Uncommon Descent Serving The Intelligent Design Community

Precious: American atheist finds ENCODE to be bullshot science


Not exactly the way he pronounced it, but read it  here. Readers will remember that ENCODE found that, contrary to Darwinian (Christian or otherwise) hopes, there was very little human “junk DNA” — and those people have been having a fit about it ever since:

I have seen a shift in the way science is being conducted in the United States. This shift still reflects of minority of the science being done, but it also represents the majority of the science being reported or disseminated to the public. In short, it appears to me that the pendulum has swung from favoring rigorous science to favoring and rewarding what I will call ‘splash’ science. To be clear this struggle between rigorous and splash science is not new nor different than in previous generations. Nor is all rigorous science not splash and vice versa. However, I think in the US the pendulum has swung dramatically to the splash at the expense of the rigorous. This change in trajectory is not surprising as funding has constricted immediately following a massive expansion. There are too many mouths at the trough and they are competing for those few morsels of grain. More and more, scientific research is being sold on its revolutionary impact and not on its scientific merit. Of course ‘impact’ sounds much more important than ‘merit’. Hell, important and impact both begin with the letter ‘i’ so there must be something to that. It seems much more science is being sold as ‘paradigm shifting,’ ‘completely unexpected,’ ‘novel’ (the only one that is true, but only in the trivial sense), or ‘needing to rewrite the textbooks.’ In these cases, it’s also 99.99999999% bullshit (e.g. ENCODE). More.

Notice, he chooses ENCODE (“if ENCODE is right, then Evolution is wrong” to rant about).

Not about all the obviously utter nonsense like 100 million habitable planets and humans are descended from pigs.

No, never that. If you thought about how stupid that is, you could find yourself questioning their whole project.

Sorry I am a little late to this dialog, but some of the early entries I find interesting: Acartia_bogart: "The current estimate of functional DNA is ~9%. But most scientists would agree that this number will increase..." (5) "One states that it might be as high as 20%, another that it might be as high as 40%. So, which number is correct? 20%? 40%? 80%?" So even if we take the most conservative number from the Encode scientist block, 20%, we still have more than doubled the amount of DNA that is known to be functional. A primary argument against ID is that it stymies good science. One of IDs strongest predictions is that junk DNA isn't all that junk. It would appear that the "junk DNA" canard has been very effective at stymieing good science. Moose Dr
Semi Related: podcast - Jonathan Wells: Is There Biological Information Outside of the DNA?, pt. 3 http://intelligentdesign.podomatic.com/entry/2014-06-11T16_35_52-07_00 bornagain77
Mice and rats (not rates). Sorry! anthropic
In his review of Signature in the Cell, Francisco Ayala wrote that the repeating ALU sequences in the DNA were a prime example of non-functional junk. But as Richard Sternberg pointed out in his article, Ayala and Falk Miss the Signs in the Genome (Signature of Controversy, p 71), Ayala's prime example looks quite different when we consider that ALUs and ALU-like sequences are found to be A) where the DNA for the mammalian genome are enriched for essential functions; B) where the sections of the mammalian genome have the highest rates of transcription; and C) where mammals have the strongest organizational correlations in their genomes. Why would A, B, and C be true if ALUs and ALU-like sequences are really junk? Further on point C, Sternberg went on to compare Short Interspersed Nuclear Elements (SINES) between mice and rates after they split 22 million years ago, he found something intriguing: the SINES rose and fell together almost in lockstep. This wasn't supposed to happen after the split, when random forces would prevail and the junky SINES would go on their merry, uncorrelated way. But lineage specific mutational insertions AFTER THE SPLIT mirrored each other. So why do they move together for 22 million years if SINES are junk? By the way, Sternberg predicts that, in the end we will find that "All the expressed 88.5 percent of our DNA has diverse roles in our development." anthropic
of related note: Design In DNA - Alternative Splicing, Duons, and Dual coding genes - video (5:05 minute mark) http://www.youtube.com/watch?v=Bm67oXKtH3s#t=305 bornagain77
BA77 Thank you for the information you provided in your comments. I strongly believe that the answer to my "yes/no" questions in the above posts # 20-22 are simply 'yes'. Yes, I agree, the farther away we look at a system, the easier it is to understand the functionality of its components when we look at them closer. Both, far and close observations, as a whole and in itemized details, must be combined in order to analyze a complex system made of many interacting parts. The whole system is much more than the sum of its components. The functional procedures -borrowing the terms from Dr. Puccio- are also part of the whole system. Therefore we must identify and describe them in order to understand the whole system. Does this make sense to you? Note that I'm not arguing with you, but rather adding, in a way, to what you have expressed in your previous comments in this and other threads. :) Dionisio
Dionsio, Not that I'm qualified to answer any of your questions in any way that will be concise and meaningful for you, but it seems to me, from my own novice perspective, that it is apparent that in defining function then one must take overall context into consideration. And it also seems readily apparent that the more narrow the context allowed to be taken into consideration then the less amount of function will be found. For prime example of what I am talking about, let's consider a grain of sand compared to the size of the universe. Most people would consider that a grain a sand is meaningless (functionless) to the size of the universe, but when overall context is taken into consideration, then that grain of sand becomes very important (i.e. functional). Evidence for Belief in God - Rich Deem Excerpt: Isn't the immense size of the universe evidence that humans are really insignificant, contradicting the idea that a God concerned with humanity created the universe? It turns out that the universe could not have been much smaller than it is in order for nuclear fusion to have occurred during the first 3 minutes after the Big Bang. Without this brief period of nucleosynthesis, the early universe would have consisted entirely of hydrogen. Likewise, the universe could not have been much larger than it is, or life would not have been possible. If the universe were just one part in 10^59 larger, the universe would have collapsed before life was possible. Since there are only 10^80 baryons in the universe, this means that an addition of just 10^21 baryons (about the mass of a grain of sand) would have made life impossible. The universe is exactly the size it must be for life to exist at all. http://www.godandscience.org/apologetics/atheismintro2.html 1 in 10 to the 60th for the fine-tuning of the mass density for the universe Sand is made up of Silica this has the formula SiO2 silicon weighs 28 atomic units Oxygen weighs 16 atomic units so each SiO2 weighs 60 atomic units there are 6.023 x 10^23 atomic units in a gram. that is 6 with 23 zeros after it. so there would be 6.023 x 10^23 / 60 = 1x 10^22 SiO2s in a gram so 3 x 10^22 atoms in a gram Say a grain of sand is 1mm across it has a volume of 0.001cm3 1cm3 of sand weighs about 2.6g so a grain of sand will weigh 0.0026g so to find the number of atoms in a grain of sand we multiply the number of atoms per gram by the number of grams: 3 x 10^22 x 0.0026g = 7.8 x 10^19 atoms = 1 grain of sand http://www.thenakedscientists.com/forum/index.php?topic=6447 thus 10^79 – atoms in the universe minus 10^60 – fine tuning of mass density equals 10^19 – or equals one grain of sand Related note; Does the Universe exist for a purpose? The size of the Universe - Overview of the higher math - video http://www.youtube.com/watch?v=onvUdyvkXvQ To fully appreciate just how amazing it is the find that the mass density of the universe is balanced to approximately one grain of sand, it is good to ponder just how huge the universe is: The Biggest Stars in the Universe - video https://www.youtube.com/watch?v=aCmJmTYS7Zw bornagain77
BA77 Apparently the link doesn't work in the previous post. Here it goes again: https://uncommondescent.com/evolution/a-third-way-of-evolution/ Refer to issues raised in comments starting @ # 9 Does the word 'functions' come to mind when researching the implied mechanisms? Does gpuccio's term 'procedures' come to mind when analyzing the implied mechanisms? Can signaling pathways be related to certain level of functionality? Can regulatory networks be associated with the 'function' concept? Dionisio
BA77 Is there anything about functionality in any of the specific issues raised in the following thread? https://uncommondescent.com.....evolution/ Are the many ‘what’ and ‘how’ questions researchers are trying to answer, in some way related to functionality? Yes, no, maybe? Essentially, what do the terms function, functional, functionality, mean?
Starting at comment # 9 in that thread. Dionisio
BA77 Is there anything about functionality in any of the specific issues raised in the following thread? https://uncommondescent.com/evolution/a-third-way-of-evolution/ Are the many 'what' and 'how' questions researchers are trying to answer, in some way related to functionality? Yes, no, maybe? Essentially, what do the terms function, functional, functionality, mean? Dionisio
Of note, reductive materialism does not do 'context' Contextuality is 'magic ingredient' for quantum computing - June 11, 2012 Excerpt: Contextuality was first recognized as a feature of quantum theory almost 50 years ago. The theory showed that it was impossible to explain measurements on quantum systems in the same way as classical systems. In the classical world, measurements simply reveal properties that the system had, such as colour, prior to the measurement. In the quantum world, the property that you discover through measurement is not the property that the system actually had prior to the measurement process. What you observe necessarily depends on how you carried out the observation. Imagine turning over a playing card. It will be either a red suit or a black suit - a two-outcome measurement. Now imagine nine playing cards laid out in a grid with three rows and three columns. Quantum mechanics predicts something that seems contradictory – there must be an even number of red cards in every row and an odd number of red cards in every column. Try to draw a grid that obeys these rules and you will find it impossible. It's because quantum measurements cannot be interpreted as merely revealing a pre-existing property in the same way that flipping a card reveals a red or black suit. Measurement outcomes depend on all the other measurements that are performed – the full context of the experiment. Contextuality means that quantum measurements can not be thought of as simply revealing some pre-existing properties of the system under study. That's part of the weirdness of quantum mechanics. http://phys.org/news/2014-06-weird-magic-ingredient-quantum.html bornagain77
It is simply insane to believe that a random accumulation of genetic accidents (accumulation of potholes) rose up and built such an astonishing method for DNA repair (i.e. built automated pothole repair machines). Moreover, how this method of DNA repair works is not reducible to the materialistic framework of Neo-Darwinism. This type of repair is only possible if quantum computation is happening within the DNA. Let me give a little background. What is interesting is that DNA repair machines ‘Fixing every pothole in America before the next rush hour’ is analogous to the traveling salesman problem. The traveling salesman problem is a NP-hard (read: very hard) problem in computer science; The problem involves finding the shortest possible route between cities, visiting each city only once. ‘Traveling salesman problems’ are notorious for keeping supercomputers busy for days.
NP-hard problem – Examples Excerpt: Another example of an NP-hard problem is the optimization problem of finding the least-cost cyclic route through all nodes of a weighted graph. This is commonly known as the traveling salesman problem. http://en.wikipedia.org/wiki/NP-hard#Examples Finding: Bees Solve The Traveling Salesman Problem – October 2010 Excerpt: It is a classic problem in the field of computer science: In what order should a salesman visit his prospects? The traveling salesman problem may appear simple but it has engaged some of the greatest mathematical minds and today engages some of the fastest computers. This makes new findings, that bees routinely solve the problem before pollinating flowers, all the more remarkable. https://uncommondescent.com/intelligent-design/finding-bees-solve-the-traveling-salesman-problem/
What is interesting is that quantum computers excel in exactly this ‘narrow’ area of computation:
The Limits of Quantum Computers – March 2008 Excerpt: “Quantum computers would be exceptionally fast at a few specific tasks, but it appears that for most problems they would outclass today’s computers only modestly. This realization may lead to a new fundamental physical principle” http://www.scientificamerican.com/article.cfm?id=the-limits-of-quantum-computers Speed Test of Quantum Versus Conventional Computing: Quantum Computer Wins – May 8, 2013 Excerpt: quantum computing is, “in some cases, really, really fast.” McGeoch says the calculations the D-Wave excels at involve a specific combinatorial optimization problem, comparable in difficulty to the more famous “travelling salesperson” problem that’s been a foundation of theoretical computing for decades.,,, “This type of computer is not intended for surfing the internet, but it does solve this narrow but important type of problem really, really fast,” McGeoch says. http://www.sciencedaily.com/releases/2013/05/130508122828.htm
Since it is obvious that there is not a ‘classical’ supercomputer in the DNA, or cell, busily computing answers to this monster traveling salesman problem, in a purely ‘material’ fashion, by crunching bits, then it is readily apparent that this monster ‘traveling salesman problem’, for DNA repair, must somehow be computed by ‘non-local’ quantum computation within the cell and/or within DNA. Moreover, contrary to what materialists thought possible for quantum entanglement in life, quantum entanglement is found along the entirety of the DNA molecule
Quantum Entanglement/Information in DNA – video https://vimeo.com/92405752
Hameroff comments on the quantum computation capacity of DNA here:
Is DNA a quantum computer? Stuart Hameroff Excerpt: DNA could function as a quantum computers with superpositions of base pair dipoles acting as qubits. Entanglement among the qubits, necessary in quantum computation is accounted for through quantum coherence in the pi stack where the quantum information is shared,,, http://www.quantumconsciousness.org/dnaquantumcomputer1.htm
The insurmountable problem for Darwinists with the finding of quantum entanglement/information, and computation, in DNA is that quantum entanglement/information requires a non-local, beyond space and time, cause in order to explain its effect.
Looking beyond space and time to cope with quantum theory – 29 October 2012 Excerpt: “Our result gives weight to the idea that quantum correlations somehow arise from outside spacetime, in the sense that no story in space and time can describe them,” http://www.quantumlah.org/highlight/121029_hidden_influences.php
Yet, Neo-Darwinists, since they hold reductive materialism to be true, have no beyond space and time cause to appeal to in order to explain the quantum effect in DNA. Whereas, as a Christian Theist, of course I have a beyond space and time cause to appeal to in order to explain the quantum effect in DNA.
John 1:1 In the beginning was the Word,,,
Finding a beyond space and time effect in DNA is a direct empirical falsification of primary Darwinian reductive materialistic claims. Moreover, finding a beyond space and time effect in DNA also lends strong support to the Christians contention that we have a soul that lives beyond the death of our material/temporal bodies:
Stuart Hameroff – Does Quantum Biology Support A Quantum Soul? – video https://vimeo.com/29895068
High School Musical 2 – You are the music in me http://www.youtube.com/watch?v=IAXaQrh7m1o
There is nothing wrong with the ENCODE press release. In fact it was too conservative in its assessment. That ENCODE researchers could be bullied by Darwinists from there initial estimate of widespread functionality is more of a testimony to the unsavory influence Darwinists have on biological science and is not a true assessment of the evidence: Birney, fresh from compiling the data, stated:
"It's very hard to get over the density of information (in the genome),,, The data says its like a jungle of stuff out there. There are things we thought we understood and yet it is much, much, more complex. And then (there are) places of the genome we thought were completely silent and (yet) they're (now found to be) teeming with life, teeming with things going on. We still really don't understand that." Ewan Birney - senior scientist - : Encyclopedia Of DNA Elements - video http://www.youtube.com/watch?v=Y3V2thsJ1Wc
Birney also stated this:
“It's just been an incredible surprise for me. You say, ‘I bet it's going to be complicated', and then you are faced with it and you are like 'My God, that is mind blowing.'” Ewan Birney - senior scientist - ENCODE 2012 Scientists go deeper into DNA (Video report) (Junk No More) - Sept. 2012 http://bcove.me/26vjjl5a
That certainly does not sound like the accumulation of random genetic accidents to me! Nor does it sound like the vast majority of DNA is just dead weight that is just sitting there along for the ride. In fact there is a very good reason to believe there is not any dead weight in the cell, sucking up energy. The cell is found to be optimal in its energy efficiency:
Optimal Design of Metabolism - Dr. Fazale Rana - July 2012 Excerpt: A new study further highlights the optimality of the cell’s metabolic systems. Using the multi-dimension optimization theory, researchers evaluated the performance of the metabolic systems of several different bacteria. The data generated by monitoring the flux (movement) of compounds through metabolic pathways (like the movement of cars along the roadways) allowed researchers to assess the behavior of cellular metabolism. They determined that metabolism functions optimally for a system that seeks to accomplish multiple objectives. It looks as if the cell’s metabolism is optimized to operate under a single set of conditions. At the same time, it can perform optimally with relatively small adjustments to the metabolic operations when the cell experiences a change in condition. http://www.reasons.org/articles/the-optimal-design-of-metabolism
If the DNA were 80% dead weight, as Darwinists hold, then we certainly should not be seeing optimal energy efficiency in the metabolic pathways! As well, the 'optimal' energy efficiency in a cell far outclasses anything man has ever devised in computers
Life Leads the Way to Invention - Feb. 2010 Excerpt: a cell is 10,000 times more energy-efficient than a transistor. “In one second, a cell performs about 10 million energy-consuming chemical reactions, which altogether require about one picowatt (one millionth millionth of a watt) of power.” This and other amazing facts lead to an obvious conclusion: inventors ought to look to life for ideas.,,, http://creationsafaris.com/crev201002.htm#20100226a
But is there any other evidence besides the ENCODE study, and optimal metabolic pathways, to back up Birney's feeling of awe for what he saw in DNA? Yes! The Wyss institute revealed that information storage capacity of DNA is almost beyond belief. Far, far, outclassing anything man has ever devised for computers:
Information Storage in DNA by Wyss Institute - video https://vimeo.com/47615970 Quote from preceding video: "The theoretical (information) density of DNA is you could store the total world information, which is 1.8 zetabytes, at least in 2011, in about 4 grams of DNA." Sriram Kosuri PhD. - Wyss Institute Storing information in DNA - Test-tube data - Jan 26th 2013 Excerpt: Dr Goldman’s new scheme is significant in several ways. He and his team have managed to set a record (739.3 kilobytes) for the amount of unique information encoded. But it has been designed to do far more than that. It should, think the researchers, be easily capable of swallowing the roughly 3 zettabytes (a zettabyte is one billion trillion or 10²¹ bytes) of digital data thought presently to exist in the world and still have room for plenty more. http://www.economist.com/news/science-and-technology/21570671-archives-could-last-thousands-years-when-stored-dna-instead-magnetic
A reasonable person would automatically realize that since our best computer engineers cannot come close to these kinds of specs, for information storage, then perhaps design is the best explanation! Moreover, the information in DNA is found to be overlapping:
"Not only are there many different codes in the sequences, but they overlap, so that the same letters in a sequence may take part simultaneously in several different messages." Edward N. Trifonov - 2010
Multiple overlapping codes are something our best computer programmers can only dream of imitating on such a large scale. Moreover, the method in which DNA manages the information inherent within it is enough to make any computer systems engineer drool with envy:
Comprehensive Mapping of Long-Range Interactions Reveals Folding Principles of the Human Genome - Oct. 2009 Excerpt: At the megabase scale, the chromatin conformation is consistent with a fractal globule, a knot-free, polymer conformation that enables maximally dense packing while preserving the ability to easily fold and unfold any genomic locus. http://www.sciencemag.org/cgi/content/abstract/326/5950/289 3-D Structure Of Human Genome: Fractal Globule Architecture Packs Two Meters Of DNA Into Each Cell - Oct. 2009 Excerpt: the information density in the nucleus is trillions of times higher than on a computer chip -- while avoiding the knots and tangles that might interfere with the cell's ability to read its own genome. Moreover, the DNA can easily unfold and refold during gene activation, gene repression, and cell replication. -per Science Daily Scientists' 3-D View of Genes-at-Work Is Paradigm Shift in Genetics - Dec. 2009 Excerpt: Highly coordinated chromosomal choreography leads genes and the sequences controlling them, which are often positioned huge distances apart on chromosomes, to these 'hot spots'. Once close together within the same transcription factory, genes get switched on (a process called transcription) at an appropriate level at the right time in a specific cell type. This is the first demonstration that genes encoding proteins with related physiological role visit the same factory. http://www.sciencedaily.com/releases/2009/12/091215160649.htm
Moreover, DNA repair is another area that inspires awe (at least it does for me), and is another area which far outclasses anything man has ever accomplished in computers (or anywhere else):
Quantum Dots Spotlight DNA-Repair Proteins in Motion – March 2010 Excerpt: “How this system works is an important unanswered question in this field,” he said. “It has to be able to identify very small mistakes in a 3-dimensional morass of gene strands. It’s akin to spotting potholes on every street all over the country and getting them fixed before the next rush hour.” Dr. Bennett Van Houten – of note: A bacterium has about 40 team members on its pothole crew. That allows its entire genome to be scanned for errors in 20 minutes, the typical doubling time.,, These smart machines can apparently also interact with other damage control teams if they cannot fix the problem on the spot. http://www.sciencedaily.com/releases/2010/03/100311123522.htm
I don't know if there's anything wrong with the work that the ENCODE researchers has done or their findings - the problem is with their press releases. They release reports claiming 80% of the genome is “functional”, but if you look at what ENCODE researchers themselves say, it's more like 8% - 20%. Even Ewan Birney, the researcher responsible for the 80% figure, admits that “conservatively” it’s as low as 9% but that it is feasible that it’s as high as 20%. Some of the ENCODE researchers did an interesting question/answer session on Reddit. Larry Moran asked: There's no question that the press has announced the death of junk DNA. Do you agree that you have demonstrated function for most of our genome? One of the ENCODE researchers answered:
ABSOLUTELY NOT: I do NOT think ANYONE has demonstrated function for most of our genome. In fact, ENCODE has not demonstrated function for ANYTHING because we published no functional studies. The only thing ENCODE has done is to find new regions on the genome that are correlated, in terms of their chemical signature (i.e. chromatin state of "openness", transcription factor occupancy, etc.), with other regions that have been proven functional by site-directed experiments. Correlated, no more and no less. And furthermore, it is even impossible to properly set thresholds for what is a real chemical signal and what is an artifact in these assays, as MH and I have discussed elsewhere in this thread. The 80% figure is almost certainly not even real chemical signatures. If you notice, 80% of the genome is the percent of the genome that is mappable so right now, I think the 80% figure simply means that if you sequence any complex genome-wide dataset deeply enough, you will eventually return the entire genome. It's just a signal-to-noise issue: if you keep looking, you'll eventually get all the noise possible: the entire mappable genome. Ewan knows this: in his blog, he says that he could either have said the 80% (low-confidence) figure or the more conservative 20% figure that we are more certain is actually telling us something that's more signal and minimal noise. But he chose the 80% figure in the end and the rest is history.
Acartia_bogart, instead of hypotheticals of God speaking directly to ALL men, and since we are, in this present moment, dealing directly with science instead, why don't we compare Theism to Naturalism/Materialism? The materialistic and Theistic philosophy make, and have made, several 'natural' contradictory predictions about what evidence we will find. These predictions, and the evidence we have found, (per K. Popper and I. Lakatos) can be compared against one another to see which overarching worldview is correct and which is pseudo-science.
1. Naturalism/Materialism predicted time-space energy-matter always existed. Whereas Theism predicted time-space energy-matter were created. Big Bang cosmology now strongly indicates that time-space energy-matter had a sudden creation event approximately 14 billion years ago. 2. Naturalism/Materialism predicted that the universe is a self sustaining system that is not dependent on anything else for its continued existence. Theism predicted that God upholds this universe in its continued existence. Breakthroughs in quantum mechanics reveal that this universe is dependent on a ‘non-local’, beyond space and time, cause for its continued existence. 3. Naturalism/Materialism predicted that consciousness is a ‘emergent property’ of material reality and thus should have no particularly special position within material reality. Theism predicts consciousness precedes material reality and therefore, on that presupposition, consciousness should have a ‘special’ position within material reality. Quantum Mechanics reveals that consciousness has a special, even a central, position within material reality. - 4. Naturalism/Materialism predicted the rate at which time passed was constant everywhere in the universe. Theism predicted God is eternal and is outside of time. – Special Relativity has shown that time, as we understand it, is relative and comes to a complete stop at the speed of light. (Psalm 90:4 – 2 Timothy 1:9) - 5. Naturalism/Materialism predicted the universe did not have life in mind and that life was ultimately an accident of time and chance. Theism predicted this universe was purposely created by God with man in mind. Scientists find the universe is exquisitely fine-tuned for carbon-based life to exist in this universe. Moreover it is found, when scrutinizing the details of chemistry, that not only is the universe fine-tuned for carbon based life, but is specifically fine-tuned for life like human life (M. Denton).- 6. Naturalism/Materialism predicted complex life in this universe should be fairly common. Theism predicted the earth is extremely unique in this universe. Statistical analysis of the hundreds of required parameters which enable complex organic life to be possible on earth gives strong indication the earth is extremely unique in this universe (Gonzalez). - 7. Naturalism/Materialism predicted it took a very long time for life to develop on earth. Theism predicted life to appear abruptly on earth after water appeared on earth (Genesis 1:10-11). Geo-chemical evidence from the oldest sedimentary rocks ever found on earth indicates that complex photo-synthetic life has existed on earth as long as water has been on the face of earth. - 8. Naturalism/Materialism predicted the first life to be relatively simple. Theism predicted that God is the source for all life on earth. The simplest life ever found on Earth is far more complex than any machine man has made through concerted effort. (Michael Denton PhD) - 9. Naturalism/Materialism predicted the gradual unfolding of life would (someday) be self-evident in the fossil record. Theism predicted complex and diverse animal life to appear abruptly in the seas in God’s fifth day of creation. The Cambrian Explosion shows a sudden appearance of many different and completely unique fossils within a very short “geologic resolution time” in the Cambrian seas. - 10. Naturalism/Materialism predicted there should be numerous transitional fossils found in the fossil record, Theism predicted sudden appearance and rapid diversity within different kinds found in the fossil record. Fossils are consistently characterized by sudden appearance of a group/kind in the fossil record(disparity), then rapid diversity within that group/kind, and then long term stability and even deterioration of variety within the overall group/kind, and within the specific species of the kind, over long periods of time. Of the few dozen or so fossils claimed as transitional, not one is uncontested as a true example of transition between major animal forms out of millions of collected fossils. - 11. Naturalism/Materialism predicted animal speciation should happen on a somewhat constant basis on earth. Theism predicted man was the last species created on earth – Man (our genus ‘modern homo’ as distinct from the highly controversial ‘early homo’) is the last generally accepted major fossil form to have suddenly appeared in the fossil record. (Tattersall; Luskin)– 12. Naturalism/Materialism predicted much of the DNA code was junk. Theism predicted we are fearfully and wonderfully made – ENCODE research into the DNA has revealed a “biological jungle deeper, denser, and more difficult to penetrate than anyone imagined.”. - 13. Naturalism/Materialism predicted a extremely beneficial and flexible mutation rate for DNA which was ultimately responsible for all the diversity and complexity of life we see on earth. Theism predicted only God created life on earth – The mutation rate to DNA is overwhelmingly detrimental. Detrimental to such a point that it is seriously questioned whether there are any truly beneficial, information building, mutations whatsoever. (M. Behe; JC Sanford) - 14. Naturalism/Materialism predicted morality is subjective and illusory. Theism predicted morality is objective and real. Morality is found to be deeply embedded in the genetic responses of humans. As well, morality is found to be deeply embedded in the structure of the universe. Embedded to the point of eliciting physiological responses in humans before humans become aware of the morally troubling situation and even prior to the event even happening. 15. Naturalism/Materialism predicted that we are merely our material bodies with no transcendent component to our being, and that we die when our material bodies die. Theism predicted that we have minds/souls that are transcendent of our bodies that live past the death of our material bodies. Transcendent, and ‘conserved’ (cannot be created or destroyed) ‘non-local’, beyond space-time matter-energy, quantum entanglement/information, which is not reducible to matter-energy space-time, is now found in our material bodies on a massive scale.
As you can see when we remove the artificial imposition of the materialistic philosophy from the scientific method, (methodological naturalism) and look carefully at the predictions of both the materialistic philosophy and the Theistic philosophy, side by side, we find the scientific method is very good at pointing us in the direction of Theism as the true explanation. - In fact it is even very good at pointing us to Christianity:
The center of the universe is Life - General Relativity, Quantum Mechanics, Entropy & the Shroud Of Turin - video http://vimeo.com/34084462
Verse and Music:
Isaiah 46:10 I make known the end from the beginning, from ancient times, what is still to come. I say, 'My purpose will stand, and I will do all that I please.' FFH - One of These Days - video https://www.youtube.com/watch?v=CHdSgQVIb5c
Science and Pseudoscience (transcript) - "In degenerating programmes, however, theories are fabricated only in order to accommodate known facts" - Imre Lakatos (November 9, 1922 – February 2, 1974) a philosopher of mathematics and science, , quote as stated in 1973 LSE Scientific Method Lecture http://www2.lse.ac.uk/philosophy/about/lakatos/scienceandpseudosciencetranscript.aspx "Since, to make accurate predictions, it takes a 'mind' to assess what the future may hold and predict what may take place in that future, how could it be otherwise that Darwinism would fail to make accurate predictions? How is it possible for a materialistic theory (Darwinism), which denies the reality of 'mind', to have any real predictive power?" UD blogger
"Acartia_bogart, but ID does not make the grand claim that God designed DNA. BA77, any significantly advanced technology is indistinguishable from magic. I forget who said that, but it is true. Along the same line, any intelligence that could create, or direct, all life on earth, would be indistinguishable from God. Let's try a hypothetical. What if it were proven beyond doubt that all life on earth was the result of an intelligent designer. And what if this designer (or its descendant) was still alive and told us that his kind had evolved through natural processes very similar to natural selection. Do you really think that the ID crowd would accept this? Or would they try to argue that the designer was himself designed by another intelligent designer? One that did not arise through evolution. Acartia_bogart
Acartia_bogart, but ID does not make the grand claim that God designed DNA! ID makes the minimal claim that Intelligence, over and above unguided Darwinian processes, is the best explanation for the staggering levels of overlapping information we find in DNA. Although I personally push the envelope to try to infer God as the best explanation, and believe the evidence backs me up, I certainly do not represent ID at large nor the minimal claim that is central to ID theory. bornagain77
Acartia_bogart, there are many ways to infer functionality across the entire genome, claiming most of the genome is non-functional because we do not yet know the precise function of much of the genome (i.e. based on ignorance) is simply insane. Whereas, inferring functionality for virtually 100% of the genome, even though we do not know the precise function of all of the genome, is relatively easy. One way to infer widespread functionality, despite knowledge of precise function, is to note that the entire genome, contrary to Darwinian presuppositions, is subject to multiple layers of DNA repair:
Repair mechanisms in DNA include: A proofreading system that catches almost all errors A mismatch repair system to back up the proofreading system Photoreactivation (light repair) Removal of methyl or ethyl groups by O6 – methylguanine methyltransferase Base excision repair Nucleotide excision repair Double-strand DNA break repair Recombination repair Error-prone bypass http://www.newgeology.us/presentation32.html The Evolutionary Dynamics of Digital and Nucleotide Codes: A Mutation Protection Perspective - February 2011 Excerpt: "Unbounded random change of nucleotide codes through the accumulation of irreparable, advantageous, code-expanding, inheritable mutations at the level of individual nucleotides, as proposed by evolutionary theory, requires the mutation protection at the level of the individual nucleotides and at the higher levels of the code to be switched off or at least to dysfunction. Dysfunctioning mutation protection, however, is the origin of cancer and hereditary diseases, which reduce the capacity to live and to reproduce. Our mutation protection perspective of the evolutionary dynamics of digital and nucleotide codes thus reveals the presence of a paradox in evolutionary theory between the necessity and the disadvantage of dysfunctioning mutation protection. This mutation protection paradox, which is closely related with the paradox between evolvability and mutational robustness, needs further investigation." http://www.benthamscience.com/open/toevolj/articles/V005/1TOEVOLJ.pdf Contradiction in evolutionary theory - video - (The contradiction between extensive DNA repair mechanisms and the necessity of 'random mutations/errors' for Darwinian evolution) http://www.youtube.com/watch?v=dzh6Ct5cg1o The Darwinism contradiction of repair systems Excerpt: The bottom line is that repair mechanisms are incompatible with Darwinism in principle. Since sophisticated repair mechanisms do exist in the cell after all, then the thing to discard in the dilemma to avoid the contradiction necessarily is the Darwinist dogma. https://uncommondescent.com/intelligent-design/the-darwinism-contradiction-of-repair-systems/
Another way to infer widespread functionality across the entire genome, despite knowledge of precise function, is empirically:
Jonathan Wells on Darwinism, Science, and Junk DNA - November 2011 Excerpt: Mice without “junk” DNA. In 2004, Edward Rubin?] and a team of scientists at Lawrence Berkeley Laboratory in California reported that they had engineered mice missing over a million base pairs of non-protein-coding (“junk”) DNA—about 1% of the mouse genome—and that they could “see no effect in them.” But molecular biologist Barbara Knowles (who reported the same month that other regions of non-protein-coding mouse DNA were functional) cautioned that the Lawrence Berkeley study didn’t prove that non-protein-coding DNA has no function. “Those mice were alive, that’s what we know about them,” she said. “We don’t know if they have abnormalities that we don’t test for.”And University of California biomolecular engineer David Haussler? said that the deleted non-protein-coding DNA could have effects that the study missed. “Survival in the laboratory for a generation or two is not the same as successful competition in the wild for millions of years,” he argued. In 2010, Rubin was part of another team of scientists that engineered mice missing a 58,000-base stretch of so-called “junk” DNA. The team found that the DNA-deficient mice appeared normal until they (along with a control group of normal mice) were fed a high-fat, high-cholesterol diet for 20 weeks. By the end of the study, a substantially higher proportion of the DNA-deficient mice had died from heart disease. Clearly, removing so-called “junk” DNA can have effects that appear only later or under other circumstances. https://uncommondescent.com/intelligent-design/jonathan-wells-on-darwinism-science-and-junk-dna/ Shoddy Engineering or Intelligent Design? Case of the Mouse's Eye - April 2009 Excerpt: -- The (entire) nuclear genome is thus transformed into an optical device that is designed to assist in the capturing of photons. This chromatin-based convex (focusing) lens is so well constructed that it still works when lattices of rod cells are made to be disordered. Normal cell nuclei actually scatter light. -- So the next time someone tells you that it “strains credulity” to think that more than a few pieces of “junk DNA” could be functional in the cell - remind them of the rod cell nuclei of the humble mouse. http://www.evolutionnews.org/2009/04/shoddy_engineering_or_intellig.html
Another way to infer widespread functionality across the entire genome, despite lacking knowledge of precise function, is to note the trend in evidence. Every element that Darwinists have insisted to be junk has been shown to have function of one kind or the other. There has only been an increase in the amount of the genome known to functional, not a decrease! Reference Notes For Jonathan Wells' Book - The Myth Of Junk DNA - Hundreds of Studies Outlining Function for various elements of 'Junk' DNA http://docs.google.com/viewer?a=v&q=cache:zGp3gRRDmA0J:www.discovery.org/scripts/viewDB/filesDB-download.php%3Fcommand%3Ddownload%26id%3D7651+Sequence-dependent+and+sequence-independent+functions+of+%E2%80%9Cjunk%E2%80%9D+DNA:+do+we+need+an+expanded+concept+of+biological+information%3F+Jonathan+Wells&hl=en&gl=us&pid=bl&srcid=ADGEESiCq0TQUSKYlr0KNNIDgaGKMM7b3z0iEGiKe_faSd0646SzaYSoCCcNavm523X5TgaGbdQPtDFmN6Yw8IexI44RokfsMKs6q-EEeM_vyYw-zaMB-h_7wKu8JjGREn_JF-CPlkSq&sig=AHIEtbRfG8rv_5eur2oifBsWxHdM_e731g Moreover,, Astonishing DNA complexity update Excerpt: The untranslated regions (now called UTRs, rather than ‘junk’) are far more important than the translated regions (the genes), as measured by the number of DNA bases appearing in RNA transcripts. Genic regions are transcribed on average in five different overlapping and interleaved ways, while UTRs are transcribed on average in seven different overlapping and interleaved ways. Since there are about 33 times as many bases in UTRs than in genic regions, that makes the ‘junk’ about 50 times more active than the genes. http://creation.com/astonishing-dna-complexity-update bornagain77
The ID insistence on labeling large chunks of things, that they do not understand, as designed by God is simply insane: Acartia_bogart
B77, even the authors of the ENCODE project do not agree on the level of functionality. One states that it might be as high as 20%, another that it might be as high as 40%. So, which number is correct? 20%? 40%? 80%? Acartia_bogart
Contrary to your preferred atheistic beliefs, ENCODE assumptions about functionality, as the videos I highlighted clearly illustrate for all to see, are not wrong. Whereas Darwinian presuppositions about non-functionality, based on ignorance, are not only 'not even wrong' but insane. bornagain77
BA77: "The Darwinian insistence on labeling large chunks of DNA, that they do not understand, as Junk DNA is simply insane:" You are entitled to your beliefs, but that doesn't change the fact that the ENCODE assumptions of functionality are wrong. The only thing that is surprising is that they did not claim 100% functionality because 100% of the DNA is involved in a chemical process every time a cell divides. Acartia_bogart
The Darwinian insistence on labeling large chunks of DNA, that they do not understand, as Junk DNA is simply insane: DNA - Replication, Wrapping & Mitosis - video https://vimeo.com/33882804 Multidimensional Genome – Dr. Robert Carter – video http://www.metacafe.com/watch/8905048/ The Extreme Complexity Of Genes – Dr. Raymond G. Bohlin - video http://www.metacafe.com/watch/8593991/ bornagain77
Thanks, that helps me understand your position more. Nothing wrong with BA77 blasts. They put more on the table. Bateman
Bateman, In short, the ENCODE study defined something as functional if there was any chemical reaction at all, regardless of whether it was transcribed into a protein or was used for other regulatory functions. DNA is a chemical, and as such it will demonstrate chemical activity, but not all of it is necessarily functional. The current estimate of functional DNA is ~9%. But most scientists would agree that this number will increase as new approaches are used to study the DNA and its functionality. The following is a BA77esque shotgun blast of excerpts from various reference.: "So if ever a transcription factor ever, in any cell, bound however briefly to a stretch of DNA, they declared it to be functional. That’s nonsense." "The ENCODE group could only declare function for a sequence by ignoring all other context than the local and immediate effect of a chemical interaction.." http://freethoughtblogs.com/pharyngula/2013/02/22/encode-gets-a-public-reaming/ " It was already known, for example, that vast portions of the genome are transcribed into RNA. A small amount of that RNA encodes protein, and some serves a regulatory role, but the rest of it is chock-full of seemingly nonsensical repeats, remnants of past viruses and other weird little bits that shouldn’t serve a purpose." "Michael Eisen, an evolutionary biologist at the University of California, Berkeley, said in a blog post that this pushed “a narrative about their results that is, at best, misleading.”" http://blogs.nature.com/news/2012/09/fighting-about-encode-and-junk.html "ENCODE’s publicity first presented a misleading “all the textbooks are wrong” narrative about noncoding human DNA. http://www.sciencedirect.com/science/article/pii/S0960982213002893 "Biochemical activity is not reliable evidence for biological function." http://www.sciencedirect.com/science/article/pii/S0006291X12024229 "Those kind of criticisms are common in journal clubs and, certainly, in the blogosphere, but scientific journals generally don't publish them. It's okay to refute the data (as in the arsenic affair) but ideas usually get a free pass no matter how stupid they are. In this case, the ENCODE Consortium did such a bad job of describing their data that journals had to pay attention." http://sandwalk.blogspot.ca/2013/03/ford-doolittles-critique-of-encode.html "For example, a random sequence may bind a transcription-factor, but that may not result in transcription. The ENCODE authors apply this flawed reasoning to all their functions." "one of the lead authors of ENCODE admitted that the press conference mislead people by claiming that 80% of our genome was “essential and useful.” He put that number at 40% (Gregory 2012), while another lead author reduced the fraction of the genome that is devoted to function to merely 20% (Hall 2012). Interestingly, even when a lead author of ENCODE reduced the functional genomic fraction to 20%.." "In its synopsis of the year 2012, the journal Nature adopted the more modest estimate, and summarized the findings of ENCODE by stating that “at least 20% of the genome can influence gene expression” (Van Noorden 2012)...Unfortunately, neither 80% nor 20% are based on actual evidence." http://gbe.oxfordjournals.org/content/early/2013/02/20/gbe.evt028.full.pdf+html Acartia_bogart
AB, you've been called you out on this issue before, and Tjguy just reiterated it. You can't just assert a point without backing it up. I don't know much about ENCODE; could you please explain why it has issues with logic and design? Bateman
AB, could you please enlighten us as to the flaws in logic and design that the ENCODE scientists committed and have become glaring? I think a better example of that would be the recent splash announcement that inflation has been verified! I think what we see here is evolutionists questioning the evidence simply because it is hard to reconcile with their worldview and leaves them with egg on their face because of their enthusiastic support of the junk DNA idea. tjguy
when it is examined, the flaws in logic and design become glaring. We have seen this all too often e.g. cold fusion, bacteria in Mars rocks,...
... Piltdown man, tree of life, Miller-Urey, selfish gene, junk DNA, 98% chimp/human, etc etc etc . cantor
Talk about overselling your point. Encode is only mentioned once in the article, and the rest of the discussion uses different examples. The ENCODE issue, unfortunately, is all too common in science today. The scientists treating it as a marketing and media event rather than using the normal channels for scientific communication. It makes an initial big stir and then, when it is examined, the flaws in logic and design become glaring. We have seen this all too often (e.g., cold fusion, bacteria in Mars rocks, etc.). Acartia_bogart

Leave a Reply