Intelligent Design Natural selection

Debate Redux: The Myth of Natural Selection

Spread the love

Philosophers call it incommensurability—when the language and underlying concepts are so different, theorists cannot even have meaningful communication. Anyone who doubts the reality of incommensurability need look no farther than this weekend’s “What’s Behind It All? God, Science, and the Universe” debate, where Stephen Meyer explained the random nature of evolution and the limits of natural selection, and evolutionists Lawrence Krauss and Denis Lamoureux denied any such thing, insisting that evolution is not random because, after all, natural selection provides the direction and creates new designs. The funny thing about this particular instance of incommensurability is that the evolutionist’s argument, which is a standard line, is, itself, incommensurate with evolutionary theory.  Read more

11 Replies to “Debate Redux: The Myth of Natural Selection

  1. 1
    Lee Spetner says:

    Apparenty neither Krauss nor Lamoureux has taken the time to understand evolution. They are just repeating the mantra that evolution is not random because natural selection gives it a direction. Many evolutionists, as far as I know, have abandoned this mantra because they know it is not true. What is important in this context is that mutations are random and statistically independent of harm or benefit to the organism. Moreover, for neo-Darwinian theory (i.e., the Modern Synthesis) to be acceptable as a scientific theory, the probability of common descent (their major claim) must be shown by calculation to be reasonably high. Any competent calculation of this probability has shown it to be vanishingly small. Therefore any attempt to explain the evolution of life using random mutations is a failure. There is ample evidence, however, that the mutations (i.e., genetic changes) that are important for evolution are not random, but are driven by stimuli stemming from environmental change. These stimuli act on a built-in mechanism in the organism to cause it to adapt to the new environment.
    Lee

  2. 2
    bornagain77 says:

    Natural selection fails on so many levels. First off it fails mathematically and empirically.

    Just how bad the math and empirics is for natural selection is gone over here:

    When Theory and Experiment Collide — April 16th, 2011 by Douglas Axe
    Excerpt: Based on our experimental observations and on calculations we made using a published population model [3], we estimated that Darwin’s mechanism would need a truly staggering amount of time—a trillion trillion years or more—to accomplish the seemingly subtle change in enzyme function that we studied.
    http://biologicinstitute.org/2.....t-collide/  

    “Shared Evolutionary History or Shared Design?” – Ann Gauger – January 1, 2015
    Excerpt: The waiting time required to achieve four mutations is 10^15 years. That’s longer than the age of the universe. The real waiting time is likely to be much greater, since the two most likely candidate enzymes failed to be coopted by double mutations.
    http://www.evolutionnews.org/2.....92291.html

    Is There Enough Time For Humans to have Evolved from Apes? Dr. Ann Gauger Answers – video
    http://www.youtube.com/watch?v=KN7NwKYUXOs

    Human Evolution: A Facebook Dialog – By Ann Gauger – Nov. 12, 2012
    Excerpt: PM:Is it also possible that the mechanism that you refer to in your video clip is not the only/main one at play?
    Biologic: The mechanism I refer to is based on the standard Darwinian model for evolution. Published population genetics estimates for how long it would take to make *and fix* a single base change to a DNA binding site in a 1 kb segment of DNA are prohibitively long—six million years. To get a second mutation in the same DNA binding site would take in excess of 200 million years.
    Now to go from hominid to human requires many changes, most of them to gene expression patterns. It is much easier to change the DNA binding site than to change the transcription factor’s specificity. And all these mutations must work together and be beneficial to the evolving organism. The window of time available according to the fossil record and phylogenetic estimates is too short for known mechanisms to be sufficient. So do I think there are are other things at play?
    Yes.
    http://www.biologicinstitute.o.....ialog?og=1

    More from Ann Gauger on why humans didn’t happen the way Darwin said – July 2012
    Excerpt: Each of these new features probably required multiple mutations. Getting a feature that requires six neutral mutations is the limit of what bacteria can produce. For primates (e.g., monkeys, apes and humans) the limit is much more severe. Because of much smaller effective population sizes (an estimated ten thousand for humans instead of a billion for bacteria) and longer generation times (fifteen to twenty years per generation for humans vs. a thousand generations per year for bacteria), it would take a very long time for even a single beneficial mutation to appear and become fixed in a human population.
    You don’t have to take my word for it. In 2007, Durrett and Schmidt estimated in the journal Genetics that for a single mutation to occur in a nucleotide-binding site and be fixed in a primate lineage would require a waiting time of six million years. The same authors later estimated it would take 216 million years for the binding site to acquire two mutations, if the first mutation was neutral in its effect.
    Facing Facts
    But six million years is the entire time allotted for the transition from our last common ancestor with chimps to us according to the standard evolutionary timescale. Two hundred and sixteen million years takes us back to the Triassic, when the very first mammals appeared. One or two mutations simply aren’t sufficient to produce the necessary changes,, in the time available. At most, a new binding site might affect the regulation of one or two genes.
    http://www.uncommondescent.com.....rwin-said/

    Richard Sternberg applies Durrett and Schmidt’s population genetics model to the hypothetical evolution of whales here

    Whale Evolution Vs. Population Genetics – Richard Sternberg PhD. in Evolutionary Biology – video
    http://www.youtube.com/watch?v=85kThFEDi8o

    Evolution And Probabilities: A Response to Jason Rosenhouse – August 2011
    Excerpt: The equations of population genetics predict that – assuming an effective population size of 100,000 individuals per generation, and a generation turnover time of 5 years – according to Richard Sternberg’s calculations and based on equations of population genetics applied in the Durrett and Schmidt paper, that one may reasonably expect two specific co-ordinated mutations to achieve fixation in the timeframe of around 43.3 million years. When one considers the magnitude of the engineering fete, such a scenario is found to be devoid of credibility.
    http://www.uncommondescent.com.....osenhouse/

    In the following podcasts, Casey Luskin interviews Dr. Richard Sternberg, evolutionary biologist and CSC Senior Fellow, whose discussion of whale origins is featured in Illustra Media’s 2015 documentary, “Living Waters: Intelligent Design in the Oceans of the Earth”. Sternberg critiques conventional accounts of whale evolution, noting that both natural selection and neutral drift cannot explain the transition between a land mammal and a fully aquatic whale. Standard evolutionary models of population genetics would either require very large breeding population sizes (greater than that of any species of mammals) or a waiting period four or more times longer than the given 8-9 million years.

    Listen: Evolutionary Biologist Richard Sternberg on the Problem of Whale Origins – September 9, 2015
    http://www.evolutionnews.org/2.....99201.html

    podcast – Dr. Richard Sternberg: Whale Evolution and Living Waters, Pt. 2
    http://www.discovery.org/multi.....ters-pt-2/

  3. 3
    bornagain77 says:

    Ironically, Durrett and Schmidt, with their very un-Darwinian 216 million year estimate from their population genetics model, had originally tried to refute Behe’s 1 in 10^20 empirical observation for the ‘edge of evolution’ using their hypothetical mathematical model.

    Waiting Longer for Two Mutations – Michael J. Behe
    Excerpt: Citing malaria literature sources (White 2004) I had noted that the de novo appearance of chloroquine resistance in Plasmodium falciparum was an event of probability of 1 in 10^20. I then wrote that ‘for humans to achieve a mutation like this by chance, we would have to wait 100 million times 10 million years’ (1 quadrillion years)(Behe 2007) (because that is the extrapolated time that it would take to produce 10^20 humans). Durrett and Schmidt (2008, p. 1507) retort that my number ‘is 5 million times larger than the calculation we have just given’ using their model (which nonetheless “using their model” gives a prohibitively long waiting time of 216 million years). Their criticism compares apples to oranges. My figure of 10^20 is an empirical statistic from the literature; it is not, as their calculation is, a theoretical estimate from a population genetics model.,,,
    The difficulty with models such as Durrett and Schmidt’s is that their biological relevance is often uncertain, and unknown factors that are quite important to cellular evolution may be unintentionally left out of the model. That is why experimental or observational data on the evolution of microbes such as P. falciparum are invaluable,,,
    http://www.discovery.org/a/9461

    Don’t Mess With ID (Overview of Behe’s ‘Edge’ and Durrett and Schmidt’s paper at the 20:00 minute mark) – Paul Giem – video
    http://www.youtube.com/watch?v=5JeYJ29-I7o

    Although 216 million years is certainly very antagonistic to Darwinian claims, Dr. Behe responded to Durrett and Schmidt’s “attempted rebuttal” in a 5 part essay:

    Waiting Longer for Two Mutations, Parts 1-5
    http://behe.uncommondescent.com/2009/03/

    summary at the end of part 5 is here:

    Waiting Longer for Two Mutations, Part 5 – Michael J. Behe – March 2009
    Excerpt: “as I show above, when simple mistakes in the application of their model to malaria are corrected, it agrees closely with empirical results reported from the field that I cited. This is very strong support that the central contention of The Edge of Evolution is correct: that it is an extremely difficult evolutionary task for multiple required mutations to occur through Darwinian means, especially if one of the mutations is deleterious. And, as I argue in the book, reasonable application of this point to the protein machinery of the cell makes it very unlikely that life developed through a Darwinian mechanism.”
    http://behe.uncommondescent.co.....ns-part-5/

    Moreover Behe’s empirical observation for the “Edge of Evolution’ is now born out in the laboratory:

    The Vindication of Michael Behe – podcast/video – 2014
    https://www.youtube.com/watch?v=itkxFbyzyro

    Michael Behe – Empirically Observed I in 10^20 Limit of Evolution – video – Lecture delivered in April 2015 at Colorado School of Mines
    25:56 minute quote – “This is not an argument any more that Darwinism cannot make complex functional systems; it is an observation that it does not.”
    https://www.youtube.com/watch?v=9svV8wNUqvA

    Guide of the Perplexed: A Quick Reprise of The Edge of Evolution – Michael Behe – August 20, 2014
    Excerpt: *Any particular adaptive biochemical feature requiring the same mutational complexity as that needed for chloroquine resistance in malaria is forbiddingly unlikely to have arisen by Darwinian processes and fixed in the population of any class of large animals (such as, say, mammals), because of the much lower population sizes and longer generation times compared to that of malaria. (By “the same mutational complexity” I mean requiring 2-3 point mutations where at least one step consists of intermediates that are deleterious, plus a modest selection coefficient of, say, 1 in 10^3 to 1 in10^4. Those factors will get you in the neighborhood of 1 in 10^20.)
    *Any adaptive biological feature requiring a mutational pathway of twice that complexity (that is, 4-6 mutations with the intermediate steps being deleterious) is unlikely to have arisen by Darwinian processes during the history of life on Earth.,,,
    What’s more, Nicholas White’s factor of 1 in 10^20 already has built into it all the ways to evolve chloroquine resistance in P. falciparum. In the many malarial cells exposed to chloroquine there have surely occurred all possible single mutations and probably all possible double mutations — in every malarial gene — yet only a few mutational combinations in pfcrt are effective. In other words, mutation and selection have already searched all possible solutions of the entire genome whose probability is greater than 1 in 10^20, including mutations to other genes. The observational evidence demonstrates that only a handful are effective. There is no justification for arbitrarily inflating probabilistic resources by citing imaginary alternative evolutionary routes.
    http://www.evolutionnews.org/2.....89161.html

    Because of such insurmountable problems from mathematics and empirical evidence, some Darwinists, at least those Darwinists who are honest with the math and empirical evidence, abandoned natural selection as a major player in the theory of evolution.

    On Enzymes and Teleology – Ann Gauger – July 19, 2012
    Excerpt: People have been saying for years, “Of course evolution isn’t random, it’s directed by natural selection. It’s not chance, it’s chance and necessity.” But in recent years the rhetoric has changed. Now evolution is constrained. Not all options are open, and natural selection is not the major player, it’s the happenstance of genetic drift that drives change. But somehow it all happens anyway, and evolution gets the credit.
    http://www.evolutionnews.org/2.....62391.html

    Majestic Ascent: Berlinski on Darwin on Trial – David Berlinski – November 2011
    Excerpt: The publication in 1983 of Motoo Kimura’s The Neutral Theory of Molecular Evolution consolidated ideas that Kimura had introduced in the late 1960s. On the molecular level, evolution is entirely stochastic, and if it proceeds at all, it proceeds by drift along a leaves-and-current model. Kimura’s theories left the emergence of complex biological structures an enigma, but they played an important role in the local economy of belief. They allowed biologists to affirm that they welcomed responsible criticism. “A critique of neo-Darwinism,” the Dutch biologist Gert Korthof boasted, “can be incorporated into neo-Darwinism if there is evidence and a good theory, which contributes to the progress of science.”
    By this standard, if the Archangel Gabriel were to accept personal responsibility for the Cambrian explosion, his views would be widely described as neo-Darwinian.
    http://www.evolutionnews.org/2.....53171.html

    Kimura’s Quandary
    Excerpt: Kimura realized that Haldane was correct,,, He developed his neutral theory in response to this overwhelming evolutionary problem. Paradoxically, his theory led him to believe that most mutations are unselectable, and therefore,,, most ‘evolution’ must be independent of selection! Because he was totally committed to the primary axiom (neo-Darwinism), Kimura apparently never considered his cost arguments could most rationally be used to argue against the Axiom’s (neo-Darwinism’s) very validity.
    John Sanford PhD. – “Genetic Entropy and The Mystery of the Genome” – pg. 161 – 162

  4. 4
    bornagain77 says:

    A graph featuring ‘Kimura’s Distribution’ being properly used is shown in the following video:

    Evolution vs Genetic Entropy – Andy McIntosh – video – ‘Kimura’s Distribution’ 59:27 miute mark
    https://youtu.be/-GLJE4FbHnk?t=3567

    As you can see from the graph on ‘Kimura’s Distribution’ from the preceding video, the belief that many mutations are completely neutral, as is held in ‘neutral theory’, is a falacious Darwinian belief that is born out of theoretical concerns from Darwinists. Concerns that seek to, minus natural selection, at least preserve some explanatory power for random mutations.
    Yet both theoretical concerns and empirical findings undermine this Darwinian hope for the complete neutrality of mutations

    “Moreover, there is strong theoretical reasons for believing there is no truly neutral nucleotide positions. By its very existence, a nucleotide position takes up space, affects spacing between other sites, and affects such things as regional nucleotide composition, DNA folding, and nucleosome building. If a nucleotide carries absolutely no (useful) information, it is, by definition, slightly deleterious, as it slows cell replication and wastes energy.,, Therefore, there is no way to change any given site without some biological effect, no matter how subtle.”
    – John Sanford – Genetic Entropy and The Mystery of The Genome – pg. 21 – Inventor of the ‘Gene Gun’

    Multiple Overlapping Genetic Codes Profoundly Reduce the Probability of Beneficial Mutation George Montañez 1, Robert J. Marks II 2, Jorge Fernandez 3 and John C. Sanford 4 – May 2013
    Excerpt: It is almost universally acknowledged that beneficial mutations are rare compared to deleterious mutations [1–10].,, It appears that beneficial mutations may be too rare to actually allow the accurate measurement of how rare they are [11].
    http://www.worldscientific.com.....08728_0006

    Unexpectedly small effects of mutations in bacteria bring new perspectives – November 2010
    Excerpt:,,, using extremely sensitive growth measurements, doctoral candidate Peter Lind showed that most mutations reduced the rate of growth of bacteria by only 0.500 percent. No mutations completely disabled the function of the proteins, and very few had no impact at all. Even more surprising was the fact that mutations that do not change the protein sequence had negative effects similar to those of mutations that led to substitution of amino acids. A possible explanation is that most mutations may have their negative effect by altering mRNA structure, not proteins, as is commonly assumed.
    http://www.physorg.com/news/20.....teria.html

    As was pointed out by Dr. McIntosh in the ‘Evolution vs Genetic Entropy’ video, a major problem for Natural Selection, (that is so easy to understand that even a child can understand it), is what is termed the “Princess and the Pea” paradox. Dr. John Sanford goes over the “Princess and the Pea” paradox in the following video at the 8:14 minute mark:

    Dr. John Sanford: Genetic Entropy and the Mystery of the Genome – video
    https://youtu.be/eY98io7JH-c?t=495

    The basic point of the “Princess and the Pea” paradox is that Natural Selection can only select for an entire organism and cannot select for individual neucleotides that are buried underneath the ‘Princess’s mattress’ of 100 trillion cells with each cell containing 3 billion base pairs of DNA and approximately a billion protein molecules. It is somelike like a brain surgeon tring to do brain surgery with boxing gloves on. A similar analogy is to imagine trying to write a new computer program on computers by allowing random errors to enter the program of computers and then throwing out every computer when its program crashed and only keeping those computers which did not yet crash. (Come to think of it, perhaps that is how Windows 10 was written 🙂 ).

    As you can somewhat see from that analogy, when selecting at the whole organism level there is nothing that can prevent slightly deleterious mutations from accumulating in the computer programs that have not yet crashed.
    This intuitively obvious implication is born out in numerical simulations.

    Can Purifying Natural Selection Preserve Biological Information? – May 2013 –
    Paul Gibson, John R. Baumgardner, Wesley H. Brewer, John C. Sanford
    In conclusion, numerical simulation shows that realistic levels of biological noise result in a high selection threshold. This results in the ongoing accumulation of low-impact deleterious mutations, with deleterious mutation count per individual increasing linearly over time. Even in very long experiments (more than 100,000 generations), slightly deleterious alleles accumulate steadily, causing eventual extinction. These findings provide independent validation of previous analytical and simulation studies [2–13]. Previous concerns about the problem of accumulation of nearly neutral mutations are strongly supported by our analysis. Indeed, when numerical simulations incorporate realistic levels of biological noise, our analyses indicate that the problem is much more severe than has been acknowledged, and that the large majority of deleterious mutations become invisible to the selection process.,,,
    http://www.worldscientific.com.....08728_0010   

    Biological Information – Purifying Selection 12-20-2014 by Paul Giem – video
    https://www.youtube.com/watch?v=SGJZDsQG4kQ

    Selection Threshold Severely Constrains Capture of Beneficial Mutations – John C. Sanford – 2013
    Concluding comments
    Our findings raise a very interesting theoretical problem — in a large genome, how do the millions of low-impact (yet functional) nucleotides arise? It is universally agreed that selection works very well for high-impact mutations. However, unless some new and as yet undiscovered process is operating in nature, there should be selection breakdown for the great majority of mutations that have small impact on fitness. We have now shown that this applies equally to both beneficial and deleterious mutations, and we have shown that selection interference is especially important when there are high-impact beneficial mutations. We conclude that only a very small fraction of all non-neutral mutations are selectable within large genomes. Our results reinforce and extend the findings of earlier studies [1–13], which in general employed many simplifying assumptions and rarely included more than a single source of biological noise. We show that selection breakdown is not just a simple function of population size, but is seriously impacted by other factors, especially selection interference. We are convinced that our formulation and methodology (i.e., genetic accounting) provide the most biologically-realistic analysis of selection breakdown to date.
    http://www.worldscientific.com.....08728_0011

    Biological Information – Mutation Count & Can Synergistic Epistasis Halt Mutation Accumulation 1-17-2015 by Paul Giem – video
    https://www.youtube.com/watch?v=6gdoZk_NbmU

    The fact that slightly deleterious mutations are accumulating in our genomes is also born out empirically.

    Human Genetic Variation Recent, Varies Among Populations – (Nov. 28, 2012)
    Excerpt: Nearly three-quarters of mutations in genes that code for proteins — the workhorses of the cell — occurred within the past 5,000 to 10,000 years,,,
    “One of the most interesting points is that Europeans have more new deleterious (potentially disease-causing) mutations than Africans,”,,,

    per science daily

  5. 5
    bornagain77 says:

    I went to the mutation database website and found:

    The Human Gene Mutation Database
    The Human Gene Mutation Database (HGMD®) represents an attempt to collate known (published) gene lesions responsible for human inherited disease.
    Mutation total (as of Feb. 27, 2016) – 179,235
    http://www.hgmd.cf.ac.uk/ac/

    Dr. Sanford comments:

    Critic ignores reality of Genetic Entropy – Dr John Sanford – 7 March 2013
    Excerpt: Where are the beneficial mutations in man? It is very well documented that there are thousands of deleterious Mendelian mutations accumulating in the human gene pool, even though there is strong selection against such mutations. Yet such easily recognized deleterious mutations are just the tip of the iceberg. The vast majority of deleterious mutations will not display any clear phenotype at all. There is a very high rate of visible birth defects, all of which appear deleterious. Again, this is just the tip of the iceberg. Why are no beneficial birth anomalies being seen? This is not just a matter of identifying positive changes. If there are so many beneficial mutations happening in the human population, selection should very effectively amplify them. They should be popping up virtually everywhere. They should be much more common than genetic pathologies. Where are they? European adult lactose tolerance appears to be due to a broken lactase promoter [see Can’t drink milk? You’re ‘normal’! Ed.].
    African resistance to malaria is due to a broken hemoglobin protein [see Sickle-cell disease. Also, immunity of an estimated 20% of western Europeans to HIV infection is due to a broken chemokine receptor—see CCR5-delta32: a very beneficial mutation. Ed.] Beneficials happen, but generally they are loss-of-function mutations, and even then they are very rare!
    http://creation.com/genetic-entropy

    Of related note:

    If Modern Humans Are So Smart, Why Are Our Brains Shrinking? – January 20, 2011
    Excerpt: John Hawks is in the middle of explaining his research on human evolution when he drops a bombshell. Running down a list of changes that have occurred in our skeleton and skull since the Stone Age, the University of Wisconsin anthropologist nonchalantly adds, “And it’s also clear the brain has been shrinking.”
    “Shrinking?” I ask. “I thought it was getting larger.” The whole ascent-of-man thing.,,,
    He rattles off some dismaying numbers: Over the past 20,000 years, the average volume of the human male brain has decreased from 1,500 cubic centimeters to 1,350 cc, losing a chunk the size of a tennis ball. The female brain has shrunk by about the same proportion. “I’d call that major downsizing in an evolutionary eyeblink,” he says. “This happened in China, Europe, Africa—everywhere we look.”

    http://discovermagazine.com/20.....-shrinking

    Moreover, to the extent that natural selection can be said to do anything, it is found that natural selction is far more likely to reduce genetic information rather than increase it.

    From a Frog to a Prince – video (17:00 minute mark Natural Selection Reduces Genetic Information) – No Beneficial Mutations – Gitt – Spetner – Denton – video
    https://www.youtube.com/watch?v=ClleN8ysimg&feature=player_detailpage#t=1031

    “…but Natural Selection reduces genetic information and we know this from all the Genetic Population studies that we have…”
    Maciej Marian Giertych – Population Geneticist – member of the European Parliament
    EXPELLED – Natural Selection And Genetic Mutations – video
    https://www.youtube.com/watch?v=6z5-15wk1Zk

    “We found an enormous amount of diversity within and between the African populations, and we found much less diversity in non-African populations,” Tishkoff told attendees today (Jan. 22) at the annual meeting of the American Association for the Advancement of Science in Anaheim. “Only a small subset of the diversity in Africa is found in Europe and the Middle East, and an even narrower set is found in American Indians.”
    Tishkoff; Andrew Clark, Penn State; Kenneth Kidd, Yale University; Giovanni Destro-Bisol, University “La Sapienza,” Rome, and Himla Soodyall and Trefor Jenkins, WITS University, South Africa, looked at three locations on DNA samples from 13 to 18 populations in Africa and 30 to 45 populations in the remainder of the world.-

    Finding links and missing genes: Catalog of large-scale genetic changes around the world – October 1, 2015
    Excerpt: “When we analysed the genomes of 2500 people, we were surprised to see over 200 genes that are missing entirely in some people,” says Jan Korbel, who led the work at EMBL in Heidelberg, Germany.,,,
    African genomes harboured a much greater diversity overall.
    http://www.sciencedaily.com/re.....094723.htm

    Moreover, as if the ‘princess and the pea’ paradox were not devastating enough for natural selection, it is now found that, dimensionally speaking, natural selection is not even on the right playing field in the first place. “Although living things occupy a three-dimensional space, their internal physiology and anatomy operate as if they were four-dimensional.”

    Post-Darwinist – Denyse O’Leary – Dec. 2010
    Excerpt: They quote West et al. (1999),
    “Although living things occupy a three-dimensional space, their internal physiology and anatomy operate as if they were four-dimensional. Quarter-power scaling laws are perhaps as universal and as uniquely biological as the biochemical pathways of metabolism, the structure and function of the genetic code and the process of natural selection.”
    They comment,
    “In the words of these authors, natural selection has exploited variations on this fractal theme to produce the incredible variety of biological form and function’, but there were severe geometric and physical constraints on metabolic processes.”
    “The conclusion here is inescapable, that the driving force for these invariant scaling laws cannot have been natural selection. It’s inconceivable that so many different organisms, spanning different kingdoms and phyla, may have blindly ‘tried’ all sorts of power laws and that only those that have by chance ‘discovered’ the one-quarter power law reproduced and thrived.”
    Quotations from Jerry Fodor and Massimo Piatelli-Palmarini, What Darwin Got Wrong (London: Profile Books, 2010), p. 78-79.

    The reason why ’4-Dimensional’ quarter power scaling laws are impossible for natural selection to explain is that Natural Selection operates at the 3-Dimensional level of the organism and the ’4-Dimensional’ quarter power scaling law are simply ‘invisible’ to natural selection. The reason why 4-Dimensional things are, for all practical purposes, completely invisible to 3-Dimensional things is best illustrated by ‘flatland’:

    Dr Quantum – Flatland – video
    https://www.youtube.com/watch?v=BWyTxCsIXE4

    The reason why the internal physiology and anatomy of living things operate as if they were four-dimensional is because it is information, which is transcendent of matter and energy, which is constraining the thermodynamics of living systems to be so far out of thermodynamic equilibrium. Dr. Stephen Meyer puts the transcendent nature of information like this:

    Intelligent design: Why can’t biological information originate through a materialistic process? – Stephen Meyer – video
    http://www.youtube.com/watch?v=wqiXNxyoof8

    “One of the things I do in my classes, to get this idea across to students, is I hold up two computer disks. One is loaded with software, and the other one is blank. And I ask them, ‘what is the difference in mass between these two computer disks, as a result of the difference in the information content that they posses’? And of course the answer is, ‘Zero! None! There is no difference as a result of the information. And that’s because information is a mass-less quantity. Now, if information is not a material entity, then how can any materialistic explanation account for its origin? How can any material cause explain it’s origin?
    And this is the real and fundamental problem that the presence of information in biology has posed. It creates a fundamental challenge to the materialistic, evolutionary scenarios because information is a different kind of entity that matter and energy cannot produce.

    In the nineteenth century we thought that there were two fundamental entities in science; matter, and energy. At the beginning of the twenty first century, we now recognize that there’s a third fundamental entity; and its ‘information’. It’s not reducible to matter. It’s not reducible to energy. But it’s still a very important thing that is real; we buy it, we sell it, we send it down wires.
    Now, what do we make of the fact, that information is present at the very root of all biological function? In biology, we have matter, we have energy, but we also have this third, very important entity; information. I think the biology of the information age, poses a fundamental challenge to any materialistic approach to the origin of life.”
    -Dr. Stephen C. Meyer earned his Ph.D. in the History and Philosophy of science from Cambridge University for a dissertation on the history of origin-of-life biology and the methodology of the historical sciences.

  6. 6
    bornagain77 says:

    George Ellis, a former close collegue of both Stephen Hawking and Roger Penrose, adds his considerable opinion here:

    Recognising Top-Down Causation
    George Ellis, University of Cape Town
    Excerpt: page 5: A:
    Causal Efficacy of Non Physical entities:
    Both the program and the data are non-physical entities, indeed so is all software. A program is not a physical thing you can point to, but by Definition 2 it certainly exists. You can point to a CD or flashdrive where it is stored, but that is not the thing in itself: it is a medium in which it is stored.
    The program itself is an abstract entity, shaped by abstract logic. Is the software “nothing but” its realisation through a specific set of stored electronic states in the computer memory banks? No it is not because it is the precise pattern in those states that matters: a higher level relation that is not apparent at the scale of the electrons themselves. It’s a relational thing (and if you get the relations between the symbols wrong, so you have a syntax error, it will all come to a grinding halt). This abstract nature of software is realised in the concept of virtual machines, which occur at every level in the computer hierarchy except the bottom one [17]. But this tower of virtual machines causes physical effects in the real world, for example when a computer controls a robot in an assembly line to create physical artefacts.
    Excerpt page 7: The assumption that causation is bottom up only is wrong in biology, in computers, and even in many cases in physics, for example state vector preparation, where top-down constraints allow non-unitary behaviour at the lower levels. It may well play a key role in the quantum measurement problem (the dual of state vector preparation) [5]. One can bear in mind here that wherever equivalence classes of entities play a key role, such as in Crutchfield’s computational mechanics [29], this is an indication that top-down causation is at play.
    http://fqxi.org/data/essay-con.....s_2012.pdf

    Moreover,  Dr. Andy C. McIntosh, who is the Professor of Thermodynamics Combustion Theory at the University of Leeds (which I believe is the highest teaching/research rank in U.K. university hierarchy), has written a peer-reviewed paper in which he holds that it is ‘non-material information’ which is constraining the local thermodynamics of a cell to be in such a extremely high non-equilibrium state:

    Information and entropy – top-down or bottom-up development in living systems?
    Excerpt: This paper highlights the distinctive and non-material nature of information and its relationship with matter, energy and natural forces. It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate.
    A.C. McINTOSH – Dr Andy C. McIntosh is the Professor of Thermodynamics Combustion Theory at the University of Leeds. (the highest teaching/research rank in U.K. university hierarchy)
    http://journals.witpress.com/paperinfo.asp?pid=420

    Moreover, Dr. McIntosh holds that regarding information as independent of energy and matter ‘resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions’.

    Information and Thermodynamics in Living Systems – Andy C. McIntosh – 2013
    Excerpt: ,,, information is in fact non-material and that the coded information systems (such as, but not restricted to the coding of DNA in all living systems) is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, this approach posits that the reverse is in fact the case. Information has its definition outside the matter and energy on which it sits, and furthermore constrains it to operate in a highly non-equilibrium thermodynamic environment. This proposal resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates.,,,
    http://www.worldscientific.com.....08728_0008

    Dr. McIntosh’s contention that ‘non-material information’ must be constraining life to be so far out of thermodynamic equilibrium has been born out empirically. Classical Information in the cell has now been physically measured and is shown to correlate to the thermodynamics of the cell:

        Maxwell’s demon demonstration (knowledge of a particle’s position) turns information into energy – November 2010
        Excerpt: Scientists in Japan are the first to have succeeded in converting information into free energy in an experiment that verifies the “Maxwell demon” thought experiment devised in 1867.,,, In Maxwell’s thought experiment the demon creates a temperature difference simply from information about the gas molecule temperatures and without transferring any energy directly to them.,,, Until now, demonstrating the conversion of information to energy has been elusive, but University of Tokyo physicist Masaki Sano and colleagues have succeeded in demonstrating it in a nano-scale experiment. In a paper published in Nature Physics they describe how they coaxed a Brownian particle to travel upwards on a “spiral-staircase-like” potential energy created by an electric field solely on the basis of information on its location. As the particle traveled up the staircase it gained energy from moving to an area of higher potential, and the team was able to measure precisely how much energy had been converted from information.
    http://www.physorg.com/news/20.....nergy.html

    Demonic device converts information to energy – 2010
    Excerpt: “This is a beautiful experimental demonstration that information has a thermodynamic content,” says Christopher Jarzynski, a statistical chemist at the University of Maryland in College Park. In 1997, Jarzynski formulated an equation to define the amount of energy that could theoretically be converted from a unit of information2; the work by Sano and his team has now confirmed this equation. “This tells us something new about how the laws of thermodynamics work on the microscopic scale,” says Jarzynski.
    http://www.scientificamerican......rts-inform

    As well, it is now found that ‘non-local’, beyond space-time matter-energy, Quantum entanglement/information ‘holds’ DNA (and proteins) together:

    Quantum entanglement holds together life’s blueprint – 2010
    Excerpt: When the researchers analysed the DNA without its helical structure, they found that the electron clouds were not entangled. But when they incorporated DNA’s helical structure into the model, they saw that the electron clouds of each base pair became entangled with those of its neighbours. “If you didn’t have entanglement, then DNA would have a simple flat structure, and you would never get the twist that seems to be important to the functioning of DNA,” says team member Vlatko Vedral of the University of Oxford.
    http://neshealthblog.wordpress.....blueprint/

    The DNA Mystery: Scientists Stumped By “Telepathic” Abilities – Sept, 2009
    Scientists are reporting evidence that contrary to our current beliefs about what is possible, intact double-stranded DNA has the “amazing” ability to recognize similarities in other DNA strands from a distance. Somehow they are able to identify one another, and the tiny bits of genetic material tend to congregate with similar DNA. The recognition of similar sequences in DNA’s chemical subunits, occurs in a way unrecognized by science. There is no known reason why the DNA is able to combine the way it does, and from a current theoretical standpoint this feat should be chemically impossible.
    http://www.dailygalaxy.com/my_.....ities.html

    “What happens is this classical information (of DNA) is embedded, sandwiched, into the quantum information (of DNA). And most likely this classical information is never accessed because it is inside all the quantum information. You can only access the quantum information or the electron clouds and the protons. So mathematically you can describe that as a quantum/classical state.”
    Elisabeth Rieper – Classical and Quantum Information in DNA – video (Longitudinal Quantum Information resides along the entire length of DNA discussed at the 19:30 minute mark; at 24:00 minute mark Dr Rieper remarks that practically the whole DNA molecule can be viewed as quantum information with classical information embedded within it)
    https://youtu.be/2nqHOnVTxJE?t=1176

  7. 7
    bornagain77 says:

    Of particular note, quantum entanglement is its own distinct ‘physical resource’ like matter and energy. A distinct physical resouce that can be used as a “quantum information channel” so as to perform quantum computation.

    Quantum Entanglement and Information
    Quantum entanglement is a physical resource, like energy, associated with the peculiar nonclassical correlations that are possible between separated quantum systems. Entanglement can be measured, transformed, and purified. A pair of quantum systems in an entangled state can be used as a quantum information channel to perform computational and cryptographic tasks that are impossible for classical systems. The general study of the information-processing capabilities of quantum systems is the subject of quantum information theory.
    http://plato.stanford.edu/entries/qt-entangle/

    It should be noted that quantum computation in proteins and DNA would go a long way towards explaining the unsolved enigmas of protein folding and DNA repair:

    Physicists Discover Quantum Law of Protein Folding – February 22, 2011
    Quantum mechanics finally explains why protein folding depends on temperature in such a strange way.
    Excerpt: First, a little background on protein folding. Proteins are long chains of amino acids that become biologically active only when they fold into specific, highly complex shapes. The puzzle is how proteins do this so quickly when they have so many possible configurations to choose from.
    To put this in perspective, a relatively small protein of only 100 amino acids can take some 10^100 different configurations. If it tried these shapes at the rate of 100 billion a second, it would take longer than the age of the universe to find the correct one. Just how these molecules do the job in nanoseconds, nobody knows.,,,
    Today, Luo and Lo say these curves can be easily explained if the process of folding is a quantum affair. By conventional thinking, a chain of amino acids can only change from one shape to another by mechanically passing though various shapes in between.
    But Luo and Lo say that if this process were a quantum one, the shape could change by quantum transition, meaning that the protein could ‘jump’ from one shape to another without necessarily forming the shapes in between.,,,
    Their astonishing result is that this quantum transition model fits the folding curves of 15 different proteins and even explains the difference in folding and unfolding rates of the same proteins.
    That’s a significant breakthrough. Luo and Lo’s equations amount to the first universal laws of protein folding. That’s the equivalent in biology to something like the thermodynamic laws in physics.
    http://www.technologyreview.co.....f-protein/

    Quantum Dots Spotlight DNA-Repair Proteins in Motion – March 2010
    Excerpt: “How this system works is an important unanswered question in this field,” he said. “It has to be able to identify very small mistakes in a 3-dimensional morass of gene strands. It’s akin to spotting potholes on every street all over the country and getting them fixed before the next rush hour.” Dr. Bennett Van Houten – of note: A bacterium has about 40 team members on its pothole crew. That allows its entire genome to be scanned for errors in 20 minutes, the typical doubling time.,, These smart machines can apparently also interact with other damage control teams if they cannot fix the problem on the spot.
    http://www.sciencedaily.com/re.....123522.htm

    Of related note, classical information is shown to be a subset of quantum entanglement/information here:

    Quantum knowledge cools computers: New understanding of entropy – June 2011
    Excerpt: No heat, even a cooling effect;
    In the case of perfect classical knowledge of a computer memory (zero entropy), deletion of the data requires in theory no energy at all. The researchers prove that “more than complete knowledge” from quantum entanglement with the memory (negative entropy) leads to deletion of the data being accompanied by removal of heat from the computer and its release as usable energy. This is the physical meaning of negative entropy. Renner emphasizes, however, “This doesn’t mean that we can develop a perpetual motion machine.” The data can only be deleted once, so there is no possibility to continue to generate energy. The process also destroys the entanglement, and it would take an input of energy to reset the system to its starting state. The equations are consistent with what’s known as the second law of thermodynamics: the idea that the entropy of the universe can never decrease. Vedral says “We’re working on the edge of the second law. If you go any further, you will break it.”
    http://www.sciencedaily.com/re.....134300.htm

    And in quantum mechanics it is information that is primarily conserved, not matter of energy.

    Quantum no-hiding theorem experimentally confirmed for first time
    Excerpt: In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed. This concept stems from two fundamental theorems of quantum mechanics: the no-cloning theorem and the no-deleting theorem. A third and related theorem, called the no-hiding theorem, addresses information loss in the quantum world. According to the no-hiding theorem, if information is missing from one system (which may happen when the system interacts with the environment), then the information is simply residing somewhere else in the Universe; in other words, the missing information cannot be hidden in the correlations between a system and its environment.
    http://www.physorg.com/news/20.....tally.html

    Quantum no-deleting theorem
    Excerpt: A stronger version of the no-cloning theorem and the no-deleting theorem provide permanence to quantum information. To create a copy one must import the information from some part of the universe and to delete a state one needs to export it to another part of the universe where it will continue to exist.
    http://en.wikipedia.org/wiki/Q.....onsequence

    Moreover, and most importantly, quantum entanglement/information simply refuses to be reduced to any materialistic, i.e. Darwinian, explanation:

    Looking beyond space and time to cope with quantum theory – 29 October 2012
    Excerpt: “Our result gives weight to the idea that quantum correlations somehow arise from outside spacetime, in the sense that no story in space and time can describe them,”
    http://www.quantumlah.org/high.....uences.php

    Physicists find extreme violation of local realism in quantum hypergraph states – Lisa Zyga – March 4, 2016
    Excerpt: Many quantum technologies rely on quantum states that violate local realism, which means that they either violate locality (such as when entangled particles influence each other from far away) or realism (the assumption that quantum states have well-defined properties, independent of measurement), or possibly both. Violation of local realism is one of the many counterintuitive, yet experimentally supported, characteristics of the quantum world.
    Determining whether or not multiparticle quantum states violate local realism can be challenging. Now ,b>in a new paper, physicists have shown that a large family of multiparticle quantum states called hypergraph states violates local realism in many ways. The results suggest that these states may serve as useful resources for quantum technologies, such as quantum computers and detecting gravitational waves.,,,
    The physicists also showed that the greater the number of particles in a quantum hypergraph state, the more strongly it violates local realism, with the strength increasing exponentially with the number of particles. In addition, even if a quantum hypergraph state loses one of its particles, it continues to violate local realism. This robustness to particle loss is in stark contrast to other types of quantum states, which no longer violate local realism if they lose a particle. This property is particularly appealing for applications, since it might allow for more noise in experiments.
    http://phys.org/news/2016-03-p.....alism.html

  8. 8
    bornagain77 says:

    Besides providing direct empirical falsification of neo-Darwinian claims that say information is emergent from a material basis, the implication of finding ‘non-local’, beyond space and time, and ‘conserved’ quantum information in molecular biology on such a massive scale, in every DNA and protein molecule, is fairly, and pleasantly, obvious.
    That pleasant implication, or course, being the fact that we now have strong physical evidence suggesting that we do indeed have an eternal soul that lives beyond the death of our material bodies.

    Does Quantum Biology Support A Quantum Soul? – Stuart Hameroff – video
    https://www.youtube.com/watch?v=iIyEjh6ef_8

    Quantum Entangled Consciousness – Life After Death – Stuart Hameroff – video
    https://www.youtube.com/watch?v=jjpEc98o_Oo

    Verse and Music:

    Matthew 16:26
    What good will it be for someone to gain the whole world, yet forfeit their soul? Or what can anyone give in exchange for their soul?

    Evanescence – My Heart Is Broken
    https://www.youtube.com/watch?v=f1QGnq9jUU0

  9. 9
    bornagain77 says:

    semi related:

    GENETIC ENTROPY – It’s Down NOT Up… – references/links
    http://www.geneticentropy.org/#!properties/ctzx

  10. 10
    Anaxagoras says:

    What William Provine had to say on Natural Selection:

    1. Natural selection was the primary mechanism at every level of the evolutionary process. This simple statement raises to major problems for me now. As John Endler has argued eloquently in Natural Selection in the Wild (1986), natural selection is not a mechanism. Natural selection does not act on anything, nor does it select (for or against), force, maximize, create, modify, shape, operate, drive, favor, maintain, push, or adjust. Natural selection does nothing. Natural selection as a force belongs in the insubstantial category already populated by the Becker/Stahl phlogiston (Ender 1986) or Newton’s ‘ether.'” [pg.199]
    * * *
    “Having natural selection select is nifty because it excuses the necessity of talking about the actual causation of natural selection. Such talk was excusable for Charles Darwin, but inexcusable for evolutionists now. Creationists have discovered our empty “natural selection” language, and the “actions” of natural selection make huge, vulnerable targets.[p.200]”

    Will Provine. The Origin of Theoretical Population Genetics (University of Chicago Press, 1971), reissued in 2001.

  11. 11
    bornagain77 says:

    Of related interest to 4 Dimensional quarter power scaling that was mentioned towards the end of comment 5 is the following finding:

    Scaling of Brain Metabolism and Blood Flow in Relation to Capillary and Neural Scaling – 2011
    Excerpt: Brain is one of the most energy demanding organs in mammals, and its total metabolic rate scales with brain volume raised to a power of around 5/6. This value is significantly higher than the more common exponent 3/4 (4- dimensional Quarter Power Scaling) relating whole body resting metabolism with body mass and several other physiological variables in animals and plants.,,,
    Moreover, cerebral metabolic, hemodynamic, and microvascular variables scale with allometric exponents that are simple multiples of 1/6, rather than 1/4, which suggests that brain metabolism is more similar to the metabolism of aerobic than resting body. Relation of these findings to brain functional imaging studies involving the link between cerebral metabolism and blood flow is also discussed.,,
    General Discussion Excerpt:
    ,,It should be underlined that both CBF and CMR scale with brain volume with the exponent about 1/4 which is significantly different from the exponent 1/4 relating whole body resting specific metabolism with body volume [1], [2], [3]. Instead, the cerebral exponent 1/6 is closer to an exponent,, characterizing maximal body specific metabolic rate and specific cardiac output in strenuous exercise [43], [44]. In this sense, the brain metabolism and its hemodynamics resemble more the metabolism and circulation of exercised muscles than other resting organs, which is in line with the empirical evidence that brain is an energy expensive organ [10], [17], [18]. This may also suggest that there exists a common plan for the design of microcirculatory system in different parts of the mammalian body that uses the same optimization principles [45].,,
    http://www.ncbi.nlm.nih.gov/pm.....MC3203885/

    IMHO, finding 1/4 power scaling for the body compared to 1/6 scaling for the brain is certainly very suggestive that mind/consciousness is of a even higher dimensional value than the 4 Dimensional body/soul is. Which is what would be expected under Christian presuppositions:

    Also of note:

    The Puzzling Role Of Biophotons In The Brain – Dec. 17, 2010
    Excerpt: In recent years, a growing body of evidence shows that photons play an important role in the basic functioning of cells. Most of this evidence comes from turning the lights off and counting the number of photons that cells produce. It turns out, much to many people’s surprise, that many cells, perhaps even most, emit light as they work.
    In fact, it looks very much as if many cells use light to communicate. There’s certainly evidence that bacteria, plants and even kidney cells communicate in this way. Various groups have even shown that rats brains are literally alight thanks to the photons produced by neurons as they work.,,,
    ,,, earlier this year, one group showed that spinal neurons in rats can actually conduct light.
    ,, Rahnama and co point out that neurons contain many light sensitive molecules, such as porphyrin rings, flavinic, pyridinic rings, lipid chromophores and aromatic amino acids. In particular, mitochondria, the machines inside cells which produce energy, contain several prominent chromophores.
    The presence of light sensitive molecules makes it hard to imagine how they might not be not influenced by biophotons.,,,
    They go on to suggest that the light channelled by microtubules can help to co-ordinate activities in different parts of the brain. It’s certainly true that electrical activity in the brain is synchronised over distances that cannot be easily explained. Electrical signals travel too slowly to do this job, so something else must be at work.,,,
    (So) It’s a big jump to assume that photons do this job.
    http://www.technologyreview.co.....the-brain/

    Exodus 34:29-30:
    “Moses didn’t realize as he came back down the mountain with the tablets that his face glowed from being in the presence of God. Because of this radiance upon his face, Aaron and the people of Israel were afraid to come near him.”

    Matthew 17:1-2
    After six days Jesus took with him Peter, James and John the brother of James, and led them up a high mountain by themselves. There he was transfigured before them. His face shone like the sun, and his clothes became as white as the light.

Leave a Reply