Uncommon Descent Serving The Intelligent Design Community

Darwinists are not usually software engineers

arroba Email

And, in some cases, the reverse is also true:

Biologists are trained to use common descent as an organizing principle for all their data, and for most biologists the Darwinian mechanism comes in the same package. Evidently they don’t see much reason to doubt this mechanism. There are exceptions to the rule, of course, among biologists who have read Mike Behe or Stephen Meyer carefully. Some proponents of intelligent design believe in common descent, but not in natural selection.

On the other hand, computer scientists and software engineers are trained in design principles, and also have real experience of how complex functional systems appear and change constructively. In particular, software engineers know how complex it can be to implement a “simple” change. From this perspective, the Darwinian story is a lot less plausible. Natural selection seems just a little bit too easy.Andrew Jones, “Why We Don’t Evolve Software: A Computer Scientist Considers Darwinian Theory” at Evolution News and Science Today

See also: Does information theory support design in nature?


We don’t often hear space and time described as a quantum error-correcting code
But some argue, the same codes used to prevent errors in quantum computers might give space-time “its intrinsic robustness.” They certainly make it sound as though our universe is designed.

Follow UD News at Twitter!

It’s estimate that 99+% of all species that have existed on this planet have gone extinct.
Is that based on the assumption that universal common descent is true? Just asking because I don't think that means anything beyond a statement of fact- if true. ET
It's estimate that 99+% of all species that have existed on this planet have gone extinct. And yet some bit-twiddler says "Natural selection seems just a little bit too easy." Give me a break ! You know what I learned about biology from my 26 years working as a software engineer? ABSOLUTELY NOTHING ! Pater Kimbridge
a few more notes:
How life changes itself: the Read-Write (RW) genome. - 2013 Excerpt: Research dating back to the 1930s has shown that genetic change is the result of cell-mediated processes, not simply accidents or damage to the DNA. This cell-active view of genome change applies to all scales of DNA sequence variation, from point mutations to large-scale genome rearrangements and whole genome duplications (WGDs). This conceptual change to active cell inscriptions controlling RW genome functions has profound implications for all areas of the life sciences. http://www.ncbi.nlm.nih.gov/pubmed/23876611 Life Leads the Way to Invention - Feb. 2010 Excerpt: a cell is 10,000 times more energy-efficient than a transistor. “In one second, a cell performs about 10 million energy-consuming chemical reactions, which altogether require about one picowatt (one millionth millionth of a watt) of power.” This and other amazing facts lead to an obvious conclusion: inventors ought to look to life for ideas.,,, Essentially, cells may be viewed as circuits that use molecules, ions, proteins and DNA instead of electrons and transistors. That analogy suggests that it should be possible to build electronic chips – what Sarpeshkar calls “cellular chemical computers” – that mimic chemical reactions very efficiently and on a very fast timescale. https://crev.info/2010/02/life_leads_the_way_to_invention/ Logical Reversibility of Computation* - C. H. Bennett - 1973 Excerpt from last paragraph: The biosynthesis and biodegradation of messenger RNA may be viewed as convenient examples of logically reversible and irreversible computation, respectively. Messenger RNA. a linear polymeric informational macromolecule like DNA, carries the genetic information from one or more genes of a DNA molecule. and serves to direct the synthesis of the proteins encoded by those genes. Messenger RNA is synthesized by the enzyme RNA polymerase in the presence of a double-stranded DNA molecule and a supply of RNA monomers (the four nucleotide pyrophosphates ATP, GTP, CTP, and UTP) [7]. The enzyme attaches to a specific site on the DNA molecule and moves along, sequentially incorporating the RNA monomers into a single-stranded RNA molecule whose nucleotide sequence exactly matches that of the DNA. The pyrophosphate groups are released into the surrounding solution as free pyrophosphate molecules. The enzyme may thus be compared to a simple tape-copying Turing machine that manufactures its output tape rather than merely writing on it. Tape copying is a logically reversible operation. and RNA polymerase is both thermodynamically and logically reversible.,,, http://www.cs.princeton.edu/courses/archive/fall04/cos576/papers/bennett73.html Dichotomy in the definition of prescriptive information suggests both prescribed data and prescribed algorithms: biosemiotics applications in genomic systems - 2012 David J D’Onofrio1*, David L Abel2* and Donald E Johnson3 Excerpt: The DNA polynucleotide molecule consists of a linear sequence of nucleotides, each representing a biological placeholder of adenine (A), cytosine (C), thymine (T) and guanine (G). This quaternary system is analogous to the base two binary scheme native to computational systems. As such, the polynucleotide sequence represents the lowest level of coded information expressed as a form of machine code. Since machine code (and/or micro code) is the lowest form of compiled computer programs, it represents the most primitive level of programming language.,,, An operational analysis of the ribosome has revealed that this molecular machine with all of its parts follows an order of operations to produce a protein product. This order of operations has been detailed in a step-by-step process that has been observed to be self-executable. The ribosome operation has been proposed to be algorithmic (Ralgorithm) because it has been shown to contain a step-by-step process flow allowing for decision control, iterative branching and halting capability. The R-algorithm contains logical structures of linear sequencing, branch and conditional control. All of these features at a minimum meet the definition of an algorithm and when combined with the data from the mRNA, satisfy the rule that Algorithm = data + control. Remembering that mere constraints cannot serve as bona fide formal controls, we therefore conclude that the ribosome is a physical instantiation of an algorithm.,,, The correlation between linguistic properties examined and implemented using Automata theory give us a formalistic tool to study the language and grammar of biological systems in a similar manner to how we study computational cybernetic systems. These examples define a dichotomy in the definition of Prescriptive Information. We therefore suggest that the term Prescriptive Information (PI) be subdivided into two categories: 1) Prescriptive data and 2) Prescribed (executing) algorithm. It is interesting to note that the CPU of an electronic computer is an instance of a prescriptive algorithm instantiated into an electronic circuit, whereas the software under execution is read and processed by the CPU to prescribe the program’s desired output. Both hardware and software are prescriptive. http://www.tbiomed.com/content/pdf/1742-4682-9-8.pdf Learning from Bacteria about Social Networking (Information Processing) - video Excerpt: I (Dr. Ben-Jacob) will show illuminating movies of swarming intelligence of live bacteria in which they solve optimization problems for collective decision making that are beyond what we, human beings, can solve with our most powerful computers. http://www.youtube.com/watch?v=yJpi8SnFXHs "The very notion that nanotechnology, the functional complexity of which is beyond the ability of modern science to create intentionally, came about mindlessly and accidentally is what is as unbelievably stupid as it is false. If atheistic science knew even one way to build technology from scratch that could also manufacture more instances of itself from available raw materials, then it might be able to begin explaining how such a technological feat could have occurred mindlessly and accidentally, because it would then at least have some idea of what would be required for something like that to take place. As it is, atheistic science insists that that which it has no idea how to make happen on purpose happened accidentally. The stupidity of that is something like jungle savages insisting, even though they didn’t have any idea how to manufacture one, that the laptop PC they found came about accidentally. The functional complexity of life’s nanotechnology is light years beyond our own." Harry UD Blogger "The truly extraordinary claim — indeed, the wildly and irresponsibly outrageous claim — is that a highly scalable, massively parallel system architecture incorporating a 4-bit digital coding system and a super-dense, information-rich, three-dimensional, multi-layered, multi-directional database structure with storage, retrieval and translation mechanisms, utilizing file allocation, concatenation and bit-parity algorithms, operating subject to software protocol hierarchies could all come about through a long series of accidental particle collisions. That is beyond extraordinary. It is preposterous. It is laughable." Eric Anderson “Although the tiniest living things known to science, bacterial cells, are incredibly small (10^-12 grams), each is a veritable micro-miniaturized factory containing thousands of elegantly designed pieces of intricate molecular machinery, made up altogether of one hundred thousand million atoms, far more complicated than any machine built by man and absolutely without parallel in the non-living world”. Michael Denton, "Evolution: A Theory in Crisis," 1986, p. 250. Earth's Biosphere Is Awash in Information - June 29, 2015 Excerpt: In this remarkable paper, Landenmark, Forgan, and Cockell of the United Kingdom Centre for Astrobiology at the University of Edinburgh attempt "An Estimate of the Total DNA of the Biosphere." The results are staggering: "Modern whole-organism genome analysis, in combination with biomass estimates, allows us to estimate a lower bound on the total information content in the biosphere: 5.3 × 10^31 (±3.6 × 10^31) megabases (Mb) of DNA. Given conservative estimates regarding DNA transcription rates, this information content suggests biosphere processing speeds exceeding yottaNOPS values (10^24 Nucleotide Operations Per Second).,,," ,,,let's ponder the scale of this information content and processing speed. A yottaNOPS is a lotta ops! Each prefix multiplies the prior one by a thousand: kilo, mega, giga, tera, peta, exa, zetta, yotta. A "yottabase" doesn't even come close to the raw information content of DNA they estimate: 10^31 megabases. That's the same as 10^37 bases, but a yottabase is only 10^24 bases (a trillion trillion bases). This means that the information content of the biosphere is 50 x 10^13 yottabases (500 trillion yottabases). They estimate that living computers perform a yottaNOPS, or 10^24 nucleotide operations per second, on this information. You can pick yourself off the floor now.,,, "Storing the total amount of information encoded in DNA in the biosphere, 5.3 × 10^31 megabases (Mb), would require approximately 10^21 supercomputers with the average storage capacity of the world's four most powerful supercomputers." How much land surface would be required for 10^21 supercomputers (a "zetta-computer")? The Titan supercomputer takes up 404 m2 of space. If we assume just 100 m2 for each supercomputer, we would still need 10^23 square meters to hold them all. Universe Today estimates the total surface of Earth (including the oceans) at 510 million km2, which equates to 5.1 x 10^14 m2. That's 9 orders of magnitude short of the zetta-computer footprint, meaning we would need a billion Earths to have enough space for all the computers needed to match the equivalent computing power life performs on DNA! http://www.evolutionnews.org/2015/06/earths_biospher097221.html
The Ham-Nye Creation Debate: A Huge Missed Opportunity - Casey Luskin - February 4, 2014 Ecerpt: "The unique informational narrative of living systems suggests that life may be characterized by context-dependent causal influences, and in particular, that top-down (or downward) causation -- where higher-levels influence and constrain the dynamics of lower-levels in organizational hierarchies -- may be a major contributor to the hierarchal structure of living systems." (Sara Imari Walker and Paul C. W. Davies, "The algorithmic origins of life," Journal of the Royal Society Interface, 10: 20120869 (2012).) http://www.evolutionnews.org/2014/02/the_ham-nye_deb081911.html Recognising Top-Down Causation - George Ellis Excerpt: ,,,However there are many topics that one cannot understand by assuming this one-way flow of causation. The flourishing subject of social neuroscience makes clear how social influences act down on individual brain structure[2]; studies in physiology demonstrate that downward causation is necessary in understanding the heart, where this form of causation can be represented as the influences of initial and boundary conditions on the solutions of the differential equations used to represent the lower level processes[3]; epigenetic studies demonstrate that biological development is crucially shaped by the environment[4],,, Excerpt: page 5: A: Causal Efficacy of Non Physical entities: Both the program and the data are non-physical entities, indeed so is all software. A program is not a physical thing you can point to, but by Definition 2 it certainly exists. You can point to a CD or flashdrive where it is stored, but that is not the thing in itself: it is a medium in which it is stored. The program itself is an abstract entity, shaped by abstract logic. Is the software “nothing but” its realisation through a specific set of stored electronic states in the computer memory banks? No it is not because it is the precise pattern in those states that matters: a higher level relation that is not apparent at the scale of the electrons themselves. It’s a relational thing (and if you get the relations between the symbols wrong, so you have a syntax error, it will all come to a grinding halt). This abstract nature of software is realised in the concept of virtual machines, which occur at every level in the computer hierarchy except the bottom one [17]. But this tower of virtual machines causes physical effects in the real world, for example when a computer controls a robot in an assembly line to create physical artefacts. Excerpt page 7: The assumption that causation is bottom up only is wrong in biology, in computers, and even in many cases in physics,,,, Life and the brain: living systems are highly structured modular hierarchical systems, and there are many similarities to the digital computer case, even though they are not digital computers. The lower level interactions are constrained by network connections, thereby creating possibilities of truly complex behaviour. Top-down causation is prevalent at all levels in the brain: for example it is crucial to vision [24,25] as well as the relation of the individual brain to society [2]. The hardware (the brain) can do nothing without the excitations that animate it: indeed this is the difference between life and death. The mind is not a physical entity, but it certainly is causally effective: proof is the existence of the computer on which you are reading this text. It could not exist if it had not been designed and manufactured according to someone’s plans, thereby proving the causal efficacy of thoughts, which like computer programs and data are not physical entities. http://fqxi.org/data/essay-contest-files/Ellis_FQXI_Essay_Ellis_2012.pdf Ask an Embryologist: Genomic Mosaicism – Jonathan Wells – February 23, 2015 Excerpt: I now know as an embryologist,,,Tissues and cells, as they differentiate, modify their DNA to suit their needs. It’s the organism controlling the DNA, not the DNA controlling the organism. http://www.evolutionnews.org/2015/02/ask_an_embryolo093851.html
Of note, the 'bottom up' reductive materialistic explanations of Darwinian evolution simply don't have, and can never have, the capacity to be 'top down' in their causation: In the following article entitled 'Quantum physics problem proved unsolvable: Gödel and Turing enter quantum physics', which studied the derivation of macroscopic properties from a complete microscopic description, the researchers remark that even a perfect and complete description of the microscopic properties of a material is not enough to predict its macroscopic behaviour.,,, The researchers further commented that their findings challenge the reductionists' point of view, as the insurmountable difficulty lies precisely in the derivation of macroscopic properties from a microscopic description."
Quantum physics problem proved unsolvable: Gödel and Turing enter quantum physics - December 9, 2015 Excerpt: A mathematical problem underlying fundamental questions in particle and quantum physics is provably unsolvable,,, It is the first major problem in physics for which such a fundamental limitation could be proven. The findings are important because they show that even a perfect and complete description of the microscopic properties of a material is not enough to predict its macroscopic behaviour.,,, "We knew about the possibility of problems that are undecidable in principle since the works of Turing and Gödel in the 1930s," added Co-author Professor Michael Wolf from Technical University of Munich. "So far, however, this only concerned the very abstract corners of theoretical computer science and mathematical logic. No one had seriously contemplated this as a possibility right in the heart of theoretical physics before. But our results change this picture. From a more philosophical perspective, they also challenge the reductionists' point of view, as the insurmountable difficulty lies precisely in the derivation of macroscopic properties from a microscopic description." http://phys.org/news/2015-12-quantum-physics-problem-unsolvable-godel.html
Supplemental note:
Information is Physical (but not how Rolf Landauer meant) - video https://www.youtube.com/watch?v=H35I83y5Uro (December 2018) (the physical reality of immaterial information) https://uncommondesc.wpengine.com/intelligent-design/a-new-unified-model-of-specified-complexity/#comment-669817
John 1:1-4 In the beginning was the Word, and the Word was with God, and the Word was God. He was with God in the beginning. Through him all things were made; without him nothing was made that has been made. In him was life, and that life was the light of all mankind.
a few notes:
"In 2003 renowned biologist Leroy Hood and biotech guru David Galas authored a review article in the world’s leading scientific journal, Nature, titled, “The digital code of DNA.”,,, MIT Professor of Mechanical Engineering Seth Lloyd (no friend of ID) likewise eloquently explains why DNA has a “digital” nature: “It’s been known since the structure of DNA was elucidated that DNA is very digital. There are four possible base pairs per site, two bits per site, three and a half billion sites, seven billion bits of information in the human DNA. There’s a very recognizable digital code of the kind that electrical engineers rediscovered in the 1950s that maps the codes for sequences of DNA onto expressions of proteins.” Casey Luskin - Every Bit Digital DNA’s Programming Really Bugs Some ID Critics – March 2010 Venter: Life Is Robotic Software - July 15, 2012 Excerpt: “All living cells that we know of on this planet are ‘DNA software’-driven biological machines comprised of hundreds of thousands of protein robots, coded for by the DNA, that carry out precise functions,” said (Craig) Venter. http://crev.info/2012/07/life-is-robotic-software/ Programming of Life - video https://vimeo.com/136580064 "Human DNA is like a computer program but far, far more advanced than any software we've ever created." - Bill Gates, The Road Ahead, 1996, p. 188 "Biological systems are the most parallel systems ever studied and we hope to use our better understanding of how living systems handle information to design new computational paradigms, programming languages and software development environments. " - (Computational and Systems Biology - CosBi) Multiple Overlapping Genetic Codes Profoundly Reduce the Probability of Beneficial Mutation George Montañez 1, Robert J. Marks II 2, Jorge Fernandez 3 and John C. Sanford 4 – published online May 2013 Excerpt: In the last decade, we have discovered still another aspect of the multi-dimensional genome. We now know that DNA sequences are typically “ poly-functional” [38]. Trifanov previously had described at least 12 genetic codes that any given nucleotide can contribute to [39,40], and showed that a given base-pair can contribute to multiple overlapping codes simultaneously. The first evidence of overlapping protein-coding sequences in viruses caused quite a stir, but since then it has become recognized as typical. According to Kapronov et al., “it is not unusual that a single base-pair can be part of an intricate network of multiple isoforms of overlapping sense and antisense transcripts, the majority of which are unannotated” [41]. The ENCODE project [42] has confirmed that this phenomenon is ubiquitous in higher genomes, wherein a given DNA sequence routinely encodes multiple overlapping messages, meaning that a single nucleotide can contribute to two or more genetic codes. Most recently, Itzkovitz et al. analyzed protein coding regions of 700 species, and showed that virtually all forms of life have extensive overlapping information in their genomes [43]. 38. Sanford J (2008) Genetic Entropy and the Mystery of the Genome. FMS Publications, NY. Pages 131–142. 39. Trifonov EN (1989) Multiple codes of nucleotide sequences. Bull of Mathematical Biology 51:417–432. 40. Trifanov EN (1997) Genetic sequences as products of compression by inclusive superposition of many codes. Mol Biol 31:647–654. 41. Kapranov P, et al (2005) Examples of complex architecture of the human transcriptome revealed by RACE and high density tiling arrays. Genome Res 15:987–997. 42. Birney E, et al (2007) Encode Project Consortium: Identification and analysis of functional elements in 1% of the human genome by the ENCODE pilot project. Nature 447:799–816. 43. Itzkovitz S, Hodis E, Sega E (2010) Overlapping codes within protein-coding sequences. Genome Res. 20:1582–1589. Conclusions: Our analysis confirms mathematically what would seem intuitively obvious - multiple overlapping codes within the genome must radically change our expectations regarding the rate of beneficial mutations. As the number of overlapping codes increases, the rate of potential beneficial mutation decreases exponentially, quickly approaching zero. Therefore the new evidence for ubiquitous overlapping codes in higher genomes strongly indicates that beneficial mutations should be extremely rare. This evidence combined with increasing evidence that biological systems are highly optimized, and evidence that only relatively high-impact beneficial mutations can be effectively amplified by natural selection, lead us to conclude that mutations which are both selectable and unambiguously beneficial must be vanishingly rare. This conclusion raises serious questions. How might such vanishingly rare beneficial mutations ever be sufficient for genome building? How might genetic degeneration ever be averted, given the continuous accumulation of low impact deleterious mutations? http://www.worldscientific.com/doi/pdf/10.1142/9789814508728_0006 Scientists Have Stored a Movie, a Computer OS, and an Amazon Gift Card in a Single Speck of DNA "The highest-density data-storage device ever created." - PETER DOCKRILL - 7 MAR 2017 Excerpt: In turn, Erlich and fellow researcher Dina Zielinski from the New York Genome Centre now say their own coding strategy is 100 times more efficient than the 2012 standard, and capable of recording 215 petabytes of data on a single gram of DNA. For context, just 1 petabyte is equivalent to 13.3 years' worth of high-definition video, so if you feel like glancing disdainfully at the external hard drive on your computer desk right now, we won't judge. http://www.sciencealert.com/scientists-have-stored-a-movie-a-computer-os-and-an-amazon-gift-card-in-a-single-speck-of-dna Three Subsets of Sequence Complexity and Their Relevance to Biopolymeric Information - David L. Abel and Jack T. Trevors - Theoretical Biology & Medical Modelling, Vol. 2, 11 August 2005, page 8 "No man-made program comes close to the technical brilliance of even Mycoplasmal genetic algorithms. Mycoplasmas are the simplest known organism with the smallest known genome, to date. How was its genome and other living organisms' genomes programmed?" http://www.biomedcentral.com/content/pdf/1742-4682-2-29.pdf The Multi-dimensional Genome--impossible for Darwinism to account for-- by Dr Robert Carter - video (15:52 minute mark: Comparing the Computer Operating Systems of Linux to the much more sophisticated operating systems of Regulatory Networks in e-coli) https://youtu.be/K3faN5fU6_Y?t=952 Comparing genomes to computer operating systems - Van - May 2010 Excerpt: we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology,,, http://www.ncbi.nlm.nih.gov/pubmed/20439753 regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) - Picture of comparison http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2889091/figure/F1/
Maybe because engineers have to think? :) jawa

Leave a Reply