Uncommon Descent Serving The Intelligent Design Community

The Elegance of Computational Brute Force, and its Limitations

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Although for many years I was a classical concert pianist, I was raised by a wonderful father, who is the most brilliant scientist I have ever known, and he imparted to me a love of science.

My love of mathematics and science never left me, and my superb education in these disciplines has benefited me well, since I now earn my living as a software engineer in aerospace R&D.

The first experience I had with computational search algorithms involved AI games theory, which you can read about here.

Brute (but intelligently designed) computational force can do some interesting things (and even elegant things, as you can discover from my perfect-play endgame databases), but only in domains with restricted search horizons, and only if the search algorithms are intelligently designed with a goal in mind.

As a result of my interest in, experience with, and knowledge of computational search algorithms and combinatorial mathematics, it immediately became obvious to me that the Darwinian notion that a blind search — with no goal and no design, and hopelessly inadequate probabilistic resources — represents a reasonable or even rational explanation of the origin of all of biology, is a transparently preposterous proposition.

Design, from whatever source, is the only logical explanation, and the Darwinian hypothesis of random errors filtered by natural selection deserves its appropriate place at the apogee of the ash heap of junk-science history.

Comments
DrRec: "Even Behe lists multiple “adaptive gain of functional coded elements” in his latest review*." This is not the first time you have put forward this red herring, so I believe it is high time to weigh in and put a stop to your misreading of the situation. Behe has, for some time, and as further illustrated in the paper you cited, been looking for what he calls the "edge of evolution" or the boundary where traditional evolutionary mechanisms can actually do something. Behe goes through many examples of mutations and tries to categorize them to see what lesson can be learned. While there are a small handful of what could be viewed as "gain of function" mutations, the takeaway from Behe's careful review is most decidedly *not* that natural processes can readily come up with new informational structures. Further, even in those cases where there is arguably a gain of function, Behe shows that such "gain" almost inevitably results from the breakage of an existing part or system. Indeed, a large part of the point of Behe's paper is to propose what he calls the "First Rule of Adaptive Evolution," namely that in particular circumstances a fitness advantage can sometimes be gained by breaking or blunting a functional coded element. The strong takeaway from all this is that (i) naturalistic processes are terrible at producing information gain (indeed, we're still waiting for a decent example of information gain beyond the absolutely trivial), and (ii) even in those cases where a survival advantage has been conferred by a mutation, it is typically the result of breaking or blunting an existing functional element, not from creating some new informational element.Eric Anderson
September 16, 2011
September
09
Sep
16
16
2011
11:04 PM
11
11
04
PM
PDT
Two points: 1) Functional information DOES increase in nature (de novo genes, novel activities). Even Behe lists multiple "adaptive gain of functional coded elements" in his latest review*. Functional information increases in directed evolution (novel activities), due to mutation and recombination. Functional information increases in genetic algorithms. 2) Saying "intelligent selection, where some function is actively function is actively measured or rewarded by the algorithm, and natural selection, where the only advantage is a “natural” advantage ... the only function which has any effect in the algorithm is reproductive function." is a distinction without a difference. A genetic algorithm, where the best solution survives, reproduces, and goes onto the next round, or a directed evolution experiment, where sufficiently good bacteria or genes coding enzymes go on, or nature, where the fastest, smartest cheetah goes on to survive and reproduce are materially equivalent. The effect is the same. So now evolution is a blind search filtered by reproductive advantage. That is your source of active information. Dembski and you seem comfortable calling the equivalent such everywhere except in nature. Why is that? I still do not grasp the material difference that make selection "active information" in Ev or Avida, but not in nature. *http://www.lehigh.edu/bio/pdf/Behe/QRB_paper.pdfDrREC
September 16, 2011
September
09
Sep
16
16
2011
08:58 PM
8
08
58
PM
PDT
DrREC: The only "active information" added by the natural principle of positive selection is that, in a reproducing population, information giving a reproductive advantage is likely to expand. The natural principle of negative selection, on the other end, makes information which gives a reproductive disadvantage more likely to be lost. That's all. You want to call that a fitness function, be my guest, but I have pointed many times to the fundamental difference between intelligent selection, where some function is actively measured or rewarded by the algorithm, and natural selection, where the only advantage is a "natural" advantage, descending from a very simple logical principle, and not by any added information about a search space, and therefore the only function which has any effect in the algorithm is reproductive function. So, at best, a darwinian algorithm is a blind search for reproductive advantage. The only "active" information in that algorithm is a simple logical implication: what reproduces better usually expands, what reproduces worse usually is lost. The simple point made by Gil here, and which I absolutely agree with, is that there is no way such an algorithm, based on blind search with only that addition of "active information" (if we want to call it that way), and no more, can even begin to explain what we see in the biological world. In Gil's words: "Design, from whatever source, is the only logical explanation, and the Darwinian hypothesis of random errors filtered by natural selection deserves its appropriate place at the apogee of the ash heap of junk-science history."gpuccio
September 16, 2011
September
09
Sep
16
16
2011
08:19 PM
8
08
19
PM
PDT
Dembski states from his abstract;
Though not denying Darwinian evolution or even limiting its role in the history of life, the Law of Conservation of Information shows that Darwinian evolution is inherently teleological. Moreover, it shows that this teleology can be measured in precise information-theoretic terms.
Thus DrREC, do you concede the main point of Dembski's paper, that IF neo-Darwinism could generate functional information in life, (which no empirical demonstration has been forthcoming), that that gain in functional information would have to be the result of the design built into nature???bornagain77
September 16, 2011
September
09
Sep
16
16
2011
08:15 PM
8
08
15
PM
PDT
Interesting post. Two questions: 1) Are genetic algorithms brute force algorithms? I thought they were quite different. 2) Is evolution a blind search? Dr.Dembski, for example contrasts Darwinian processes and blind searches: "Searches that operate by Darwinian selection, for instance, often significantly outperform blind search." Abstract of : LIFE ’S CONSERVATION LAW Why Darwinian Evolution Cannot Create Biological Information William A. Dembski and Robert J. Marks II I guess the question is whether the fitness function provided by the algorithm (called an active source of information) is analogous to the fitness function that emerges from natural selection. I can't think of a rationale for finding one an active source of information, and the other not. Can you?DrREC
September 16, 2011
September
09
Sep
16
16
2011
06:54 PM
6
06
54
PM
PDT
1 2

Leave a Reply