Industry is constantly searching for technologies to maximize profits and minimize costs. Software industry is no exception (the world software market exceeded $300 billion).
Actually some computers can process quadrillions floating-point operations per second (10^15 flops). It would be technically possible to implement on such computers the paradigm of unguided evolution (random variation + selection) for obtaining new programs by randomly modifying old programs. So, why software houses pay legions of human programmers to develop ex-novo applications when an automatic process could do the job? They could save truckloads of money by automatizing, at least in large part if not in toto, the software development work flow.
To have an idea, let’s perform two simplified calculations about the speed of biological evolution (BES) vs. the speed of computer aided evolution (CAES).
Biological evolution speed
Consider an initial population of 10^9 bacteria with generation/ reproduction time = 40 minutes, and mutation rate = 0.003 mutations per genome per generation. We have an initial biological evolution speed BES = (0.003 x 10^9) / (40 x 60) = 1250 mutations/sec.
Computer aided evolution speed
Consider a single 10^15 flops computer and suppose, for the sake of argument, that a program “mutation” needs an equivalent of 1000 floating-point operations. We get a computer aided evolution speed (CAES) = 10^12 mutations / sec.
Since, according to Darwin, unguided biological evolution was able to spontaneously produce all 500 million species on earth (from bacteria to man) in 3 billion years (biological evolution time = BET), computer aided evolution could automatically produce software containing an equivalent overall amount of functional complex specified information in what we call “computer aided evolution time” (CAET). In other words, we state that the product of “speed x time” is equal for biological evolution and for computer evolution:
CAET x CAES = BET x BES
CAET is then = (BET x BES) / CAES
in numbers:
CAET = (3×10^9 x 1250) / 10^12 = 3.75 years
Evolution applied to software programming would produce software equivalent to the organizational information that present and past organisms contain in less than 4 years. Then, again, why software houses don’t save billion dollars in employers by applying Darwinian evolution to the software creation?
My short answer: because Darwinian evolution works exactly zero, when the goal is to create systems. It is fully incapable to create the least system in principle. If it were capable to do that just a little, software producers would use it. To put it differently, if Charles Darwin was right Bill Gates would be far richer than he is…
—
I know in advance the objection that evolutionists could rise. They always deny all: “it is false that software industry doesn’t use evolution; in fact there are evolutionary algorithms”, or something like that.
My counter-objection: evolutionary algorithms (EAs) are programs designed to converge by iteration to a particular solution for a very specific problem. To recall EAs to refute my affirmation that informatics industry doesn’t use evolution to create software is nonsense like to say that, for example, in mathematics, the iterative methods to find the approximate root of an equation (like Newton’s method) can create the entire mathematics. In other words, EAs are designed routines that can be useful in certain cases to solve very small sub problems. I wrote “in certain cases” because in some other cases EAs fail and fail spectacularly, exactly as it happens – in certain conditions – for iterative methods in math. The bottom line is that EAs are toys that can do nothing to solve the big picture, the total creation of an entire software project from zero. Here the analogy is strict: EAs are unable to create new software applications, like Darwinian evolution is unable to create new organisms. And software producers do know that.
I agree EAs are specific programming modules and mainly used as efficient search functions.
I would also like to show that even energy system shows support for Intelligent Design:
Any system – be it biological or physical- tends to move towards low energy consumption level so that the total energy used by the system becomes efficient.
A random system will always use a higher energy than an intelligent, directed system, since randomness involves doing the job allocated to the system by chance.
Neo-Darwins claim that random chance can arrange the amino acids sequences to produce specific proteins. We can consider the randomness as a system of Binomial probability distribution.
We can envision 2 types of random attempts in the Neo-Darwin system:
Organism attempts to improve chance of success by increasing random attempts without regard to energy expended
Organism conserve energy by reducing attempts- which will lead to reduced rate of success
Considering that the system always tries for lower energy consumption, we can ignore the scenario ‘a’. Consider scenario ‘b’:
Let us denote the probability of success of arrangement of amino acids by ‘p’ . Thus, probability of no more than 2 success per 1000 attempts (which is generous and very high) in a time span will be:
?(-1+p)?^998 (1+998p+498501p^2)
If we simplify the above, we get
?(-1+p)?^998 (1+499p(2+999p))
If p = 0.1 the probability will be
1.09×?10?^(-42)
For p=0.9 the probability will be staggering low
4.04×?10?^(-993)
Let’s denote the energy required for a attempt at arranging amino acid randomly by the symbol ‘e’, so for 1000 attempts, the energy expended will be e X 1000 = 1000e.
A directed intelligent system will need only 1 attempt to arrange proper amino acid sequence. Since the probability of random success is infinitesimal, for all practical purpose,
We have E = 1000e. where ‘E’ represents energy expended by directed intelligent system and is equal to a single attempt of ‘e’.
Conclusion – The random, chance system of arrangement of amino acids to produce protein will expend almost as much energy as the number of attempts made by the system to arrange the amino acids. An organism can conserve the energy by reducing random attempts, which would invariably reduce the probability of success further or it can increase the attempts without regard to energy conservation, which would be highly inefficient and unlikely as a viable biological process.
In fact the conclusion is so intuitive that we need not have used any math at all.
I am giving below the simple Mathematica input required to derive the above results. Of course you can use whatever software (or your mind, pencil and paper) that you are comfortable with.
Probability[x?2,x ?BinomialDistribution[1000,p]]
Output:
?(-1+p)?^998 (1+998p+498501p^2)
FullSimplify[?(-1+p)?^998 (1+998p+498501p^2)]
Output:
?(-1+p)?^998 (1+499p(2+999p))
?(-1+p)?^998 (1+499p(2+999p))/.p?0.1
?(-1+p)?^998 (1+499p(2+999p))/.p?0.9
Brilliant. Certainly a little functioning code is a lower target than functioning biological systems.
I’m actually kinda interested in selvaRajan’s comment @ 1. To be honest, I don’t quite understand the entire thrust of his argument, but there seems to be something useful there that isn’t quite the same as other 2nd Law arguments.
selvaRajan, can you rephrase your argument, or explain in another way?
Every paper/book that has been published on Computational Evolutionary Algorithms should really use the full and correct title for this approach:
“Intelligently Designed Evolutionary Algorithms”.
This would help remove any confusion of this approach with evolution as understood by materialists such as Dawkins, Myers etc.
Thanks niwrad! Definitely a keeper. I’ve pointed this disparity out to atheists a few times (If Darwinism is true why can’t we design computer ‘super’ programs with it?) but never had the math as an example to back the disparity up.
A few related notes:
niwrad,
Since a software’s true success at a company is based on end customer satisfaction. The software company would have to mutate some code, compile it and see if people like it. This would require operating the software. Of course, manuals to operate the software would not exist with the software, the customer would have to discover what it’s useful for. Unless maybe, they kept a bunch of programmers off to the side to test each new mutated program…and figure it out (of course this is going back towards intelligent selection like the Dawkins weasel).
Seems to me, the best test to see if a software program can evolve is to make a program that can copy itself many times over with various mutation rates. And if any progeny code can out survive the parent code, then it ‘wins’ to reproduce. Create a virtual word of limited resource code snippets, and see if the replicating software code can make something greater than the sum of it’s parts.
I think this has been done, and expect that what will happen in the end is that the code will only simplify to smaller code… and that’s about the end of it.
This is why for Darwinian evolution to be true, it should be expected to find that the only surviving organism is of the simplest possible kind, a single celled organism. And one that will consume ANY other creature or organic material other than itself… but such a creature doesn’t exist. I wonder why. 😛 Of course, Darwinists might argue ‘but it’s complicated‘. I would agree with them in a way, not because of complications in the explanation, but rather because of the complicated reasoning for seeking such an explanation.
I don’t think it’s possible to simulate unguided evolution in a computer.
But say it is possible. I wonder if dropping a huge rock on the PC would lead to greater diversification of the software that survived.
Dr. David Berlinski: Random Mutations (to computer programs?) – video
http://www.youtube.com/watch?v=DGaUEAkqhMY
What Is The Genome? It’s Not Junk! (Linux Operating System Compared To Life) – Dr. Robert Carter – video
http://www.metacafe.com/watch/8905583/
Comparing genomes to computer operating systems – Van – May 2010
Excerpt: we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology,,,
http://www.ncbi.nlm.nih.gov/pubmed/20439753
Programming of Life – Biological Computers – Ch. 6
http://www.youtube.com/watch?v.....F11E2FB840
3-D Structure Of Human Genome: Fractal Globule Architecture Packs Two Meters Of DNA Into Each Cell – Oct. 2009
Excerpt: the information density in the nucleus is trillions of times higher than on a computer chip — while avoiding the knots and tangles that might interfere with the cell’s ability to read its own genome. Moreover, the DNA can easily unfold and refold during gene activation, gene repression, and cell replication.
http://www.sciencedaily.com/re.....142957.htm
Do you believe Richard Dawkins exists?
Excerpt: DNA is the best information storage mechanism known to man. A single pinhead of DNA contains as much information as could be stored on 2 million two-terabyte hard drives.
http://creation.com/does-dawkins-exist
DNA Computer
Excerpt: DNA computers will work through the use of DNA-based logic gates. These logic gates are very much similar to what is used in our computers today with the only difference being the composition of the input and output signals.,,, With the use of DNA logic gates, a DNA computer the size of a teardrop will be more powerful than today’s most powerful supercomputer. A DNA chip less than the size of a dime will have the capacity to perform 10 trillion parallel calculations at one time as well as hold ten terabytes of data. The capacity to perform parallel calculations, much more trillions of parallel calculations, is something silicon-based computers are not able to do. As such, a complex mathematical problem that could take silicon-based computers thousands of years to solve can be done by DNA computers in hours.
http://www.tech-faq.com/dna-computer.html
The reason evolution isn’t used in software programming is that at it’s heart, it’s a brute force approach. Try all combinations, and then toss out the ones that won’t work.
Given enough time and resources, it will find the most efficient solution, assuming such a solution exists at all. But given the exponential nature of most real world computing problems, there are a nigh infinite number of ineffective and inefficient solutions to blindly stumble upon before you hit the efficient solutions.
To make this brute force search work, the search field must be reduced. When done by programmers, this is an example of intelligently guided “evolution”. The precision with which solutions are selected is also more an example of artificial breeding than natural selection.
And of course, evolution can’t apply if it doesn’t have something to start with; random bit-flips are not known for their ability to create brand new functional software programs.
I joined this forum just to comment on this, after lurking for years. As a computer programmer I can tell you that without the inteligence of the programmer, no computer program can ‘evolve’. If a program would ever be able to evolve, it would have had to have been designed to that in the first place.
SirHamster:
Yeah. This is why Darwinian evolution is a pile of superstitious nonsense. The problem with evolution and materialist OOL theories is that, no matter how good a combination gets, the succeeding random mutations or transformations will destroy it.
I use to think about this as well. Not exact as niwrad put it here (which is good) but somewhat related. If a person want to start a software company where new software will be develop only on the basis on random mutation and selection and if that person ask the evolutionary biologists to invest in such a company, how many of them will be doing it?
Mung said: “I don’t think it’s possible to simulate unguided evolution in a computer.”
Ironically, when I was writting a black jack program, I found out that the “random” function wasn’t so random. I had mistakenly used the same seed for all the iterations. When I was testing the program, I found that I could predict the sequence of “cards” that came up during play. After troubleshooting the problem, I discovered that I needed to use a different seed every time I called the “random” function. I solved the problem by using the time stamp as a seed. After I did that, I came to the conclusion that computers cannot achive pure randomness. This is why you will never see me sitting at a slot machine in Vegas.
How to Simulate Unguided Evolution in a Computer.
ME: I don’t think it’s possible to simulate unguided evolution in a computer.
But i think it could be an interesting topic of discussion, if niwrad or Gil or someone else were to start a thread on it.
Some people might be amazed at how much intelligent design there has to be to even come close.
A rather good method to get better seed is to add a pile of truly random data and add that to the equation with the timestamp. This data can be generated relatively simple by collecting mouse movements or static from a radio.
Sebestyen
Software programs would not run if it were not for computer architectures, so it should not be surprising at all that programs run on computers that appear well-designed to run software programs.
What shall we call this principle? Suggestions?
The macthropic principle?