COMPLEXITY =/=> EVOLUTION
Many Darwinists equate complexity with evolution. They see the fossil record of increasing complexity with time as precisely what defines Evolution. But is increasing complexity always a good thing? The history of computers is instructive.
Your iPhone and laptop computer are constructed using base-2, principally because flip-flops and early binary circuits were easy to make, even the earliest electronic memory based on circular ferrites was two-state. This base-2 necessity led to an explosion in the study of Boolean Algebra and binary logic in the 1950’s, which demonstrated that everything you could do in base-10 could be done in base-2.
By the late 50’s, the Russians were falling further and futher behind the US in computer technology, and being the math-nerds they are, they thought that perhaps binary computers were just the first step in a necessary evolution of computers. So if evolution was complexity, then the obvious next evolutionary step should be ternary logic, or 3-state systems. Rather than (-1,1), they built circuits that used (-1, 0, 1) as logic states.
After much effort, they had their first ternary computer up and running and programmed, and they could compare it to the US binary machines. They were abysmally slower. Not only so, but they were slower even if one emulated ternary logic in software on a binary machine–for you FORTRAN afficionados, this was the FOR66 arithmetic goto statement. I personally translated a 40-line FOR66 program CURFIT from Philip Bevington’s 1969 “Data Analysis” textbook into TurboPascal, and wrestled for a whole week with the ternary logic. It was devilish, ultra-compact, but a royal pain. Five years later I was translating it into “C” (TurboPascal having died an early death) and read Kernighan and Richie’s classic text where they said about “clever” ternary algorithms the same thing that my poetry instructor had told me in college–“Kill all your little darlings.” Or as the Brits would say, “Too clever by half.”