Uncommon Descent Serving The Intelligent Design Community

The Fundamental Law of Intelligent Design

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

After being in the ID movement for 10 years, and suffering through many debates, if someone were to ask me what is the most fundamental law upon which the ID case rests, I would have to say it is the law of large numbers (LLN). It is the law that tells us that a set of fair coins randomly shaken will converge on 50% heads and not 100% heads. It is the law that tells us systems will tend toward disorganization rather than organization. It is the law of math that makes the 2nd law of thermodynamics a law of physics. Few notions in math are accorded the status of law. We have the fundamental theorem of calculus, the fundamental theorem of algebra, and the fundamental theorem of arithmetic — but the law of large numbers is not just a theorem, it is promoted to the status of law, almost as if to emphasize its fundamental importance to reality.

To understand what the law of large numbers is, it requires understanding the notion of expected value or expectation value. Rather than giving the somewhat brutal mathematical formalism of expected value, let me give an illustration with coins. If we have large set of fair coins, there is an expectation that approximately 50% of the fair coins will be heads after a vigorous shaking or flipping of the coins (a random process). That is, the expected value for the proportion of heads is 50%.

As we examine sets of coins that are very large (say 10,000 coins), the outcome will tend to converge so close to 50% heads so frequently that we can say from a practical standpoint, the proportion will be 50% or close to 50% with every shaking of the set. If we consider each coin in the set as a “trial”, the example illustrates the law of large numbers. Formally stated the law of large numbers says:

the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed.

Law of Large Numbers

How does this play out for ID? Before answering that question, let me classify 3 kinds of designs (or forms of organization).

A. Non-functional ordered objects (like all fair coins heads, or homochirality in biology)

B. Non-functional dis-ordered, but recognizably designed objects (like a set of numbered coins organized according to a pre-specified pattern, an binary representation of Hamlet, DNA strings that identify GMOs, etc.)

C. Functional objects (like components assembled into a functioning machine, a software bit stream, etc.)

In this essay, I’ll illustrate design using the law of large numbers with the “non-functional ordered objects”. I’ll save for later discussion the illustration of design in the more challenging cases of “non-functional dis-ordered, but recognizably designed objects” and “functional objects”.

If I had 500 fair coins in a box all heads, I would conclude the 100% proportion of heads is far away from the expected value of 50% heads, thus we have a significant violation of the law of large numbers for random processes, thus a random process is rejected as the mechanism of creating the all-heads pattern. By convention, the ID community classifies objects as designed if they do not conform to the products of law and chance. Whether they are designed in the ultimate sense is a separate question, but the practical rejection of the chance hypothesis in this case is unassailable.

A typical mistake ID proponents make is saying, “the all-heads pattern happens on average only 1 out 2^500 times, therefore the chance hypothesis is rejected”. The Darwinists will counter by saying, “that pattern is no more special than any other since every pattern happens only 1 out of 2^500 times, therefore all-coins heads is consistent with the chance hypothesis”. Last year, Darwinists at The Skeptical Zone tried to pull that same rhetorical stunt on me with these words:

if you have 500 flips of a fair coin that all come up heads, given your qualification (“fair coin”), that is outcome is perfectly consistent with fair coins,

Law of Large Numbers vs. Keiths

But I came prepared to counter their maneuvers. 🙂 They obviously didn’t anticipate I’d debate them from an unorthodox angle, namely the law of large numbers and application of expected value. I pointed out based on the binomial distribution and expectation value of 50% heads, 100% heads is a violation of law of large numbers and hence a violation of the chance hypothesis from a practical standpoint. My opponents in the debate were thrown into disarray. But as always, they never admitted defeat in the exchange. They camped out at UD and would not rest until I confessed the following creed:

if you have 500 flips of a fair coin that all come up heads, given your qualification (“fair coin”), that is outcome is perfectly consistent with fair coins,

Law of Large Numbers vs. Keiths

I told them, “no dice”. Or maybe I should have said, “no coins.” The Darwinists at Skeptical Zone fared so badly that even arch Darwinist Jeffrey Shallit felt it necessary to call his associates out on their folly.

The advantage of using the law of large numbers is it brings clarity to the probability arguments. It negates the Darwinists claim that “every pattern is just as improbable as another, therefore design is nothing special”. The 500-coins heads example illustrates how to apply the law of large numbers in identifying designs for non-functional ordered objects, and thus, certain patterns are indeed special because their very nature is at variance with the chance hypothesis.

It occurred to me, since the law of large numbers was such a fruitful way to refute the materialists on the question of non-functional ordered designs, how about we use the law of large numbers when dealing with other more challenging kinds of designed objects? Those ideas, Designer willing, will be explored in subsequent discussions.

NOTES:

For some history of debates with Darwinists over the Law of Large Numbers see:

SSDD: A 22-sigma event is consistent with the physics of fair coins?

Law of Large Numbers vs. Keiths

Siding with mathgrrl on a point, alternative to CSI V2

Comments
Matzke: Any other individual sequence of 500 heads/tails also has the same probability. So the law of large numbers doesn’t get you anywhere.
I'm a total amateur on statistics. But I'll give it a go anyway: There are much more individual sequences that produce 50% head than the one sequence that produces 100% head. When we produce many sequences, we can order the sequences in mathematical sets based on percentage. The set with 50% heads accommodates the most sequences; next will be the set with 49% head and the set with 51% head; next 48% and 52% ....; finally the set with 100% head and the set with 100% tail - most likely empty sets because they accommodate just one sequence. It follows that a sequence with 50% head has the highest probability and a sequence with 100% head (or a sequence with 100% tail) has the lowest probability. If I'm right, Matzke cannot say that "any other individual sequence of 500 heads/tails also has the same probability". Because sequences which are accommodated by the '50% heads set' have the highest probability.Box
December 15, 2013
December
12
Dec
15
15
2013
07:50 PM
7
07
50
PM
PDT
it [the law of large numbers] is a theorem in mathematics with no relevance to reality
Neil, are you so determined to disagree you'll say something so outrageous. That has to be the quote of the day.scordova
December 15, 2013
December
12
Dec
15
15
2013
07:47 PM
7
07
47
PM
PDT
The LLN doesn’t say anything about the outcome of a single experiment. It is only relevant to the averaged outcome of many experiments.
I mentioned every coin can be treated as an independent trial when the entire set is shaken or if the coins are individually shaken, so LLN applies. You're not even reading what I wrote, much less refuting it. Keep trying, or simply acknowledge I was right.scordova
December 15, 2013
December
12
Dec
15
15
2013
07:39 PM
7
07
39
PM
PDT
Dr. Matzke, To what do I owe the honor of a visit by world's top lobbyist for Darwin?
Your actual argument should be something about independent specification of a pattern. You’re the ID advocate, I shouldn’t have to help you out!
. All coins heads is an independent specification. You're the one who is making a clueless remark. But I chose this example to show that the chance hypothesis can be rejected in some cases even without an explicit independent specification.
Any other individual sequence of 500 heads/tails also has the same probability. So the law of large numbers doesn’t get you anywhere.
Wrong, the law of large numbers says the 500 coins should approach the expectation value of 50% heads, whereas 100% heads occupies the maximum possible number of standard deviations possible in principle (somewhere apporaching 22 sigma). Hence, as a practical matter, the chance hypothesis can be rejected. If you were in a chem lab and performed some basic experiment that was done a zillion times before by other experimenters but gave you results that were 22-sigma from expectation, wouldn't you think you'd raise some eybrows? :-) Now if you want to argue 500 fair coins heads is consistent with predicted experimental expectation for a random process, go right ahead. If want to say that you accept the chance hypothesis for this case, feel free to record such an assertion. If not, I take it then you agree that I've successfully demonstrated the chance hypothesis, practically speaking, can be rejected. So what say you Nick, is chance a practical explanation for 500 fair coins heads or not?scordova
December 15, 2013
December
12
Dec
15
15
2013
07:37 PM
7
07
37
PM
PDT
Few notions in math are accorded the status of law. We have the fundamental theorem of calculus, the fundamental theorem of algebra, and the fundamental theorem of arithmetic — but the law of large numbers is not just a theorem, it is promoted to the status of law, almost as if to emphasize its fundamental importance to reality.
This is nonsense. It is just a theorem. That it is called a law has to do with its history. It isn't any kind of promotion of a theorem to a law. Moreover, by itself, it is a theorem in mathematics with no relevance to reality. It can have relevance when aspects of reality are modeled with a mathematical model for which the law of large numbers happens to be applicable. For reference, here's a link to the Wikipedia page.
If we have large set of fair coins, there is an expectation that approximately 50% of the fair coins will be heads after a vigorous shaking or flipping of the coins (a random process). That is, the expected value for the proportion of heads is 50%.
The expected value is exactly 50%. If you tossed a bunch of coins, and expected exactly 50% to be heads, that would be foolish. "Expected value" is a technical expression in the mathematics, and is not what you should expect to happen. Instead of randomly tossing coins, lets randomly shuffle up the molecules in the atmosphere. The law of large numbers says you should get mild weather. However, we actually get thunderstorms, hurricanes, tornados, etc. It's only on the average that it will be mild. Individual instances can be far from that average. Of course, with the weather, we don't have completely random shuffling of molecules. We have other forces such as the coriolus force, the heat from the sun, the orbiting of the earth around the sun, the effects of volcanos, etc. Similarly, other forces affect chemical reactions that can occur. It's just a mistake to assume that you will always get something near the average. You quote this:
if you have 500 flips of a fair coin that all come up heads, given your qualification (“fair coin”), that is outcome is perfectly consistent with fair coins, Law of Large Numbers vs. Keiths
However, keiths was talking about the outcome of a single tossing experiment (involving a sequence of 500 flips). As the Wikipedia article clearly points out, "the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times." The LLN doesn't say anything about the outcome of a single experiment. It is only relevant to the averaged outcome of many experiments.Neil Rickert
December 15, 2013
December
12
Dec
15
15
2013
07:28 PM
7
07
28
PM
PDT
I pointed out based on the binomial distribution and expectation value of 50% heads, 100% heads is a violation of law of large numbers and hence a violation of the chance hypothesis from a practical standpoint.
Any other individual sequence of 500 heads/tails also has the same probability. So the law of large numbers doesn't get you anywhere. Your actual argument should be something about independent specification of a pattern. You're the ID advocate, I shouldn't have to help you out!NickMatzke_UD
December 15, 2013
December
12
Dec
15
15
2013
05:14 PM
5
05
14
PM
PDT
1 2 3

Leave a Reply