- Share
-
-
arroba
Having participated at UD for 8 years now, criticizing Darwinism and OOL over and over again for 8 years is like beating a dead horse for 8 years. We only dream up more clever and effective and creative ways to beat the dead horse of Darwinism, but it’s still beating a dead horse. It’s amazing we still have a readership that enjoys seeing the debates play out given we know which side will win the debates about Darwin…
Given this fact, I’ve turned to some other questions that have been of interest to me and readers. One question that remains outstanding (and may not ever have an answer) is how much information is in an artifact. This may not be as easy to answer as you think. For example if I take an uncompressed sound file that is 10 gigabits in size and subject it to various compression algorithms, I may come up with different results. One algorithm may come up with 5 gigabits, another 1 gigabits, and another 0.5 gigabits. What then is the size of the file? How many bits of information are in the file given we can represent it in a variety of ways, all with differing numbers of bits.
Now, let us see how this enigma plays out for CSI. CSI is traditionally a measure of improbability since improbability arguments are at the heart of ID. Improbability can be related to the Shannon metric for information.
Suppose I have four sets of fair coins and each set of coins contains 500 fair coins. Let us label each set A, B, C, D.
Suppose further that each set of coins is all heads. We assert then then that CSI exists, and each set has 500 bits of CSI. So far so good, but the paradoxes will appear shortly.
If I asked, “what then is the total information content of the four sets of coins?” One might rightly say:
The total probability that all four sets of coins being heads is probability of all 2000 coins being heads, thus the probability is 2000 bits (1 out of 2^2000), and that is the amount of CSI collectively represented by all 4 sets of coins. The amount of CSI is 2000 bits.
But someone might come along and say,
Wait a minute, each set is a duplicate of the other set, so we should only count only 1 set as having the true amount of information. The information content is no greater than 500 bits, since the duplicate sets don’t count, there is no increase in information because of the duplicate sets. The amount of CSI is 500 bits.
So is the total amount CSI for all 4 sets combined 2000 bits or 500 bits? I think the correct answer is 2000 bits because CSI is a measure of improbability (Bill mentioned somewhere he thought about using the term “Specified Improbability”). I’ve given the number I think is the correct answer, but what do the readers think is the correct answer? What are the number of bits of CSI?
I post this discussion because ID proponents at UD have some disagreement over the issue. I think these are interesting topics to discuss, but we could always go back to beating the dead horse of Darwinism.
NOTES:
HT Eric Anderson who posed a very thought provoking question that inspired this thread:
If I measure the Shannon information* in one copy of War and Peace and then measure the Shannon information in another copy of War and Peace do I now twice as much information as I had?
To maybe clarify the issues at hand, instead of War and Peace, I thought about the sets of coins. Readers are invited to weigh in. I think this is a topic that deserves consideration and rigor.