Uncommon Descent Serving The Intelligent Design Community

Darwinism: Why its failed predictions don’t matter

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

From Wayne Rossiter, author of Shadow of Oz: Theistic Evolution and the Absent God: at his book blog:

It’s an odd pattern. It was this problem that came to mind as I recently revisited Living with Darwin: Evolution, Design and the Future of Faith, by Philip Kitcher. Kitcher is a philosopher at Columbia University, and he specializes on biology. His book was published by Oxford University Press, and was the recipient of the 2008 Lannan Notable Book Award. We should take his views seriously.

His book begins with a forceful assertion: “From the perspective of almost the entire community of natural scientists world-wide, this continued resistance to Darwin is absurd. Biologists confidently proclaim that Darwin’s theory of evolution by natural selection is as well established as any of the major theories in contemporary science.”

This is not really a prediction. But, it is a statement that was wrong even before it was penned. More.

People who are committed to intellectual integrity in their own work often miss this central point: Once a bully pulpit like Darwinism has been established the occupant does not need to be correct, accurate or even useful. He can be a drag on the system. He can lead the march into degenerate science. He can, incidentally, fix you good if you try to offer an alternative view however grounded.

Bullies are not dislodged by being shown to be wrong, only by being successful opposed. Efforts so far have been commendable but quite honestly, more is needed.

See also: Biologist Wayne Rossiter on Joshua Swamidass’ claim that entropy = information

Follow UD News at Twitter!

Comments
It's been a long time since this discussion took place. My initial position was counter to the position that ID opponents sometimes take. They say that the probability of ANY hand dealt from a standard deck of cards is highly improbable, and yet it happens all the time. From this they then go on to say that highly improbable events happen all the time and so ID's arguments against evolution involving improbabilities is illegitimate. The thought came to me the other day that this is what took place in this very prolonged discussion: we never took notice of the actual probability scheme we were dealing with. We weren't properly identifying the actual 'event' we had at hand (pardon the pun). Here's what it now looks like to me. We have two independent probabilities that have become confused. We have the probability of dealing a hand. Let that be P(Hand). And we have the probability of a certain set of cards ending up in a 'hand' consisting of 5 cards dealt from a standard, shuffled deck. Let this be P(X1,X2,X3,X4,X5) with the X's being random variables. The probablity of the 'event' associated with the 'dealing a hand' is then: P(Hand) x P(X1,X2,X3,X4,X5). But P(Hand), as I've argued throughout, is 1. Hence the probability any "particular" hand, as jdk stipulates, is 1 x P(X1,X2,X3,X4,X5)= P(X1,X2,X3,X4,X5). So, this is where the confusion comes in: we overlook this compound event, which involves both the face value of cards and the act of shuffling dealing. So, to say in the case of dealing cards that highly improbable events happen all the time is simply to confuse P(Hand) x P(X1,X2,X3,X4,X5) with P(Hand). PaV
Yes, nothing but a theological speculation. jdk
Miller's claim is a theological one unless he only claims to be speculating. hnorman5
Thanks, Phinehas: I appreciate that, and your other comments about the value of this conversation. I too have learned a lot by thinking about the issues and trying to articulate my thoughts. jdk
EA:
Another problem with Miller’s approach: it isn’t “evolution” as the term is understood in academia, a fact which earned him a lot of scorn from the materialist evolutionists.
Indeed. The term is used to mean, as G.G. Simpson put it: "Man is the result of a purposeless and natural process that did not have him in mind." But, of course, the moment that "purposeless" is a demonstrable scientific proposition, ID's efforts to detect purpose are also validated as such. Which puts materialists making such claims in quite the pickle. Phinehas
jdk @429 My apologies if I implied that it was your position. I meant that it was my position, and that what you wrote had informed it, and in fact changed it in significant ways over the course of this thread. Phinehas
Phinehas writes,
I think, given what jdk has argued successfully @105 and @192, that you would not be entitled to claim that the probability of drawing that permutation was 1 in 4 x 10^21.
I disagree that that is my position, or that my posts said that. The question of whether a hand is significant or not is a matter of human cognition. That is an additional element added to the issue of theoretical probability. Every hand has a probability of 1 in 4 sextillion: I have repeatedly said that. That is the probability irrespective of any specification any one has or hasn't made. I know you and others have argued otherwise, but I don't agree with your interpretation. So, to be clear, the probability is what it is, given a certain set of parameters, irrespective of whether any specifications are made by anyone. I did explain the probability of getting a significant hand probably gets smaller as the sample space gets larger. I will be amazed, and conclude cheating, if I get all spades in order and not if I get {4 clubs, 7 diamonds, jack diamonds, ace hearts ... in some non-patterned fashion}, but NOT because all spades in order is more or less likely than {4 clubs, 7 diamonds, jack diamonds, ace hearts ... in some non-patterned fashion}. jdk
But let's remember Miller's approach was essentially an effort to have his cake and eat it too: still believe in God while keeping his other foot firmly in the camp of fully materialistic explanations for life and biology. Rather cute, to be sure, but not an impressive intellectual stance to take. One problem with Miller's approach: there is no evidence for it. Another problem with Miller's approach: it isn't "evolution" as the term is understood in academia, a fact which earned him a lot of scorn from the materialist evolutionists. Eric Anderson
Origenes @421 and Phinehas: We, as intelligent beings, have the ability to collapse probabilities. Indeed, this is precisely how we act as causal agents in the real world. I would note that this ability is also a fundamental aspect of the very concept of intelligence, which by its very etymology means: to choose between [contingent possibilities]. Unlike inanimate matter, we have the ability to choose: to act, not just be acted upon. Eric Anderson
jdk:
You have described the basic explanation that Ken Miller uses in “Finding Darwin’s God” for how God interacts with the natural world without violating any physical laws or otherwise “miraculously” intervening in the world.
Interesting. I haven't read "Finding Darwin's God" but I imagine Miller and I would still differ quite a bit on whether or not God was constrained to interacting with the world in this way, or whether it must never be His intention to interact otherwise. Phinehas
DS:
Am I now entitled to claim that the probability of drawing that permutation was/is 1 in 4 x 10^21? Am I entitled to claim a very unlikely event occurred? If so, why?
I think, given what jdk has argued successfully @105 and @192, that you would not be entitled to claim that the probability of drawing that permutation was 1 in 4 x 10^21. This is based on the idea that some percentage of 13-card hands have an admittedly amorphous pre-specification of being "significant." If we suppose that, for a 13-card hand as has been described, the number of potential "significant" hands is 5% of the total number of possible configurations, then I would say the probability of drawing that permutation would be 1 in 4 x 10^21 x 0.05 (or whatever the proper math would be for what I've described, since I have more confidence in my logic than in my math :P). But these are still very long odds, so I think you would indeed be entitled to claim that a very unlikely event occurred. Phinehas
to Dave at 423: I think you have a described an inconsistency in some of the ideas being presented about the role specifications play. to Phinehas at 422: (very off-topic!) You have described the basic explanation that Ken Miller uses in "Finding Darwin's God" for how God interacts with the natural world without violating any physical laws or otherwise "miraculously" intervening in the world. jdk
I've probably said just about all I have to say about this subject, but I wonder about one issue that came up earlier. Suppose I draw a 13-card permutation P at random, and it definitely looks "random"---no obvious pattern. I think it's being argued that I am not entitled to then claim that the probability of that outcome occurring was 1 in 4 x 10^21. Or perhaps I just can't claim that a very unlikely event occurred. However, suppose the permutation came up in ascending order of face value (ace, 2, 3, ...), with the suits alternating hearts/diamonds/hearts/diamonds... . I didn't predict this, that's just the permutation that was drawn. Am I now entitled to claim that the probability of drawing that permutation was/is 1 in 4 x 10^21? Am I entitled to claim a very unlikely event occurred? If so, why? daveS
Origenes:
This doesn’t violate quantum randomness, because a selection can be non-random at the macro level, but random at the micro level.
Fascinating insight. I've often thought about a sort of corollary to this. Quantum randomness seems to allow for a high degree of improbability at the micro level so long as the probabilities even out at the macro level. If I were a being who wished to exert control over circumstances, but in a way such that my fiddling could not be definitively denied as a random occurrence, quantum randomness provides quite the convenient setup. Behind its veil, for instance, I could set up the butterfly sneezes that dictate weather patterns to my heart's content so long as the macro-probabilities fell out into the expected patterns. To carry this thought further, supposing I had complete access to the quantum state of every particle, I might, if I so desired, even set up the XOR logical operation described above in someone's brain as a function of my will rather than that of the person to whom the brain belonged. Phinehas
Phinehas: But it is all intuition, feeling like there is an important truth just outside my mind’s ability to grasp it.
Same here. Something that VJTorley once wrote seems somehow relevant:
…[I]t is easy to show that a non-deterministic system may be subject to specific constraints, while still remaining random. These constraints may be imposed externally, or alternatively, they may be imposed from above, as in top-down causation. To see how this might work, suppose that my brain performs the high-level act of making a choice, and that this act imposes a constraint on the quantum micro-states of tiny particles in my brain. This doesn’t violate quantum randomness, because a selection can be non-random at the macro level, but random at the micro level. The following two rows of digits will serve to illustrate my point. 1 0 0 0 1 1 1 1 0 0 0 1 0 1 0 0 1 1 0 0 1 0 0 0 0 1 1 0 1 1 0 1 1 1 0 1 The above two rows of digits were created by a random number generator. The digits in some of these columns add up to 0; some add up to 1; and some add up to 2. Now suppose that I impose the non-random macro requirement: keep the columns whose sum equals 1, and discard the rest. I now have: 1 0 1 1 1 0 0 0 0 0 1 0 1 0 0 0 1 1 0 1 1 0 Each row is still random (at the micro level), but I have now imposed a non-random macro-level constraint on the system as a whole (at the macro level). That, I would suggest, what happens when I make a choice.
His illustration may have to do with what a 'collapse' from randomness into outcome might be and how we might be involved in that process. Origenes
Origenes:
Yes correct Kitcher! But there is one small problem, you did not independently specify the hand.
And even that wouldn't be a big deal without, "But you did," which is where I feel like there is a claim that something happened which did not in reality happen. To expand on #412, I just get the sense that the questions we are asking are reaching down to the very core of our experience of reality, especially as it relates to probability and time. But it is all intuition, feeling like there is an important truth just outside my mind's ability to grasp it. Phinehas
Kitcher: “Consider the humdrum phenomenon suggested by [Michael] Behe’s analogy with bridge. You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order?
What is the question here? After 13 cards are dealt a particular hand is guaranteed. Probability 1. Works all the time. Furthermore, since there is no independent specification to match, the term "exactly" is completely inappropriate in Kitcher's question, since all possible hands would have been equally "exact". Each and every hand would "exactly" match the post-specification. So, again, what is being asked? Perhaps the question is: What is the probability that the hand exactly matches the post-specification? Well the answer to that is, assuming that post-specification is done accurately: probability 1. Or the question is the following (and this option seems most likely to me):
What is the probability that you get exactly those cards in exactly that order, if you had independently specified it?
Now things make sense:
Kitcher: The answer is one in 4 x 10^21.
Yes correct Kitcher! But there is one small problem, you did not independently specify the hand. - - - - - P.S. I too would like to express my genuine appreciation for jdk and DS. - - - - - P.P.S. Phinehas #412 makes my head spin. Origenes
Thanks, Phinehas. The questions from you and others here have been very challenging and interesting as well. daveS
I realize we are getting to a point where it may feel like we are talking past each other and that this can cause tension and frustration to rise. Before that happens, I would really like to express my genuine appreciation to jdk and DS for what has been a very enlightening and fascinating discussion for me, helping me to nail down exactly what I believe about the situation and why through their wonderfully challenging questions and objections. Seriously. THANK YOU. Phinehas
jdk:
No “that” means the particular throw. I am looking at the coin and heads is up. That is what “that” refers to.
OK. You are looking at the coin and heads is up. Is it possible for heads not to be up? If that question causes some confusion, it should. It is the same confusion that is swirling around this whole notion of trying to retrofit a particular onto something generic that already happened. Phinehas
DS: But you didn't say "3" you said "the number that was generated." This is more like saying "n" than saying "3". And "n" is a variable. It is a stand-in that can mean any one of the possible numbers. If "n" is not initialized before the random event, then it is undefined. Assigning it a value after the fact doesn't change this. Phinehas
If your “that” means “I got one of the two possibilities
No "that" means the particular throw. I am looking at the coin and heads is up. That is what "that" refers to. jdk
Phinehas,
The problem is that there is no first “particular number”. There is simply “the number that was generated” which was not a “particular number” but a “generic number”. The probability of getting a “generic number” is 1 in 1.
"Generic number"? In this experiment, exactly one number comes up. Just like when you roll a 6-sided die, you get exactly one element of {1, 2, 3, 4, 5, 6}. I just rolled a die, and a "3" came up, for example. daveS
I think Origenes said something earlier about probability collapsing. This makes me wonder if we aren't touching a bit on quantum physics here. I'm decidedly a layman on the subject, but it seems to me that the act of "observing" the random event may collapse the probability wave function into a single outcome. This may be why trying to retrofit a "particular" just doesn't work. Anyway, FFT. Phinehas
DS:
There is no second “particular number”. There is simply the number that was generated. The probability of getting that number was 1 in 4 x 10^21, since all the numbers have that same probability of coming up.
The problem is that there is no first "particular number". There is simply "the number that was generated" which was not a "particular number" but a "generic number". The probability of getting a "generic number" is 1 in 1. Phinehas
jdk:
Am I justified in saying “The probability of that having happened is 1/2?” I understand what you are saying, but I would like you to assume that your point is made, and just answer my question.
If your "that" means "I got one of the two possibilities" then you will always get one of the two possibilities, which indicates a probability of 1/1 not 1/2. And your "that" more correctly means "I got one of the two possibilities" than it does "I got a particular outcome" when there was not really a particular outcome specified prior to the random event. Phinehas
re 404: I know you said all the same stuff as before, but you didn't answer the simple question. I flip a coin, and get heads. Am I justified in saying “The probability of that having happened is 1/2?” Notice, no one called heads. There was no prior specification. I just flipped a coin. Am I justified in saying “The probability of that having happened is 1/2?” I understand what you are saying, but I would like you to assume that your point is made, and just answer my question. jdk
Origenes,
DaveS: Do you accept the truth of the statement “All outcomes in the experiment of drawing a random 13-card permutation are equally likely”?
Yes of course. And “All outcomes” is, in fact, a (compressed) list of all sextillion possible specifications. Here is a snapshot:
Specification 200645: king of hearts, 10 of spades, 3 of clubs, jack of clubs, 9 of diamonds, queen of hearts, 4 of clubs, 8 of clubs, 10 of hearts, king of diamonds, 6 of diamonds, queen of diamonds, and an ace of spades Specification 200646: king of hearts,...
But you won't accept that:
For all permutations of 13 cards E, E occurs about once in every 4 x 10^21 trials.
without further qualification, is true? daveS
Phinehas,
Yes, but this is not all he is saying. He is equating the random number you generate with a particular number after the fact.
There is no second "particular number". There is simply the number that was generated. The probability of getting that number was 1 in 4 x 10^21, since all the numbers have that same probability of coming up. daveS
jdk:
I flip a coin, and get heads. Am I justified in saying “The probability of that having happened is 1/2?”
Again, the probability of what having happened. 1) The probability of matching that one particular specification? 2) The probability of matching one of the two possible specifications? Repeating myself... The equivocation is in supposing that 2 is the same as 1. It is only 2 that has happened in reality (as opposed to theoretically, or in our imagination), unless one of the two possible specifications has been made “particular” before the random event. You could write out the two possible specifications and attempt to claim based on this that a particular outcome has been specified, but in reality, you’ve only attempted to make all of the possible outcomes “particular” which tends to violate the meaning of the word a bit. In any case, when all possible specifications are “particular,” then your probability of “particular” permutations over possible permutations becomes 2/2, or 1. The equivocation is in then pretending that the probability is 1/2, but that would only be the case if a single particular outcome had been specified in reality prior to the random event rather than theoretically or in our imagination or merely by using the label "particular." Now, I could see an argument being made that this is simply an alternate way of looking at the same thing. That it is an issue of semantics. Or that it is a potato, po-tah-to sort of thing. If that is the case, then I think the perspective Origenes and I have been describing has the added advantage of lining up better with our intuition, expectation, or experience of surprise. The vast gulf between these realities and the idea that "the improbable happens all the time" is not highlighted as well with the example of a coin as it is with something as complex as Hamlet. Here, we can see the difference in sharp relief in the notion that something unimaginably improbable to the point of being practically impossible (given the finite resources of the known universe) can be claimed to happen every time I run a program to generate a string of characters the length of Hamlet. But do I experience any shock at this? Does it violate my expectations in any way? Which is more in line with my intuition or response to what happened in reality? 1) That something has occurred which has a 1 in 3.4 × 10^183,946 chance of occurring? 2) That something has occurred which has a 1 in 1 chance of occurring? For me, my intuition, level of surprise, shock, expectations, etc. all line up much more strongly with 2 than with 1. Which tends to suggest that 2 may be the perspective which lines up more closely with reality. Phinehas
Phinehas @403
But what is the probability that you and I are both crazy?
Infinitesimal, I would say. But then again "improbable things happen all the time." Or do they? :) Origenes
The simplest question that I can possibly imagine asking. I flip a coin, and get heads. Am I justified in saying "The probability of that having happened is 1/2?" Phinehas and Origenes, can you just focus on and answer that question, please. jdk
Origenes @374 Thanks! I appreciate your contribution as well. I feel like I totally grok where you are coming from and completely agree. It seems like a rather obvious perspective to me, and I would likely start doubting my own sanity if I were the only one who saw it. But what is the probability that you and I are both crazy? :) Phinehas
DaveS @398
DaveS: Do you accept the truth of the statement “All outcomes in the experiment of drawing a random 13-card permutation are equally likely”?
Yes of course. And "All outcomes" is, in fact, a (compressed) list of all sextillion possible specifications. Here is a snapshot:
..... Specification 200645: king of hearts, 10 of spades, 3 of clubs, jack of clubs, 9 of diamonds, queen of hearts, 4 of clubs, 8 of clubs, 10 of hearts, king of diamonds, 6 of diamonds, queen of diamonds, and an ace of spades Specification 200646: king of hearts, .....
See #401, Phinehas has nailed it. Origenes
jdk: This get's to the heart of the equivocation:
Then, when a hand was dealt, I could look at the hand and look at the list, find the match, and say “Oh yes, the probability was 1 in sextillion.” Is that true?
The question is: The probability of what was 1 in a sextillion? 1) The probability of matching that one particular specification? 2) The probability of matching one of the specifications on your list? The equivocation is in supposing that 2 is the same as 1. It is only 2 that has happened in reality (as opposed to theoretically, or in our imagination), unless one of the specifications on the list has been made "particular" before the random event. By writing out all sextillion possible specifications, you appear to be trying to say that a particular outcome has been specified, but in reality, you've only attempted to make all of the possible outcomes "particular" which tends to violate the meaning of the word a bit. In any case, when all possible specifications are "particular," then your probability of "particular" permutations over possible permutations becomes sextillion/sextillion, or 1/1. Phinehas
Origenes, I keep suggested we consider simpler situations. So I repeat, I actually asked the same question in the other thread, which I’ll repeat here: 1. If I study, in theory, what would happen if I create a random sequence by flipping a coin three times, I can figure out what the possible random sequences are (HHH, HHT, etc) and I can figure out the probability of each of those: P(HHH) = 1/8, P{HHT) = 1/8 etc. Is there anything wrong about that sentence? 2. Then, if I actually flip three coins and get HHT, I can say “the probability of that having happened is 1/8. Is there anything wrong about that sentence? jdk
DS:
Rather, he is saying that any time you do the card draw, you are essentially generating a random number between 1 and 4 x 10^21, each of which is expected to occur once every 4 x 10^21 trials.
Yes, but this is not all he is saying. He is equating the random number you generate with a particular number after the fact. But it was not a particular number before being generated. It was undefined. Then, after the random event, Kitcher tries to assign the value of the particular number back to the random number as though a particular number were generated rather than a random one. Generating a random number is not the same as generating a particular number. A random number can be any one of the possible specifications but a particular number can be only one of the specifications. At the time of the random event, it is only a random number that is matched by the event, not a particular number. The probability of matching a random (unspecified) number with a random generation is 1. The probability of matching a particular (specified) number with a random generation is 1/n. Phinehas
Origenes, Do you accept the truth of the statement "All outcomes in the experiment of drawing a random 13-card permutation are equally likely"? daveS
Jdk @395
If I listed all 4 sextillion hands, and wrote down that the probability of each was 1 in 4 sextillion, would that count as an independent specification?
Good question. In fact DaveS' "all permutations of 13 cards" (see #392) is also a 'list' of all 4 sextillion hands. To answer your question, yes it is a independent specification, or rather, it is a list of all possible independent specifications.
Then, when a hand was dealt, I could look at the hand and look at the list, find the match, and say “Oh yes, the probability was 1 in sextillion.”
The probability that you find a match on the list is 1. Try this with one independent specification, then, and only then, you have your 1 in sextillion probability. Origenes
Origenes,
Inexact word usage which fails to distinguish between the following two events:
I don't think I've ever heard of a requirement that this distinction be made. As far as I know, "everyone" accepts the truth of the statement "For every outcome E in the experiment of flipping a fair coin, E occurs about once in every two trials". daveS
I actually asked the same question in the other thread, which I'll repeat here: 1. If I study, in theory, what would happen if I create a random sequence by flipping a coin three times, I can figure out what the possible random sequences are (HHH, HHT, etc) and I can figure out the probability of each of those: P(HHH) = 1/8, P{HHT) = 1/8 etc. Is there anything wrong about that sentence? 2. Then, if I actually flip three coins and get HHT, I can say “the probability of that having happened is 1/8. Is there anything wrong about that sentence? jdk
If I listed all 4 sextillion hands, and wrote down that the probability of each was 1 in 4 sextillion, would that count as an independent specification? Then, when a hand was dealt, I could look at the hand and look at the list, find the match, and say "Oh yes, the probability was 1 in sextillion." Is that true? jdk
DaveS @392:
For all permutations of 13 cards E, E occurs about once in every 4 x 10^21 trials.
Inexact word usage which fails to distinguish between the following two events: (1) For all permutation of 13 cards E — with an independent specification —, E occurs about once in every 4 x 10^21 trials. (2) For all permutation of 13 cards E — with a specification informed by the outcome —, E occurs each and every time. Origenes
Rossiter says:
If your life depended on you dealing yourself a king of hearts, 10 of spades, 3 of clubs, jack of clubs, 9 of diamonds, queen of hearts, 4 of clubs, 8 of clubs, 10 of hearts, king of diamonds, 6 of diamonds, queen of diamonds, and an ace of spades—in that order—you are a dead man. If you dealt a hand every second for the next 1,000,000,000,000,000 years, you would (on average) get that hand once. That’s how the argument is supposed to work.
Since all outcomes in the experiment are equally likely, we can say:
For all permutations of 13 cards E, E occurs about once in every 4 x 10^21 trials.
So whether the experiment is performed in the past, present, or future, it is true that:
Every time the experiment is performed, the outcome is one which occurs about once in every 4 x 10^21 trials.
Therefore:
When I drew a 13-card permutation last week, the outcome was one that occurs about once in every 4 x 10^21 trials
Finally:
When I drew a 13-card permutation last week, the outcome that occurred was one that has probability about 1/(4 x 10^21)
daveS
Eric Anderson @388 Kitcher smuggles the specification in after the hand is being dealt. The specification is informed by the outcome. This is not how it is supposed to go. As Rossiter points out we need an independent specification and then try to match that with the outcome.
Rossiter: You can try that ‘til the cows come home. .... If your life depended on you dealing yourself a king of hearts, 10 of spades, 3 of clubs, jack of clubs, 9 of diamonds, queen of hearts, 4 of clubs, 8 of clubs, 10 of hearts, king of diamonds, 6 of diamonds, queen of diamonds, and an ace of spades—in that order—you are a dead man. If you dealt a hand every second for the next 1,000,000,000,000,000 years, you would (on average) get that hand once. That’s how the argument is supposed to work.
Origenes
Leaving aside Kitcher's mistake, Eric, i believe you agreed above that the probability of his having been dealt the exact, particular hand he was dealt is 1 in 4 sextillion. True? jdk
jdk @387
jdk: But do you agree that if you throw a 1, you are justified in saying “the probability of that happening was 1/6”?
We have to be very carefully about wording here. Throwing a die and the subsequent ‘automatic collapse’ into a particular outcome has no probability of 1/6. Instead it has a probability of 1. Similarly, dealing 13 cards, which automatically collapses into a particular 13 cards hand, has a probability of 1. So, in the context of throwing a die, what exactly has a probability of 1/6? The probability is 1/6 for an independent specification to match the outcome of the throw. The specification must originate independent from the outcome, because if the specification is informed by the outcome (a la Kitcher), then the overall probability remains 1. And here is why — the ‘Kitcher method’:
(1) throw of a die, which automatically collapses into a particular outcome — probability 1. (2) specification informed by the outcome — probability 1.
Both events have a probability of 1, so the overall probability is 1 — contrary to Kitcher’s claim. I argue that the only way to get to a 1/6 probability is to introduce an independent specification. Origenes
Origenes @386: Yes, part of the problem is that Kitcher completely ignores the critical aspect of specification (probably because he doesn't understand the design inference). So he tricks himself (and unfortunately others) into thinking that he is comparing apples and apples. He's not. He's comparing apples and oranges. Without taking into account specification his exercise is meaningless and amounts to nothing more than the pedestrian observation that if you are dealt a hand of cards you will get a hand of cards. He is falling victim to the two mistakes I outline in my new OP. Eric Anderson
But do you agree that if you throw a 1, you are justified in saying "the probability of that happening was 1/6"? jdk
jdk: ... I would say that “the probability of throwing a 1, 2, 3, 4, 5 or 6 = 1.
I agree. Kitcher's trick is this: he throws a die, getting 1, 2, 3, 4, 5 or 6 (probability = 1), and next claim that it is an 1/6 event. I know you (jdk) don't want to discuss Kitcher, but for those who do I think this is it. Kitcher deals a 13-card hand, and whether the outcome is X, Y or whatever he claims that we witnessed a 1 in 4 x 10^21 event. "Improbable things happen all the time". Not so. Origenes
Good. I never said (2). So you and I agree that "the probability of throwing a 1 is 1/6". Instead of (2) (which I did not say) I would say that "the probability of throwing a 1, 2, 3, 4, 5 or 6 = 1. This just means that when I throw a dice, the number that comes up must be one of the numbers in the sample space, which is the set {1 2 3 4 5 6}. This is a trivial result. Then when I state that all the elements in the sample space are equiprobable, which is the case, that means that the probability of throwing a 1 is 1/6 the probability of throwing a 2 is 1/6, etc. This is really some language ambiguity that ought to be easy to clean up. That is why I wrote " If a roll a die, each possibility (a 1 or 2 or 3 etc.) has a probability = 1/6", so that it would be clear that I wasn't merely saying what you wrote in (2). jdk
jdk @383
jdk: That is, the probability of throwing a 1 is 1/6 the probability of throwing a 2 is 1/6, etc.
What do you mean exactly? (1) the probability of throwing a 1 is 1/6. or (2) the probability of throwing a 1, 2, 3, 4, 5 or 6 is 1/6 - - - - I agree with (1) and disagree with (2). Origenes
Then why did you disagree when I wrote 1) above: 1. If a roll a die, each possibility (a 1 or 2 or 3 etc.) has a probability = 1/6 That is, the probability of throwing a 1 is 1/6 the probability of throwing a 2 is 1/6, etc. That's all I'm saying. Do you agree with the part following "That is," ... jdk
Jdk: I am not talking about Kirtcher or his argument.
But I am and guess what, it is the topic of this thread.
Jdk: If I roll a dice 600 times, I will get approximately 100 1’s, 100 2’s, 100 3’s, etc.
Yes of course. Origenes
I am not talking about Kirtcher or his argument. I am asking a simple question about probability. Is this statement true: If I roll a dice 600 times, I will get approximately 100 1's, 100 2's, 100 3's, etc. jdk
jdk @379 Good questions jdk. Regarding (1), rolling a die automatically collapses into having a particular number — this is guaranteed. Regarding (2), pre-specyfing a number and then rolling a die does not automatically collapse into a match. That is the difference between (1) and (2). This difference tells us that probabilities for both events are not the same. (1) is an event a la Kitcher and succeeds every time. Each time one rolls a die a particular number comes up and each time one can claim a hindsight probability of 1/6. Event (2) only succeeds 1/6 of the tries. Again, there is a difference. Origenes
Origenes, I'm asking you to think of simpler situation so we can understand the general situation better: Do you believe either or both of the following statements are true? 1. If a roll a die, each possibility (a 1 or 2 or 3 etc.) has a probability = 1/6 2. If you pre-specify a number, and then roll a die, the probability that the die will match your pre-specified number = 1/6 jdk
Jdk @375 If I understand you correctly, you are saying that pre- or post-specification makes no difference wrt probabilities. That does not reflect reality. In the real world buying a lottery ticket is obviously not an improbable event. Subsequently looking at the number and calculating probabilities after the fact should not change that. What is (unfortunately) improbable in reality is winning a lottery. It cannot be the case that winning a lottery has the same probability as buying a ticket. Right? - - - - edit: Buying a ticket "collapses into" having a particular number. Having a ticket does not collapse into winning a lottery. Origenes
My argument is that Kitcher’s claim, “But you did, and you have witnesses to testify that your records are correct,” is akin to running Variant 2 or 3, but pretending you’ve run Variant 1.
I'm not sure that you have read Kitcher's quote in context. His whole point is that we should not be impressed by the fact you can calculate a very small probability for almost any observation. In other words, he is, more or less, saying don't confuse Variant 3 for Variant 1. wd400
Phinehas, Yes, 10^-9 is correct. I actually don't think what Kitcher says is analogous to any of these programs. There is no comparison of a randomly generated number with a particular number. Rather, he is saying that any time you do the card draw, you are essentially generating a random number between 1 and 4 x 10^21, each of which is expected to occur once every 4 x 10^21 trials. This is not a case of doing something and then pretending to have done something else. daveS
Origenes, perhaps you will look at 364:
If there are n lottery tickets and all are sold, the probability that you got the particular number you did is 1/n. That is true whether you write anything down the week beforehand, or not. If you did write a number down the week before, the probability that the number you wrote down matches the actual ticket you bought is also 1/n.
I'll note I set up a spreadsheet (see 361) that demonstrates the above with dice. jdk
Phinehas #371 #373 I very much appreciate your effort.
I don’t think the issue is in the mathematics but in the rhetoric of how the problem is set up. To avoid the ambiguity of the English language, perhaps the rigor of a computer language would help.
This may very well be the way forward. Origenes
DS: That is what I would expect as well (10^-9 I think). My argument is that Kitcher's claim, "But you did, and you have witnesses to testify that your records are correct," is akin to running Variant 2 or 3, but pretending you've run Variant 1. Phinehas
Phinehas, I think the probabilities are 10^-10, 1, and 0. I could be misreading something though. daveS
jdk, DS: As I have been saying, I don't think the issue is in the mathematics but in the rhetoric of how the problem is set up. To avoid the ambiguity of the English language, perhaps the rigor of a computer language would help. I offer the following three variants of a program:
// Variant 1 int a_particular_integer; int a_random_integer; a_random_integer = RNG(); // Random number between 0 and 999999999 a_particular_integer = 123456789; // Int is pre-specified if (a_particular_integer == a_random_integer) { // What is the probability this executes? }
// Variant 2 int a_particular_integer; int a_random_integer; a_random_integer = RNG(); // Random number between 0 and 999999999 a_particular_integer = a_random_integer; // Int is defined by result of RNG if (a_particular_integer == a_random_integer) { // What is the probability this executes? }
// Variant 3 int a_particular_integer; // Int is unspecified int a_random_integer; a_random_integer = RNG(); // Random number between 0 and 999999999 if (a_particular_integer == a_random_integer) { // What is the probability this executes? } a_particular_integer = a_random_integer; // Int is post-specified
For each of the variants, what is the probability that the code inside the if block will execute? Where appropriate, please support your answer with an explanation. Phinehas
PaV: FYI, my OP is up here: https://uncommondescent.com/intelligent-design/confusing-probability-the-every-sequence-is-equally-improbable-argument/ Eric Anderson
Origenes, do you agree with my statements in 364? Are you willing to look at the spreadsheet. This has nothing to do with magic, or whether someone is a creationist. jdk
Origenes (and everyone), I have a suggestion. We can use the symbols P1, P2, ..., Pk to denote the 13-card permutations (so k is about 4 x 10^21). We all agree that for every i from 1 to k, P(Pi) = 1 in 4 x 10^21. (Or for notational purists, P({Pi}) = 1 in 4 x 10^21). P(guessing the next permutation) = 1 in 4 x 10^21 also. Furthermore, P({P1, P2, ..., Pk}) = 1. We could use this notation rather than the ambiguous phrase "getting any 13-card permutation", or even "getting any particular 13-card permutation". This or similar notation might eliminate some of the confusion and talking past one another. daveS
I mean, there isn't, is there? If you assume the lottery is only drawn once all possible tickets are sold (and each is sold only once) then the sentence you reproduce above is true (right?) and not in conflict with anything I have said. wd400
Wd400:
Origenes: So the chance of writing down the winning numbers of a lottery a week in advance is equal to having a particular lottery ticket?
There is no magic in writing down a hand.
If so, then there is also no magic in writing down the winning numbers of a lottery a week in advance. Perhaps one has to be a 'creationist' to see the magic in certain things. Origenes
So, the probability of writing down a 13-card hand and subsequently getting it is equal to getting any 13-card hand? Don’t you guys see the problem here?
No. There is no magic in writing down a hand. The deck of cards doesn't change because you did so. So exactly one of the possible ordered hands matches the specification, so the odds are 1/S. If all you mean is that declaring a minute probability for a particular observed event after the fact isn't very impressive then you are making Kitcher's point for him. I'm not sure why this simple little point is so meeting such resistance. wd400
I hope you look at my spreadsheet, Origenes. I think it addresses the exact issue you are thinking about. If there are n lottery tickets and all are sold, the probability that you got the particular number you did is 1/n. That is true whether you write anything down the week beforehand, or not. If you did write a number down the week before, the probability that the number you wrote down matches the actual ticket you bought is also 1/n. Please study my spreadsheet, where n = 6, because this is easier to see with real numbers that we can relate to. jdk
Origenes,
So, the probability of writing down a 13-card hand and subsequently getting it is equal to getting any 13-card hand? Don’t you guys see the problem here?
I would say your statement is correct if you replace "any" by "any particular", as per PaV's suggestion. As written, I don't agree with it. Going back to your die example, the probability of matching a 5 is equal to the probability of {1}, and to the probability of {2}, and so on. No one is saying that the probability of matching the 5 is equal to the probability of {1, 2, 3, 4, 5, 6}. Have you thought about my question in #341? I've answered a whole string of your questions in the meantime. daveS
Jdk:
Origenes: So, the probability of writing down a 13-card hand and subsequently getting it is equal to getting any 13-card hand?
Yes, if by “getting any 13-card hand?” you mean getting the particular hand you got out of all possible hands. If you mean what I think you mean, there is no problem.
So the chance of writing down the winning numbers of a lottery a week in advance is equal to having a particular lottery ticket? "There is no problem"? Really? Origenes
Yes, if by "getting any 13-card hand?" you mean getting the particular hand you got out of all possible hands. If you mean what I think you mean, there is no problem. It would be much better to think about simpler situations, where we can visualize the events better. Here is a link to a shared google spreadsheet that illustrates these two scenarios we have been talking about, in terms of rolling a dice 720 times. Probability dice Refresh the page to get a new set of numbers so you can run multiple trials. The spreadsheet uses a random number generator to pick an random number from 1 to 6. It does this 720 times in order to get a fairly large sample. Column A just rolls 720 dice and counts the number of times each number shows up. These will be about 1/6 (17%) for each number. This is to be expected. Columns C and D picks a random pre-specification and then rolls the dice to see if the specification matches the roll. Column E has a 1 for a match and 0 for no match. This will also happen about 1/6 of the time. There isn't any way to let you make your own specification before each dice throw - you'll have to throw your own dice for that, but that would be equivalent to the second scenario. Note the scenario of making a specification and getting a match is exactly the same (1/6) as just getting a particular number, such as 5 or 2 or any one of the other numbers. Is there anything surprising about this? jdk
Jdk #359
1) has a probability of 1/6 3) The probability of throwing any one of the numbers, irrespective of whether anyone pre-specifies it or not, is also 1/6.
So, the probability of writing down a 13-card hand and subsequently getting it is equal to getting any 13-card hand? Don't you guys see the problem here? Origenes
Yes, Origenes is not clear. I think by 2 he means what is the probability that you will get one of the numbers in the sample space, which is 1, of course. However, I don't see why that situations contrasts with 1, which appears to be the probability of a roll matching a prior specification. 1) has a probability of 1/6 2) has a probability of 1 if he means what I think he means. but 3) The probability of throwing any one of the numbers, irrespective of whether anyone pre-specifies it or not, is also 1/6. jdk
It is a very simple event really. Just throw a die, and see if you get a particular number.
Just cross posted. If you do mean the event {1, 2, 3, 4, 5, 6}, then they have different probabilities. The probability of that event, which is the entire sample space, is of course 1. If you mean one of the singleton events I listed, they each have probability 1/6, as does (1). daveS
DaveS @356 It is a very simple event really :) Just throw a die, and see if you get a particular number. Origenes
Origenes,
To be clear, do the following two events have equal probability: (1) writing down “5”, subsequently throwing a die, getting a “5”. (2) throwing a die, getting a particular number — which can be 1,2,3,4,5 or 6.
It's not clear to me what the last event is. Could you list the outcomes in this event? Edit: I'm not sure if in (2) you're talking about one of {1}, {2}, {3}, {4}, {5}, {6}, or if you mean {1, 2, 3, 4, 5, 6}. If you tell me which, I can answer the question. daveS
DaveS, Jdk To be clear, do the following two events have equal probability: (1) writing down “5”, subsequently throwing a die, getting a “5”. (2) throwing a die, getting a particular number — which can be 1,2,3,4,5 or 6. Origenes
I highly encourage people to run the following experiment: Pick a number (either the same one each time or a different one each time), roll the dice, and see if you have a match. Keep track of the number of matches. Do this enough times (I recommended 96) to get some good data. I predict you will get a match about 1/6 of the time. What do you predict? Am I wrong or right about what will happen? I highly encourage you to try it. Let's get some real world data to discuss. jdk
jdk: [and others]
1) Write down a number between 1 and 6, throw a die, and see if the result matches the number you wrote down. Do this quite a few times, say 96, writing a new number down each time. Keep track of how many matches there are.
If you read @347, you'll see I make a big thing about "matches." PaV
Origenes,
Wait a minute … Event (1) involves checking if my pre-specified “5” matches the outcome! This is different from “just rolling the die”.
Yes, certainly.
You tell me.
Well it's your experiment, so you have to tell us. I think you have in this last post, though, so all is well. The two events are equally likely as jdk said. You can test it. Write down "5", roll the die, and check whether it comes up "5". Repeat a large number of times, then divide the number of times the die did come up 5 by the number of trials. It will likely be close to 1/6 with a large number of trials. daveS
DaveS @349
How is the 5 chosen? If it’s also chosen at random from the numbers 1 through 6 (so essentially another die roll), the events are not equally likely. (1) has probability 1/36 rather than 1/6.
Agreed. Let's suppose that the 5 is chosen by free will and that probability does not apply, IOWs the probability of choosing "5" is 1.
If you do simply write down a 5, and the experiment consists of just rolling the die, they are equally likely.
Wait a minute ... Event (1) involves checking if my pre-specified "5" matches the outcome! This is different from "just rolling the die".
You need to state exactly what the experiment is, what the sample space is, and what the probability distribution is.
You tell me. - - - - - Jdk @348
2) Select a number, such as 5. Throw the dice 96 times and count the number of 5’s.
As Phinehas said, this is not what I said. Origenes
@358 No, (2) does not match what Origenes said. No number was selected first. The number was defined by the throw. Phinehas
Origenes,
Tell me this. What is the difference wrt probability between (1) and (2): (1) writing down “5”, subsequently throwing a die, getting a “5”. (2) throwing a die, getting a “5”. Are both events supposed to be equally probable? If so, then there is something very wrong here.
Edited due to reading jdk's post: How is the 5 chosen? If it's also chosen at random from the numbers 1 through 6 (so essentially another die roll), the events are not equally likely. (1) has probability 1/36 rather than 1/6. If you do simply write down a 5, and the experiment consists of just rolling the die, they are equally likely. You need to state exactly what the experiment is, what the sample space is, and what the probability distribution is. daveS
re 345. Yes, both are equally probably. Run an experiment with yourself. 1) Write down a number between 1 and 6, throw a die, and see if the result matches the number you wrote down. Do this quite a few times, say 96, writing a new number down each time. Keep track of how many matches there are. 2) Select a number, such as 5. Throw the dice 96 times and count the number of 5's. I predict you will get success about 1/6 of the time in both cases, proving that you are wrong when you say "Are both events supposed to be equally probable? If so, then there is something very wrong here." Both events are equally probable, and the experiment will show you are wrong. jdk
If the probability of involved in selecting numbered balls in a lottery is, let's say 1 in 5 x 10^7, then according to what has been said here, EACH ticket has a probability of 1 in 5 x 10^7. Now the drawing of the balls, along with the selling of tickets with ordered arrangements of numbers constitutes what a lottery IS. One, without the other, is pointless. Now, we presume 50 million tickets are sold. The probability of a lottery is the combined of the 'events' associated with every lottery ticket sold, and the selection of the balls. Since the selling of individual tickets are independent events (sometimes there's multiple winners; hence, there's replacement), and the selection of the numbered balls is independent of the selling of tickets (or else there's fraud), then the probability of winning a lottery is the combined event wherein your ticket matches the numbers selected. Hence, two independent events. Hence the probability of winning . . . And, here, I stop. Because something is wrong in all of this. I now change my mind about several things: 1.) The probability of dealing a hand of cards, or being dealt a hand of cards, is 1.0. 2.) The probability of selecting thirteen cards from a deck of 52 cards face down on a table is also 1.0 3.) The probability of writing down an hypothetical hand of cards is 1.0 4.) The probability of selecting your own numbers for a lottery ticket, or receiving a machine-generated one, is 1.0 Now, where does all this come from? I'm not sure. But it seems to me that when we deal with 'probabilities,' we're dealing with a kind of 'dual space,' to borrow from vector calculus. The only REAL probabilities have to do with matching events. And, it is "matching" that is CRITICAL in all truly probabilistic events. From an ID perspective, this is exactly what a "specification" is: it is a "match" between one probabilistic event that is "dual" to another probabilistic event. So, e.g., I flip a coin, and if it's heads, I write down a 1, and if it's tails, I write down 0. If we enter "dual" space--or what is it's equivalent, "sample space"--then we can say that there is a "probabilty of 50% that the coin will be a heads on average." Now we can make all of this 'real' by writing down the outcomes. (But this only makes the numbers we're writing down "existential," and not truly REAL.) But wait a second: I flip a coin. The probability of it being either a heads or a tails is 1.0. I roll a die: the probability that it will be either a 1,2,3,4,5,6 is 1.0. Dealing cards: 1.0; selecting cards from a deck on a table: 1.0, etc. To try and close in on what I mean by "dual space" and the "match" this requires, let's look at a couple of things: 1.) The chance of selecting numbers for a lottery is 1.0. The chance of selecting numbered balls from some kind of vacuum chamber is 1.0. A true lottery consists in "matching" your sequence of numbers with the numbers that came up using the vacuum chamber. Your ticket, and the numbers from the vacuum chamber are "dual" to one another. The numbers from the vacuum chamber are "dual" to all the lottery tickets, and they are "dual" to the chosen numbers. 2.) The chance of getting a Bridge hand is 1.0. The chance of writing down an hypothetical Bridge hand is 1.0. The probability of "matching" these two "probabilities that exist in the 'dual space' of sample space;" that is, that the hand you received, and the hand you wrote down, are the SAME is 1 in 4 x 10^21. Other examples could be given, but I think we all see the main idea here. Probabilities, in, and of, themselves, do not exist in our world; rather, they exist in the world of "sample space." [Just as an aside, the Schrodinger Equation actually doesn't exist in the real world, but in the realm of 'configuration space' according to Schrodinger; and it is something is almost universally lost sight of, because, using the equation, you end up with the right results] For probabilities to become "real," something must be "dual" to it, and "match." So, e.g., I roll two dice and I come up with 10 7's in a row. Can I now go to Las Vegas and say, "Give me some money, I rolled 10 7's in a row"? Of course not. But when I do go to Las Vegas, I put money down before rolling the dice, and the casino "matches" the bet: I'm saying I'll roll a number, and then "match" it, before I roll a 7 or an 11; meanwhile, the casino says that I can't. Because of this "matching," the probability has now left "sample space," or "dual space," and is now REAL, since the numbers rolled by the dice must now "match" certain outcomes. I think this is where all the confusion exists. IOW, when we say things---about cards, dice, or lottery tickets---that "improbable things happen all the time," what we're really saying is that all kinds of "improbable" things happen in "sample space," or, better yet, "dual space" ["dual space" can be thought of as existential "sample space"]. The truly improbable events, or REAL events involving improbability is when two "improbable" events in "dual space" match up. This "matching" involves a one-to-one mapping of one event to another. And, so, unless something is "specified," all the events that are commonly called "improbable" are only events in "dual space." And this is what Dembski was getting at, and what is at the heart of ID. If we take these notions over to biology, then the "machinery of the cell" that produces proteins from DNA can be thought in much the same way as the machinery that is responsible for printing out a lottery ticket. There must be some kind of random number generator involved; there must be some kind of electronic communication taking place to validate the purchase; there's the printer, etc. So, based on what I've been saying, the chance of printing a lottery ticket is 1.0. And, so, the probability of DNA producing a protein is 1.0 (well, not exactly, just in the same way as a lottey ticket printer can break down, and then need to be fixed [by intelligent agents :) ]). Now, I said that producing a protein via cell machinery is 1.0, and then qualified it. Well, the probability of duplicating DNA is 1.0. Well, actually, it's not. It's more, as we know, like 0.9999999. And, so, real probabities occur. (that is: the 'actual' mutation rate of cells) Well, if there is no real improbabilities involved in cell machinery, then what is ID's argument? Well, ID's argument has ALWAYS been about the improbabities involved with one kind of 'pattern' "matching" another specified 'pattern,' which also includes improbablity. So, the probability of producing Protein A is 1.0 (roughly), and that of producing Protein B is 1.0 (roughly). That's been my argument concerning hands and lottery tickets. But we know that in the cell, somehow, Protein A has to MATCH Protein B in "sequence space" if they are to form a 'protein complex.' What is the minimum number of amino acids that need to form bonds in order for the 'complex' to form? I'll just guess 10 as a minimum. Well, we're dealing with a "match" in "amino acid sequence space." So, this is a REAL probability. And we know that the cell uses roughly 20 different amino acids. What, then, is the REAL (not in the "dual space" of "amino acid sequence space" since now we have a "match") probability of this "match" occuring (like a lottery ticket "matching" the numbers selected from a vacuum chamber, or like a hand "matching" the Bridge hand we wrote down right before the deal) ? It is: 20^10 = 1 in 10^13. And what if, as we know happens in cells, there are protein complexes made of several proteins, let's say 5? Then the probability is 1 in 10^65. I just googled this: "largest protein complex in yeast," yeast being the simplest example of a eukaryote. Here's one of the answers. It says that the largest complex is made of 11 proteins. Then, to be conservative in calculating the probabilities, let's say only 5 amino acids are needed for a valid binding site, and since 11 sites have been bound together independently of the others, we must multiply the individual probabilities togther. We then have: [20^5]^11. When I used an online calcaltor, it came back with this answer: "Infinity." Well, not quite. But it is 1 in [2^5]^11 x [10^5]^11 = 3.6 x 10^16 x 10^55 = 1 in 3.6 x 10^71. Can "errors" in the cell's machinery bring about this REAL improbable event? No, intelligent agency is needed. That's the argument ID makes. PaV
jdk: I do not agree with this part:
Whichever one happens has a 1 in 4 x 10^21 probability of happening.
Whichever one happens has a 1 in 1 chance of happening. ***** I am not a programmer, but this is what comes to mind when thinking about this issue.
int a_particular_integer; int a_random_integer; a_random_integer = RNG(); if (a_particular_integer == a_random_integer) { \\ Kitcher's claim that something improbable happened is TRUE } a_particular_interger = a_random_integer;
Though a_particular_integer purports to be a particular integer (after all, that's what we've called it), it is in reality undefined at the time of the evaluation. The fact that it gets defined after the fact doesn't help. Assuming this would compile, once a_particular_integer is defined, we could then jump back up to the 3rd line and do a proper evaluation. Note that jumping back up to the 4th line is not the same as jumping to the 3rd line, but Kitcher acts as though it is! This is the equivocation. Phinehas
DaveS Tell me this. What is the difference wrt probability between (1) and (2): (1) writing down "5", subsequently throwing a die, getting a "5". (2) throwing a die, getting a "5". Are both events supposed to be equally probable? If so, then there is something very wrong here. Origenes
I see that both Phinehas and Origenes are in today's conversation, so I ask them both what they think of 340 also. jdk
Yes, those are true statements. Can you give me a thumbs-up or down on my #341? daveS
The probability of 5 coming up is 1/6 The probability of 1, 2, 3, 4, 5, or 6 coming up is 1. Origenes
Origenes,
So in order to meet the requirement of a pre-specification you offer the ‘pre-specification’ “each permutation”, which entails all possible outcomes? That doesn’t make sense, since a specification is supposed to be about a subset of all possibilities.
So I take it you would not agree with this statement, either?
Consider the experiment of rolling a fair six-sided die and recording the number that comes up. Then each number between 1 and 6 inclusive is expected to be recorded about once every six trials.
daveS
There are 4 x 10^12 specifications, just like when you flip a coin there are two specifications. Specification #1 is a particular hand: say all spades in order ... Specification# 5,668,559,453 is a particular hand: say {3 clubs, 5 diamonds, jack spades, 4 clubs, 4 hearts, 9 hearts, 10 spades, ace clubs, queen clubs, ace diamonds, 6 diamonds, 7 clubs, 3 hearts} ... Specification # 4 x 10^21 is a particular hand: say all diamonds in order. Those are all the possibilities: 4 x 10^21 individually specified hands. One of them will happen. Whichever one happens has a 1 in 4 x 10^21 probability of happening. Specification #5,668,559,453 has a 1 in 4 x 10^21 probability of happening. Specification #5,668,559,454 has a 1 in 4 x 10^21 probability of happening. Specification #5,668,559,455 has a 1 in 4 x 10^21 probability of happening. And so one for all the rest of the 4 x 10^21 specified hands. Do you agree with all that, Phineshas, and if not, what part and why? jdk
DaveS #338 So in order to meet the requirement of a pre-specification you offer the 'pre-specification' "each permutation", which entails all possible outcomes? That doesn't make sense, since a specification is supposed to be about a subset of all possibilities. Origenes
This is getting ridiculous. Let me try once more to find a statement we can agree upon. How about this?
Each permutation of 13 cards is expected to occur about once every 4 sextillion trials.
daveS
DaveS: But let’s say I remove the “expected”: Every time I press the button to start the program a 13-card permutation will be drawn which occurs only about once every 4 sextillion trials.
Doesn't work. The expectation is still in there — "which occurs" means "which is expected to occur". Probabilities are about expectations concerning the occurence of pre-specified events. Origenes
Phinehas, Please see my reformulation in #334. daveS
DS:
What do you think of this restatement: Every time I press the button to start the program a 13-card permutation will be drawn which would be expected to occur only about once every 4 sextillion trials.
I think the following is just as valid, and has the added benefit of lining up well with actual expectations:
Every time I press the button to start the program a 13-card permutation will be drawn which would be expected to occur every time I push the button.
Phinehas
Origenes, We could have determined that every permutation of 13 cards is expected to occur about once every 4 sextillion trials before we drew the cards. But let's say I remove the "expected":
Every time I press the button to start the program a 13-card permutation will be drawn which occurs only about once every 4 sextillion trials.
What do you think now? daveS
DaveS @332 How about this:
Every time I press the button to start the program a 13-card permutation will be drawn which would be expected to occur only about once every 4 sextillion trials, IF it was specified in advance.
The point is of course that it was NOT specified in advance. How can you expect something if it is not specified in ADVANCE? Origenes
Phinehas, What do you think of this restatement:
Every time I press the button to start the program a 13-card permutation will be drawn which would be expected to occur only about once every 4 sextillion trials.
daveS
DS:
“But you did” clearly means “but you did draw that particular permutation”.
But you will always draw that particular permutation when that particular permutation is defined after the fact by some 13-card permutation that was drawn. That's why we get the absolute weirdness of this:
Every time I press the button to start the program [that generates a random string with the length of Hamlet], what almost surely will never happen is almost surely guaranteed to happen.
If what Kitcher is doing is allowed, then what is almost surely guaranteed to happen is that the program will generate a particular string that it almost surely would never generate. And this guaranteed occurrence of the never-gonna-happen happens every time I push the button. (Note that when I say "a particular string" here, I am using precisely the equivocation at the root of this issue, but if Kitcher is allowed to use it, I should be able to do so too. So, we must all pretend I am actually talking about a particular string, since I've used the words, even though I am not.) It's almost as though Kitcher, et al have an unshakable belief in the miraculous, for how else would you describe the miraculous but as an occurrence which likely never would have happened by chance? Phinehas
Eric Anderson,
C’mon, that is precisely why he equivocates. His example does not line up with Behe’s, even though he offers it as a refutation of Behe.
Here's the structure of Kitcher's statement:
What is the probability that you get X? The answer is one in 4 x 10^21 But you did.
What do you think he means by "But you did"? The natural reading would be, "But you did get X". If you don't think he means "But you did get X", but rather "But you did get Y" where Y is not X, how do you tell what Y actually is? daveS
jdk @325: Thanks for the clarification. So you've been discussing the design inference, but not in the particular cases of biology or the cosmos. Fair enough. Eric Anderson
daveS @327:
Through the entire passage, he is referring only to the event and probability of a particular permutation being drawn.
C'mon, that is precisely why he equivocates. His example does not line up with Behe's, even though he offers it as a refutation of Behe. Complete failure on Kitcher's part. Probably not intentional. Likely he just doesn't doesn't understand the design inference. Eric Anderson
Phinehas, No. Kitcher does not equivocate. Nowhere does he confuse the probability of some permutation being drawn with the probability of a particular permutation being drawn. Through the entire passage, he is referring only to the event and probability of a particular permutation being drawn. "But you did" clearly means "but you did draw that particular permutation". daveS
DS:
It involves equivocation. “what almost surely will never happen” corresponds to a particular 13-card permutation. “what is almost surely guaranteed to happen” is that some 13-card permutation will be drawn.
And Bingo was his name-o! Yes, you've nailed it exactly. It involves equivocation between these two things. 1) A particular 13-card permutation 2) Some 13-card permutation will be drawn But this is the exact same equivocation involved in Kitcher's original statement:
“Consider the humdrum phenomenon suggested by [Michael] Behe’s analogy with bridge. You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order? [1] The answer is one in 4 x 10^21…But you did [2], and you have witnesses to testify that your records are correct.” [emphasis mine]
The first highlighted section is talking about a particular 13-card permutation. The second is talking about the fact that some 13-card permutation was drawn. It is this equivocation that I've been pointing out since #5. Phinehas
Eric writes,
So the discussion has inevitably turned to specification and design detection.
I think specification is just part of probability theory: if I say the probability of heads = 1/2, heads is the specification: you have to state the probability of something to even make a probability statement. As the discussion progressed, I became interested in what I think I called post-specifications: a sense after the fact that a result had such a significant pattern that we might wonder if it was truly chance. Someone with absolutely no interest in design arguments about the world, as goes on here at UD, is going to be surprised, and concerned about fairness, if his bridge opponent has all 13 spades, for instance. In fact, fairness is a standard aspect of probability theory. In basic theory, we assume the coin is fair: exact 50-50 odds. However, in more advanced theory, which leads to very important real-world applications, we can look at whether an unlikely particular result (say 90 out of 100 heads) is more likely to be the result of a weighted coin than it is to to be the low probability result of a fair coin. So specification, significance, and "cheating" are all aspects of probability theory to discuss. With all that said, what I am not interested in, for various reasons, and haven't discussed, is common design issues such as fine-tuning of the universe, protein structure, the structure of genes, etc. Others occasionally bring those up, but I haven't entered into any discussion about them. jdk
PS: Some UD posts of reference: On the stat mech issue: https://uncommondescent.com/intelligent-design/of-s-t-r-i-ng-s-nanobots-informational-statistical-thermodynamics-and-evolution/ Second law: https://uncommondescent.com/informatics/ud-guest-post-dr-eugen-s-on-the-second-law-of-thermodynamics-plus-vs-evolution/ Hoyle, Walker & Davies: https://uncommondescent.com/intelligent-design/hoyle-with-updates-from-walker-and-davies-on-cosmological-fine-tuning-a-common-sense-interpretation-of-the-facts-suggests-that-a-super-intellect-has-monkeyed-with-the-physics-as-well-as-the-c/ Collins on Fine Tuning: https://uncommondescent.com/fine-tuning/robin-collins-on-cosmological-fine-tuning/ On FSCO/I: https://uncommondescent.com/intelligent-design/btb-q-where-does-the-fscoi-concept-come-from/ kairosfocus
JDK (and others), I suggest you ponder the key statistical mechanics result on dominance of certain clusters of states in systems through relative statistical weight of said clusters consistent with given macro conditions. Any given individual microstate will be typically vastly improbable in a highly contingent system. But, states can be reasonably clustered and a strong pattern is overwhelmingly predominant groups. In that context, the weight overwhelms that of special, specifiable clusters, indicating the almost certain direction of spontaneous change and the persistence in the overwhelming clusters in absence of constraints. So, we can see that microstates from the predominant group will be unsurprising in a way that contrasts dramatically with allegedly spontaneously getting to special, specifiable, compressed description clusters. Indeed, this is close to the statistical form of the second law of thermodynamics. This then directly ties to the islands of function deeply isolated in vast config spaces dominated by non-functional states issue and it is a red herring led away to a strawman caricature to in effect divert from this issue. The design inference is connected to a known way to get to islands of function: intelligently directed configuration. Indeed, the only empirically validated way, as is evident from text strings in this thread. I suggest you ponder L K Nash's discussion of the binomial distribution for cases of say 500 bits or 1,000 its with possible values 0, 1. You will find a dominant peak near 50{50 0/1 for even odds. Special clusters will be maximally implausible on blind search but can readily be explained on design, e.g. ASCII coded text strings. Mandl gives an easy physical instantiation: a weak B field giving defined direction to a paramagnetic substance with a like number of elementary units, say atoms for convenience. We can extend to magnetic media and chance and/or mechanical necessity driven random text generation vs writing text in English or a machine code, etc. KF kairosfocus
jdk @320: As mentioned, I haven't gone back to read all 300+ comments in this thread, but thank you for pointing me to your two specific comments. You have some excellent thoughts, and although we could quibble over some details, there is much we agree upon. You have even hinted at some of the very things I wish to cover in my OP if I get around to it. Hey, don't steal my thunder! :)
But people keep protesting that no, it is wrong to say of a bridge hand that “this hand was improbable”, so it doesn’t seem to be such a pedestrian conclusion.
Was this in the context of whether design can be detected? Was it inferred? If not, and it was clear from the discussion that it was a pure probability analysis and not any kind of argument against design (more technically correctly stated: that the examples were given under the assumption that there was not design), then I agree with you.
One of the things I have done in this thread is ponder a bit the ideas of specifications and significance, because I find those interesting also. I have pointed to 105 and 192 as long posts where I reflect and brainstorm a bit.
I see. So the discussion has inevitably turned to specification and design detection. Just as I suspected. :)
In general, I am interested in the role of chance and causality, and the ideas associated with the fact that each moment flows into the next one in ways that include what are, at least from our perspective, contingent events. However, I also am a retired math teacher, and just like probability as a subject, in part because grasping it correctly is so slippery.
Well said and good stuff. Definitely interesting topics worth discussion. ----- I apologize if it sounded like I was trying to dispute your motives. Indeed, I was trying to assume, and still do assume, they were as you stated. However, the interesting part of the probability discussion is indeed in relation to specification and design detection. If I can get my OP up soon, I'll be very interested in your thoughts and additional ideas. Eric Anderson
Reading back over this, I see a typo in mu comment at 315. This thread being so awash with misunderstanding already, it seems best to re-write this if full Before anyone writes something down, and presuming a uniformly random probability of writing down a particular hand, then sure the probability of writing down a specific hand in order then drawing that same hand in the same order is 1/ (4 x 10^21)^2. The probability writing down a hand (under the same provisos as above) then getting it is 1 in 4 x 10^21. Another way of seeing this is that the probability of matching the hand (in order) once you have written it down 1 in 4 x 10^21 (the hand you just wrote down is on of 4×10^21 possible hands, after all). wd400
Eric writes,
I should add that I expressed to jdk some doubt that anyone would be spending a lot of time on the “improbable things happen all the time” observation outside of the design context, because the observation is quite pedestrian and not worthy of particular fascination as a general matter.
But people keep protesting that no, it is wrong to say of a bridge hand that "this hand was improbable", so it doesn't seem to be such a pedestrian conclusion. And there have been more issues than just that one that have stimulated my "fascination" and my participation in his discussion. I'm not quite sure why being interested in ways peopple do and do not understand probability, and enjoying discussing and trying to clarify what I see as misconceptions, should strike one as odd. One of the things I have done in this thread is ponder a bit the ideas of specifications and significance, because I find those interesting also. I have pointed to 105 and 192 as long posts where I reflect and brainstorm a bit. In general, I am interested in the role of chance and causality, and the ideas associated with the fact that each moment flows into the next one in ways that include what are, at least from our perspective, contingent events. However, I also am a retired math teacher, and just like probability as a subject, in part because grasping it correctly is so slippery. And I get addicted to somewhat argumentative discussion sometimes. :-) So I ask that you don't try to read more into my participation in this thread than is warranted by what I have written. jdk
PaV @294: I have not followed this entire thread, having joined late, and I doubt I will have an opportunity to go back and read any of the first couple hundred comments. I am agreeing with jdk as to a specific point: namely, improbable things happen all the time. I have also been taking him at his word (which he has stated to me at least twice in the last 48 hours) that he is approaching this topic not with reference to design detection, but simply as a matter of basic probability. I should add that I expressed to jdk some doubt that anyone would be spending a lot of time on the "improbable things happen all the time" observation outside of the design context, because the observation is quite pedestrian and not worthy of particular fascination as a general matter. As a potential argument against design, however, it is like Medusa, with various incarnations of the argument repeatedly rearing their ugly heads. Your comment suggests to me that indeed there may be more going on here than just a fascination with the naked concept of probability. I suspect that many other commenters on this thread are also interested in the topic, not as a naked discussion of probability, but precisely because they think it has something to do with the design inference. In any case, I am more than happy to grant the point that improbable events are exceedingly common. It is true. It is readily observable. And it has no negative impact on the design inference. Indeed, the design inference takes it into account. ----- Lastly, as I said, I haven't reviewed all your comments, but my suspicion is that you are driving at an important point -- one that I hinted at to Origenes @276. I thought perhaps I had already posted an OP on this issue, but in reviewing my posts I don't see it now, although I did find a partially-completed post in my drafts folder (one of many). If I get some time in the next 48 hours I'll dust off my notes and do a new OP on this important issue. In brief, it has to do with the categorization of specification, and there are some important nuances people need to keep in mind if they are to avoid being misled by the "improbable things happen all the time" line of argumentation. Eric Anderson
WD400, in casesof FSCO/I the obvious specificity is being found on a deeply isolated island of function in a config space large enough to utterly swamp sol system or observed cosmos scale atomic resources used in a blind search. 500 - 1,000 bits does it. You really need new objections, things like this have long since passed any reasonable sell-by date. KF PS, Walker and Davies tell us a lot about such islands:
In physics, particularly in statistical mechanics, we base many of our calculations on the assumption of metric transitivity, which asserts that a system’s trajectory will eventually [--> given "enough time and search resources"] explore the entirety of its state space – thus everything that is phys-ically possible will eventually happen. It should then be trivially true that one could choose an arbitrary “final state” (e.g., a living organism) and “explain” it by evolving the system backwards in time choosing an appropriate state at some ’start’ time t_0 (fine-tuning the initial state). In the case of a chaotic system the initial state must be specified to arbitrarily high precision. But this account amounts to no more than saying that the world is as it is because it was as it was, and our current narrative therefore scarcely constitutes an explanation in the true scientific sense. We are left in a bit of a conundrum with respect to the problem of specifying the initial conditions necessary to explain our world. A key point is that if we require specialness in our initial state (such that we observe the current state of the world and not any other state) metric transitivity cannot hold true, as it blurs any dependency on initial conditions – that is, it makes little sense for us to single out any particular state as special by calling it the ’initial’ state. If we instead relax the assumption of metric transitivity (which seems more realistic for many real world physical systems – including life), then our phase space will consist of isolated pocket regions and it is not necessarily possible to get to any other physically possible state (see e.g. Fig. 1 for a cellular automata example).
[--> or, there may not be "enough" time and/or resources for the relevant exploration, i.e. we see the 500 - 1,000 bit complexity threshold at work vs 10^57 - 10^80 atoms with fast rxn rates at about 10^-13 to 10^-15 s leading to inability to explore more than a vanishingly small fraction on the gamut of Sol system or observed cosmos . . . the only actually, credibly observed cosmos]
Thus the initial state must be tuned to be in the region of phase space in which we find ourselves [--> notice, fine tuning], and there are regions of the configuration space our physical universe would be excluded from accessing, even if those states may be equally consistent and permissible under the microscopic laws of physics (starting from a different initial state). Thus according to the standard picture, we require special initial conditions to explain the complexity of the world, but also have a sense that we should not be on a particularly special trajectory to get here (or anywhere else) as it would be a sign of fine–tuning of the initial conditions. [ --> notice, the "loading"] Stated most simply, a potential problem with the way we currently formulate physics is that you can’t necessarily get everywhere from anywhere (see Walker [31] for discussion). ["The “Hard Problem” of Life," June 23, 2016, a discussion by Sara Imari Walker and Paul C.W. Davies at Arxiv.]
kairosfocus
Specified on what basis? Independently specified or an after-the-fact specified a la Kitcher?
Which of these do you think best describes most creationist probability calculations? wd400
wd400 @314
You are illustrating Kitcher’s point.
I do not think so.
The probability of a particular event depends on how tightly it is specified.
Specified on what basis? Independently specified or an after-the-fact specified a la Kitcher? I'm attempting to point out that there is a fundamental difference. And that probabilities that are assigned do not conform reality.
It is easy to start from almost any observation and then assign it a minuscule probability.
Eric Anderson said something important in this respect.
EA: there is an important nuance to address when we actually get down to making the specification, particularly in cases of otherwise non-functional strings, like card deals, coin tosses and the like.
His point may have to do, I believe, withh the fact that when we get into arbitrarily treating random non-functional events as wholes — while in fact they are not — things no longer make sense. IOWs what Kitcher does is treat a dealt 13-card hand as a whole, as something specific and meaningful, while, in fact, it is just a snapshot, if you will, of an ongoing random process of card shuffling. Yet in other words, "dffufdhvk.vmrp9u-4u80ytg9" should be treated differently than "I think therefore I am". Origenes
PaV, I don't know why you persist in this error that the number of unordered hands is ~4x10^21. It has been demonstrated to you many times that this is not the case. With your latest comment. Before anyone writes something down, and presuming a uniformly random probability of writing down a hand, then sure the probability of writing down a specific hand in order then drawing that same hand in order is 1/ (4 x 10^21). The probability writing down any hand then getting it is 1 in 4 x 10^21)^2. Another way of seeing this is that the probability of matching the hand (in order) once you have written it down 1 in 4 x 10^21 (the hand you just wrote down is on of 4x10^21 possible hands, after all). wd400
Origenes, You are illustrating Kitcher's point. The probability of a particular event depends on how tightly it is specified It is easy to start from almost any observation and then assign it a minuscule probability. We shouldn't be easily impressed by such calculations. When someone somewhere sees the face of jesus burnt in their toast, most sane people shrug their shoulders and make nothing of it. That's partly because lots of people burn their toast every day (there have been many trials) and partly because lots of burn patterns look a bit like a face, especially to our face-spotting brains (it's not that tightly specified). wd400
So we have: (1) The probability of getting a particular 13-card hand is 1/S (2) The probability that a 13-card hand matches an independent specification is 1/S. Now the question is: If these probabilities are the same, then why is it that (1) is guaranteed*, while (2) (almost) never happens? "Improbable events happen all the time", only goes for type (1) events and never for type (2) events? If so why do they have the same probability of occurring? - - - * thanks to Kitcher's after the fact specification Origenes
jdk:
1 in 4 x 10^21
You've stated that the probability of a hand being dealt is this very probability. I submit that the probability of receiving the exact hand of cards you wrote down in anticipation of the hand you would shortly receive is also 1 in 4 x 10^21. So, in the ONE event that comprises the 'dealing of hand' and the 'writing down of the sequence of the hand you will receive,' being one that involves TWO independent events, I would suppose the probability of that event is (1 in 4 x 10^21)^2. Or, put another way, the probability you give, is it for the dealing of the hand, or the writing down of the exact hand? PaV
Phinehas,
DS: Is there no contradiction in this? Every time I press the button to start the program [that generates a random string with the length of Hamlet], what almost surely will never happen is almost surely guaranteed to happen. Or, why is the above not an accurate depiction wrt probability theory?
It involves equivocation. "what almost surely will never happen" corresponds to a particular 13-card permutation. "what is almost surely guaranteed to happen" is that some 13-card permutation will be drawn. daveS
Phinehas,
DS: If this is taken literally, one experiment of dealing 13 random cards leading to a particular hand of 13 cards would still have a 1/1 probability. Did you mean theoretically performed?
Yes, and I mean over a large number of trials. I'm basically just trying to describe the frequentist conception of probability. daveS
Phinehas,
LOL! This is just an appeal to authority couched in different language. I may not be as sharp on the math as others, but I can spot a rhetorical ploy a mile away. And as I’ve been saying, the issue with Kitcher and many others on this thread is not their mathematics, but their rhetorical sleights-of-hand.
Nonsense. I have a very elementary understanding of probability and associated philosophical issues, which I understand are significant. I'm not in charge of "accepted probability theory", so I am sincere when I say if you have some suggestions which might bring probability theory more in alignment with reality, by your lights, have at it. daveS
DS:
I know you didn’t ask me, but if you think there is some problem with accepted probability theory, you are certainly free to suggest some modifications to improve it.
LOL! This is just an appeal to authority couched in different language. I may not be as sharp on the math as others, but I can spot a rhetorical ploy a mile away. And as I've been saying, the issue with Kitcher and many others on this thread is not their mathematics, but their rhetorical sleights-of-hand. Phinehas
DS:
An event has low probability if and only if it occurs a small number of times compared to the total number of times the experiment is performed.
If this is taken literally, one experiment of dealing 13 random cards leading to a particular hand of 13 cards would still have a 1/1 probability. Did you mean theoretically performed? Phinehas
DS: Is there no contradiction in this? Every time I press the button to start the program [that generates a random string with the length of Hamlet], what almost surely will never happen is almost surely guaranteed to happen. Or, why is the above not an accurate depiction wrt probability theory? Phinehas
Phinehas,
This is really the crux of my objection, so I do wish you’d at least attempt to address it rather than simply appealing to the authority of accepted probability theory. I get the accepted probability theory, but am questioning whether it makes sense when compared to reality.
I guess I'm guilty of the same thing. I know you didn't ask me, but if you think there is some problem with accepted probability theory, you are certainly free to suggest some modifications to improve it. Maybe they will catch on. I don't see a pressing need to do so, but I'm not a probabilist or statistician. daveS
Phinehas,
Suppose I have a computer program that will spit out a random string the length of Hamlet. We know that the odds of it spitting out a specified configuration (such as Hamlet) are so small that we can say it almost surely will never happen.* Does it really make sense that every time I press the button to start the program, what almost surely will never happen is now almost surely guaranteed to happen? In a theoretical system where the above is the case, might we not suppose it is the theoretical system that is flawed rather than reality?
I think you're reading more into the concept of "low probability" than is justified. An event has low probability if and only if it occurs a small number of times compared to the total number of times the experiment is performed. This doesn't mean that events which actually do occur generally have reasonably large probability. Many of these 13-card permutations we have been talking about might actually occur just one or even zero times in the complete history of our universe. This is despite the fact that people around the world play bridge all the time, where the permutations are generated constantly. There really is no contradiction in saying that one can generate very low probability events at will. daveS
When flipping a coin the probability you will get one of the possibilities (heads or tails) is exactly 1 (not nearly 1).
In reality, it is only nearly 1, since you might have a heart attack, or the coin might land on its edge, or...
When you flip the coin, you now have an actual event. That particular event was heads, and the probability of it having happened was 1/2.
In reality, there is no probability of something having happened. If it has happened then, in reality, the probability is 1.0. It is certain that it has happened, so talk of probability is rather meaningless at that point. Again, the issue here isn't in the mathematics, but in the sleight-of-mind, ambiguity of language, imagination, etc. What actually happened, apart from imagination, was that a fair coin was flipped to see which of the two potential outcomes would be instantiated. One of the two potential outcomes was then observed.
End of case. I will once again give up. This is fundamental and basic accepted probability theory.
But if basic accepted probability theory brings you to this:
Suppose I have a computer program that will spit out a random string the length of Hamlet. We know that the odds of it spitting out a specified configuration (such as Hamlet) are so small that we can say it almost surely will never happen.* Does it really make sense that every time I press the button to start the program, what almost surely will never happen is almost surely guaranteed to happen?
Might we not suppose it is the theoretical system that is flawed rather than reality? This is really the crux of my objection, so I do wish you'd at least attempt to address it rather than simply appealing to the authority of accepted probability theory. I get the accepted probability theory, but am questioning whether it makes sense when compared to reality. Phinehas
When you haven’t specified anything, the probability when flipping a coin is nearly 1.0 that you will get one of the possible specifications (heads or tails). The real specification for the flip when no specification is given is heads or tails.
When flipping a coin the probability you will get one of the possibilities (heads or tails) is exactly 1 (not nearly 1). The probability of each is 1/2. It make no difference whether someone specifies what is going to happen, or not. When you flip the coin, you now have an actual event. That particular event was heads, and the probability of it having happened was 1/2. End of case. I will once again give up. This is fundamental and basic accepted probability theory. jdk
jdk: To see why the above has to be the case, it helps to look at something other than a single coin. Suppose I have a computer program that will spit out a random string the length of Hamlet. We know that the odds of it spitting out a specified configuration (such as Hamlet) are so small that we can say it almost surely will never happen.* Does it really make sense that every time I press the button to start the program, what almost surely will never happen is now almost surely guaranteed to happen? In a theoretical system where the above is the case, might we not suppose it is the theoretical system that is flawed rather than reality? * See: Infinite Monkey Theorem on Wikipedia Phinehas
jdk:
I flip a coin and it comes up heads. Am I justified in saying that the probability of that having happened is 1/2, even if no one “called heads” before the flip?
The probability of what having happened. When you haven't specified anything, the probability when flipping a coin is nearly 1.0 that you will get one of the possible specifications (heads or tails). The real specification for the flip when no specification is given is heads or tails. What actually happened is that you got one of either heads or tails exactly as specified. The probability of this is 1/1 and not 1/2.
Would I be justified in saying the probability of this particular flip is 1/2.
But at the time of the flip, it wasn't a particular flip. It was a random flip. We're only trying to pretend it was a particular flip after the fact. It wasn't. If you specify a specific outcome ahead of time, then it becomes a particular flip and not a random flip. Phinehas
1 in 4 x 10^21 jdk
jdk: @ 279, among other statements you said this:
Even in bridge, where order doesn’t count, the number of hands is 635 billion, so when you are dealt a hand is likely that that hand has never been dealt before because the probability of that hand is so low.
So, we're basically in agreement as to how to look on Bridge hands. You also said this:
The probability of your getting the particular hand you get is 1/n, a very small number (2.5 x 10^-22).
Here's two different scenarios: First scenario: I'm at a table playing Bridge. You say that the odds of getting my hand are one 635 billion. Okay. But, if include the order in which they were dealt, it is 1 in 4 x 10^21. Okay, you're saying the odds of my getting the hand I got was 1 in 4 x 10^21. Second scenario: Before the dealer deals, I write down the suits, numbers/faces, and order of 13 cards. The dealer then deals me my cards. The order, numbers/faces, and suits of the cards I receive match exactly what I wrote down. What is probability of the second scenario happening? This is what you said @ 124:
Now, a winning set of numbers, a specification, is chosen by the lottery administrators. The probability of your having the winning ticket is 1 out of 10 billion also. But this is a different probability even though it has the same numerical value.
Aren't the probabilities of "independent" events supposed to be multiplied together? Now, if the probability of getting a lottery ticket, or of getting a Bridge hand is 1.0, then, to me, everything works out. PaV
re 292: You write,
I still think this is only true in some sort of technical, imaginary sense that doesn’t really line up with reality.
It is true in a technical, mathematical sense. You write,
The above rests on an ambiguity of language where we can use the words “particular hand” for a hand that is not really particular. If it is not pre-specified as the exact cards to match ahead of time, then it is not in reality a “particular hand."
I flip a coin and it comes up heads. Am I justified in saying that the probability of that having happened is 1/2, even if no one "called heads" before the flip? Would I be justified in saying the probability of this particular flip is 1/2. How do you answer these questions? jdk
The probability of a bridge hand is NOT 1 in 4 x 10^21. I am really baffled that you don't get that. And yes, you can say what I said about every bridge hand that is ever dealt to anyone: they all have an equal, very small, probability of 1 in 635 billion. jdk
jdk: @127:
But if someone looks at their hand and thinks to himself, “I am virtually certain no one in the universe has ever had this particular hand, even if this game has been played billions of times, because this particular hand, like all hands, has a extremely small probability of happening”, they would be making an accurate statement. Notice that this second statement has nothing to do with specifications. It is just a true statement reflecting the fact that the probability of any hand is 1/S.
I didn't see this before. It was addressed to someone else. jdk, I ask you, if you know that the probability of ANY Bridge hand is 1 in 4 x 10^21, then couldn't that statement have been made of any hand? That is, "looking" at the hand is not necessary. This observation seems to me to indicate that the probability of any Bridge hand is 1 in 52!/(39!)(13!). Why? Because, if I had the time---my Bridge partners might become impatient!---I could sort my cards any way I wanted; and so I could then go through 13! of such combinations. It would always be the same hand I was dealt. It was for this reason, I suspect, that Kircher had to specify "if order is important." But notice that the probability of the initially dealt hand was, in fact, 1 in 4 x 10^21. All the combinations come from the person holding the hand mixing them up. That is, a seecondary event. PaV
Eric A @ 287:
jdk: Therefore, an event of very low probability occurred. Events of very low probability happen all the time.
Eric: Absolutely agree.
What jdk has left out is that there was an earlier discussion about the probability of a hand, and it was pointed out by me that although the probability of any hand is low, there are, as jdk has stated, just that many ways of dealing a hand. They cancel. The probability of dealing out a hand of cards is 1.0 It is NOT an improbable event. Rolling three dice is not an improbable event. What would be improbable is if the dice hit the table they were being rolled on and then jumped in the air and stayed there. That would be improbable. No, the improbabilities involved in the rolling of the dice occur in the minds of intelligent beings, and are "associated" with the rolling of the dice, and no more. This would be an improbable event: two people throw three dice on the same table and the end up with the same numbers. But a single roll has a probability of one. It's going to happen. In the case of cards, receiving a Bridge hand is completely probable. It, too, is going to happen, as I mentioned above. Here's an improbable event. I list 13 cards on a piece of paper, and then place the piece of paper with the list of cards facing down. Then I'm dealt a hand. I turn over the piece of paper, and find that I got the exact same hand. (Even if the order isn't the same, we're still dealing with an highly improbable event.) And, Eric, this has to do, of course, with a "specification" being involved in the two examples I've given. Maybe you would like to comment a little further. PaV
PaV: I'm trying to understand, and explain, and I think your subtle point is just confused and wrong. You write,
However, since there are 13! hands that are combinations of one another, to select one of these makes the odds different. The odds are now 13! x 1 in 4 x 10^21, or 13! x 39!/52! = [13! x 39!]/52!.”
Since the one permutation (out of 4 x 10^21) can be rearranged into 13! other permutations, each one of which is the same combination, you have 13! fewer combinations than permutations. If in the middle of the problem you change your interest from permutations to combinations, then the probabilitiies have changed because you have changed what your studying. Nothing subtle there. You write,
This means that there are then (52!)/(39!)(13!) combinations, and 52!/39! permutations of a deck of cards chosen 13 at a time.
This is absolutely true, and has been mentioned a number of times, assuming you have understood parentheses in the formula (52!)/[(39!)(13!)] combinations, and 52!/39! permutations of a deck of cards chosen 13 at a time. But we all agree with that. All the manipulations of how it happens doesn't change the number of possibilities. And since the probabilities are just the reciprocals of the number of possibilities, we are thus agreed on the probabilities. And I read over your other two examples, and I'm convinced you are confused. If we deal hands and declare that order is important, we have a different situation than if we declare that order is not important. That is up to us. And, how can I say it more clearlyL in bridge order is not important so bridge is not what Kircher's example is about.
All you have to say, contra Kircher, is that the odds of receiving a Bridge hand is 1 in 4 x 10^21. Why? Because we’re not dealing with entire sample space at once, and the possibility of a combination occurring in normal act of dealing is zero.
This paragraph makes no sense. You don't ever deal with the sample space at one, you select one element out of the sample space. And I have no idea what "the possibility of a combination occurring in normal act of dealing is zero" means. jdk
jdk:
The probability of your getting the particular hand you get is 1/n, a very small number (2.5 x 10^-22). Therefore, an event of very low probability occurred. Events of very low probability happen all the time.
I still think this is only true in some sort of technical, imaginary sense that doesn't really line up with reality. The above rests on an ambiguity of language where we can use the words "particular hand" for a hand that is not really particular. If it is not pre-specified as the exact cards to match ahead of time, then it is not in reality a "particular hand." The pre-specification in reality is "13 random cards dealt from a 52 card deck." It is only a sleight-of-hand (or mind, or language) that would allow us to suppose that the cards dealt are a "particular hand" rather than "13 random cards dealt from a 52 card deck." I believe that it is only the pre-specification in reality that determines how improbable an event actually is. The sleight-of-mind comes with getting the specification confused and supposing that it is a "particular hand" (which is actually unspecified while purporting to be highly specified) when in fact "13 cards dealt from a deck of 52" is the real specification. This is reinforced by the intuition that highly improbable occurrences ought to be surprising. It ought to be surprising to be dealt a "particular hand" but it ought not be surprising at all to be dealt "13 random cards from a 52 card deck." The lack of surprise should clue us in to the reality that nothing truly improbable has happened. Again, it seems like the idea that events of very low probability happen all the time is a violation of common sense on the face of it. When I sit down to deal out a hand, is it true in reality that the unimaginably improbable is not only probable, but a near certainty? Surely, that cannot be right, can it? Surely, there is a better way to look at the issue that doesn't blatantly harm logic and common sense. Phinehas
jdk: @278: I think everyone misses the subtle point I'm stressing (overstressing?). Imagine the young girl, who, for the next twenty centuries, shuffles a deck, places them face down on a table, selects 13 cards at random forming a pile, which she then shuffles so that the order of the original selection is lost. She now flips them over, lays them out left to right, and writes the sequence of cards. The probability of each selection of 13 cards is 1/52 x 1/51 x . . .1/40 = 1 in 4 x 10^21. But, over the centuries, as she writes down all these various hands, she notices that some of them are similar. In fact, they have the same cards, but simply in a different left-to-right order. She finds that that whatever hand she picks, she find 13! less one more of them. So, she says to herself, "If I'm not interested in the order of the cards, but just that these groups of 13 cards contains the same 13 cards, then my odds of coming up with such a grouping of cards is 13! higher than if I 'select' ANY one of these orderings of the 13 cards." She then goes on to find out: "There are 4 x 10^21 hands in total that are different when I select 13 cards out of the deck. The likelihood of EACH ONE of these hands is therefore 1 in 4 x 10^21. However, since there are 13! hands that are combinations of one another, to select one of these makes the odds different. The odds are now 13! x 1 in 4 x 10^21, or 13! x 39!/52! = [13! x 39!]/52!." This means that there are then (52!)/(39!)(13!) combinations, and 52!/39! permutations of a deck of cards chosen 13 at a time. So, all the numbers work out. It's simply a matter of understanding what is happening. It seems to me, please correct me if you see otherwise, that since the cards, after being selected in a particular order, are then shuffled, the order of the original selection is lost, and no longer has anything to do with what happens later. After shuffling the cards, the cards are then placed from left-to-right. This is a NEW order, independent of the original selection. And this ordering of the shuffled pile will completely be equivalent to the sample space of all hands formed by selecting 13 cards from a shuffled deck of 52 cards. Whether she didn’t shuffle the pile, or shuffled them, the same sample space will eventually turn up. IOW, ordering is what separates permutations from combinations, not what constitutes the actual probability of any event of ‘dealing’ a hand of 13 cards. ********************************** Another example: I have a bag full of 52 balls that are not numbered. I pull out 13 balls. What are the odds of this event? 1.0 Now I have a bag full of 52 balls numbered from 1 to 52. I pull out 13 balls. What is the probability of doing this? 1 in 4 x 10^21 since each individual ball can be distinguished. Now, I look at the balls, I see what their numbers are. I confirm to you that, indeed, all of the numbers come from the set of numbers 1 to 52, and none of them are duplicated. Do you need more information than that to tell me the probability of the selection? I say, “no.” You don’t need to know the ‘order’ of how they were selected. All you need to know for computing its probability is that the balls were numbered, and nothing funny happened. ******************************** Finally: You shuffle a deck of cards, and you deal out all the cards into four hands. What is the probability of those hands? For each of them, courtesy of wd400, the probability, as it should be is 1 in 4 x 10^21. But you say, “No, wait a second, that’s if order is important, because that’s the only way of getting us away from combinations which can affect the probabilities.” To which I respond: Do you really want to tell me that there is a chance that any two of those four hands dealt are ‘combinations’ of one another? Of course they’re not. So the whole idea of ‘combinations’ is moot. All you have to say, contra Kircher, is that the odds of receiving a Bridge hand is 1 in 4 x 10^21. Why? Because we’re not dealing with entire sample space at once, and the possibility of a combination occurring in normal act of dealing is zero. This is why people are free to sort their individual hands once they've been dealt. The order in which the cards were dealt affects nothing. PaV
Origenes, incidentally, there is an important nuance to address when we actually get down to making the specification, particularly in cases of otherwise non-functional strings, like card deals, coin tosses and the like. I'm not sure if I've already posted an OP on this before. I'll check my notes and, if needed, may do a quick post on this point sometime in the next week. Eric Anderson
Origenes @276: There is nothing special about Kitcher's example. His is just another case of failing to understand that the design inference is not based on improbability alone. This is the whole point of a specification. I actually have the last sentence you quoted highlighted in my copy of his book. Rather than going into the issue in detail again (there have been previous posts on this), I'll just reproduce here my note in the margin: "This is a failed argument, but requires some careful analysis to understand Kitcher's mistake. The issue is not whether improbable things can happen. They do. All the time. The issue is this: in the context of comparing possible explanations for a complex and specified event, which explanation is most rational: blind chance or intelligence." Eric Anderson
re 287: Great! :-) jdk
jdk @279:
I have no interest in Kircher: either what use he makes of probability in his book or the way he phrased this situation. I am interested in the probability theory.
Fair enough.
Therefore, an event of very low probability occurred. Events of very low probability happen all the time.
Absolutely agree. I'm not sure I have a lot to add, as we are in agreement here. ----- Now there is much more that could be added if we want to discuss how this observation might or might not apply to the design inference . . . Eric Anderson
But I am not "using this as a means against the design inference." I mentioned putting symbols on the cards just to make them more random than suits and numbers. jdk
#283: I do not begrudge the interest itself jdk. But my eyebrows go up when this is used as a means against the design inference. Let me put symbols on some cards... Upright BiPed
wd400: @275. Very good! And it points out that whether dealing, or simply picking up cards from a table top, are the same. I think you mentioned that in the prior post. Very helpful, and thoughtful. PaV
FWIW: I have lots of other interests also, but there may not be much overlap between mine and yours. jdk
most certainly Upright BiPed
re 280: I like probability theory. Maybe you don't. Different strokes for different folks, you know. jdk
#279 Really? This is what captures your fancy? Upright BiPed
In case Eric actually wants to get involved here, I'd like to summarize the situation, and my position. 1. I have no interest in Kircher: either what use he makes of probability in his book or the way he phrased this situation. I am interested in the probability theory. 2. The situation is to deal 13 cards from a 52 card deck, in order. Since order is being considered, this is a permutation, sometimes written 52 permute 13 or 52 pick 13. The formula for this is 52/39!. Thus, there are 4 x 10^21 possible hands. Let S = the set of all possible hands. This is the sample space. Let n = 4 x 10^21. This is the number of elements in the sample space. You are dealt a hand. (It makes no difference how you get it: first 13 cards off the top of the deck, selected randomly from the deck, 13 cards dealt to 4 people around a table, etc.) The probability of your getting the particular hand you get is 1/n, a very small number (2.5 x 10^-22). Therefore, an event of very low probability occurred. Events of very low probability happen all the time. Even in bridge, where order doesn't count, the number of hands is 635 billion, so when you are dealt a hand is likely that that hand has never been dealt before because the probability of that hand is so low. jdk
re 273 and 274: This is wrong, PaV. However the only way I know to explain to you would be to consider a much smaller deck of cards and go through the number where we can visualize the actual hands. But I'll briefly try, anyway. You write,
It’s NOT that the improbability of any individual hand changes, rather, there’s just more hands that you can form.
The probability for each each is nothing but the reciprocal of the number of hands. If there are 4 x 10^21 possible hands, then the probability of each is 1 out of 4 x 10^21. If you increase the number of possible hands, you decrease the probability of each. Also, the number 4 x 10^21 is the number of ordered hands: the number of permutations. There is only one way for an ordered hand to exist. There are 13! different permutations that can be made by rearranging the cards in a single combination. That is why there are more permutations than combinations. Just to be clear: The number 52!/39! is 4 x 10^21, the number of permutations, which is what we are studying The number 52!/(39!•13!) is 6.35 x 10^11, the number of combinations, which is a different issue. You can think of it as exactly the same situation as above, getting 13 of 52 cards, but, since you don't care about order, rearranding them anyway you wish, as when people sort a bridge hand. Any one combination can be obtained from 13! different permutaions. There are much fewer combinations, therefore each particular combination is more probable than a particular permutation. Also, you are confused about reciprocals: 1 out of 4 x 10^21 is not 4 x 10^-21, because you have to reciprocate the 4 also. The probability of any particular permutation is 1/4 x 10^-21 = 2.5 x 10^-22. Last, I am not clear, and I am not sure you are either, when you write,
or a probability of (13!)(39!)/(52!). So the total number of hands is (52!)/(39!)(13!).
I think you mean (52!/39!)•13! which is wrong, as I have explained, but you might mean 52!/(39!•13!). The first of those, which creates a number bigger than 4 x 10^21 is wrong, as I have explained. The second is the correct formula for the number of combinations, which is less than the number of permutations, which I have also explained jdk
PaV, I think you're using a conditional probability calculation here when it's not appropriate. Consider the simplified example where just the 4 kings are dealt. The 4 cards are shuffled then dealt to the 4 players. It's clear that each player has probability 1/4 of getting any particular king. However, I think that you would say the probabilities of of the players getting the cards they were dealt are 1/4, 1/3, 1/2, and 1/1 (in order going from player 1 to player 4). These numbers could be interpreted as conditional probabilities, but that's not what we're talking about. Even player 4 has probability 1/4 of getting any particular king. daveS
Eric Anderson @271
Eric Anderson: .... (2) his [Kitcher's] lack of understanding of the probability arguments, which he seriously botches ...
Eric, would you care to expand on that a little? And could you perhaps say a few words on the following:
Kitcher: “Consider the humdrum phenomenon suggested by [Michael] Behe’s analogy with bridge. You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order? The answer is one in 4 x 10^21…But you did, and you have witnesses to testify that your records are correct.”
Origenes
Do your calculation for the second position. Then it’s 1/52 x 1/50 x 1/46 x 1/42...
No. What are the odds the second player gets the Ace of Spaces then the King of Spades? The first card to player has to be anything other than those two cards (50/52), the first card player two gets must be the AoS and there are 51 cards left so 1/51. Player 3's first can't can't be the King of Spades (49/50 remaining cards are fine)... All told you get: (50/52 * 1/51 * 49/50 * 48/49 * 47/48 * 1/47) (1/52 * 50/51 * 49/50 * 48/49 * 1/48 ) = (1/52 * 1/51) The distinction you are trying to make exists only in your misunderstanding of probability. All that you have to do is use the combination formula when order matters, the permutation one otherwise. wd400
The last part of my last post reads better this way: However, if any combination is allowed, there are then 13! more possible hands, and so the final probability including combinations is 13! x 4 x 10^-21: that is, 13! hands, EACH having a probability of 4 x 10^-21, or a probability of (13!)(39!)/(52!). So the total number of hands is (52!)/(39!)(13!). It's NOT that the improbability of any individual hand changes, rather, there's just more hands that you can form. PaV
wd400:
i.e. the chance of getting any two cards dealt to you in order is the same, no matter if they are taken from the top of the deck or deal around a table.
This is true for each position of the deal; but this isn't to say that all four positions have equal probabilities calculated in this fashion. Do your calculation for the second position. Then it's 1/52 x 1/50 x 1/46 x 1/42, etc. (I'm assuming that, e.g., the 50/51 represents the probability that the second hand will receive some card, and not any particular card, from the shuffled deck, and so forth.) jdk:
If you consider the cards in the order you get them, the number of possible hands you can get is 4 x 10^21.
But it is here where all the subtlety lies. All combinations of any ordered Bridge hand also have the probability of being chosen of 1 in 4 x 10^21. However, if any combination is allowed, there are then 13! more possible hands, and so the final probability including combinations is 13! x 4 x 10^-21: that is, 13! hands, EACH having a probability of 4 x 10^-21. It's NOT that the improbability of any individual hand changes, rather, there's just more hands that you can form. Though subtle, I think this distinction is crucial. PaV
nm daveS
Living with Darwin: Evolution, Design and the Future of Faith, by Philip Kitcher.
Quick drive-by comment as I just noticed the above. Coincidentally, I just finished reading Kitcher's book yesterday. Did not do a review on Amazon yet, but if I do it will probably receive 1 or 2 stars. 1 star for content. Perhaps 1 more star because he seems to be a nice guy and does at least offer a partially-sincere olive branch of sorts at the end of his lengthy final diatribe chapter against all traditional religion. Kitcher makes some interesting points about challenges to certain theological positions and has a few thoughts worth considering. Unfortunately, his book is seriously marred by: (1) his lack of understanding of intelligent design, his critique of which was apparently the main impetus for the book, (2) his lack of understanding of the probability arguments, which he seriously botches, and (3) his naive and simplistic assertions that Darwinism has the answers and that the serious and deep flaws in the Darwinian narrative are but minor hiccups that can be ignored or will be resolved soon enough. Eric Anderson
So NONE of those hands would have a probability of 1/52 x 1/51 x 1/50 x 1/49, which is the case of the hand I held forth as an example.
Another way to see this can't be right is to consider the fact that after the deck is shuffled, any particular permutation of the entire deck, say: C1, C2, C3, C4, C5, C6, C7, C8, ..., C49, C50, C51, C52 has exactly the same chance of occurring as the following three other permutations: C2, C3, C4, C1, C6, C7, C8, C5, ..., C50, C51, C52, C49 C3, C4, C1, C2, C7, C8, C5, C6, ..., C51, C52, C49, C50 C4, C1, C2, C3, C8, C5, C6, C7, ..., C52, C49, C50, C51 (where groups of 4 cards are cyclically permuted themselves). If these four distinct permutations of the deck are dealt successively, dealing to player 1 first each time, then the four 13-card hands/permutations "rotate around the table". Since each player is equally likely to get either of these four particular hands, it must be true that they are each equally likely to get any particular hand. As there are about 4 x 10^21 of them, the chance is always 1 in 4 x 10^21 of getting any particular hand. daveS
Very good, wd400. jdk
You normally deal out cards to four players with the first player receiving the 1st, 5th,9th, 13th card, etc. This is subtily different than dealing one person the first thirteen cards. You’ll notice that the odds of getting your hand in a regular game of Bridge involves, for the first player dealt a hand: 1/52 for the first card, 1/47 for the second card, 1/43 for the third card, etc. So NONE of those hands would have a probability of 1/52 x 1/51 x 1/50 x 1/49, which is the case of the hand I held forth as an example.
For you to get the second hand you received it had to not be dealt 2nd, 3rd or 4th. You will find that (1/52 * 50/51 * 49/50 * 48/49 * 1/48 ) = (1/52 * 1/51) i.e. the chance of getting any two cards dealt to you in order is the same, no matter if they are taken from the top of the deck or deal around a table. wd400
PaV, I don't agree. It makes no difference whether you get 13 cards dealt straight off the top of the deck, or get one of the hands dealt in the normal way in a game of bridge, or you or the girl picks 13 cards randomly from the deck. If you consider the cards in the order you get them, the number of possible hands you can get is 4 x 10^21. The distinctions you make about how you get the cards don't make any difference. jdk
Phinehas at 262: That is well-stated, i think, although I would very substitute "very close to" for "virtually identical." One reason is that, as you recognize, significance is a gray area. In 100 rolls, would 12 6's right in the middle count as significant? There are lots of judgments calls that affect the situation. jdk
jdk:
In your example, once you sate you are going to take 13 cards from a 52 card deck (clarifying whether order does or does not count) you create a sample space of all possible hands that can happen, and you can then calculate the probability of each one of them happening.
I understand this, and appreciate it; yet, there seems to be a distinction that can be made between the example I gave, and the example Kitcher gave. Kitcher gives an example of a hand, naming the cards, and then says the probability is 1 in 4 x 10^21 after saying "if order is important," or something along those lines. In my example---and this is a fine distinction, but, nevertheless a distinction---I say that thirteen cards are dealt, without specifying them, and then, along the line, to further clarify what I meant, I presumed that it was the first thirteen cards from the top of the deck, and worked out the probability that way. This is a different scenario, though to some degree related, to Kitcher's scenario. I objected to saying that 'order' was important because I felt that there already was an order: that is, the actual dealing of the cards: 1st, 2nd, and so forth. This isn't how cards are normally dealt. You normally deal out cards to four players with the first player receiving the 1st, 5th,9th, 13th card, etc. This is subtily different than dealing one person the first thirteen cards. You'll notice that the odds of getting your hand in a regular game of Bridge involves, for the first player dealt a hand: 1/52 for the first card, 1/47 for the second card, 1/43 for the third card, etc. So NONE of those hands would have a probability of 1/52 x 1/51 x 1/50 x 1/49, which is the case of the hand I held forth as an example. That is, if the girl had 52 cards, face down on a table, and then dealt them out to four people, we would NOT encounter the same probability for each hand as we do in the case of the single, straightforward deal of the top thirteen cards, which corresponds to the young girl selecting 13 cards out of the deck, in an order of first to last, and giving them to one person (one pile). I think you have to acknowledge that, or, better yet, would agree to that. I think this is the reason for my hesitation, and why I said that a permutation formula is being forced onto the scenario I proposed. Normally, you would not work out the 'probability' of each person receiving their hand in the way I did above, in the case of a four-handed deal. What is supposed is this: each person ends up with thirteen cards, and when we compare hands, then we see that the probability of each hand of 13 is 1 in 4 x 10^21 "if order is important," or 1 in (52)!/(39)!(13)! "if it's not." I think these are two different ways of looking at the hand: my way, which looks at raw probabilities associated with how many cards remain when the selection is made; and, the normal way of looking at hands, which is independent of any probabilities associated with dealing. IOW, it is not said that the probability of Bridge hand is THIS if you were the first player to be dealt a hand, and THAT if you were the second player. There's an "independence" present that is NOT present in my scenario. It is a distinction, I believe. PaV
Phinehas: I noticed this pattern for: 3, 4, 6, 9, 12, 15, 18
I did not notice position 4. That ruins the pattern. Bad design! Origenes
Origenes:
However the numbers at position 3, 6, 9, .. and so forth are the sum of the two numbers that precede them. Did you notice? If not, what is your verdict now: randomness or design?
I noticed this pattern for: 3, 4, 6, 9, 12, 15, 18 Base on your last post and the emerging pattern, I'm guessing position 4 was accidental. Which means it wasn't by design. Which likely means your sequence is a combination of randomness and design, probably by design. Which reveals that their are layers to this thing we call design. Which makes me want to take a nap. :) Phinehas
jdk: What I mean is this: With three dice, you thought there might be something like 10% of the possible combinations that could be considered significant. With 10 dice, you thought this percentage would likely go down quite a bit. If this trend continues, then moving to 100 dice would result in even less of a percentage of the possible configurations being significant. Eventually, as you continue to add dice, the number of significant combinations compared to the total number of combinations would be so small that the probability of getting one of the significant combinations would be virtually identical to the probability of getting a pre-specified configuration. Does this make sense? Phinehas
Phinehas:
Origenes:
2, 1, 3, 4, 2, 6, 5, 1, 6, 2, 3, 5, 3, 1, 4, 1, 1, 2, 4, 2, 6
Is this sequence best explained by design or e.g. the throw of a die?
Having looked at your numbers, though I see some things that could indicate a pattern, I can’t figure out anything that would explain the numbers all the way through. So, I’d probably go with the throw of a die.
You are correct Phinehas. Not every number can be explained by design. However the numbers at position 3, 6, 9, .. and so forth are the sum of the two numbers that precede them. Did you notice? If not, what is your verdict now: randomness or design? Origenes
Thanks, Phinehas, for even reading and thinking about those lengthy posts. I think a key distinction is one that PaV brought up, although I'll put it in my own words. Probability as a subject of pure mathematics is not interested in significance. Significance is something added by our human cognitive ability to recognize and ascribe meaning to patterns: an ability that is at the heart of our ability to create knowledge that goes far beyond issues of probability. As I pointed out several times, if you have a situation where the elements already exhibit a pattern, such as in a deck of cards, we will see more significance in particular hands than we would if we had 52 cards with 52 symbols that had no intrinsic patterns of order or categorization. You write,
Further, though significance will always be greater than probability, as probability lowers significance appears to approach probability. Do you agree with this?
I'm afraid this sentence doesn't make sense to me, but my writing time is up, so perhaps I'll have time to think about it later. jdk
jdk: Thank you for your posts @105 and @192. I think they get to the heart of the issue for me. I especially found this part interesting:
Without going through any more analysis, I am virtaully certain that the ratio of throws that exhibit a pattern in comparison to the total 60 million possible throws would be much, much smaller than with three dice. That is, there would be a much larger percentage of total hands that did not exhibit much, if any, pattern: they would be 1’s on our significance scale.
This seems intuitive to me as well. If this is the case, would it be fair to say that a discussion of significance* is not completely divorced from a discussion of probability? In other words, significance is not orthogonal to probability. Further, though significance will always be greater than probability, as probability lowers significance appears to approach probability. Do you agree with this? ***** * Where "significance" means the probability of getting a significant hand (based on some arbitrary cutoff on your admittedly grey significance scale). Phinehas
Origenes:
Is this sequence best explained by design or e.g. the throw of a die?
Having looked at your numbers, though I see some things that could indicate a pattern, I can't figure out anything that would explain the numbers all the way through. So, I'd probably go with the throw of a die. Interestingly, a pattern that I do not currently perceive could be pointed out to me that would immediately and forcefully put me over in the design camp. Also interestingly, once in the design camp because a pattern had been revealed, I expect no new information would be able to put me back into the throw of a die camp. Phinehas
You can only have a probability, in the mathematical sense we are discussing, if you describe unambiguously, the parameters that produce the events: for example, flip a fair coin three times in a row. One you have those parameters, then you also have a sample space, because the parameters define the possible events. Then you can discuss the probabilities of the individual events, and the sum of all probabilities must equal 1. Therefore, in mathematical probability, all events have to be a part of a defined sample space. In your example, once you sate you are going to take 13 cards from a 52 card deck (clarifying whether order does or does not count) you create a sample space of all possible hands that can happen, and you can then calculate the probability of each one of them happening. I appreciate it that you are "mulling things over." Probability is a very slippery subject. jdk
Why is it that a random process (shuffling of cards, throw of a fair die) is assumed to be responsible for e.g. the 13 cards? Isn't that the wrong approach to the issue? To me it makes more sense to present a bridge hand, or a sequence of numbers and then ask: what is the best explanation, randomness or design? And why? For instance this sequence:
2, 1, 3, 4, 2, 6, 5, 1, 6, 2, 3, 5, 3, 1, 4, 1, 1, 2, 4, 2, 6
Is this sequence best explained by design or e.g. the throw of a die? Origenes
wd400 and jdk: I keep mulling these things. I read through #229, and see your logic there. I still feel like the act of dealing involves probilities that are independent of the entire sample space of possible hands. It seems to me the individual act of choosing 13 cards has its own probability, independent of other 'acts' of dealing. Nonetheless, I can see that in the act of dealing, order is important. So, let me ask you, is it possible to look at the selection of 13 cards from a deck of cards as a probability event, in, and of, itself? Or, put another way, is it simply, and always, a subset of the sample space? PaV
wd400 @253 If due to "selection" only one species survives — e.g. Deinococcus Radiodurans — would you consider that an increase or decrease of biological information? Origenes
Why isn’t it glaringly obvious that less variability means less information? Well, no. Some definitions of information are almost exactly " decreased variability", so I guess you will have to tell me which definition of information you had in mind. If it is just the informal one you use in the rest of your post then it doesn't seem to be very relevant to evolutionary biology. wd400
wd400 @250 Why isn't it glaringly obvious that less variability means less information? By comparison, if conservative voices in the media are silenced — if the liberal MSM becomes "fixed" — then we get less information overall. Origenes
Simplifying is good, Dave! We can visualize and even list all the possibilities in the two scenarios you offer in 248. jdk
It is completely in line with my broader claim “selection elimination reduces variety and biological information” which you (also) prefer to ignore. The higher the proportion of loci that become “fixed” in a population, the lower are nucleotide variability and average heterozygosity in that population.
Right, but what has that to do with information? edit: (or more to the point, how is this a decrease in information) wd400
wd400:
Origenes: “To become fixed in the population” is a euphemism for “massive loss of information”
Well this just isn’t so. You often say things like this, but have never been able to support this claim, or even explain why you think it is the case.
It is completely in line with my broader claim "selection elimination reduces variety and biological information" which you (also) prefer to ignore. The higher the proportion of loci that become "fixed" in a population, the lower are nucleotide variability and average heterozygosity in that population.
wd400: ... of course selection only operates once variety is present. Variety arises by mutation
Sure, all selection elimination does is hinder evolution. Origenes
Here's a simple experiment that PaV, or anyone could carry out. Take the four kings from a standard deck. These four cards will be shuffled thoroughly, and two will be drawn at random and placed in front of you, one on the left, one on the right. You are offered two options: A) You win $100 if the card on the left is the king of hearts and the card on the right is the king of diamonds. B) You win $100 if the king of hearts and the king of diamonds are drawn (there is no requirement regarding which is on the left and which is on the right). Which option would you choose in order to maximize your chance of winning? Furthermore, getting to the questions raised in PaV's post 244, what are the probabilities of winning in each scenario? daveS
As wd400 points out, if you pick up a true bridge hand, that is a combination, and there are much fewer possibilities. Imagine you have 13 cards in order. There are 52!/39! ways to do that, which is a short way of writing 52 • 51 • 50 ... • 40 However, if you ignore order, then there are 13! different ways you could get those cards, because there are 13 cards you could have got first, 12 second, etc. Therefore the formula for the number of combinations of 13 cards without order counting (a true bridge hand) is 52!/(39!•13!). Both wd400 and I have provided this formula previously. To summarize, if you pick 13 out of 52 cards, in order, there are 4 x 10^21 possibilities. If you choose 13 out of 52, where order doesn't count, there are 6.35 x 10^11 possibilities. jdk
Are you sure...
What is the probability of selecting those 13 cards, irrespective of their final order: it seems to me that it is the familiar 1/52 x 1/51 x …..1/40. I don’t see how their final order affects the selection process. (This, of course, would be very different if you were trying to match cards; then the order of the cards you’re trying to match would impose an order on the selection process: having the right 13 cards isn’t enough; they have to be in the right order as well.)
Imagine you were trying to match the hand "All spades in order A,K,Q....,4,3,2". To do that you would have to get the ace of spades on your first draw, right? What are the odds of doing that? 1/52 yeah? Then you would need to get the K from the 51 remaining cards, so that's 1/51. Carry on and you'll see you arrive at the familiar expression 1/52 * 1/51 * ... = 52!/39!. That's the probability of drawing an ordered hand, it can't also be the probability of drawing the same cards in any order. (And, of course, there are 13! orderings of any bridge hand, which gets us back to the different between combinations and permutations I discussed in 229) wd400
wd400: I'm doing just fine, thank you. PaV
jdk: I'll try to do a better job of making my example clear. The whole point of having the face cards down is simply to keep 'order' out of the picture temporarily. The whole point of shuffling twice is to illustrate that the initial selection of the cards involves probabilities not predicated on the order the cards will eventually take. IOW, the girl could have put down the cards from left to right as she picked them. That would have been Order #1. Or, she could have put down the cards from left to right after the first shuffle. That would have been Order #2; but she shuffles again, and now we have Order #3. Neither of these shuffles has anything to do with the young girl's originally selecting of the 13 cards from the laid out deck of cards. Now, it's my impression that you see the situation of the cards being' face down,' as being an entirely different scenario than having the cards being 'face up.' I don't see why exactly. Let's remember what happens in a real game of Bridge: the deck is shuffled while all the cards are 'face down.' Then they're dealt out 'face down.' Then they're picked up with the deck marking, and not the card marking, showing outwards. I.e., I don't know what my playing partner's cards are. In the case of the young girl selecting cards, we know that the cards are marked; we just have to flip them over to see what they are. And we know there is some kind of order; because the first card ends up on the bottom of the pile, and the last card ends up on top. But none of that enters into the probability of the individual selection of each of the 13 cards. For example: what if the girl selected the cards from a deck of cards placed on a table with the cards face up (These could be any 13 cards from out of the deck). And suppose she ordered the cards from left to right. Then she decides to order the very same cards from right to left in reverse order. Then she decides to take the leftmost card and replace it with the one in the middle, and swaps the rightmost card with the card to the immediate left of the middle card, etc. I imagine there are 13! ways of ordering the cards, ONCE they've been selected. What is the probability of selecting those 13 cards, irrespective of their final order: it seems to me that it is the familiar 1/52 x 1/51 x .....1/40. I don't see how their final order affects the selection process. (This, of course, would be very different if you were trying to match cards; then the order of the cards you're trying to match would impose an order on the selection process: having the right 13 cards isn't enough; they have to be in the right order as well.) Doesn't this all come down to this fact: in a Bridge hand, the person holding the hand is not obliged to hold the cards in any particular order? That's how it strikes me. He could have the Hearts on the left, and Clubs on the right; or, vice versa. He/she could have the order going from Ace high downwards, ordering left to right, or, 2 low onward, ordering left to right. Rolling dice is different. The order is immediately important. I roll a die: it's a six. Again I roll it: it's a three. And, so on. I see there being an order to the roll of the die; and, if we switch over to two dice, obviously in a game of dice, 'craps', if you roll a 9, then another 9, and then a seven, this would be very different from rolling 9,7,9. But this restriction doesn't seem to apply to Bridge, or most card games. PaV
“To become fixed in the population” is a euphemism for “massive loss of information”
Well this just isn't so. You often say things like this, but have never been able to support this claim, or even explain why you think it is the case. Until you do I am not sure how I can correct your misunderstanding. (As to your other point, of course selection only operates once variety is present. Variety arises by mutation) wd400
Wd400 @205
Dawkins makes the point that using some naive specification might show that a biological sequence or trait is very improbable. The silliest examples being “this is 100 amino acids long, and there are 20 amino acids so there is only a 1/(20^100) chance this protein exists!”
Absolutely silly indeed. Anyone who claims that this number reflects the chance that such a protein exists simply assumes all the exquisite apparatuses that are required to produce proteins, which is, indeed, beyond silly. Obviously the chance is way way smaller.
Dawkins point is biological traits are the not the result of single draws from a probability distribution (like the card examples), but evolution along lineages.
Unfortunately Dawkins has no point.
There is no need to get every amino acid right in one draw, as each individual change increases the fitness of the creature that has it, meaning they are likely to be become fixed in the population and all individuals in subsequent generations start with improvements from previous ones, which they can in turn improve.
This is based on the silly assumption that improvements are all lined up, like stepping stones, towards a goal. Not only is this completely baseless, but if it were true, then it would point to design — see ‘conservation of information’ #206. What it also being overlooked is that selection elimination reduces variety and biological information. “To become fixed in the population” is a euphemism for “massive loss of information” and little else. Thirdly, selection elimination can only 'not eliminate' an invention after that invention already exists, so it can’t actually invent. Summing up Dawkins "point": we have sheer dumb luck (allegedly) coming up with all sorts of viable organisms and selection elimination wielding his scythe. Question is: How does that help evolution? Origenes
PaV writes,
We’ve all agreed that the probability of ‘getting a hand’ is 1.0. And the probability of getting a Bridge hand (13 cards dealt to you), if order is important, is 1 in 4 x 10^21.
Yes, if you mean "getting a particular Bridge hand (13 cards dealt to you), if order is important ...." We have agreed to this multiple times throughout this thread, I believe. Then PaV writes,
The only basic disagreement was regarding 13 cards dealt to yourself, with the order not being important. I’ll stick to the argument I make in #232.,
I think in all discussions order has been important, but I'll look at 232 anyway. And in 232, I see in your example order is important, as you write at then end "now you lay them down, face up, from left to right." I assume by left to right you are saying that the order is important First, it makes no difference whether the six-year old girl picks the cards or you shuffle the deck and deal the first 13. You write,
She does, and places the card, face down, off to the side. You ask her to do this a second, third and thirteenth time.
Does she keep the pile in order? Since they are face down I presume that means we don't know what the 13 cards are. Then you write,
She selects one of the cards, and places on the pile, off to the side, with all the cards face down.
I don't understand this sentence. Do you mean she selects another card (a 14th card) and places it on the pile also? Then you write,
Now, you take the 13 cards she selected, and you shuffle them. And now you can lay them down on the table, first card on the left, and the last card on the far right. But you don’t. Instead, you shuffle them three more times, and now you lay them down, face up, from left to right.,
First, it makes no difference how many times you shuffle them: we are assuming you randomize them, so we now have no way of knowing the order they were originally in. Now when you lay them out face up, they are in order. Then you ask,
What was the probability of selecting these cards?
This is unclear. Since these are the only 13 cards in the pile, the probability of selecting those cards = 1, because they are the only cards in play at this time. If you mean, what is the probability of selecting them in the same order as they were in the original pile, the answer is 1 out 13! = 1 out of 6,227,020,800 You write,
Let me answer these questions for you (since I think we both agree on this): The probability of the first card is 1 in 52, the second is 1 in 51, etc, etc = 1 in 4 x 10^21.
If you only have 13 cards in the original pile, and are asking if the order of the face up cards matches the order they were in when face down, then that probability is 1/13 • 1/12 • etc. There aren't 52 cards involved anymore, just 13. If you are asking what is the probability of getting those 13 cards in order in respect to the whole deck, then yes the probability is 1 out of 4 x 10^21. But if that's what you mean, I don't understand what the significance is of the girl creating the pile face down and the shuffling the cards: that adds nothing to the situation. You picked 13 out of 52 cards and arranged them in order, which is exactly what a permutation is, and 52 permute 13 = 4 x 10^21. jdk
jdk:
Also, 52!/39! is not “the total number of permutations of a deck of 52 cards.” That is just 52!. 52!/39! is the total number of ways of picking 13 elements from a set of 52 in order.
In the previous sentence, I mentioned hands of 13. I simply omitted that, which was an error in writing, not in math. Where do you think the number 39! came from? Out of thin air? PaV
PaV,
The only basic disagreement was regarding 13 cards dealt to yourself, with the order not being important.
Is there anyone here who is talking about the situation where order is not important? It seems we all, including Rossiter, are considering order to be important:
When you deal yourself thirteen cards, you have no idea what your hand will be. And, it doesn’t matter in any meaningful way. You don’t have to get a certain hand. You simply get thirteen cards. Now, the thrust of the probability argument is to say that you must get a particular hand (thirteen cards in a particular order).
daveS
You are beyond hope or help. wd400
wd400:
The only “point” I can imagine you are trying to make is so muddle-headed I wouldn’t ascribe to anyone. So, you can spell it out or we can drop it.
If you can't figure it out, then it's better we discuss it further. PaV
wd400:
So, you are asking what proportion of all possible draws contain the same cards, in any order.
There's nothing embarrassing at all. You've simply misunderstood what I was addressing. You're the only one posing this question. Read 232. PaV
Getting to be embarrassing? A/mats passed that threshold a long time ago. Now it just IS embarrassing. Truth Will Set You Free
This is getting to be embarassing,
Who said anything about “matching a hand”?
You want to know the probability of obtaining a particular hand (the one you happened to draw) ignoring the order in which it was drawn. So, you are asking what proportion of all possible draws contain the same cards, in any order. If you want to calculate that one card at a time, then you start by saying "I could have drawn any of these 13 cards in the first case, and still ended up with the set of cards. That's 13/52. And so on an so forth. Stop and think about what you are saying for a moment. You agree that 52 * 51 * 50 ... * 39 is mathematically identical to 52!/39! right? And you agree this is the formula for the number of k-permutation of a set (that is the numbered of ordered subsets)? But you are also claiming this calculation gives you the number of unordered hands? So you are claiming the probability of drawing a hand in any order is mathematically identical to the probability drawing a hand in a specific order? wd400
jdk: I made a mistake in applying this to the words of Kircher. In fact, we were talking about Rossiter's analysis of Kircher's argument. Below is part of response I made to daveS. Notice the words in bold; these words have been in the background here, and certainly in the back of my mind. The exchange with daveS @49 included this:
PaV: Well, what do you think Kitcher was saying? It’s not clear from the context of the quote. But Rossiter’s take is that Kitcher is saying that Behe is wrong, and the the probability of getting a hand of 13 cards is a highly probable event. Rossiter then goes on to say this:
This is asinine, because it misses the entire thrust of the probability argument. When you deal yourself thirteen cards, you have no idea what your hand will be. And, it doesn’t matter in any meaningful way. You don’t have to get a certain hand. You simply get thirteen cards. Now, the thrust of the probability argument is to say that you must get a particular hand (thirteen cards in a particular order). You can try that ‘til the cows come home.
I un-concede the point. We've all agreed that the probability of 'getting a hand' is 1.0. And the probability of getting a Bridge hand (13 cards dealt to you), if order is important, is 1 in 4 x 10^21. The only basic disagreement was regarding 13 cards dealt to yourself, with the order not being important. I'll stick to the argument I make in #232. PaV
jdk: I went back to @5 and saw Kircher's quote. You're correct. He did mention "order," and, so, he definitely was thinking in terms of permutations. Point conceded. Now, here's what I have in mind: there are 52 cards of a deck place face down on a table. You ask a six-year-old girl to pick a card. She does, and places the card, face down, off to the side. You ask her to do this a second, third and thirteenth time. She selects one of the cards, and places on the pile, off to the side, with all the cards face down. Now, you take the 13 cards she selected, and you shuffle them. And now you can lay them down on the table, first card on the left, and the last card on the far right. But you don't. Instead, you shuffle them three more times, and now you lay them down, face up, from left to right. What was the probability of selecting these cards? Did shuffling affect these odds? Let me answer these questions for you (since I think we both agree on this): The probability of the first card is 1 in 52, the second is 1 in 51, etc, etc = 1 in 4 x 10^21. Second question: No. The shuffling does not affect the probability involved in how they were selected. Now, the only question is this: can we equate the random process of a 'shuffle' with the random process of 'a child selecting cards one at a time'? I think we can. PaV
wd400:
(since any of cards in the specified hand could have been drawn first while still matching the hand)
Who said anything about "matching a hand"? PaV
PaV writes,
You have 52 cards, and you deal them out. Kircher doesn’t say, “Since we’re interested in the order of all possible Bridge hands, then the total number of possible Bridge hands where order is important is 4 x 10^21.” IF he had said that, then I would have agreed with you. But he didn’t say that.
The quote from Kircher in 5 is,
You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order? The answer is one in 4 x 10^21…
So Kircher did did say exactly what PaV says he didn't. This is a permutation. Also, Kircher did not call them bridge hands. The quote above started with,""Consider the humdrum phenomenon suggested by [Michael] Behe’s analogy with bridge." He then described the permutation situation quoted above. Kircher's example was not about bridge hands. As a have said a number of times, we are not talking about bridge hands, because in bridge hands order is not important. Those are combinations, which wd400 correctly distinguishes in 229. PaV asks,
So, jdk, I’ll ask you: what is the probability of being dealt the first 13 cards of a shuffled deck without replacement? You don’t end up with (52)!/(39)!, which is the total number of permutations of a deck of 52 cards.
This is a confused sentence. The number of different ways you can be dealt the first 13 cards from a well shuffled deck without replacement is indeed 52!/39! = 4 x 10^21, which is the formula for the permutation of 13 things taken from a set of 52. The probability, therefore, of getting a particular hand, such as all the spades in order, is 1 out of 4 x 10^21. Also, 52!/39! is not "the total number of permutations of a deck of 52 cards." That is just 52!. 52!/39! is the total number of ways of picking 13 elements from a set of 52 in order. I think further discussion with PaV about this is futile. jdk
Looking back through a couple of these posts, this is a pretty good question to ask yourself, PaV.
Let me ask you this: what is the probability of dealing 13 cards from a single deck when the order is in one direction, and then the order is in the opposite direction? IOW, does it matter a bit whether the Ace of Spades is dealt first, or last. What matters is if it was the ‘first’ card selected, or the ‘thirteenth.’ That’s all.
Right -- so if order did not matter then it would not make a difference if the ace of spades was first or last. That being the case, you wouldn't start your calculation with 1/52 (since it didn't have to be the ace first) but 13/52 (since any of cards in the specified hand could have been drawn first while still matching the hand). With your next draw there are 12 cards that could match the hand, and 51 one left in the deck. So you have (13/52) x (12/51) x (11/50)... That means the in your earlier calculations, presuming order doesn't matter, you were out by a factor of 13 in the first term, 12 in the second, 11 and so on. Or 13! overall. Multiply your probability by 13! and you end up with the inverse of the number of combinations of 13 cards from a 52 card deck, which would be the relevant calculation if the order did not matter. (You can check this easily enough numerically, also recall the formula for the number of k-permutations is n!/(n-k)!, while combinations is n!/((n-k)!*k!), there are fewer combinations by a factor of k!). wd400
wd400 @205
Of the possible genes encoding protein chains 153 amino acids in length, only about one in a hundred trillion trillion trillion trillion trillion trillion is expected to encode a chain that folds well enough to perform a biological function. [Axe, Undeniable, Ch.10]
Dawkins: However many ways there may be of being alive, it is certain that there are vastly more ways of being dead, or rather not alive. You may throw cells together at random, over and over again for a billion years, and not once will you get a conglomeration that flies or swims or burrows or runs, or does anything, even badly, that could remotely be construed as working to keep itself alive.
Axe: The very same principle applies at levels above and below the cell. Coherent skeletons are impossibly rare among random arrangements of bones, as are coherent body plans among random arrangements of organs, and molecular machines among random arrangements of folded proteins, and folded proteins among random arrangements of amino acids. According to our analysis, none of these inventions had any prospect of coming together by accident. They all required insight. Dawkins still thinks natural selection can do the work of insight, but we know better. Interestingly, his own words point to the gaping hole in Darwin’s theory, which we saw back in chapter 7. Natural selection happens only after cells are arranged in ways that work to keep the organism alive, so selection can hardly be the cause of these remarkable arrangements. Darwin’s simplistic explanation has failed, and the millions who have followed him have nothing but his outdated assumption to stand on. The stepping stones over which these followers think life has skittered from one form to the next are definitely not explained by natural selection. Selection steps to forms that already exist, so it doesn’t explain the forms themselves, much less the intricately engineered circumstances that would have been needed for these forms to be connected through lines of descent. And the problem never goes away. Because the impossibility of accidental invention is at the root, and because each new form of life amounts to a new high-level invention, the origin of the thousandth new life form is no more explicable in Darwinian terms than the origin of the first. Even if we suppose a first insect to have been formed somehow—without trying to explain how—all the countless insects that differ substantially from that first one would still be new top-level inventions. The component inventions common to all insects would have had their specific representations in that first insect, but a great many of these components would have had to be substantially reworked to suit each new insect. This would have been a staggering feat of re-engineering in itself, to say nothing of the great variety of newcomponents that would have had to be invented from scratch. In the end, each new form of life amounts to a stunning new invention, and since the hallmark of invention is functional coherence—which accidental causes can’t explain—we rightly see each form as a distinct masterpiece. Accident is out of the picture. Stepping stones connecting these masterpieces are either a figment of our storyteller imaginations or proof that God has, at times, converted the world into an exquisite nanofabrication facility. There is no substitute for brilliance, so either the stones are part of the brilliance or they aren’t anything at all. The genius of life is not in question. The only question is how The Genius of life did his work. [Axe, Undeniable, Ch.11]
Origenes
The only "point" I can imagine you are trying to make is so muddle-headed I wouldn't ascribe to anyone. So, you can spell it out or we can drop it. In the other half of this thread you seem to argue that two ways to denote a mathematically identical probability calculation are somehow different on a metaphysical level? I'm not sure if you know, but an ordered sample of size k taken without replacement is called a "k-permutation" of a set. So, if we take order to be important (as is specified in the example), then these bridge hands are certainly permutations. wd400
jdk:
The question is is a permutation calculation the correct calculation for the situation we have been discussing, where 13 cards are dealt in order?
My simple answer is: no.
(I did write one sentence at 207 where I agreed with most of a post of wd400, but even there I am mainly thinking about general probability models applied to the world, not about any particular biological knowledge that I have.)
You're simply accepting that what wd400 is correct. How do you know that to be true? The claim evolutionary biologists make is that not even TWO mutations have to happen simultaneously in order to explain the kind of biological diversity we see in this world. It's not a believable claim. And real evidence--Behe's The Edge of Evolution makes this quite clear. The question is: Is each mutational step taken in the evolution from one species to another a separate event, with it's own limited improbability, or, are multiple mutational events necessary with a combined improbability that is astronomical? What should be obvious is this: if EVERY viable evolutionary step is but some small improbability away, then there should be a regression to the mean, and only a blob would exist. No, astronomical improbabilites are needed for there to be any kind of stability of species. Put another way: microevolution can occur one step at a time. There is no evidence whatsoever that macroevolution--true species creation--can occur in this same fashion. PaV
wd400: Try real hard to think of it; maybe it'll come to you. PaV
jdk:
(The formula for this is 52!/39!.)
To me it looks like you're determined to use the permutation formula. But other probability formulas exist. And you've already wrote it out. Here's what I mean: in Kircher's example of dealing a Bridge Hand, does he mention anything about comparing one hand to another? He doesn't. You have 52 cards, and you deal them out. Kircher doesn't say, "Since we're interested in the order of all possible Bridge hands, then the total number of possible Bridge hands where order is important is 4 x 10^21." IF he had said that, then I would have agreed with you. But he didn't say that. So, jdk, I'll ask you: what is the probability of being dealt the first 13 cards of a shuffled deck without replacement? You don't end up with (52)!/(39)!, which is the total number of permutations of a deck of 52 cards. You end up with: 1/52 x 1/51 x 1/50 ....x 1/40. When you cancel equal numbers in the numerator and the denominator, you end up with the exact same series of multiplications. But just because the probabilities are the same doesn't mean we have the right to say Kircher was interested in permutations when he makes no mention of that.
PaV, in respect to the 4 x 10^21 possible hands that we have been discussing, are A and B the same hand or different hands?
In terms of considering how many permutations can be made from a deck of 52 cards, yes, they're different. In terms of what Kircher describes, no, they're not any different. Let me ask you this: what is the probability of dealing 13 cards from a single deck when the order is in one direction, and then the order is in the opposite direction? IOW, does it matter a bit whether the Ace of Spades is dealt first, or last. What matters is if it was the 'first' card selected, or the 'thirteenth.' That's all. PaV
Let me know what the point is, I guess. wd400
wd400: You side-stepped my point: clever, but, nevertheless, evasive. PaV
PaV writes,
What is it, exactly, that you KNOW about biology?
FWIW, I know quite a bit about some aspects of biology, especially anatomy and physiology, but very little about biochemistry. I am not qualified to talk about the structure of proteins, for instance. I will once again note that I have been discussing pure probability, and haven't made any comments about anything to do with biology. (I did write one sentence at 207 where I agreed with most of a post of wd400, but even there I am mainly thinking about general probability models applied to the world, not about any particular biological knowledge that I have.) jdk
Oops, I just saw PaV's response at 214 to Dave, so let me skip what I just wrote, and reply this way. Dave wrote,
Do you agree that in Kitcher’s experiment, getting the clubs in order ace, 2, 3, …, jack, queen, king and getting the clubs in order king, queen, jack, …, 2, ace would be counted as distinct events?
and PaV replied,
Only if you’re going to use a permutation calculation.
I'm not sure that PaV answered the question. The question is is a permutation calculation the correct calculation for the situation we have been discussing, where 13 cards are dealt in order? In other posts you talked about bridge hands, or throwing coins all at once, or "whether the Jack of Diamonds was the first, of the fourth, or the thirteenth, doesn’t matter," but all of those are situations in which a permutation calculuation would be incorrect. jdk
PaV writes,
The Kitcher example involved dealing the first 13 cards. Whether the Jack of Diamonds was the first, of the fourth, or the thirteenth, doesn’t matter. Apparently you have difficulty seeing this. The first card dealt—whatever it is, has a chance of being dealt of 1 in 52; and the second card, no matter what it is, has a chance of 1 in 51. Why? Because there are only 51 card left after the first one is dealt. I don’t know why you can’t see this.
Of course I understand the calculation for dealing the cards in order: the number of possible hands is 52 • 51 • 50 • ... * 40 = 4 x 10^21. (The formula for this is 52!/39!.) However, hands which have the jack of diamonds first are different than hands that have the jack of diamonds 4th, or 13th, or in any other position. So let me ask PaV a question, based on Dave's post at 212 that should help clarify our mutual understandings: Let A = ace, 2, 3, 4 ... king of spades, dealt in that order. Let B = king, queen, jack, ... ace of spades in that order (the same cards as A but in the opposite order.) PaV, in respect to the 4 x 10^21 possible hands that we have been discussing, are A and B the same hand or different hands? jdk
There is no crocoduck, so no. wd400
wd400: Do the "croco-duck" and the 'duck-billed platypus' share a common ancestor? PaV
croco-duck (typo in the first case) is an example of a particularly silly creationist confusion. wd400
wd400: Try to say something of some significance. It's not like you to post rot. Your "corcoduck" is meant to convey something? I'll give you trillions of trillions of years, and new protein domains will not arise. PaV
daveS: Only if you're going to use a permutation calculation. PaV
Of course they don’t think this is what happened, because if they did, then they would have to abandon evolution.
I mean, even creationsists don't actually think proteins were created by one-off draws from an urn full of amino acids. Do they?
Doug Axe’s work shows just how isolated protein domains are
Axe tried to make a corcoduck -- evolving one modern protein into another one. Again, no one thinks this is how proteins came about. Modern proteins are the result of (in some cases) billions of years of evolutionary divergence, it is no great surprise they are quite separate from each other after all this time. wd400
PaV, Do you agree that in Kitcher's experiment, getting the clubs in order ace, 2, 3, ..., jack, queen, king and getting the clubs in order king, queen, jack, ..., 2, ace would be counted as distinct events? daveS
jdk:
Hi wd400. I’ve only been commenting on the pure probability issue, but I agree with you all that wrote in 205.
And to what, and why, do you agree? What is it, exactly, that you KNOW about biology? PaV
jdk:
I’m not sure how clear he is that he has changed the situation considerably, just as he doesn’t seem to understand the difference between receiving the components of an event in order (cards, dice coins) and receiving them all at once so there is no discernable order.
I believe that it is you who are wrong about this. The Kitcher example involved dealing the first 13 cards. Whether the Jack of Diamonds was the first, of the fourth, or the thirteenth, doesn't matter. Apparently you have difficulty seeing this. The first card dealt---whatever it is, has a chance of being dealt of 1 in 52; and the second card, no matter what it is, has a chance of 1 in 51. Why? Because there are only 51 card left after the first one is dealt. I don't know why you can't see this. PaV
wd400:
No one things biological proteins evolved as they are from a one-off draw from an urn full of amino acids!
Of course they don't think this is what happened, because if they did, then they would have to abandon evolution. Doug Axe's work shows just how isolated protein domains are. And this fact tells us that it is sheer impossibility to move from one domain to another simply by chance. This would require ALL protein domains to have been present from the beginning. And, now, pray tell, how did THAT come about? By chance? PaV
jdk/daveS/Origines:
I think that PaV is saying that when one views 1,000 coin flips as isolated events
This is what I meant. DaveS wrote: match “one letter at a time”. This IS the idea. And this IS how you "climb" Mt. Improbability. PaV
Hi wd400. I've only been commenting on the pure probability issue, but I agree with you all that wrote in 205. I find it odd the PaV just now brings up the situation where you try to match the component parts of the pattern one at a time, when that is unlike anything that has been discussed on this thread, including the original example of dealing 13 cards in order. I'm not sure how clear he is that he has changed the situation considerably, just as he doesn't seem to understand the difference between receiving the components of an event in order (cards, dice coins) and receiving them all at once so there is no discernable order. jdk
WD400 #205
WD400: If fitness landscapes are relatively smooth for a given trait, then evolution can be much more like Pavs example than the probability of particular bridge hands. There is no need to get every amino acid right in one draw, as each individual change increases the fitness of the creature that has it, meaning they are likely to be become fixed in the population and all individuals in subsequent generations start with improvements from previous ones, which they can in turn improve.
Here is where 'conservation of information' becomes relevant. In this article Dembski cites Kauffman saying:
If mutation, recombination, and selection only work well on certain kinds of fitness landscapes, yet most organisms are sexual, and hence use recombination, and all organisms use mutation as a search mechanism, where did these well-wrought fitness landscapes come from, such that evolution manages to produce the fancy stuff around us?
Dembski goes on saying:
Kauffman’s observation here is entirely in keeping with conservation of information. Indeed, he offers this observation in the context of discussing the No Free Lunch theorems, of which conservation of information is a logical extension. The fitness landscape supplies the evolutionary process with information. Only finely tuned fitness landscapes that are sufficiently smooth, don’t isolate local optima, and, above all, reward ever-increasing complexity in biological structure and function are suitable for driving a full-fledged evolutionary process. So where do such fitness landscapes come from? Absent an extrinsic intelligence, the only answer would seem to be the environment …. Okay, so the environment supplies the information needed to drive biological evolution. But where did the environment get that information? From itself? The problem with such an answer is this: conservation of information entails that, without added information, biology’s information problem remains constant (breaks even) or intensifies (gets worse) the further back in time we trace it. The whole magic of evolution is that it’s supposed to explain subsequent complexity in terms of prior simplicity, but conservation of information says that there never was a prior state of primordial simplicity — the information, absent external input, had to be there from the start. …. But without intelligent input, conservation of information implies that as we regress biological information back in time, the amount of information to be accounted for never diminishes and may actually increase. .
I believe Dembski’s point is clear: assuming smooth fitness landscapes, as Dawkins and wd400 do, is, in effect, an attempt to hide the problem of origin of information in the ‘environment’. Origenes
I suspect Pav's example is motivated by Dawkins' "climbing Mt Improbable" metaphor. Dawkins makes the point that using some naive specification might show that a biological sequence or trait is very improbable. The silliest examples being "this is 100 amino acids long, and there are 20 amino acids so there is only a 1/(20^100) chance this protein exists!" Dawkins point is biological traits are the not the result of single draws from a probability distribution (like the card examples), but evolution along lineages. If fitness landscapes are relatively smooth for a given trait, then evolution can be much more like Pavs example than the probability of particular bridge hands. There is no need to get every amino acid right in one draw, as each individual change increases the fitness of the creature that has it, meaning they are likely to be become fixed in the population and all individuals in subsequent generations start with improvements from previous ones, which they can in turn improve. Even then, working out the probability of arriving at a particular sequence is not a case of just adding probabilities (it requires some quite fancy calculus actually). And, of course, there is much more wrong with the "1/20^(protein_length)" specification. No one things biological proteins evolved as they are from a one-off draw from an urn full of amino acids! wd400
jdk @200 I think that PaV is saying that when one views 1,000 coin flips as isolated events — not as a sequence — then you can focus on each result individually. Matching the first flip takes only two flips at the most. And so on. 2000 flips ( at the most) to match all the isolated events. He contrasts these isolated events with the situation when one considers the events to form a whole. Now, due to our decision to lump the events together, we need 2^1000 flips to match the pattern. In #186 PaV writes: "It’s only so when the “mind” decides to “group together” individual events into a “dependent” whole. ... Notice that it is the “mind” doing this, and deciding this." PaV, in effect, points out the role of the mind wrt probabilities, which is profoundly interesting. - - - - BTW Is no one going to answer my question at the bottom of post #194? edit: DaveS beat me to it in #201 Origenes
re 198: I see no place where Perakh mentions throwing 100 dice. jdk
Hmmm. That is an entirely different scenario than any we have discussed in this thread. jdk
jdk, I interpreted PaV's 2000 flips calculation to apply to a scenario where you are given a pattern of length 1000 to match "one letter at a time". For example: HTTHTH ... You flip a coin until it matches the first letter, "H". Once that is achieved, you flip the coin until it matches the second letter, "T" Then you go on to the third letter, try to match the "T", and so on. Matching one letter at a time would be expected to take 2000 flips on average, so I'm guessing that's what he had in mind. daveS
Also, there are some confusions in what you wrote: First, if you flip the coins 1000 times, there are 2^1000 possibilities. If you already have a pattern you are trying to match, the probability of a match is 1/2^1000. Therefore, on average (but not with a certainty), you would need to flip the coins 2^1000 times in order to get a match. This calculation involves multiplying, not adding: 1/2 • 1/2 • ... 1000 times. So your statement that "So, on average, you need only 2000 flips to match the pattern you find." is wrong. If you flip the coins 2000 times, you'll only get two of the possible events, and of course you are very unlikely to match your pattern if you look at only two events. Also, as I have explained, but you don't seem to understand, if you flip all the coins AT ONCE, you don't know what order they came in, so that is different than flipping the coin 1000 times in succession. But if you flip them in order, you are correct that it will take, on average, 2^1000 flips to match the pattern. jdk
jdk:
Give me an example of the difference between linked and added.
You have the results, supposedly, of 1,000 coin flips. Then, on average, to get the first position, you need two flips; and, two flips, on average for all the other 999 positions. So, on average, you need only 2000 flips to match the pattern you find. Whereas, collectively, the pattern, considered as a whole, would need 2^1000 flips to match the pattern. IOW, if you flipped 1000 coins AT ONCE, it would take 2^1000 of those flips to match the pattern. PaV
jdk: Perakh begins with a sequence of 100 4's. He says that this is highly unlikely. Then, he says, what about a sequence of 10 4's? Still unlikely, but like the 100 4's. That's where he gets his "psychological" side of things. PaV
Also, PaV, at 194 you write,
And that includes Perakh, who wants to divide the sequence of 100 4’s into 10 sequences of 10 4’s.
I don't see where he does that. The example you quoted from his book is just about ten throws. Can you show where he discusses throwing 100 4's? jdk
PaV, you write,
You’d make a lousy evolutionary biologist if you think the probabilities are linked and not added.
Leaving out the biology, and just talking probability, give me an example where probabilities are added. And what does "linked" mean. Give me an example of the difference between linked and added. jdk
Jdk: So the interesting issue here is to understand better why C has significance to us, to use the word I am suggesting, and A doesn’t. I think the key issue here is, as you point out, that human beings are good at and inclined towards pattern recognition. When we recognize a pattern, we then attach a significance to that throw that we don’t attach to a throw such as A above, which has no pattern clear pattern at all. Pattern recognition is a very important part of our cognitive ability to understand the world, and one of the tools we use to build a base of knowledge about the world. ... This is the “human” part of the situation that goes beyond the pure probability: the part that adds the “dual” aspect you speak of.
I would like to note that something can have significance because we understand that it requires knowledge. A sequence of prime numbers is an obvious example and my sequence in post #194 is another.
THE UNIVERSAL DESIGN INTUITION Tasks that we would need knowledge to accomplish can be accomplished only by someone who has that knowledge. In other words, whenever we think we would be unable to achieve a particular useful result without first learning how, we judge that result to be unattainable by accident. ... I use the term universal design intuition—or simply design intuition—to refer to this common human faculty by which we intuit design. ... I intend to show that the universal design intuition is reliable when properly used ... The design intuition is utterly simple. Can you make an omelet? Can you button a shirt? Can you wrap a present? Can you put sheets on a bed? Tasks like these are so ordinary that we give them little thought, and yet we weren’t born with the ability to do them. Most of the training we received occurred so early in life that we may struggle to recall it, but we have only to look at a young person still in the training years to be reminded that all of us had to be taught. Whether we taught ourselves these skills or were taught by others, the point is that knowledge had to be acquired in the form of practical know-how. Everyday experience consistently shows us that even simple tasks like these never accomplish themselves. If no one makes breakfast, then breakfast goes unmade. Likewise for cleaning up after breakfast, for making the bed, and so on. [Douglas Axe, ‘Undeniable’, Ch.2]
Origenes
PaV: What Perakh has done, perhaps unwittingly, is to say that any group of ten rolls of a die ALL have a probability of 1 in 10^6. But that’s not so. It’s only so when the “mind” decides to “group together” individual events into a “dependent” whole. … So, contra Perakh, what the “mind” does, or doesn’t do, is essential: and is not simply a “psychological” reaction. It is knowledge at work using information that the “mind” has acquired over the lifetime of the individual.
I would like to note that in life we are confronted with real wholes which cannot be arbitrarily cut up and turned into newly arranged wholes at will. Also with numbers we can suddenly recognize a pattern and infer design as the best explanation:
2, 1, 3, 4, 2, 6, 5, 1, 6, 2, 3, 5, 3, 1, 4, 1, 1, 2, 4, 2, 6
Is this sequence best explained by design or the throw of a die? Origenes
jdk: You'd make a lousy evolutionary biologist if you think the probabilities are linked and not added. That's all we get from them. And that includes Perakh, who wants to divide the sequence of 100 4's into 10 sequences of 10 4's. PaV
Hi PaV. This is really long, but I have been comprehensive about some things, and tried to illustrate with examples to make everything really clear. I have no idea if anyone will really read all this with the intent of digesting it, but I'm interested, so I did it anyway! :-) Way back at 148, you wrote,
jdk, [you wrote],
I think what is confusing is that there is a difference between the event, the specification, and the significance of the event. The significance lies outside the realm of the probability situation itself, but rather in some broader context.
I think that you’re onto something here. This is, in a way, what we’re wrangling about. But, my hunch is that there is something deeper lurking here. For example, all of this discussion is bringing me to the point of view that probability acts like a ‘dual space,’ or, at the very least, there’s some kind of duality associated with it. And that this duality, then, can give rise to misunderstandings. Hopefully some clarity will emerge.
I agree that this part of the subject is worth exploring. Our human interpretation of events adds a layer of significance and meaning that goes beyond the purely theoretical probability considerations. It is also closely associated with the idea of specifications, at least from one point of view. This is the subject Perakh was addressing in the section you quote, which he titled Psychological Aspects of Probability. I'm willing to use his example, but I'd like to use my formulation about there being some significance to us, from a broader context than just the pure probability. I think this approach is consistent with your use of the idea of "duality": that we compare the probabilistic event with other things we know about the world to determine the significance of an event. I'm not very interested in responded to Perakh's thoughts on the matter, though: I would rather make my own points, and hear yours and others here, than worry much about exactly what Perakh said. Let first be a little formal here: The situation is that we roll a fair die 10 times, and record the numbers rolled in order. There are thus 6^10 (60,466,176) possible events that can occur. Call each of these events a "throw." The sample space is this entire set of possible throws. Call the sample space S. The number of elements in the sample space is 60,466,176. Call this number n. All the events are equiprobable, so the probability of any particular throw is 1/n So consider the following two throws, per Perahk: A = (3, 5, 6, 2, 6, 5, 6, 4, 1, and 1) C = (4, 4, 4, 4, 4, 4, 4, 4, 4, and 4) It is a true fact that both of these have a probability of 1/n. From the point of view of pure probability, these are equivalent events: C is just one of n possible events. However, obviously, we would respond to C very differently than A. To A our response would be, "Well that's one of the many results I could get - nothing special here." But to C our response would be, "That's hard to believe. I'm inclined to think something else is going on here besides pure chance." (I might actually have stronger feelings that that, and feel pretty sure some kind of cheating has gone on.) So the interesting issue here is to understand better why C has significance to us, to use the word I am suggesting, and A doesn't. I think the key issue here is, as you point out, that human beings are good at and inclined towards pattern recognition. When we recognize a pattern, we then attach a significance to that throw that we don't attach to a throw such as A above, which has no pattern clear pattern at all. Pattern recognition is a very important part of our cognitive ability to understand the world, and one of the tools we use to build a base of knowledge about the world. (There is a whole subset of child psychology that studies the development of this skill in children.) This is the "human" part of the situation that goes beyond the pure probability: the part that adds the "dual" aspect you speak of. However, merely seeing a pattern is not enough to make us question chance. As I have mentioned, if the sample space is small, where nothing is improbable, even if we see a pattern, we might very well not question chance. So, here is a fairly extensive analysis of a simpler situation: throwing three dice. Once we understand the issues well, we can expand to the much larger sample space arising from throwing 10 dice. If I throw three dice, and they come up all 6's, that has a 1/216 chance (not 1/108 as you mention). If we've played enough dice games with three dice, we've seen that happen, and will not be unduly surprised. (And note, the probability of all three dice being the same number, like 1 1 1, 2 2 2 , etc., is only 1/36, so that is pretty common.) I think the main issue is here is the question of what proportion of the events exhibit a pattern to which we attach a significance. For instance, we could list all the possible events when we throw three dice: there are only 216 of them. Suppose we decided to mark which of those 216 were significant because of having a pattern. For instance, the six events where all three dice are the same would qualify as having a pattern. What about 1 2 3? This is just as likely as 6 6 6, and it has a pattern. Would we consider it as significant? What about 2 4 6? What about 1 6 1? What abou 4 2 6 (all even numbers)? I have two points to make about these examples: 1. There is not a clearcut dividing line between throws that exhibit a pattern and those that don't. We might be even tempted to devise a "significance scale", with 5 = very significant pattern (6 6 6), 3 = somewhat significant pattern (1 6 1, perhaps, or at least 2 4 6) and 1 = no significant pattern (3 5 2). 2. Suppose we did categorize all 216 possibilities, picking some criteria to separate all the significant patterns from those throws lacking a significant pattern. What percent of the sample space is events that are significant? Let the set SP = all the throws which have a significant pattern. Let m = the number of events in SP The big question is what is the ratio m/n? What percentage of the possible throws are in SP? That is, when we throw three dice, what is the probability m/n that we will get a throw that exhibits a significant pattern? My intuitive guess is that the percentage is pretty high. If you count all the throws where the dice are the same, all the straights going forward and backward, and all the "skippy straights" such as 2 4 6, we already have almost 10% of all possibilities as significant. Therefore, when we throw 6 6 6 we are a bit surprised. but not amazed, not solely because the odds are 1/216, but because we have a pretty high probability of throwing some significant throw in. For instance if m/n = 10% (which includes just the throws listed above), the odds of throwing a significant throw vs throwing a non-significant throw is 1 to 9, and those are reasonable odds. The 10% is really what is important, not the 1/216 ratio, if all we are trying to explain is getting a throw with a significant pattern as opposed to getting some specific, particular pattern. Two more notes to add to this explanation. Since there is not clearcut distinction between significant and non-significant throws, we might, as I suggested above, apply a scale from 5 to 1. This then a place where we could calculate an expected significant value: take the number of 5's times the percentage of throws that are labeled 5's + the number of 4's times the percentage of throws that are labeled 4's + etc. Then we would have a expected significance value EV for throwing three dice. I have very little idea what SV might be: the very first step of assigning a value to each throw would involve some judgments upon which people would disagree. However, if all throws were insignificant (no one saw any patterns at all), EV would equal 1. If half the throws were insignificant, and the other 1/2 average about 3, for instance, EV would equal 2. If every throw was very significant (which isn't the case at all), EV would equal 5. So we see that the lower EV is, the less likely you will get a significant throw, so the more likely that you will be surprised if do you one. Last point: an important one. The dice already contain a pattern built in: the numbers from 1 to 6. Most of the patterns we see arise from the relationship of the numbers themselves: they are ordered, and the group in known sets of evens and odds. Suppose instead we had just six symbols that had no order and could not be categorized into any groups. This would reduce the number of significant hands because there would be fewer patterns to recognize. All three dice with the same symbol, such as # # # would stay the same, a 5. But there would be nothing to compare to straights or evens in the above example. We might recognize $ % $ as a pattern, but that would be no different than 5 1 5 in the regular dice, and that would be no more than a 2, I think. Therefore, the expected significance value EV for this situation would be lower than that of the regular dice, because there would be fewer significant hands to compare to. That is, the probability of getting & & & would be the same as the probability of 6 6 6, but & & & would possibly surprise us more because the dice with symbols has fewer significant throw. To summarize, before going on to ten dice: We definitely see patterns, and we intuitively (even if we are not very sophisticated about theoretical probability) make an estimate of the ratio of significant events in comparison to the total number of possible events. If that ratio is relatively large (say 10%) we are unlikely to infer some other than chance no matter what throw we get. And another, more precise way, to measure significance would be to assign a significance value to every throw, and then create an expected significance value EV. The higher the EV, the more likely we will get significant hands, and thus the less likely we are to be surprised when we get one. ======== Now, to wrap this up, let's jump to 10 dice. Without going through any more analysis, I am virtaully certain that the ratio of throws that exhibit a pattern in comparison to the total 60 million possible throws would be much, much smaller than with three dice. That is, there would be a much larger percentage of total hands that did not exhibit much, if any, pattern: they would be 1's on our significance scale. Because of this when we do get a significant hand such as all 4's, or 5 5 5 5 5 7 7 7 7 7, it is a much more unlikely event than getting a non-significant hand. So to return to the example at the start of his post. A = (3, 5, 6, 2, 6, 5, 6, 4, 1, and 1) and C = (4, 4, 4, 4, 4, 4, 4, 4, 4, and 4) both are equally improbable from a pure probability point of view. But A is a member of a very large number of non-significant throws and C is a member of a much, much smaller set of significant throws. It is this ratio that we are thinking about, intuitively, we when declare that C could not have happened by chance but A did. We are looking for patterns to which we attach significance vs those we don't. If I specified a particular throw 3 1 4 3 1 6 4 5 2 4, I would have a 1 out of 60 million chance of getting that throw. But if I just specified that I was going to throw a non-significant throw, I would have a very good chance of matching that specification. Conversely, therefore, the chances of throwing a significant throw is very small. It's not the absolute probability that we are interested in when we look for significance, it the ratio of events that are and aren't significant that is ... well, significant. Way too much typing. I'm done, and shot the whole evening! jdk
PaV @186 @189 Connecting the dots. Very interesting. Origenes
PaV, you write,
The basic idea is that when you see a pattern, then the pattern is produced by a series, or sequence, of events, each of which has a probability, which probabilities must be multiplied together.
This is true whether you see a pattern or not. This is just how probability works: when successive independent events occur, you multiply the probabilities. As I said before, there is nothing in our current discussion where adding probabilities applies. You write,
It’s like the example I gave before: if you throw one die, a hundred times in a row, we know the probability of each throw/event is low; if, however, you throw 100 dice ‘all at once, then each throw/event is highly improbable. It’s the probability associated with the first die, plus that of the second, plus that of the third, etc. So, the probabilities are multiplied.
This is true (with the exception noted below). It is exactly the principle we have been using in all our examples. However, as a clarification, since we are considering the order of the throw, we need to either throw a die 100 times successively, or number the 100 dice so we know which is dice 1, which is dice 2, etc. If we don't care about order, we can throw all 100 at the same time. However, we have been talking about order in all our examples, so I think we need to talk about throwing a die many times, one throw at a time, not throwing 100 at the same time. By the way, I'm working on a longer post that may shed some light on our topic. And I must repeat, I am not at all even thinking about how this might apply to "cellular realities" or any other aspect of biology. But I am interested in why we see some throws as significantly improbable, and others not, and that is what I'm working on trying to explain. jdk
jdk: I'm fully capable of getting backwards: it's a sort dyslexia in remembering things. But that said, the basic idea is that when you see a pattern, then the pattern is produced by a series, or sequence, of events, each of which has a probability, which probabilities must be multiplied together. The basic way of "climbing Mt. Improbable" is to take 100 steps, with each step having nothing to do with the other steps. Or, you might say, one mutation at a time. The whole idea of irreducible complexity is to say that this isn't going to happen; that to arrive at the complexity of proteins, single steps won't get you there. So, it's all about: do you "add," or do "multiply." My insight is that when a "pattern" is glimpsed, then those single steps become fused together. It's an "probabilistic ensemble," to borrow from thermodynamics. It's like the example I gave before: if you throw one die, a hundred times in a row, we know the probability of each throw/event is low; if, however, you throw 100 dice 'all at once, then each throw/event is highly improbable. It's the probability associated with the first die, plus that of the second, plus that of the third, etc. So, the probabilities are multiplied. I think common sense should tell you that when it comes to cellular realities, the improbabilities are way too high. Perakh, and his acolytes, simply tells us to put common sense to the one side. BTW: most of what you posted is available online; but, it is a helpful summary, having all those various elements in one condensed setting. PaV
Here is a link to a notesheet I used in my Probability and Stats chapter when I taught high school Pre-Calculus. It might make a good reference for anyone interested. Probability Notesheet It is, I think, a succinct summary of all the concepts we have been using in this discussion, and more. Google will calculate factorials and combinatorial numbers for you: type 5! or 5 choose 2. I don't think it computes permutations, so to get 52 pick 13, you have to use the formula 52!/39! And here it all is, although some of the formatting, such as indents, don't translate well to the screen. Pre-Calculus: Chapter 12 Notes on Counting and Probability 1) Definition of variables: E and F are events. An event is a particular occurance that can happen in a given situation. n = N(E) is the number of ways that event E can happen. p = P(E) is the probability that event E will happen. S = the sample space: the total set of possible events that can happen in a given situation. Therefore, N(S) is the total number of possible events in a given situation. 2) The AND rule (the multiplication principle): N(E and F) = N(E) • N(F) P(E and F) = P(E) • P(F) To find the total number of ways (or probability) that event E will happen and then event F will happen, you multiply the number of ways (or probabilities) that each will happen. 3) The OR rule (the addition principle): N(E or F) = N(E) + N(F) P(E or F) = P(E) + P(F) To find the total number of ways (or probability) that event E will happen or event F will happen, you add the number of ways (or probabilities) that each will happen. (This applies only if the events are mutually exclusive.) 4) The NOT rule (the complement principle): P(not E) = 1 – P(E) Either an event happens (E) or it doesn't happen (not E). The probability that the event won't happen is 100% minus the probability that E will happen. The event is called not E, and the probability is called the complement of P(E). For example, if the probability of throwing a 5 on a dice is 17% , then the probability of not throwing a 5 is 83% Counting Principles 5) Counting with Replacement N(E repeated k times) = n^k Assume Event E can happen in n ways. Assume also that if E is repeated, it can again happen n times (this is called with replacement.) If event E is repeated k times then the total number of possible events is n^k. 6) Counting without Replacement If the number of ways event E can happen is reduced by one (1) each time it repeats, then we say that the event happens without replacement. There are three situations. a) n Factorial (n!) n! is defined as n (n –1) (n –2) • ..... • 1. For example, 3! = 3•2•1 = 6 This is the number of ways that a set of n objects can be arranged in order. b) Permutations (nPk) (when order does counts) nPk = n!/(n – k)! This is read "n Pick k". This is the number of ways that a subset of k objects can be picked from a set of n objects, in order. c) Combinations (nCk) (when order does not counts) nCk = n!/k!•(n – k)! This is read "n Choose k". This is the number of ways that a subset of k objects can be picked from a set of n objects, without regard to order. The combinatorial numbers nCk can be summarized in Pascal's Triangle. nC0 = nCn = 1 Only 1 way to choose none or all nC1 = nCn–1 = n n ways to choose 1, or all but 1 nCk = nCn–k Choosing some (k) is the same as not choosing the rest (n-k) 7) Definition of Probability: Probability = number of ways the desired event can happen/total number of ways all possible events can happen P(E) = N(E)/N(S) (S = Sample space: all the possible events.) Two Types of Complex Probability Situations 8) Binomial Probability BP(n, k) = nCk • p^k • q^(n-k) Variables: p = probability of a success on one trial q = probability of a failure on one trial (q = 1 – p) n = number of trials k = number of successes Example: A baseball player gets a hit 30% of the time. What is the probability that she will get 3 hits in 5 at-bats? p = 30% chance of getting a hit in one at-bat q = 70% chance of not getting a hit in one at-bat n = 5 at-bats k = 3 hits P (3 hits) = 5C3 • 30%^3 • 70%^2 = 13% 9) Combinatorial Probability Assume a set of n objects contains a subset of m objects. Assume k objects are chosen from the set of objects on n objects. What is the probability that all of the objects will be chosen from the subset of m objects? The combinatorial probability formula is CP(n, m, k) = mCk/nCk Example: A group of 20 people contains 12 women. Three people are chosen at random. What is the probability that they will all be women? P(3 women) = 12C3/20C3 = 220/1140 = 19% Maybe someone will find all this useful. jdk
Hi PaV. I appreciate your lengthy post, and have some relevant things to say, I think, which I hope to get to later today. But one sentence jumped out at me:
When a “pattern” is detected, then “independent events” become “dependent events,” and the probabilities are no longer added but MULTIPLIED!
I don’t believe this is correct. The basic rule is that if you want the probability of A or B you add the probabilities, but if you want the probability of A and B, as in successive independent events, you multiply the probabilities. The probability of throwing a 1 or a 6 is 1/6 + 1/6 = 1/3 The probability of throwing a 1 and a 6 (either successively, or with two dice which are distinct from each other), the probability is 1/6 • 1/6 = 1/36. These are basic rules for computing probabilities. jdk
jdk: I looked on my computer to see if I had written anything about Perakh's treatment of ID. And, I had. Looking over it, from a "probabilistic" point of view, I imagine we could easily say that Perakh had made no errors. [I do remember, however, that in his book he criticized how Behe used probabilities, and, in one occasion, did so in a way that contradicted what he had written in an appendix. But, no need to go into that. I haven't the time, nor disposition. And, again, it wasn't that he got his probability theory wrong, but that he was being dishonest (it's also possible that he simply made a mistake)] However, along the lines of what I had said earlier about there being a kind of "duality" when it comes to probabilities (remember, e.g., the cards written in invisible ink, and then the markings appearing. When there is no markings the actual 'event' of dealing the cards tells us nothing; it is only when the markings appear that we can, and do, make distinctions. This points to how the mind is involved in probabilities. Well run into that in what I'm going to post from my Word file). And it is here that I believe Perakh misses the mark: From Chapter 13, Unintelligent Design
Consider an experiment with a die, where events are sets of ten trials each. We assume an honest die was well as independence of outcomes. If we toss the die once, each of the six possible outcomes has the same chance of happening, 1/6. Assume that in the first trial the outcome was, say, 3. Then we toss the die the second time. It is the same die, tossed in the same way, with the same six equally probable outcomes. To get an outcome of 3 is as probable as any of the five other outcomes. The tests are independent, so the outcome of each subsequent trial does not depend on the outcomes of any of the preceding trials. Now toss the die in sets of ten trials each. Assume that the first event is as follows: A (3, 5, 6, 2, 6, 5, 6, 4, 1, and 1). We are not surprised in the least since we know that there are 6^10 (that is 60,466,176) possible, equally probable events. Even A is just one of them and does not stand alone in any respect among those over sixty million events, so it could have happened in any set of ten trials as well as any other of those sixty million variations of numbers, Let us assume that in the second set of ten trials the vent is B (6, 5, 5, 2, 6, 3, 4, 1, and 6). Again, we have no reason to be surprised by such a result since it is just another of those millions of possible events, and there is no reason for it not to happen. So far the probability theory seems to agree with common sense. Assume now that in the third set of ten trials the event is C (4, 4, 4, 4, 4, 4, 4, 4, 4, and 4---the “all 4s” event described earlier in the chapter). I am confident that in such a case everybody would be amazed, and the immediate explanation of that seemingly “improbable” event would be the suspicion that either the die has been tampered with or that it was tossed using some sleight of hand. While cheating cannot be excluded, this event does not necessarily require such an assumption. Indeed, what was the probability of event A? It was one in over sixty million. Despite the exceedingly small probability of A its occurrence did not surprise anybody. What was the probability of event B? Again, only one in over sixty million but we were not amazed at all. What was the probability of event C? The same one in over sixty million, but this time we are amazed. Why does “all 4s” seem amazing? Only for psychological reasons. It seems easier to assume cheating on the par of the dice-tossing player than the never-before-seen occurrence of all 4s in ten trials. What is not realized is that the overwhelming majority of events other than this one were never seen, either. There are so many possible combinations of ten numbers, composed of six different unique numbers, that each of them occurs extremely rarely. The set of ten identical numbers seems psychologically to be “special” among combinations of different numbers, but for probability theory this set is not special. To view an event as special means abolishing the premise of the probabilistic estimate---the postulate of a fair die. . . . There is no doubt that the viewpoint of probability theory is correct, even in the face of contradictory common sense. Such a human psychological reaction to an improbable event such as ten identical outcomes in a set is as wrong as the suggestion to a pilot of a spacecraft lagging behind to increase her speed if she wishes to overcome a craft ahead of hers in orbit.
I suppose Perakh means to say by the 'spacecraft' analogy, that by "increasing" one's speed, you won't "overcome" the other craft, but will, in fact, simply enter some different orbit. IOW, reality is "counter-intuitive." This is, of course, the evolutionist's argument, as though intuition was left behind with the discovery of Copernicus. If this is what Perakh means, it's almost a non sequitur. Again, I hearken back to the notion of "duality." That notion points out that the human mind is very much a part of probability. And it is a part of probability when the ideas of probability begin to give us "knowledge," or "information." Here, Perakh uses the word "psychological." His use of this word betrays his deliberate attempt to sidestep the issue of "information." The reason that 10 4's in a row catches our eyes is that we humans, familiar with the game of dice, "know" the improbability of such an ACTUAL even from happening. In the world of probability, they would all be the same; but in the real world we live in, we know that this is highly improbable. Thus, TWO worlds: duality. Now, as I say, Perakh dismisses this. But let's take a closer look. On the first roll of a single dice, any number, 1 to 6, is expected; on the second roll, any number is again expected, but the roll of TWO 4's in a row is noticed; three 4's in a row is NOT expected; nor the 4th, nor the 5th, and exceedingly the remaining five 4's that will be rolled. So, in the 'world' of Perakh, and 'pure' probability, 10 4's is just like any other roll of the die ten times. But, in the "real" world of intelligent beings, we "know" that something's up. If you were at a crap table and rolled 10 seven's in row, the pit boss would come looking for you. So, in the "real" world the following occurs (in our minds, beds of intelligence and knowledge): when a die is rolled and it comes up with a number, we think that this has happened independently of the first roll; this makes EACH roll of the die a 1 in 6 event. They're all "independent" events. However, when we see 4 after 4 being rolled, we begin to link these 'seemingly' independent events together. So, now, the likelihood of a 4 in the first roll is 1 in 6; but now the probability of the second roll is 1 in 36; and the third, 1 in 108, with the tenth 4 being rolled having a probability of 1 in 10^6. What Perakh has done, perhaps unwittingly, is to say that any group of ten rolls of a die ALL have a probability of 1 in 10^6. But that's not so. It's only so when the "mind" decides to "group together" individual events into a "dependent" whole. (And the "mind" doesn't normally do this when viewing the individual roll of a die.) Notice that it is the "mind" doing this, and deciding this. It's just like the shuffling of cards printed in invisible ink. The mind cannot make distinctions in that case; and, probability theory cannot apply. So, contra Perakh, what the "mind" does, or doesn't do, is essential: and is not simply a "psychological" reaction. It is knowledge at work using information that the "mind" has acquired over the lifetime of the individual. ++++++++++++++++++++++++++++++ To the UD community, and any on-lookers: I actually think I've stumbled upon a very important point in ID theory, and it has to do with the whole notion of "pattern," or "specification." When a "pattern" is detected, then "independent events" become "dependent events," and the probabilities are no longer added but MULTIPLIED! It is the "mind" that connects individual events into a greater whole. We do this all the time, in all sorts of ways, every day. It's called "pattern recognition." I recognize someone's face from a distance. I recognize my car in the middle of a parking lot. I recognize the face of Abraham Lincoln on the penny I have in my hand. Sometimes it happens, though, where we recognize a pattern--notice that the 'pattern' has to be linked to our base of knowledge (as Demski includes in his paper on Specification)--and, if this pattern involves some underlying series of events involving probability, our "minds" then connect seemingly "independent events" and combines it into one "inter-dependent" event. So, e.g., a series of 1' and 0's numbering 1,000 comes to be "recognized" as representing the first 10 prime numbers using ASCII code. (Dembski's example), and the string of numbers goes from simply being a SERIES of events, each with a probability of 1 in 2, to ONE event, with a probability of 1 in 2^1000. Are we justified in saying that a protein is not just the sum of events, each of which is 1 in 4 ? (the 4 nucleotides that form the DNA code) Is this simply some "psychological" trick being played on us? Are we simply failing to be "counter-intuitive"? Well, the fact that machinery inside living cells can convert this sequence of nucleotides into codons, and codons into amino acids, and amino acids into proteins strongly suggests that the CELL recognizes a pattern, and the entire sequence as a "combined event": i.e., the cell machinery does not see a single nucleotide in isolation from the remaining ones, but views the entire sequence as a "combined event." This is nub of of the difference that holds apart those who hold to ID, and those, like Dawkin's, who want to climb Mt. Improbable. jdk: I'm interested in your reaction, even though I've strayed away from 'pure' probability theory. PaV
I'm with Dave. I'm not interested at all in Perahk's anti-ID position. jdk
EugeneS, If I were to attempt to read Perakh's mind, I would guess he is probably interested in correcting some misconceptions as well. Everyone likes to get their 2 cents in. But I'm sure no one is interested in my opinions about Perakh's motives. Does Perakh make any incorrect statements about probability in this book? That's what I really care about here. daveS
Let's put it plainly. All the likes of Shalitt, Perakh, Dawkins, Brian Cox, etc. want is exclude God from their lives. For this purpose they abuse their own rational faculty, science and whatever else it takes to abuse. Fine, their own lives, their own decisions. But there invariably comes a time to pay the bill... Eccl. 11:9. EugeneS
to PaV re 180. You write,
Do you disagree with the notion that the expectation value of receiving a bridge hand is 1.0?
First, as Dave and I have pointed out, I believe you are using the term "expectation value" when you mean probability. See post 117. But yes, I agree with that. In fact, I thought we had agreed on that long ago. Here is a summary of two main points: 1. The probability of being dealt a hand is 1, because we are assuming a hand gets dealt. More formally, if we have a sample space S with n equiprobale events, then when an event happens it has a probability = 1 of being an element of the sample space. That is by definition, as the sample space is the set of all events that can happen in the situation. 2. The probability of getting a particular hand is 1 out of 4 x 10^21 (assuming we are taking order into account.) More formally, the probability of any particular event is 1/n. Does this agree with what you are trying to say? I thought that we had all agreed on this quite a few posts ago. jdk
to PaV re 178: Perahk specifically says at the start of his example that "all n tickets are distributed." He, and I, are obviously using the lottery as a model for a situation where you have 1,000,000 equiprobale events, and one of them is selected. So I don't think your objections in 178 are relevant to, nor accurately depict, what Perakh actually wrote in the example I summarized. Perhaps you could read that section again and refresh your memory, starting on page 385. It's only about three pages long. jdk
jdk: Do you disagree with the notion that the expectation value of receiving a bridge hand is 1.0? You mention a weighted average. Well, the odds of ANY bridge hand is 1 in 4 x 10^21. And there is only ONE way of getting that hand. So, if you have an expectation value defined as the number of ways of getting a hand x the odds of getting that hand, this product summed over all possible hands, and then divided by the odds of any ONE Bridge hand summed over all possible hands, you get 1.0. Am I missing something? PaV
Origines: Dembski has modified his approach to information in a way that involves searches and probability distributions. In it he finds that successful searches involve information, and that that information can be traced back to some kind of input of information, and this can be linked to a designer. In his new approach, the upper probability bound is not needed. That's my summation. But, I'm guessing you got that much, too. So, I'm not sure what's making you scratch your head. PaV
jdk:
In a lottery with a million numbers, there is a probability of 1 that someone will win, and a probability of 1 out of 1,000,000 that a particular person (let us say, Joe) will win.
Perakh says this, and you say that his analysis is right. This statement is wrong. There is not a probability of 1 that someone will win. There is a probability of 1 only upon the condition that you will keep selecting numbers until someone wins. And this involves intentionality on the part of an "intelligent" being. The whole argument of ID is that "intelligence" is the only way of overcoming extreme improbabilities. Again, the probability, outside human agency, is not 1 that someone will win. For example, we know of lotteries where no one has the winning ticket. It is then extended, and you have super lotteries. I'm not exactly sure how this works, but I'm guessing it's along these lines. Everything rolls over to the next week's lottery. Eventually, someone wins. Now, where Perakh is being 'slick' is by not mentioning the word "eventually," and by not specifically pointing out that this probability is 1.0 only if there have been an appropriate number of lottery tickets sold. These are important, and I think deceptive, omissions. And it is done so as to say, "Well, look, in the lottery your chance of winning is only 1 in 100 million. And yet someone wins all the time." The deception is that no mention is made that this is simply how a lottery is "designed" to work (interesting choice of a word, eh?). It cannot work otherwise. And, that lotteries work by selling a sufficient amount of tickets. Naturally, if the odds are 1 in 100 million, and you give out $60 million, but only sold 2,000 tickets at a dollar a piece, there will be one, and only one, lottery. So, this gets into how many lottery tickets are sold. And the probability is one simply because enough, generally, are sold. Bringing this across to biology, if the odds of a particular protein sequence is 1 in 10^75, this means for this to "happen," in the same way that a "winning ticket" happens, is for there to be 1 in 10^75 cell division/replications, more or less. And it is this impossibility that Perakh wants to avoid, and does, by failing to bring in the reason for his saying the probability is very high, nearly 1, of there being a winning ticket. So, it is said: "Your chance of winning the lottery is 1 in 100 million; yet someone always wins. See, highly improbable things happen all the time." And, with this 'way of the hand,' evolution becomes probable--or even 'expected.' This is disingenuous. PaV
Thanks, hnorman5. Perakh wrote his book back in 2003 and Kitcher's book was in 2008, so Perakh wouldn't have been responding specifically to Kitcher. But I've never understood what argument of Kitcher's people were concerned about here anyway. A phrase of Kitcher's triggered this discussion, and it seems to me we've reached quite a bit of agreement as we've tried to clarify what the issues are. Also, I've paid no attention to any arguments, by Perakh or anyone else, about how all this does, or might, apply to the universe, or life: I've been interested in the pure mathematics. jdk
jdk at 170 I think your comments on Parakh's take on multiple lottery wins are correct. What's odd is that he seems to be arguing on the wrong side. He clearly explains the role of specification and refutes Kitcher's argument. I'm not sure if this represents a concession on his part that he's declaring irrelevant or what. He does go on to say that the universe could have been created by high probability events but I don't know if that's the main thrust of his argument. hnorman5
Origenes, I now understand the difference between the 1/S and 1/S^2 situation, I think. We'll use your lottery example with 1,000,000 numbers (but I'm also thinking of your earlier example with just head and tails, in 154) Let S = the number of equiprobable events in the sample space. S = 1,000,000 in the lottery example and S = 2 with a coin. The probability of any one particular event happening is 1/S. Now, specify one of the events, such as E = 672,483 in the lottery (or E = T with a coin.) The probability of a particular event (Joe's ticket, or the flip of the coin) matching the specification is 1/S. WE have agreed on this, and you describe it in 171 when you write,
Joe buys a lottery ticket — nr. 672483 (1) The Kitcher question: “What is the probability that you get exactly those numbers in exactly that order?” Answer: 1 in 1000.000 (assuming that all tickets are available). Suppose that there is one winning ticket and it is Joe’s ticket. (2) What is the chance that Joe’s ticket is the winning ticket? Put another way: what is the chance that a random lottery number generator produces the numbers sequence that matches Joe’s specification? Answer: 1 in 1000.000.
So far so good. Now you introduce a different question.
(3) What is the chance for anyone to get exactly the 672483 lottery ticket AND this ticket to be the winning ticket? Answer: 1 in 1000.000^2
What you have done here is introduced a third element in the situation: essentially an additional specification for the winning lottery ticket number, which is already the specification for Joe to be a winner. If you decide before hand that 672,483 is the number in question, you are asking what is the probability Joe has this number (1/S) and what is the probability that this number is the winning number (also 1/S). Joe may have that number but it might not be the winning number: probability of that is 1/S. The winning number might be that number, but Joe doesn't have it: the probability of that is 1/S. The probability, for Joe "to get exactly the 672483 lottery ticket AND this ticket to be the winning ticket" is thus 1/S^2. What I don't understand is why this is significant. The more basic situation is you call heads or tail and then I flip the coin. There is an event and a specification, and a 1/S probability they match. What would be a meaningful situation where you would have a specification for the specification? The situation you have here is like this: I am going to flip the coin and you are going to call it heads or tails. Your chance of success is 1/2. However, another person is standing off to the side and quietly says to his friend, "I think he's going to call tails." The third person has added a specification concerning your choice of heads or tails. Therefore there is a 1/4 chance that you will call tails (1/2) and the coin will actually land tails (1/2). This is the situation you described in 154. jdk
Thanks. I appreciate knowing that we agree, Also, I really know nothing about Dembski's conservation of information approach, and the article you linked to looks long and not straightforward, so I'll pass on thinking about that unless you can show an example of what he is talking about. jdk
Jdk: If I have a very large sample space but also a very large number of events which match a specification, then I am I justified in inferring chance and not design if the ratio of potential matching events to sample space is large enough?
Jdk, you are obviously correct IMO. Meanwhile I am trying to understand Dembski. Here he is saying that 'conservation of information' renders his unisversal probability bound of 1 in 10^150 irrelevant:
Dembski: .... The animating impulse behind Shallit’s email, and one that Felsenstein seems to have taken to heart, is that having seen my earlier work on conservation of information, they need only deal with it (meanwhile misrepresenting it) and can ignore anything I subsequently say or write on the topic. Moreover, if others use my work in this area, Shallit et al. can pretend that they are using my earlier work and can critique them as though that’s what they did. Shallit’s 2003 paper that Felsenstein cites never got into my newer work on conservation of information with Robert Marks, nor did Felsenstein’s 2007 paper for which he desires a response. Both papers key off my 2002 book No Free Lunch along with popular spinoffs from that book a year or two later. Nothing else. So, what is the difference between the earlier work on conservation of information and the later? The earlier work on conservation of information focused on particular events that matched particular patterns (specifications) and that could be assigned probabilities below certain cutoffs. Conservation of information in this sense was logically equivalent to the design detection apparatus that I had first laid out in my book The Design Inference (Cambridge, 1998). In the newer approach to conservation of information, the focus is not on drawing design inferences but on understanding search in general and how information facilitates successful search. The focus is therefore not so much on individual probabilities as on probability distributions and how they change as searches incorporate information. My universal probability bound of 1 in 10^150 (a perennial sticking point for Shallit and Felsenstein) therefore becomes irrelevant in the new form of conservation of information whereas in the earlier it was essential because there a certain probability threshold had to be attained before conservation of information could be said to apply. The new form is more powerful and conceptually elegant. Rather than lead to a design inference, it shows that accounting for the information required for successful search leads to a regress that only intensifies as one backtracks. It therefore suggests an ultimate source of information, which it can reasonably be argued is a designer. I explain all this in a nontechnical way in an article I posted at ENV a few months back titled “Conservation of Information Made Simple” (go here).
Origenes
Hi Origenes. I think I understand your 1/s^2 example now, and we'll respond shortly. But I'm also interested in hearing what you think about the middle part of 165 above, where I responded to you by writing,
I agree with the good summary of issues (1) and (2). I disagree with (2) in that it only considers the situation where the sample space is large and the specification space is small. In that situation, yes we would conclude design, such as in the situation I mentioned above as “ludicrously improbable” However, I think it’s important to realize that it is the relative size of the sample space and the specification space that determines the probability of the match between the event and the specification. An obvious situation is throwing three coins. The sample space is so small that there is no specification for which a match would be considered too improbable to be due to chance. The other situation is one I’ve mentioned several times, but Origenes has not acknowledged as relevant: the situation where the specification is broad enough to include a large number of matching events. The example I’ve used above is if the specification is “deal 13 cards in order without any red face cards”, the specification space is so large that this will happen about 16% of the time. Therefore, if it happens, I’m sure we would believe that chance is a sufficient explanation, and that design would not be a warranted conclusion.
So what do you think>? If I have a very large sample space but also a very large number of events which match a specification, then I am I justified in inferring chance and not design if the ratio of potential matching events to sample space is large enough? Is my question clear? jdk
A lottery is good example. Suppose a lottery with a million numbers. Joe buys a lottery ticket — nr. 672483 (1) The Kitcher question: "What is the probability that you get exactly those numbers in exactly that order?" Answer: 1 in 1000.000 (assuming that all tickets are available). Suppose that there is one winning ticket and it is Joe's ticket. (2) What is the chance that Joe's ticket is the winning ticket? Put another way: what is the chance that a random lottery number generator produces the numbers sequence that matches Joe's specification? Answer: 1 in 1000.000 Note that, although probabilities are the same, it is much easier buying a ticket than winning the lottery. (3) What is the chance for anyone to get exactly the 672483 lottery ticket AND this ticket to be the winning ticket? Answer: 1 in 1000.000^2 Origenes
Here is what Perakh is saying about the lottery: In a lottery with a million numbers, there is a probability of 1 that someone will win, and a probability of 1 out of 1,000,000 that a particular person (let us say, Joe) will win. [I'll add: So when Joe wins, we, the outside observer, are not surprised, because we had no prior specification about who would win, and someone had to win. Joe, however, did have a prior specification about the winner, namely himself, so he is amazed, and considered himself lucky that he hit the 1 in a million jackpot.] In a lottery with only a 100 people, there is a probability of 1/100 ^3 that Joe will win three times in a row, which is 1 in a million, which is the same probability of Joe winning the bigger lottery. Why are we not surprised when Joe wins the big lottery, not concerned at all about the chance occurence of a 1 in a million event, and yet we are surprised, and might even suspect cheating, if Joe wins the smaller lottery three times in a row, Perakh explains, and I now think he is right, that even though we may not be aware of it, we are not comparing Joe's chances of winning, which are the same in both cases, but rather are comparing the probability of someone winning. In the first case (a million people) the probability of someone winning = 1. In the second case, there are 100 people who might win three times is a row, so the probability of someone winning three times in a row is 100/1,000,000, which is 1/10,000. So it is 10,000 times less likely that someone will win the the little lottery three times in a row than it is that someone will win the big lottery, which is certain, even though the odds of a particular person (Joe) winning the big lottery or three times in a row in the small lottery are the same. So we are justified in being surprised, and perhaps even suspecting cheating, if Joe wins the little lottery three times in a row, but not justified in suspecting cheating if he wins the big lottery, because we aren't really thinking about Joe in particular, but rather about the generic "someone" winning. I think Perakh has a good point, and I don't see anything wrong about this analysis, jdk
Deleted - I understand now something I said was wrong. Here is what Perakh is saying about the lottery:: jdk
jdk:
Maybe this is the distinction PaV meant to highlight.
Yes, you're right. That is what I meant about equivocation. I'm not sure I want to spend a lot of time re-reading Perakh, and digging into it again. But, that said, the comments you made reflect my disappointment with him. You seem to take a more benign view of his treatment of the lottery. Just from memory, that is where I thought he was way off, and at points made little sense. It is here where I think he is particularly wrong. I'm happy you took the time to look. Realize that Perakh is looked to by many people who disagree with ID as an authority on these matters. PaV
jdk,
The hard subject, which he doesn’t handle very well and which we are struggling with also, is the issue of some specifications having meaning or significance to us for various reasons, and thus seeming more improbable when they occur than when a non-meaningful event of equal probability occurs. He discusses cognitive and psychological issues in addressing low probability matches between specifications and events, but I don’t think he is very clear. But again, I think we are struggling with understanding these issues also. I did find his example of multiple lottery wins interesting, and think there may be an important point lurking there but would have to study it a bit to explain.
I only have access to a few pages in chapter 13 (and elsewhere where he describes the multiple lottery wins), but I found it less than 100% convincing (or perhaps it's a good explanation, given that he is writing for those of us with limited knowledge of stats) Obviously we all would intuitively conclude there is some cheating going on if some particular person won a large lottery three weeks in a row. But how to justify this rigorously? I can only think back to a mostly forgotten class where likelihood ratio tests were discussed, where you compare two specific competing hypotheses proposed to account for the data. The null hypothesis is that the lottery is fair. What precisely is the alternative hypothesis? One option is that the person who is cheating sets up the lottery so he wins every week. Clearly that would beat the null hypothesis. But there are innumerably many ways the lottery could be rigged. For example, maybe it's set up so that whoever wins during the first week will win every subsequent week (so it's fair in week 1, but rigged thereafter). I don't know how to "objectively" choose an alternative hypothesis in this case. daveS
I read Chapter 13 of Perahk's book. First, I didn't think it was very good. As someone who has written math materials explaining probability to good high school students, I found his explanation of basic probability principles correct, but not very well done. (And I did disagree with some of his statements, but they don't relate to the subjects we are discussing.) Second, I only paid attention to the part about pure probability. I paid no attention to all the asides about origin of life or Bible codes. Third, I think he pretty much said things we have said, such as the probability of getting some hand is 1, but the probability of getting a particular hand is 1/S. (He uses n for the number of events in S.) The hard subject, which he doesn't handle very well and which we are struggling with also, is the issue of some specifications having meaning or significance to us for various reasons, and thus seeming more improbable when they occur than when a non-meaningful event of equal probability occurs. He discusses cognitive and psychological issues in addressing low probability matches between specifications and events, but I don't think he is very clear. But again, I think we are struggling with understanding these issues also. I did find his example of multiple lottery wins interesting, and think there may be an important point lurking there but would have to study it a bit to explain. With all that said, PaV, can you point to some statements (or even page numbers and paragraphs) that you most take issue with? jdk
Hi all. I've been busy out of town the past two days, but I'm glad to see there have been some more comments, because I've been thinking about these issues. There are two issues I'd like to write more on when I have more time (but soon): 1. The role and nature of specifications in general in probability theory. 2. The situation of Origenes that he says has a probability of (1/S)^2 as opposed to 1/S. I've been trying to understand the difference between the two situations to my own satisfaction. But here are some shorter replies to today's posts PaV writes,
I was trying to point out the elusive nature of the word “probability”, or its inverse, “improbability.” It is a word given to equivocation.
I am not quite sure what this means. I don't think the concept of probability itself is elusive mathematically. Given the parameters of a situation, we can usually calculate the probabilities (at least in the simple discrete situations we have been considering.) Possible PaV means to contrast probable and improbable, which are definitely vague and not clearly undefined. Some of the things we have been discussing (specifying a particular 13 card hand and then having those 13 cards actually get dealt, in order) is, to coin a rigorous mathematical term, ludicrously improbable. :-) On the other hand, getting dealt a royal straight flush is fairly improbable (about 1 out of 650,000), but it does happen. On the third hand, getting heads four time in a row happens about 6% of the time, so I don't think we would consider that improbable. Maybe this is the distinction PaV meant to highlight. Origenes writes,
Summing up: There are two types of events being discussed here. (1) the actualization of a sequence produced by some randomized chance process (dealing of cards, flipping a coin, stones on the ground). (2) matching an independent specification/pattern. (1) is easily explained by chance. Chance is very good at explaining the actualization of a sequence. In fact, it is completely unsurprising that some ‘particular’ sequence actualized. The emergence of some particular sequence is guaranteed by the chance process that produces it. (2) matching an independent specification is only connected to the chance process of making uninformed guesses wrt to a huge search space. Only sheer dumb luck can help here. Therefore chance is NOT good at all in is explaining why a sequence matches an independent specification. Contrary to (1), here, wrt matching a specification, chance has no effective mechanism. Therefore events of type (2) are better explained by design.
I agree with the good summary of issues (1) and (2). I disagree with (2) in that it only considers the situation where the sample space is large and the specification space is small. In that situation, yes we would conclude design, such as in the situation I mentioned above as "ludicrously improbable" However, I think it's important to realize that it is the relative size of the sample space and the specification space that determines the probability of the match between the event and the specification. An obvious situation is throwing three coins. The sample space is so small that there is no specification for which a match would be considered too improbable to be due to chance. The other situation is one I've mentioned several times, but Origenes has not acknowledged as relevant: the situation where the specification is broad enough to include a large number of matching events. The example I've used above is if the specification is "deal 13 cards in order without any red face cards", the specification space is so large that this will happen about 16% of the time. Therefore, if it happens, I'm sure we would believe that chance is a sufficient explanation, and that design would not be a warranted conclusion. To tie these two points of PaV's and Origenes' together, let S = the number of events in the sample space P = the number of events in the specification space (the elements which match the specification.) D = some low probability which is the dividing line between things we might attribute to chance and those things which are too improbable to be due to chance, and for which design is a reasonable conclusion. (Dembski proposed a very small number: I forget what he called it.) Then if P/S is less than D, we conclude design. It is the ratio P/S that is important, not the absolute value of S. I will point out that we have only been considering situations in which all the events are equiprobable, which often is not the case, such as considering the sum of two dice. This adds another complication to calculating probabilities. The sample space is the numbers from 2 to 12, but probability of matching a specification of 7 is different than matching a specification of 12. PaV writes,
Read Chapter 13 of Perakh’s book. See what, and how, he argues.
I was in an internet group with Perakh at one time, and I have his book, but have never read it: it looked too ponderous to read then, and it still does. However, I looked at the sub-titles of chapter 13 and saw that the first one is "Estimation of Probabilities Is Often Tricky," which is almost verbatim what I used to tell my probability students. Probability calculations are often not obvious at all, and often the results are counter-intuitive. So maybe I'll look at Chapter 13 and see what I think. jdk
And I don't feel motivated to dig through his chapter and type it all up. Rest assured. Look here: https://books.google.com/books?id=8yuUnv4dF9YC&pg=PA270&lpg=PA270&dq=mark+perakh+chapter+13+unintelligent+design&source=bl&ots=vcO8Pe2tan&sig=W5nNM8-G7ZivQmF7Ha4var2vMyE&hl=en&sa=X&ved=0ahUKEwj0sriNo5HUAhVD6mMKHVRBDxkQ6AEIPDAE#v=onepage&q=mark%20perakh%20chapter%2013%20unintelligent%20design&f=false PaV
PaV,
Yes, right now, this thread. Read Chapter 13 of Perakh’s book. See what, and how, he argues.
Well, I was referring to participants in this thread. I think we are all in agreement on the probabilities of the events in question. If you have specific passages of the Perakh book that you want to post and discuss, I would be interested in reading them. I don't feel motivated enough to request the book at my local library though. daveS
daveS: Yes, right now, this thread. Read Chapter 13 of Perakh's book. See what, and how, he argues. There used to be wars. Years ago. But now most of those who disagree with us have moved on; and we're happy they have since there's a limit on how many times you can suffer someong saying the same thing over and over again. And it's true on both sides. At some point, you agree to disagree. Let science work its magic. Oh, wait, science doesn't work by magic! :) PaV
Summing up: There are two types of events being discussed here. (1) the actualization of a sequence produced by some randomized chance process (dealing of cards, flipping a coin, stones on the ground). (2) matching an independent specification/pattern. (1) is easily explained by chance. Chance is very good at explaining the actualization of a sequence. In fact, it is completely unsurprising that some ‘particular’ sequence actualized. The emergence of some particular sequence is guaranteed by the chance process that produces it. (2) matching an independent specification is only connected to the chance process of making uninformed guesses wrt to a huge search space. Only sheer dumb luck can help here. Therefore chance is NOT good at all in is explaining why a sequence matches an independent specification. Contrary to (1), here, wrt matching a specification, chance has no effective mechanism. Therefore events of type (2) are better explained by design.
Luskhin: His [Dembski's] point is that some unlikely events should NOT be attributed to design, but rather are best explained by chance. Dembski’s fundamental premise is that Miller’s random poker hand is a perfectly good example of an unlikely event which is best explained by chance. But what happens when one is dealt 50 consecutive royal flushes? What happens when the stones spell out “Welcome to Wales by British Railways”? Clearly, not all unlikely events are best explained by chance, especially when they conform to a special type of pattern. Dembski calls this conformation to a pattern “specification.” The design inference therefore requires unlikelihood (related to complexity) coupled with specification. Miller implies that Dembski infers design by the mere unlikelihood of an event, but Miller egregiously ignores the fact that according to Dembski, we must also have specification to infer design. Dembski even uses this very example of dealing a hand of cards when illustrating an unlikely but yet non-designed event. (See how this is implied in Dembski’s essay “Intelligent Design as a Theory of Information.”) Ken Miller has put forth a patently false straw-man characterization of intelligent design arguments in order to falsely allege refutations to the public. Source
Origenes
PaV,
I’m happy that you agree. Our opponents here, though, won’t accept what seems to be trivially obvious. It makes it hard to have actual discourse.
Hm. I haven't seen anyone here refusing to accept the part of your post #146 I agreed with. As far as I can tell, everyone has been on the same page on that issue from the start. daveS
daveS: I'm happy that you agree. Our opponents here, though, won't accept what seems to be trivially obvious. It makes it hard to have actual discourse. PaV
jdk:
Actually rolling 200 dice is the same type of event, but the sample space is much bigger, and therefore the probabilities of various results are much smaller: the probability of rolling 200 6’s is about 1 out of 4 x 10^155.
I was trying to point out the elusive nature of the word "probability", or its inverse, "improbability." It is a word given to equivocation. PaV
Origenes, Yes, then I do agree with your calculation. daveS
DaveS @149
DaveS: Is the specification not assumed to be a chance event, as jdk inquired about?
In my calculations (see e.g. #154) I assume that the specification can be either heads or tails with equal probability. Let’s go back to where it all started.
Kitcher: “You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order? The answer is one in 4 x 10^21.” *
Now ask the next question: Suppose one independent specification, what is the probability that the post-specification matches this independent specification? The answer is (again) one in 4 x 10^21. Now let’s ask the third and final question: What is the probability that you get exactly those cards in exactly that order AND that the post-specification matches an independent specification? The answer is 1 in (4 x 10^21 )^2. - - - - (*) Dembski would agree with Kitcher's probability, e.g. he writes: “Indeed, any given arrangement of stones is but one of almost infinite possible arrangements.” [‘The Design Inference’, p. xi] Origenes
I see there has been some discussion of the "significance" of events that occur. "Significance" is significant -- okay, "significance" is important -- only to the extent that significant events occupy a smaller subset of total outcomes. In any event, it is the specification, not the significance, that is important. hnorman5
Jdk @145
Jdk: Origenes, I think you are wrong, but this is my last comment on in it.
And then you go on to argue something which I have repeatedly told you to agree with
Jdk: If you specify one or the other of head or tails, choosing any method you wish including flipping another coin, what is the probability that my coin flip will match your specification? The answer is 1/2, not 1/4.
Like I said before, I agree. You are correct. It is true what you are saying. We are in agreement on this. It is however irrelevant to my claim. Allow me to explain. The probability that it is heads and that it matches the specification is 1/4, because it is one out four possibilities which all have equal probability 1/4: (1) Heads & Specification ‘heads’ (2) Heads & Specification ‘tails’ (3) Tails & Specification ‘heads’ (4) Tails & Specification ‘tails’ Again, yes you are correct, (1) and (4) are both events in which a coin flip matches specification. However they are two different events. The problem with your question “what is the probability that a coin flip matches specification” is that it does not differentiate between events (1) and (4). In effect it asks: “what is the probability of getting either (1) or (4)?” My following question does differentiate between (1) and (4) : What is the chance that ‘heads’ becomes actual and matches a specification? Answer: only one event out of four possibilities — therefore 1/4. So here is my claim, which you dispute, again:
Origenes: For any sequence, the probability of becoming actual (1/S) and match an independent specification (1/S) = 1/S^2
Origenes
Re 151: Hmmm. The sentence "Rolling a pair of dice is an improbable event" is inaccurate and misleading language. Each result of rolling two dice has a probability, and the sum of all the results in the sample space equals 1. However, none of the probabilities of the results are improbable (however vaguely we define that) because the sample space is small. The probability of rolling two 6's is 1/36, and no one who has played dice games would call this improbable. Actually rolling 200 dice is the same type of event, but the sample space is much bigger, and therefore the probabilities of various results are much smaller: the probability of rolling 200 6's is about 1 out of 4 x 10^155. You write,
So, I think it is more accurate to say something along these lines: “The probabilities associated with being dealt a particular Bridge hand is very low.
Yes, this is what we have been saying, although I don't think the word "associated" adds anything to the statement. Also, I don't the plural is accurate, as there is only one probability. (And, as I pointed out, we're not really talking about bridge hands.) So I would rewrite your sentence as "The probability of being dealt any particular hand of 13 cards in order is very low." (In fact, 1 out of 4 x 10^21) jdk
PaV,
I’ll presume that by “trivially true” you mean that the probability of a hand being dealt is 1.
Yes, we assume the experiment is carried out, so some element of the sample space is chosen for certain.
Now, when I said that the “expected value” of being dealt a hand is 1.0, that is also true. That might seem trivial to you, but others use it in a non-trivial way here at UD.
I'm saying that's not what the expected value of a random variable is. The EV is a weighted average of the outcomes. For example, suppose we roll a fair die and record the number that comes up. The EV of this random variable is 1/6*1 + 1/6*2 + ... + 1/6*6 = 3.5. This is an average of the possible outcomes 1 through 6, all with equal weights 1/6. If the die is altered, making a "6" twice as likely to come up as each of the other numbers, then the new EV is 1/7*1 + 1/7*2 + ... + 1/7*5 + 2/7*6, or about 3.86.
The probability of ANY hand being dealt is, therefore, the product of 4 X 10^21 possible hands x the 1 in 4 x 10^21 probability each ‘particular’ (or, ‘specified’) hand = 1. There is ZERO possibility you will NOT get a hand that is part of the SET consisting of ALL possible Bridge Hands. I would think you would agree to that. ** OTOH, if you look over the entire set of 4 x 10^21 Bridge Hands and say: “What are my chances of getting that ‘particular’ hand,” and that hand only; then, the probability is this: the probability of getting a hand (1.0) times the probability of getting that particular (specified) hand, which is now NOT the entire ‘set’ of possible hands, but just one, or 1 in 4 x 10^21.
Yes.
So, the distinction is between “any” and “any particular”. It is correct to say that the probability of being dealt “any particular” Bridge hand is 1 in 4 x 10^21; but it is not correct to say that the probability of being dealt “any” Bridge hand is 1 in 4 x 10^21. This might seem a trivial distinction; but, it is not.
I agree it's not trivial, and in fact I agree with it. At least I think it's a reasonable distinction, given the inherent ambiguity in human language. daveS
jdk:
From the point of view of the math only, every particular hand that is dealt is an improbable event
See, here's the confusion: I don't think this statement can be right. What if I say: "Rolling a pair of dice is an improbable event." You might answer: "No, the probabilities involved in rolling two dice is not very low." Now, if I say: "Rolling two hundred dice at once is an improbable event," you would like answer, "Yes." But it's the SAME action. (I've got big hands; so, I could probably accommodate 50 to 70 dice; but, Shaquille O'Neal could handle all 200 I'm sure.) So, I think it is more accurate to say something along these lines: "The probabilities associated with being dealt a particular Bridge hand is very low." I'll go back to the example of 52 blank playing cards. Except this time, the reason the cards are blank is because they've all been printed in 'invisible ink.' Now, they're all blank. They're shuffled, re-shuffled, and dealt into 4 hands. There is no "probability" associated with these hands at all. They're completely interchangeable. But now we turn all the cards up, and spray them with some chemical that will make the markings visible." Now "probability" has become "associated" with them. Do you see what I mean by probabilities being "associated" with the Bridge hands, but not inherent in them? This might seem like a trivial distinction, but ID opponents routinely say something like this: "Each lottery ticket is highly improbable, but someone always wins." This is supposed to mean that improbable events occur all the time, so why do ID proponents get so worked up about improbabilities. Yes, this might be along the lines of "significance." PaV
PaV writes,
So, the distinction is between “any” and “any particular”. It is correct to say that the probability of being dealt “any particular” Bridge hand is 1 in 4 x 10^21; but it is not correct to say that the probability of being dealt “any” Bridge hand is 1 in 4 x 10^21.
I agree with that, and believe I have been careful is writing "any particular hand" has a probability of 1 out of 4 x 10^21 Note: we are not actually talking about bridge hands, because in bridge the order you get the cards is unimportant. Bridge hands are combinations, in which order doesn't count, and 52 choose 13 is about 6.35 x 10^11. We have been talking 13 cards in order, a permutation, and 52 pick 13 is 4 x 10^21. jdk
Origenes, Is the specification not assumed to be a chance event, as jdk inquired about? We can use the formula P(E & F) = P(E | F)*P(F) here: P(match spec & toss is heads) = P(match spec | toss is heads)*P(toss is heads) If the specification was heads, then the right-hand side is 1*1/2 = 1/2. If the specification was tails, then the right-hand side is 0*1/2 = 0. Without any probabilities for the specification being heads or tails, I don't see how we can say anything beyond this. For example, suppose you always choose the specification "heads". Then P(match spec & toss is heads) is 1/2. If you always choose the specification "tails", then P(match spec & toss is heads) is 0. If you specify heads half the time and tails half the time via some chance process independent of the coin toss, then P(match spec & toss is heads) is 1/4. daveS
jdk:
I think what is confusing is that there is a difference between the event, the specification, and the significance of the event. The significance lies outside the realm of the probability situation itself, but rather in some broader context.
I think that you're onto something here. This is, in a way, what we're wrangling about. But, my hunch is that there is something deeper lurking here. For example, all of this discussion is bringing me to the point of view that probability acts like a 'dual space,' or, at the very least, there's some kind of duality associated with it. And that this duality, then, can give rise to misunderstandings. Hopefully some clarity will emerge. PaV
jdk:
But the place where I disagree is that I think it is accurate to say every hand is an improbable event.
I hope my last post (#146) influences you to see why I think it isn't accurate. PaV
daveS:
Sorry to be stubborn, but we didn’t agree to that. We do agree that the probability of a hand being dealt is 1 in this experiment, not the expected value. That’s trivially true, of course.
I'll presume that by "trivially true" you mean that the probability of a hand being dealt is 1. Now, when I said that the "expected value" of being dealt a hand is 1.0, that is also true. That might seem trivial to you, but others use it in a non-trivial way here at UD. If you want to say that ANY specified Bridge hand being dealt to someone amounts to a probability of 1 in 4 x 10^21, I would also agree to that. The distinction I draw is better illustrated using the notion of sets. Anytime cards are shuffled and dealt in Bridge, the set of all possible Bridge hands encompasses 4 x 10^21 hands. That's the 'set' of all possible Bridge hands. As the cards are being dealt, ALL of those possibilities are in play. None have been eliminated. So, there are all 4 x 10^21 ways of dealing out the hands which remain as possible hands that can be dealt. The probability of ANY hand being dealt is, therefore, the product of 4 X 10^21 possible hands x the 1 in 4 x 10^21 probability each 'particular' (or, 'specified') hand = 1. There is ZERO possibility you will NOT get a hand that is part of the SET consisting of ALL possible Bridge Hands. I would think you would agree to that. OTOH, if you look over the entire set of 4 x 10^21 Bridge Hands and say: "What are my chances of getting that 'particular' hand," and that hand only; then, the probability is this: the probability of getting a hand (1.0) times the probability of getting that particular (specified) hand, which is now NOT the entire 'set' of possible hands, but just one, or 1 in 4 x 10^21. So, the probability of getting a 'specified' hand is: 1 x 1 in [4 x 10^21]; that is, 2.5 x 10^-22. In sum, to say about Bridge hands that "the probability of getting dealt ANY Bridge hand is 1 in 4 x 10^21," and not reserving that phrase to describe only a situation in which a 'specified' Bridge hand has been "pre-specified," is to not accurately describe what is actually happening. So, the distinction is between "any" and "any particular". It is correct to say that the probability of being dealt "any particular" Bridge hand is 1 in 4 x 10^21; but it is not correct to say that the probability of being dealt "any" Bridge hand is 1 in 4 x 10^21. This might seem a trivial distinction; but, it is not. PaV
Origenes, I think you are wrong, but this is my last comment on in it. If you specify one or the other of head or tails, choosing any method you wish including flipping another coin, what is the probability that my coin flip will match your specification? The answer is 1/2, not 1/4. jdk
Jdk @142
Jdk: If you consider your specification also a random event …
I would rather say that the probability wrt getting heads or tails is a separate from the probability of matching the independent specification. What is the chance of getting heads? 1/2. And what is the chance that heads matches the independent specification? 1/2. What is the chance of getting both? 1/2^2.
Jdk: … but the probability that your specification (whichever it is) matches the actual coin toss is 1/2.
True, however the probability of the event “it is heads and it matches the specification” is 1/4; which is consistent with my original claim, which you disputed.
Origenes: For any sequence, the probability of becoming actual (1/S) and match an independent specification (1/S) = 1/S^2
- - - @143
Jdk: Just having a match between a specification and an event does not necessarily mean that that match could not have been by chance. Other factors have to be taken into consideration: the size of the sample space and the size of the subset that matches the specification.
Yes Dembski agrees with you, that’s why he proposed the universal probability bound. Origenes
re 139: Yes I am indirectly alluding to there being some probabilities of a match between an event and specification that we have no problem considering due to chance, such as 16%, and some that we absolutely would not attribute to chance, such as 1 out of 4 x 10^21. But that was not my main point. My main point was that if a specification is broader than just picking one event in the sample space, such as all hands with no red face cards, then the probability of a match with the specification can be a probability that is large enough that we would attribute to chance if there are a large number events in the sample space (hands in this case) that match the specification. Just having a match between a specification and an event does not necessarily mean that that match could not have been by chance. Other factors have to be taken into consideration: the size of the sample space and the size of the subset that matches the specification. jdk
The subject is the probability that an event will match a specification. The probability that the flip of the coin matches your specification (calling heads or tails) is 1/2 If you consider your specification also a random event, then the probability you call heads and it is heads is 1/4, and the probability you call tails and it tails is 1/4, but the probability that your specification (whichever it is) matches the actual coin toss is 1/2. I don't think there is anything clearer that I can say. jdk
Jdk @138
Jdk: But Origenes, do you see that two of the four possibilities you list involve the specification matching, (1) and (4). Therefore, the probability that your specification matches the actual event is 2/4, or 1/2, as I said.
True, but irrelevant to my point. Of course only one of the four possibilities I listed can become actualized and the chance that it becomes actualized is 1/2^2. - - - - Origenes
to PaV at 136: First, I am not following, am only vaguely aware of, and am not interested in arguments that people have applied to ID. But I agree with Dave that you are not using the term "expected value" incorrectly. I agree that there is nothing improbable about getting dealt a hand. But the place where I disagree is that I think it is accurate to say every hand is an improbable event. I think what is confusing is that there is a difference between the event, the specification, and the significance of the event. The significance lies outside the realm of the probability situation itself, but rather in some broader context. Suppose I am dealt 13 cards and get the singleton queen of spades. If I am playing hearts with no passing, this is bad news and I bemoan my bad luck in my hand matching this specification. If I am playing bridge, I am less concerned. The probability of the event is the same in both cases, but the significance changes based on context. So I think, PaV, is that you are concerned about the significance, or lack thereof, various people have attributed to certain things happening. That, to me, is a dditional matter that goes beyond the probability itself, I think. From the point of view of the math only, every particular hand that is dealt is an improbable event: the hand 2 hearts, 5 clubs, jack spades, ... 8 clubs, 2 hearts is a hand that has almost certainly never been dealt and never will be again. Without any reference to any specification or any arguments about significance, the hand I got had a probability of 1 out of 4 x 10^21 of happening. jdk
Jdk @130
Jdk: what if the specification is broader than just a single unique hand? For instance, if the specification is that the hand will have no red face cards, the probability of matching the specification is about 16%, and so if this specification was meant we would most likely consider it a matter of chance, not design.
If I understand you correctly, you are pointing out that there is a ‘grey area’ out there. I agree. Dembski proposed the universal probability bound:
A degree of improbability below which a specified event of that probability cannot reasonably be attributed to chance regardless of whatever probabilistic resources from the known universe are factored in.[1]
Origenes
But Origenes, do you see that two of the four possibilities you list involve the specification matching, (1) and (4). Therefore, the probability that your specification matches the actual event is 2/4, or 1/2, as I said. jdk
PaV,
Both you and daveS agreed that the expected value of a hand being dealt is 1.0, then this simply confirmed that ID opponents have been wrong. Both the expected value, and the reality of the action involved, tell us that there is nothing at all improbable about receiving a ‘hand.’ And to then conflate this reality with the associated to this realistic action is absurd.
Sorry to be stubborn, but we didn't agree to that. We do agree that the probability of a hand being dealt is 1 in this experiment, not the expected value. That's trivially true, of course. daveS
jdk: There is nothing you say, or have said, that I don't perfectly understand. It is simple combinatorics. We all agree on the essentials. What you are missing is the kind of arguments that have been made, and continue to be made, in an effort to refute ID arguments. Inane positions are taken. The basic confusion is one of saying that because a particular act has occurred, and we ASSOCIATE a low probability to the outcome of this act, then this proves that improbable events occur all the time. This is patent nonsense. The illustration I gave of dealing blank cards distinguishes between the "act" of dealing and shuffling, and the "configuration space" ASSOCIATED with the "objects" being shuffled and dealt. They're not the same thing, and yet those opposed to ID continue to conflate the two realities hoping, in this way, to blunt the criticism that ID makes of the improbability of even one, single protein domain arising in random fashion. Both you and daveS agreed that the expected value of a hand being dealt is 1.0, then this simply confirmed that ID opponents have been wrong. Both the expected value, and the reality of the action involved, tell us that there is nothing at all improbable about receiving a 'hand.' And to then conflate this reality with the associated to this realistic action is absurd. PaV
Jdk @ 128
Jdk:
Origenes: For any sequence, the probability of becoming actual (1/S) and match an independent specification (1/S) = 1/S^2
I believe that is wrong, Origenes. Let’s make this very simple.
I am a huge fan of very simple.
Jdk: I am going to flip a coin, and you are going to call it heads or tails.
Ok! You flip the coin.
Jdk: The sample space [for flipping the coin] contains just two equally probable events, heads or tails, so each has a probability = 1/2.
Correct. Now I am in another room without any knowledge about the outcome. I will produce the independent specification by making a guess and write either ‘heads’ or ‘tails’ on a piece of paper. Probability 1/2. This expands the sample space, because now we have 4 possibilities with each a probability of (1/2)^2: (1) Heads, specification heads (2) Heads, specification tails (3) Tails, specification heads (4) Tails, specification tails Origenes
daveS: Read his book "Unintelligent Design." You can find out for yourself. See Chapter 13. He completely dismisses Borel's Law. PaV
I'm afraid that what you write doesn't make sense, PaV. We agree that you are going to get dealt a hand. The probability of your getting dealt a hand is 1 because we agree to deal you a hand. The probability of your getting the exact hand you get is 1/S where S = 4 x 10^21 because there are S hands that you could possibly get, and you got one of them. Do we agree about this? As I wrote to Origenes, this is no different than saying that when you flip a coin, the probability of getting heads is 1/2. There are two possibilities, and you got one of them. The only difference is that there are 4 x 10^21 possibilities, not 2. (P.S. I am making no effort whatsoever to apply this to biology. I'm just trying to discuss the mathematics of probability. I hope that is clear.) jdk
PaV,
Mark Perakh, and biologists who buy into his nonsense.
Would you mind providing a specific example where Perakh denies some elementary fact of probability, calculates some probability incorrectly, or the like? daveS
jdk:
The first is the probability that you got the particular ticket you did. The second is the probability that your ticket matches the winning numbers. The second is in comparison to the specification, and the first is in comparison to all the other possible tickets in the sample space of tickets.
No, the possibility of you getting a lottery ticket is ONE. The possibility of your getting a lottery ticket with a particular number on it is 1 in 10 billion. Take a deck of cards. Erase all contents on its face, so that each card is blank. There remains 52 blank cards. You shuffle them. Then reshuffle them. Then you pass out the cards from left to right forming 4 stacks of 13 cards. What's the probability of this happening? It is the same probability of doing the very same thing with numbered and face cards. Do you see this? If you don't, then you're missing the point. There is NO probability you won't receive a hand either way. However, with numbers, suits, and face cards, there is a way of differentiating 'hands' whereas there is not when they are all blank. But SIMPLY because you've printed something on them in NO way affects the act of shuffling and dealing. Many anti-ID proponents argue in this fashion. They say essentially say that DNA sequence space exists, so it can't be improbable. This is where kairosfocus's reference comes in:
It should then be trivially true that one could choose an arbitrary “final state” (e.g., a living organism) and “explain” it by evolving the system backwards in time choosing an appropriate state at some ’start’ time t_0 (fine-tuning the initial state). In the case of a chaotic system the initial state must be specified to arbitrarily high precision. But this account amounts to no more than saying that the world is as it is because it was as it was, and our current narrative therefore scarcely constitutes an explanation in the true scientific sense.
daveS:
There are? Can you name a few?
Mark Perakh, and biologists who buy into his nonsense. PaV
Origenes writes,
What chance is NOT very good in is explaining why a sequence matches an independent specification. This is better explained by design.
I'd like to offer some clarification to this statement. In regards to the situation we have been discussing, this is true: If you write down 13 cards in order (the specification) and then I deal 13 cards, if the dealt hand matches the specification we will definitely think that design (in this case, cheating somehow) was involved. The probability of this happening is 1/S (S = 4 x 10^21), and that is so ludicrously improbable that so one would consider it just a matter of chance. I think we are all agreed on this. However, and this is a point I've made but you haven't responded to, Origenes, what if the specification is broader than just a single unique hand? For instance, if the specification is that the hand will have no red face cards, the probability of matching the specification is about 16%, and so if this specification was meant we would most likely consider it a matter of chance, not design. So, your comment above is valid if the sample space is large and the number of events which match the specification is small in proportion to the size of the sample space, but it is not necessarily true if the number of events which match the specification is a significant proportion of the overall sample space. jdk
jdk, Could you also say that the events "becoming actual" and "matching an independent specification" are not independent, therefore the "simple" product rule P(E and F) = P(E)*P(F) doesn't apply? daveS
re 126:
For any sequence, the probability of becoming actual (1/S) and match an independent specification (1/S) = 1/S^2
I believe that is wrong, Origenes. Let's make this very simple. I am going to flip a coin, and you are going to call it heads or tails. The flip of the coin is the event. The sample space contains just two equally probable events, heads or tails, so each has a probability = 1/2. Your calling it heads or tails is the specification that the event needs to match. You call heads. There is no probability involved here. Heads is one of the two possibilities: obviously your specification has to be in the sample space. The probability of the coin flip matching your specification is just 1/2. It is not (1/2)^2. ----- Here is another way to look at it. Consider this situation: I flip a coin twice in a row 1) What is the probability I throw heads twice? 1/2 x 1/2 = 1/4 2) What is the probability that both coins are the same? This can be rephrased this way: given the result of the first flip, what is the probability that the second matches it? This probability is 1/2. This is easily checked by listing all four possibilities (HH, HT, TH, TT) and noting that 2 out of 4 have matching coins This is called a conditional probability, written P(A|B), meaning what is the probability of A given that B has already happened. This is actually equivalent to the specification situation, where B is the specification and A is the event that needs to match the specification. The first flip is the specification and the second is the event. It makes no difference whether the first flip is just chosen or if it happens by chance, the probability of the second flip matching it is 1/2, not (1/2)^2. jdk
follow-up #126 Chance is very good at explaining the actualization of a sequence. In fact, it is completely unsurprising that some 'particular' sequence actualized. The emergence of some particular sequence is guaranteed. What chance is NOT very good in is explaining why a sequence matches an independent specification. This is better explained by design. Origenes
Matching an independent specification is a distinct event from any sequence to actualize. A hand is dealt and a sequence is actualized with 1/S probability — irrespective if this hand matches an independent specification. Next we ask: ‘does this low-probability event match an independent specification?’ This is a separate issue. What is the chance that it does? Again 1/S Therefore For any sequence, the probability of becoming actual (1/S) and match an independent specification (1/S) = 1/S^2 ---- p.s. there is no role for an independent specification in Kitcher's narrative. Origenes
WD400 (and JDK et al), what forces other than those of chemistry and physics (including statistical thermodynamics) are there in Darwin's warm pond or the like pre-life environment? Especially, as would make the ofrigin of cell based life not so vastly beyond the search capacity of the observable cosmos as would make no difference? Where, if you wish to suggest a stepping stones model, kindly show its empirical warrant. While you are at it, show that he islands of function issue as most recently highlighted by Davies and Walker, is not relevant, is not fundamental. KF PS: Davies and Walker:
In physics, particularly in statistical mechanics, we base many of our calculations on the assumption of metric transitivity, which asserts that a system’s trajectory will eventually [--> given "enough time and search resources"] explore the entirety of its state space – thus everything that is phys-ically possible will eventually happen. It should then be trivially true that one could choose an arbitrary “final state” (e.g., a living organism) and “explain” it by evolving the system backwards in time choosing an appropriate state at some ’start’ time t_0 (fine-tuning the initial state). In the case of a chaotic system the initial state must be specified to arbitrarily high precision. But this account amounts to no more than saying that the world is as it is because it was as it was, and our current narrative therefore scarcely constitutes an explanation in the true scientific sense. We are left in a bit of a conundrum with respect to the problem of specifying the initial conditions necessary to explain our world. A key point is that if we require specialness in our initial state (such that we observe the current state of the world and not any other state) metric transitivity cannot hold true, as it blurs any dependency on initial conditions – that is, it makes little sense for us to single out any particular state as special by calling it the ’initial’ state. If we instead relax the assumption of metric transitivity (which seems more realistic for many real world physical systems – including life), then our phase space will consist of isolated pocket regions and it is not necessarily possible to get to any other physically possible state (see e.g. Fig. 1 for a cellular automata example).
[--> or, there may not be "enough" time and/or resources for the relevant exploration, i.e. we see the 500 - 1,000 bit complexity threshold at work vs 10^57 - 10^80 atoms with fast rxn rates at about 10^-13 to 10^-15 s leading to inability to explore more than a vanishingly small fraction on the gamut of Sol system or observed cosmos . . . the only actually, credibly observed cosmos]
Thus the initial state must be tuned to be in the region of phase space in which we find ourselves [--> notice, fine tuning], and there are regions of the configuration space our physical universe would be excluded from accessing, even if those states may be equally consistent and permissible under the microscopic laws of physics (starting from a different initial state). Thus according to the standard picture, we require special initial conditions to explain the complexity of the world, but also have a sense that we should not be on a particularly special trajectory to get here (or anywhere else) as it would be a sign of fine–tuning of the initial conditions. [ --> notice, the "loading"] Stated most simply, a potential problem with the way we currently formulate physics is that you can’t necessarily get everywhere from anywhere (see Walker [31] for discussion). ["The “Hard Problem” of Life," June 23, 2016, a discussion by Sara Imari Walker and Paul C.W. Davies at Arxiv.]
kairosfocus
Hi PaV. You write,
Somehow, according to your logic, and I suppose daveS’s logic, every lottery ticket that is printed will win.
No, and I don't see how you could think that is what I am saying. Let's make a new situation: a lottery where you buy a ticket that contains 5 2-digit numbers. There are 10 billion possible tickets. When you buy a ticket, the probability of your getting your particular set of numbers is 1 out of 10 billion. This probability is true irrespective of whether the lottery is ever held: you have a set of numbers that is 1 out of the 10 billion you might have gotten when you bought your ticket. Now, a winning set of numbers, a specification, is chosen by the lottery administrators. The probability of your having the winning ticket is 1 out of 10 billion also. But this is a different probability even though it has the same numerical value. The first is the probability that you got the particular ticket you did. The second is the probability that your ticket matches the winning numbers. The second is in comparison to the specification, and the first is in comparison to all the other possible tickets in the sample space of tickets. These are two different things. To help illustrate, consider the case in which you only have to match the first four numbers to win. Then the probability of winning is 1 out of 100 million, even though the probability of your getting the ticket you got is still 1 out of 10 billion. P.S. I also don't know enough about genes to discuss the implications of all this to biology. However, you agreed with my statement that
This is not to say that specifications aren’t important. It is to say that there are two separate events to consider: the probability of the event itself and the probability of the event matching a specification.
The biological example you give seems to be in agreement with the point I am making that the probability of a random event matching a specification is separate from the probability of the random event occurring. jdk
jdk:
Your 2nd and 3rd lines are about comparing an event to a specification, which adds an additional component to just looking at the event itself. Being dealt a hand is a separate issue from that hand matching a specification.
If a DNA sequence consists of 300 nucleotides, each of which is 1 of 4 possible nucleotides cellular life permits, then how does this sequence of 300 nucleotides become a sequence of 100 amino acids, each of which is one of 20 permitted in cells? The point is that we could use combinatorics to come up with a probability for ANY stretch of DNA sequence, and, no matter where you looked on the 3 x 10^9 nucleotides making up the human chromosome, the probability will ALWAYS be the same. Now, given that that is the case, then please define a "gene."
This is not to say that specifications aren’t important. It is to say that there are two separate events to consider: the probability of the event itself and the probability of the event matching a specification.
Yes. And in the cell there are a whole lot of events that go into "specifying" DNA sequence. Not just any-old sequence will do. It needs to be specific. And this is determined via protein-protein interactions. So, you see, there are two separate "events." The "second event" is the requirement of a protein having the right shape, and the right binding regions, and these binding regions in the right place along the length of the amino acid chain. Work out the probabilities. They're extremely small. PaV
PaV,
Somehow, according to your logic, and I suppose daveS’s logic, every lottery ticket that is printed will win.
No, that doesn't follow from anything we've said.
There are mathematicians and biologists who protest this simple truth.
There are? Can you name a few? daveS
jdk:
The probability of drawing some sequence and the probability of drawing any particular sequence are statements that are true whether anyone ever makes a specification or not.
Somehow, according to your logic, and I suppose daveS's logic, every lottery ticket that is printed will win. "But, of course not," you would say. "However," you would add," each lottery ticket has the same probability of being printed up." Yet, how could you possibly have a lottery if there is not some specific target selected? It is absolutely mind-numbing that this distinction isn't obvious to anyone who spends a few minutes thinking about it. In biology, the problem is far worse. You need two proteins to match up with one another; so, they have to "co-specify" one another. jdk:
Is this distinction all that is being discussed?
Yes. And, somehow, this continues to mystify. daveS:
If you called those probabilities instead, we have agreed they are correct for this experiment. The probability of drawing some sequence is 1. The probability of drawing a particular sequence is 1 in 4 x 10^21.
There are mathematicians and biologists who protest this simple truth. PaV
Origenes, you seem to skip over what I think should be line 2 in 119: The probability of drawing any particular sequence is 1 in 4 x 10^21 The probability of drawing some sequence and the probability of drawing any particular sequence are statements that are true whether anyone ever makes a specification or not. Your 2nd and 3rd lines are about comparing an event to a specification, which adds an additional component to just looking at the event itself. Being dealt a hand is a separate issue from that hand matching a specification. If I am dealt a hand, the probability of being dealt that particular hand is 1 out of S = 4 x 10^21. Before the deal it was just one of S possibilities. After the deal, it exists: it is the one out of the S possibilities that became an actuality. If someone says, "Ah yes, that is the hand I was expecting to get," that would be, in your words, a dependent specification. It would also be a silly thing to say. But if someone looks at their hand and thinks to himself, "I am virtually certain no one in the universe has ever had this particular hand, even if this game has been played billions of times, because this particular hand, like all hands, has a extremely small probability of happening", they would be making an accurate statement. Notice that this second statement has nothing to do with specifications. It is just a true statement reflecting the fact that the probability of any hand is 1/S. Specifications involve matching the hand with one or more possible hands, either specified beforehand or noticed afterwards because of some meaningful pattern. Specifications themselves are not part of the theoretical analysis of the probability of something. They are an additional component that comes from comparing the event with some additional criteria. This is not to say that specifications aren't important. It is to say that there are two separate events to consider: the probability of the event itself and the probability of the event matching a specification. (And, as I have pointed out a couple of times, the specification can be broader than just specifying an specific event, in which case the probability of the event and the probability of matching the specification are not the same. For instance, in 114 I used the example of the specification being "all cards the same suit.) jdk
The probability of drawing some sequence is 1. The probability of drawing an independently* specified sequence is 1 in 4 x 10^21. The probability of drawing a dependently** specified sequence (Kitcher's method) is 1. - - - - * by 'independently specified' is meant, specified independently from the dealt hand. ** by 'dependently specified' is meant, specified on the basis of the dealt hand. Origenes
The probability of pulling out a number = 1.0, but this is trivial. In more formal language, it says that all possible events are in the sample space (this is true by the definition of sample space), so that whatever event happens, it will be a member of the sample space. The probability of pulling out a particular number is 1 in 4 x 10^21. Is this distinction all that is being discussed? P.S. Yes, to Dave: expected value is not the same concept as probability. I just took PaV to mean probability in 116. jdk
PaV, Hm. That's not how I use the term "expectation". This wikipedia page describes what I understand it to mean. If you called those probabilities instead, we have agreed they are correct for this experiment. The probability of drawing some sequence is 1. The probability of drawing a particular sequence is 1 in 4 x 10^21. daveS
daveS and jdk: Apparently the only way I'm going to point out to you the distinction I'm making is to use "expectation values" (a la Bob'O'H). The "expectation value" of pulling out a number from the bowl full of numbers is 1.0. The "expectation value" of pulling out a specific number from the bowl is 1 in 4 x 10^21. Should we talk about the "probability" of something when its 'expected' value is 1.0? You can, if you like. But it is not germane to the realm of probabilities. 1 x (1 x 10^-3000) is 1 x 10^-3000. It's like talking about the "zero vector": trivial. PaV
jdk,
Notice that the two probabilities in 1A1 and 1B1 are the same numbers, but they are probabilities of dfferent things: the probability of guessing right vs the probability of just getting the particular hand you did. In 1A1 you are amazed, and would probably conclude cheating of some type, but in 1B1 there is nothing amazing at all: you just got a random hand.
This is a good point. I guess you could think of the sample space in 1A1 as ordered pairs of sequences, where the event in question occurs if the two sequences in the ordered pair match. In 1B1, the sample space consists of just single sequences. At least that's one way to differentiate between the two scenarios. daveS
Re 111: Yes, sticking in your hand and a drawing out a number involves probabilities: you have a 1 out of S = 4 x 10^21 chance of drawing out any particular number, so the probability of drawing the particular number you drew is 1/S. If there were only three numbers in the bowl, you would have a 1/3 chance of drawing out any particular number. The fact that some one did, or no one didn't, make any specifications doesn't affect the probabilities of the actual event itself. The probability of the event matching the specification is a different probability than the probability of the event itself. As I explained in 105, if you specify some number of hands N, then the probability of guessing correctly is N/S, but the probability of the hand that came up is still 1/S. For example, if your specification is "all cards the same suit", then there are 4 x 13! = 2.5 x 10^10 hands that will match the specification, so the probability of matching the specification is about 1 out of 1.6 x 10^ 11, while the probability of whatever hand was dealt is still just 1 out of 4 x 10^21. jdk
PaV,
Here, I disagree. Kitcher fails to understand the whole notion of a predetermined “pattern,” and confuses dealing out cards–with no real probability other than 1.0–and the probability of a particular grouping of different cards found in the dealt hands. They’re two separate realities: one falls squarely within the realm of probabilities, while the other is squarely outside that realm.
I don't know what you mean by something "falling outside the realm of probabilities". We've already agreed that the probability of getting any particular permutation is 1 in 4 x 10^21. That's the case whether you specify some permutation beforehand or not. [Edit: What jdk said] I don't see what else there is to discuss here. daveS
The probability of a particular hand is 1 out of S = 4 x 10^21 irrespective of whether anyone makes a specification or not, or even if a hand, or any hand ever, gets dealt. The probability is a theoretical construct based on the parameters of the situation. jdk
daveS:
Probabilities do “come into play” whenever you define an experiment (such as drawing 13 cards from a deck).
And we're talking about two completely different "experiments." That's the whole point being missed here. And it is an important, and vital, one. Two experiments: Let's say there is a bowl with 4 x 10^21 different numbers marked on them. First experiment: stick your hand in the bowl and select one of the numbers. Second experiment: declare which of those numbers you will select; then stick your hand in the bowl and select one of the numbers. The act of selecting a number involves no probability; yet, ID opponents want to say, "Oh, yes it does involve probabilities. In fact, the first experiment is completely equivalent to the second experiment." And they're completely wrong, as you can see. PaV
daveS:
Well, yes, depending on how the experiment is defined. Kitcher says that the outcomes of the experiment are permutations of 13 cards, so we can calculate probabilities of events in that experiment.
You missed the point: the particular permutation of cards I receive is one thing; however, the FACT that I will absolutely (unless the dealer dies before completing the deal) receive the hand, makes "receiving the hand" fall outside of probabilities. It exists simply be definition of what cards and dealing are.
Yes. I would say that’s included in the definition of the experiment—you must always receive 13 cards. A thinking person could not think otherwise.
Apparently you didn't miss the point. This was it.
All I’m saying is that Kitcher’s statement in the passage I responded to is accurate.
Here, I disagree. Kitcher fails to understand the whole notion of a predetermined "pattern," and confuses dealing out cards--with no real probability other than 1.0--and the probability of a particular grouping of different cards found in the dealt hands. They're two separate realities: one falls squarely within the realm of probabilities, while the other is squarely outside that realm. It's a fundamental mistake that opponents of ID make all the time. PaV
wd400: Here's what a commentator wrote--quoting Wickramasinghe about their book: "The upshot of all this was that nothing was decisively resolved. We did not convince our opponents as we had set out to do, and we also lost many friends! The prudence of taking on so powerful an institution like the British Museum must on retrospect be called to question. Maybe by "brainless" you mean "thinking outside of the box." But, of course, if NO ONE thinks outside of the box, then you're not really free to think, are you? Today, the "box" is called being "politically correct." PaV
wd400:
They in now way relate to any process that has been suggested to produce the results.
And what processes are those that you propose?
How much biological function exists in sequence space is an interesting question, but these uninformed calculations are not related to those question in any obvious way.
You seem to patently contradict yourself in this sentence. Just exactly how do you specify "sequence space"? PaV
Hi Origenes, First, I want to say that I have no knowledge of, and no interest in, whatever Kitcher had to say, or what point he is trying to make. I'm just interested in the discussion about probability. But I don't think you are correct when you say,
They are not probabilities of different things. What is involved when one determines “the probability of just getting the particular hand you did”? First one makes a post-specification and next, one goes ‘back in time’ and calculates the probability of guessing right.
Every possible hand has an equal probability of 1/S of occurring. This is the probability irrespective of whether anyone evers deals a hand or not. Once a hand is dealt, 1/S is simply the probability of getting that particular hand, no matter what that hand is. There is no pre- or post- specification is involved in simply dealing a hand. If you make a pre-specification, then you are comparing that pre-specification with the hand that happened. If you only guessed one hand, then the probability of guessing right is also 1/S. If you would choose four possible hands as pre-specifications, then the probability of guessing right is 4 x 1/S = 4/S. You write,
Go through the process step-by-step and you will see that this is how it works.
I'm not sure what process you are talking about. The process I outlined is 1. Guess a set of possible hands N (maybe N = 1, or maybe N > 1) and write them down. These are the pre-specifications. 2. Given your pre-specifications, the probability of your guessing right is N/S. 3. Deal a hand and see if you guessed right. In this case, there is no post-specification (that is, you aren't looking for any pattern other than a match with one of your pre-specifications.) That's all. What about this seems wrong. jdk
Jdk: Notice that the two probabilities in 1A1 and 1B1 are the same numbers, but they are probabilities of different things: the probability of guessing right vs the probability of just getting the particular hand you did.
I do not agree. They are not probabilities of different things. What is involved when one determines “the probability of just getting the particular hand you did”? First one makes a post-specification and next, one goes ‘back in time’ and calculates the probability of guessing right. Go through the process step-by-step and you will see that this is how it works. After the ‘time travel’ is completed Kitcher welcomes you back and shouts: “You did it!” The problem for Kitcher is that time travel — and knowledge of 'future events' that comes with it — changes the odds dramatically. Origenes
FFT: First, probabilities exist irrespective of whether anyone has offered a specification beforehand or not, or even if a hand is dealt or not. Probabilities are theoretical constructs given certain parameters. The probability of getting all spades in order exists and can be calculated, given a deck of cards, whether anyone ever deals a hand of bridge or not. Looking for a pattern, or recognizing a pattern, is a separate issue. To discuss further, I'd like to make a distinction between what you might call a pre-specification and a post-specification. Also, for brevity, let S = 4 x 10^21. Situation 1: a standard deck of cards 1A1: you declare before hand (pardon the pun) a certain order, like 3 clubs, 6 diamonds, jack spades, etc. Then you deal the cards. Your chance of having guessed right is 1/S. I would call this a pre-specification because you declared what you are looking for before the event. 1B1: you make no pre-specification. The hand is dealt, and you get 5 hearts, ace clubs, king clubs, etc: just a random looking bunch of cards. The probability of this hand is 1/S Notice that the two probabilities in 1A1 and 1B1 are the same numbers, but they are probabilities of dfferent things: the probability of guessing right vs the probability of just getting the particular hand you did. In 1A1 you are amazed, and would probably conclude cheating of some type, but in 1B1 there is nothing amazing at all: you just got a random hand. 1A2. To understand better the role of pre-specification, in 1A1, consider choosing a number of possible hands N. Obviously, your surprise at matching the dealt hand is lessened depended on N: if you choose 2 x 10^21 hands, you have a 50% chance of matching the dealt hand, and wouldn't be surprised at all if that happened. 1B2. Again, you make no pre-specification. This time you get all the spades in order. Even though you made no pre-specification, this hand is so obviously special that you would be amazed. However, the probability of "being amazed" is not really 1/S, because there are other hands that would also stimulate the same sense of specialness: all the other suits in order, all the cards in a suit in reverse order, all the spades in any order, etc. Let us call this sense of a special hand occurring a post-specification. Obviously, the probability of matching a post-specification is greater than 1/S, because there are some number N of hands that would trigger our judgment that the hand was special. This is where things get fuzzy: how big in N? All the spades in order is obvious. How about ace clubs, 4 diamonds, 7 hearts, 10 spades, king clubs, 3 diamonds, etc., where the values come up by 3 and the suits advance 1 each card. Is that special enough to be a member of N? What about the same pattern of numbers but the suits in no particular order? How about the first 12 cards following a pattern, but the 13th not? Is that still a member of N? And does this example show that there might be considered "degrees" of specialness, so that some hands are more clearly a member of N and for some it might be a judgment call that some people might disagree about. The point is that pre-specifications are a well-defined set, but post-specifications are not. There is no a priori way of determining exactly what the probability is of matching a post-specification because we really couldn't, I think, clearly list all possible hands that we would consider and all the ones we wouldn't. 1C Now consider a dec kof 52 cards that each contain a distinct symbol, but for which there were no subgroups of any kind. In this case, I don't think there are any possible post-specifications because there is no pattern to notice after the fact of the hand being dealt: every hand is unique, with probability 1/S. Of course, you could make a pre-specification, and have just the same chance of being right as with a regular deck of cards. But with this second deck of cards, there are no post-specifications. I think this example illustrates that post-specifications involve some type of judgment on our part about what is meaningful. Also, the probabilities of post-specifications are harder to calculate, and fuzzier, because we don't know what subset of possible events out of the sample space S we would consider meaningful, and whether there might be degrees of meaningfullness, so to speak. So, FFT. jdk
PS: One clarification:
Let’s not play games. When a particular “pattern” is being looked for–a “pattern” which we will call a “specification”–then probabilities come into play. And they mount up in a hurry.
Probabilities do "come into play" whenever you define an experiment (such as drawing 13 cards from a deck). daveS
PaV,
I’m in a Bridge game. The deck is shuffled. I’m dealt the cards. I pick them up. I sort them by suits. Is there any kind of ‘probability’ involved with this? If there is, then what is it?
Well, yes, depending on how the experiment is defined. Kitcher says that the outcomes of the experiment are permutations of 13 cards, so we can calculate probabilities of events in that experiment.
Now, if there were a 1 in 100 chance that I wouldn’t be dealt a hand, or that the hand was somehow defective, then common sense tells you that the game of Bridge wouldn’t exist. Who would be so foolish as to play it. No, it is a GIVEN that I will be dealt, and receive, a hand.
Yes. I would say that's included in the definition of the experiment---you must always receive 13 cards. A thinking person could not think otherwise.
Now, if I sit down for a game of Bridge, and I say ahead of time that I want to receive, in this order, the Ace of Hearts, 10 Diamonds, etc, etc., then there isn’t enough time in the world for that to happen.
Yes, at least it's exceedingly unlikely it will happen in any reasonable timeframe.
Let’s not play games. When a particular “pattern” is being looked for–a “pattern” which we will call a “specification”–then probabilities come into play. And they mount up in a hurry.
Ok, sounds reasonable.
What’s the story about if you start with one penny in the first square of a chess board, and then double that amount when moving from one square to the next one on the same line, by the time you reach the end, you have 2^63 times $00.01, or $9.22 x 10^16, or 9200 Trillion dollars. Now what if you increased this by 20, the inverse of the probability of selecting a particular amino acid? Just think of 10 specified amino acids in a chain of 300 amino acids. Well, the improbability is 1 in 20^10, or 1 in 10^13. So, for the ancestor of the elephant to make one change in a portion consisting of a stretch of 10 amino acids, this would take 10 trillion replications. Given a population size of a million, 10^6, and given the transition from “mouse to elephant” took 24 million years (see NatGeo article), and given that female elephants give rise to very few offspring in their 60 years of fecundity (let’s say one offspring every ten years), then there isn’t sufficient time for a “specified” 10 amino acid section of a SINGLE protein to arise in all that time. And we want to explain the change of a mouse to an elephant as coming about in a random fashion? Can you possibly close your eyes to the implications of all of this?
I'm not. All I'm saying is that Kitcher's statement in the passage I responded to is accurate. I'm not asserting that it's a valid criticism of Behe's argument, or ID in general. I'm leaving that judgement (and the evaluation of your elephant computation) to people who know something about biology, amino acids, &c. Edit: Breaking my rule of not getting into biology, who said elephants had to evolve? Is this not the sharpshooter fallacy that Origenes referred to? daveS
In what ways are these calculations brainless?
They in now way relate to any process that has been suggested to produce the results.
Was Sir Fred Hoyle brainless? Is that your position?
I certainly think the person who co-wrote a whole book claiming Archaeopteryx was a fraud despite knowing almost nothing about geology or vertebrate biology was quite capable of acting brainlessly, yes. There is no reason to think protein are pre-specified. How much biological function exists in sequence space is an interesting question, but these uninformed calculations are not related to those question in any obvious way. wd400
wd400:
It is not an attempt to dismiss probability arguments generally but to undercut the kind of brainless specifications you sometimes see (tornado in a junkyard, number of amino acids to the 20th power etc).
In what ways are these calculations brainless? Was Sir Fred Hoyle brainless? Is that your position? And, do you take the position that proteins aren't "pre-specified"? (I'll presume that you will accept that proteins are "specified") daveS: I'm in a Bridge game. The deck is shuffled. I'm dealt the cards. I pick them up. I sort them by suits. Is there any kind of 'probability' involved with this? If there is, then what is it? This is how a Bridge game is played. Now, if there were a 1 in 100 chance that I wouldn't be dealt a hand, or that the hand was somehow defective, then common sense tells you that the game of Bridge wouldn't exist. Who would be so foolish as to play it. No, it is a GIVEN that I will be dealt, and receive, a hand. How can a thinking person think otherwise? Now, if I sit down for a game of Bridge, and I say ahead of time that I want to receive, in this order, the Ace of Hearts, 10 Diamonds, etc, etc., then there isn't enough time in the world for that to happen. Let's not play games. When a particular "pattern" is being looked for--a "pattern" which we will call a "specification"--then probabilities come into play. And they mount up in a hurry. What's the story about if you start with one penny in the first square of a chess board, and then double that amount when moving from one square to the next one on the same line, by the time you reach the end, you have 2^63 times $00.01, or $9.22 x 10^16, or 9200 Trillion dollars. Now what if you increased this by 20, the inverse of the probability of selecting a particular amino acid? Just think of 10 specified amino acids in a chain of 300 amino acids. Well, the improbability is 1 in 20^10, or 1 in 10^13. So, for the ancestor of the elephant to make one change in a portion consisting of a stretch of 10 amino acids, this would take 10 trillion replications. Given a population size of a million, 10^6, and given the transition from "mouse to elephant" took 24 million years (see NatGeo article), and given that female elephants give rise to very few offspring in their 60 years of fecundity (let's say one offspring every ten years), then there isn't sufficient time for a "specified" 10 amino acid section of a SINGLE protein to arise in all that time. And we want to explain the change of a mouse to an elephant as coming about in a random fashion? Can you possibly close your eyes to the implications of all of this? PaV
Origenes - in this case because I know context. I know the inhabitants of that island speak and write English, so I know it is more likely that they set this pattern up themselves. Bob O'H
WD400, We hear your scorning dismissal" brainless. What is it that you fear so much that you scorn rather than address? Let's see: AA's can be chained, as can D/RNA, forming strings, check. Any one can be followed by any other, check. (A lot more can happen too, in organic chem but let us set aside.) Now, we therefore see X-X-X-X . . . X, n times, where each X can be 20 states (typically -- yes there are extra AAs too but they are rare). Or for D/RNA 4 states, ignoring the new artificial elements that were recently added. So, we see 20 possibilities x 20 x 20 . . . x 20, n times over, or else 4 x 4 x4 . . . 4 n times over. That is simple mathematics. Therefore by definition an n-element 20-state string specifies a configuration space of 20^n possibilities. Likewise, a similar string of 4-state elements specifies a space of order 4^n. So, your first quarrel is with mathematics of strings, not a prosperous case to be in. Now, obviously, a flat random distribution of elements -- and there is no credible evidence of preferred sequences, which would run counter to the ability to store info, and is also counter to the chaining chemistry. Where the info storage is structurally at right angles to the chaining. Now, that means that we are dealing with raw info storage of 4.32 bits/symbol for AA's and 2 bits/symbol for D/RNA chains. This is similar to memory registers. Now, let us go to a Darwin warm pond or the like pre life environment. Tabula rasa. How do we find from simple "soup" elements, the chaining and folding and information processing, on blind chance and mechanical necessity? Not to mention the functional organisaiton and encapsulation that makes metabolism and NC machinery that assembles proteins, go. Or, the implied von neumann kinematic self replicator? No, we cannot beg the question and assume an existing life form, the first problem is that pre life environment, and we are neglecting the thermodynamics and the chemical possibilities way beyond this, so that the information and search challenge are actually much worse than any objection you may have on naive flat random approaches. See how brainless your contempt laced dismissal now is readily seen to be? Now, we know that the genome for a first living cell is likely to be in the range 100 - 1,000 k bases. That is coming from a config space of scale 4^100,000 or much worse. Remember, given what else could be going on, this is a lower estimate on the challenge. Our sol system is ~ 10^57 atoms and the observed cosmos ~ 10^80. Generously, let us assume all are in play, never mind most are H or He and are in zones not conducive to life chemistry. 10^17s, about 10^12 - 10^14 chem rxns per second, on generous organic rates. Search capacity, 10^57 + 14 + 17, for our sol system, 10^88. The config space for 100,000 4-possibility elements is of order 9.98*10^60,205. Negligibly different from zero search. Nor would going up to observed cosmos level make a material difference. But is life written into the laws of the cosmos so tha the result is in effect forced? Given the fine tuning challenge, that would be fine tuning on steroids, raising an almost forced conclusion of design. It's bad enough already when we see what is in the laws that forces H, He, C and O to greatest abundance, and puts N close by. Stars, periodic table, water, organic chemistry, proteins, in one shot. The cosmos is fine tuned to produce the building blocks for that. Design practically screams out at us already. As Sir Fred Hoyle full well understood. Coming back to OOL, objectors to the design inference have no empirically warranted mechanism to causally account for the functionally specific complex orgtanisation and informaiton in cell based life, on blind chance and/or mechanical necessity. if you doubt this, simply summarise: ________ then give the people who identified this ______, with what evidence __________, and identify their Nobel prizes for solving the OOL challenge: _______ . Apart from rhetorical stunts, those blanks simply cannot be filled in. Bluff called, and we are confident that evo mat advocates cannot provide the evidence to back their rhetoric. Going on, in life as studied, yes we find patrterns of AAs and in DNA codes etc, such that exactly flat random distributions are not found. That is not driven by chemical or physical bias, so it is a reflection of how codes and functions need to be organised and in so being organised, they will not be at the flat random end. Durston et al published a great analysis on that in 2007. Going on, We now find something more: proteins come in clusters in AA sequence space, including many domains with one or a few, deeply isolated. In short, lo and behold, islands of function in very large config spaces. So, no smooth gradient of easy incremental improvements on an across the board basis. The fitness function model so often put up applies within an island of function, not on the config space as a whole. So, how do we account for body plans from the Cambrian revolution to our own, on models that critically depend on incrementalism to achieve stepwise development? We simply cannot, and the spaces for body plans run like 10 - 100+ million bases each, much much harder. Indeed, round down to 1/2 a bit on average per base of info, and say that only 10% of the genome counts -- plenty o=f room for junk dna. With 10 mn bases, that is 5 mn bits, and 500,000 to account for. We are still in the impossible to search blind search challenge territory. So who is really being "brainless" now, please, why? KF kairosfocus
I have a question for someone here who believes in evolution. From the perspective of Darwinian evolution, enviotnmental condition is something to which an organism must adapt in order to improve its chances of survival and reproduction. For e.g. if this enviotnmental condition is the 'intron-exon gene structure' then adapting to it simply means, gaining the ability to remove introns and join exons. Since this ability comes from the group of proteins(RNA splicing machine), while the information that codes for them is written on the DNA, adapting to an environment simply means finding the right arrangement of nucleotides in the DNA. This realtionhip between enviotnmental condition and the right arrangement of nucleotides is similar to question-answer relationship. In other words, inton-exon gene structure is the question, while the arrangement of nucleotides in the DNA that codes for RNA splicing is the correct answer to this question. Hence, adapting to a particular environmental condition is like providing the right answer to a particular question. Given this analogy, you need to explain to me how to find the correct answer to a particular question via evolutionary mechanisms. Of course, there is a one constraint here: you are not allowed to see the question. We know that evolution has no intelligence and no mind, so it cannot see, read, think, percieve, ....it cannot grasp the problem... Hence, I have this question written in my Word document but you are not allowed to see it. But you can use whatever evolutionary mechanism you choose, for e.g.: functional shift, exaptation, co-option, selection, duplication... to provide the correct answer. In other words, you are alowed to combine existing letters, words and sentences that exist in books, newspapers, magazines,dictionaries, internet or in your mind. You can do whatever you want in creating new combinations of linguistic elements. The only constraint is your inability to use engineering and inteligent design principles in solving the problem. You are unable to notice or become aware of the question, or in other words, you are unable to create a mental representation of perceived question and then, by using your cognitive faculties, co-opt the right combination of letters, words and sentences according to this mental representation. In short, no intelligence is allowed. Also, you cannot communicate with me about a partial accuracy of the answer since communication is an intelligent activity, and we know that evolution is not intelligent and therefore it is not able to communicate. Let me explain this by using the above mentioned enviotnmental condition of intron-exon gene structure. Adaptation to this environment consists of at least four subprocesses: to recognize mRNA and its intron-exon boundaries, then to cut the introns out, to rearrange exons and finally to release the mRNA molecule. Only when combination of nucleotides in the DNA that codes for all these subprocesses exists, only then natural selection can act, and not before. For example: if we assume the existance of splicing helper proteins that assembly at the intron-exon borders to guide small nuclear ribo proteins to form a splicing machine, this partial correctness of the splicing process won't cause introns to magically disappear without a complete splicing machine. Meaning, evolution is able to select an organism only when all four subprocesses are in existance. That is why you cannot communicate with me to determine the partial accuracy of the answer. Here I will also presuppose that all functional words already exist in the "word pool". So, you don't need to create new words from scratch but you can co-opt the right answer from the words that already exist in the books, newspapers, dictionaries.... We are told by evolutionists that co-option is a powerful evolutionary mechanism. They say that the parts nessecary to build new molecular machines could be taken from other molecular machines and combined into the new machine being constructed. Hence, in answering the question you are allowed to take 'functional' words from the existing "word pool". Now just thing about the extent of the problem. The subject of the question can be any aspect of the reality that can be expressed in words. So there is a potential for nearly infinite number of potential questions. And since you don't know what the question is you don't know what words or letters to use, how to combine them, you don't know what amount of words constitute the correct answer. But, we are told by evolutionists that in the real world the path towards the right solution is not carried by random means. So, there you go... try to solve this problem via whatever non-random evolutionary means you choose, be it: functional shift, exaptation, co-option, selection, duplication, ... and let me know when you find out the correct answer. forexhr
Bob O'H @96 :) On a more serious note, assuming that you consider a hoax to be designed, you clearly see design. Why is that? Why not shrug your shoulders and say: “things with low-probability happen all the time”? Origenes
Origenes @ 92 - that would clearly be a hoax. You've clearly never travelled on BR. Bob O'H
wd400, I wouldn't call those specifications brainless so much as naive. But, they're usually naive proposals made where informed specifications don't exist. Also, while an unweighted distribution is, surely, unlikely, an arbitrary weighted distribution is just as likely to work against what you want/need as for it. While treating such naive proposals as necessarily definitive is unfair, they aren't at all when properly considered with respect to our general ignorance; and simply dismissing them while demanding that a better likelihood will arrive is, well, fallacious. LocalMinimum
Have any of you read the passage in context? It's pretty clear that Kitcher's point is that the particular pre-specification of a hypothesis is important. It is not an attempt to dismiss probability arguments generally but to undercut the kind of brainless specifications you sometimes see (tornado in a junkyard, number of amino acids to the 20th power etc). wd400
Origenes, I would infer that the stones were deliberately placed to spell out that message. I don't know that it would have much to do with probability though. daveS
DaveS What would you do when stones spell out “Welcome to Wales by British Railways”? Would you: (1) infer design because this has a low-probability arising by chance. (2) Shrug your shoulders and say: "things with low-probability happen all the time" ? Origenes
Origenes, I expressed some reservations about the word "improbable" above in #84. Would you agree that the event you described would occur very rarely (by human standards, and relative to the number of trials)? daveS
Dave, the bottom line is that according to Kitcher’s reasoning, which you endorse, everything is ‘improbable’, and the term has lost its meaning entirely. Origenes
Origines, Edit---I misread your calculation. No one is saying this is some great feat. It's rather trivial. It's just that you've drawn one particular sequence out of an enormous sample space, and it's so rare that it will most likely never occur again. daveS
Suppose you repeat the process 10 times. You’ll now have received 10 standard bridge hands, 10 sets of 13 cards, each one delivered in a particular order. Specify the order in which all these cards were received, and the probability is 1 in 4^10 x 10^210, which is approximately 1 in 10^216. And yet ... you did it! Origenes
4 x 10 to the 21st power is still an astronomical number even if you don't get exactly 1 success for every one of those tries. But then, Kitcher says that this poses no problem for other reasons. You just have to specify after the deal. hnorman5
Bob O'H and jdk My misnake. Thanks for the clarification. mike1962
mike1962 - you've missed jdk's point, which is about getting exactly the expected number. If you have 10 coin tosses, the probability of getting exactly 5 is 0.246 (from a binomial distribution). Actually, if you have more tosses, the probability is even less: with 100 tosses, the probability of getting exactly 50 is 0.08. Of course, with 101 tosses the probability is zero. Bob O'H
PaV,
Dave, don’t take it the wrong way, but I think it’s a violation of common sense. If you’re a trained statistician, then maybe this language will help: the “likelihood” that you will receive a “bridge hand” is 1. The “likelihood” you’ll receive a particular, specified hand is 1 in 4 x 10^21. Let’s distinguish.
I'm not a trained statistician (to say the least), but I speak of the "probability" of getting a particular hand/sequence assuming some hypothesis, and of the "likelihood" of a particular hypothesis given some observation(s). With that substitution, I would agree with your numbers, with the caveat that the "hands" you describe are ordered. I don't see any violation of common sense in my position, though.
In biology, large parts of proteins must have a specific sequence of amino acids. This, then, becomes astronomically improbable; and the possibility of something like that coming about by chance is essentially zero. That’s the ID argument.
Ok, but I'm not going to say anything about topics which require some knowledge of biology. One comment about my previous posts: In retrospect, I would stick with the adjective "low-probability" rather than "improbable". The connotations of "improbable" seem to indicate an outcome which is especially unlikely compared to the others, and that's not what I mean. daveS
jdk: But there is only a certain probability that the expected number of successes will happen. If I flip a coin 10 times, I would “expect” five heads, but in fact the probability of not getting five heads is about three times as great as the probability of getting 5 heads. This is incorrect. It appears you are generating all 10 coin tosses (bits) at once, in which case that's 224 integers (from 0 to 1023) that have five bits set, 224 out of 1024, or 0.21875. However throwing a fair coin repeatedly does not generate all the bits at once, the system has no "memory", which means each throw is completely independent from all previous throws. Therefore, you should definitely expect very close to 50% distribution heads vs tails, and the results will tighten up closer to 50% over time. mike1962
jdk - it depends on what you're doing! The expectation is a straightforward way to summarise the full distribution in a single number, and often it is important in determining what happens in the long term. For example, whether an epidemic will occur or die out (if the expected number of infections per parental infection is 1 then the probability of an epidemic depends on the expectation). Bob O'H
But there is only a certain probability that the expected number of successes will happen. If I flip a coin 10 times, I would "expect" five heads, but in fact the probability of not getting five heads is about three times as great as the probability of getting 5 heads. So the expected number of successes is sort of a misleading idea, perhaps. jdk
Oh, BTW, if you multiply the probability by the number of attempts, you get the expected number of successes (assuming independence). Bob O'H
Thanks, Bob and PaV. I agree it is not the main issue, but I'm glad to know that we are talking about probability here. jdk
PaV - not so. You still have a probability. The likelihood is the probability of data given parameters and a model (or sometimes is just proportional to that, because the constant of proportionality isn't important for the problem at hand). Bob O'H
jdk: If you multiply the probability by the number of attempts, I've been told here at UD that you get "likelihood," not probability. I just took a look at how "likelihood" is used, and it appears to be used in stochastic processes where the probability density distribution is held constant, and the mean and the variance become variables, with actual data than determining the actual distribution. So, I would agree, the "probability" of receiving a "bridge hand" where any bridge hand will do, is, 1. PaV
jdk - I agree (I am now a full professor of statistics, so I can claim a modicum of knowledge here). But I don't think it's the crux of the matter, which is whether you receive 13 cards, or given that you receive 13 cards, they are the cards specified. I haven't looked at all of this tread, but I suspect that the probability that you receive the 143 cards specified, given you have looked at the hand, becomes important. Bob O'H
I have taught probability at the high school level, not college. I just looked up the difference between likelihood and probability, and I don't think, perhaps, that PaV is using likelihood correctly. I think he means probability. I'm curious what others, especially Dave, thinks about the distinction between the two words? jdk
PaV @73
... the “likelihood” that you will receive a “bridge hand” is 1. The “likelihood” you’ll receive a particular, specified hand is 1 in 4 x 10^21. Let’s distinguish.
Exactly. Now Kitcher might say: "You need a specification? No problemo! I can provide a specification of the bridge hand on the basis of that very same bridge hand, each and every time." Our answer must be that this is NOT allowed. The specification (information!) must stem from a source independent from the bridge hand that was dealt. Do you agree? Origenes
daveS:
I take it some in this thread feel this is a violation of probability theory, but it’s really not.
Dave, don't take it the wrong way, but I think it's a violation of common sense. If you're a trained statistician, then maybe this language will help: the "likelihood" that you will receive a "bridge hand" is 1. The "likelihood" you'll receive a particular, specified hand is 1 in 4 x 10^21. Let's distinguish. In biology, large parts of proteins must have a specific sequence of amino acids. This, then, becomes astronomically improbable; and the possibility of something like that coming about by chance is essentially zero. That's the ID argument. PaV
nm daveS
It makes sense to speak of the probability of an event ONLY IF an independently given specification is involved. A specification independent from the event itself. What doesn't count as 'specification' is summing up of all the parts of the event on the basis of that very event itself — the Miller/Kitcher method. Again, the specification must arise/exist independent of the event. Similarly, if we speak of information wrt proteins, we do not refer to proteins themselves, we are referring to information stored in e.g. DNA. Proteins do not describe themselves. Proteins (and rocks) do not contain "information" about themselves. In general, the claim that 'everything contains information', in the sense that everything contains "information" about itself, is therefor wrong. If true, it would render the term 'information' meaningless. If something contains information in the true sense, then it must be about something else. Put another way, DNA does not contain information in the sense that all parts describe DNA itself, but instead, the information in DNA is about something else, e.g. proteins. Bottom line: information about proteins must be independent from proteins. Probabilities that ID is interested in works in the same way. If one is dealt 13 random cards nothing improbable has happened, However things can become interesting — improbable — in two ways: (1) the 13 cards match a specification that was given prior (and thus independently) to the event. (2) It is unknown whether there exists an independent specification, but the 13 cards match a well-known specifiation, such as "royal flush". IOWs improbability comes into play only when there is, or when there is the suspicion of, an independent specification. Origenes
PaV,
Yes, “once-in-a-million-years” things happen at least 5 times a day.
Yes, this is quite literally true. I'm not saying that the same one-in-a-million-year event is going to happen 5 times in a day, but rather 5 distinct one-in-a-million-year events will happen in a day. Go into any casino, and you can observe 5 such events in a matter of minutes. I take it some in this thread feel this is a violation of probability theory, but it's really not. (Edit) An example: Go to the roulette wheel and write down the next 10 numbers that come up. There are about 6 x 10^15 possibilities for the outcome, all equally likely (assuming an American roulette wheel). This outcome is expected to occur only about once every 200 milllion years, assuming the experiment is performed once per second around the clock. That's one of the 5 improbable events right there. daveS
daveS:
Yet such sequences occur very frequently in a bridge game.
Yes, "once-in-a-million-years" things happen at least 5 times a day. PaV
PaV,
And that means the probability of being dealt A bridge hand is 1; while, being dealt “sequence S” is 1 in 4 x 10^21, and would require, on average, 4 x 10^21 bridge hand being dealt before “sequence S” arises.
Yes, it typically will take a looong time (over 100 trillion years) before a specific sequence comes up, under the assumptions I described above. That's why any particular sequence is very improbable. It's plausible that the vast majority will never be drawn in the history of the universe. Yet such sequences occur very frequently in a bridge game. I don't know anything about biology or the origin of life, or what specific sequences are actually necessary for life to exist, so I can't comment on those matters. daveS
daveS:
For the sake of practicality would you allow me to instead do the experiment with card shuffles? Here’s the procedure: 1) Shuffle a deck of cards well and record the sequence S obtained. 2) Repeatedly shuffle the cards, keeping track of whether they end up in sequence S or not. (N.B.: This S is the sequence obtained after shuffle #1, throughout the experiment) 3) After many repetitions, divide the number of times the cards did end up in sequence S by the total number of trials. Obviously, neither of us expect to get a fraction of 1, correct?
The recording of "the sequence S" is what in ID we call a specification. As I've explained twice already, without a "specification" every permutation of a bridge hand is equal to another, and that means you have an equal number of possible configurations as you do the level of improbability for each configuration. This cancels out. And that means the probability of being dealt A bridge hand is 1; while, being dealt "sequence S" is 1 in 4 x 10^21, and would require, on average, 4 x 10^21 bridge hand being dealt before "sequence S" arises. The whole argument is this: if a "specific" "configuration" is needed for evolution to even begin---for example cytochrome c---then how is this configuration (= 20^100 {20 different amino acids, and a 100 a.a. sequence} ) arrived at? If you want to sweep the problem under the carpet and answer that this is a question that has to do with "origin of life," as many Darwinists do, then we get into the area of de novo genes--genes that are specific to certain taxa, or even species, and found nowhere else. What are the implications? What should I think? PaV
Here's a thought experiment. Show a group of people a sand pile and a living, breathing lizard. Tell them that both represent complex arrangements of matter but that both arose by chance. Chances are they'll believe you with regards to the sand pile but not the lizard. The layman will likely say, sure, the sand pile is complex but with the lizard you have to get all that complexity in the right place or it doesn't work. But most any configuration of sand can make a sand pile. The mathematician will likely speak of sets and subsets. But they're both talking about complex specified information. No amount of unspecified complexity can show us how specified complexity is produced hnorman5
PaV, Ok, I'm going to set aside the issue of permutations and improbability.
On average, yes, I think I would get one. And why don’t you do the experiment? Prove me wrong.
I'm not sure you understand what I meant. Allow me to clarify. For the sake of practicality would you allow me to instead do the experiment with card shuffles? Here's the procedure: 1) Shuffle a deck of cards well and record the sequence S obtained. 2) Repeatedly shuffle the cards, keeping track of whether they end up in sequence S or not. (N.B.: This S is the sequence obtained after shuffle #1, throughout the experiment) 3) After many repetitions, divide the number of times the cards did end up in sequence S by the total number of trials. Obviously, neither of us expect to get a fraction of 1, correct? daveS
daveS:
Please read the Kitcher quote again. He speaks of the event of getting a particular set of 13 cards in a particular order. That’s a permutation of 13 of the 52 cards, by definition. Why do you think the probability you arrived at equals 1 over 52P13? This is elementary combinatorics.
Don't you see that when you draw randomly from a deck of 52 cards, the "order" is already reflected in the fact that you change the odds for each card selected. You're beating a dead horse here.
Did you read my post #57? I could sit here all day drawing 13-card sequences, each of which is only expected to be drawn once every 100 trillion years (if I can perform the draws fast enough). If I draw 1 sequence per second, I can generate 86400 of these in a 24-hour period.
If you can't see the glaring contradiction in a sentence that reads "improbable events occur constantly," then you are also unable to see the distinction between a specified collection of variables, and an unspecified collection of variables. It is truly amazing this isn't obvious to you.
After a sufficient number of trials, divide the number of times the gravel came up in exactly the same configuration by the total number of trials. Do you think you will get 1?
On average, yes, I think I would get one. And why don't you do the experiment? Prove me wrong. PaV
I have a pot of 20 amino acids. I draw out random ones and make a chain of say 100, can I say "Voile, look what I have done, Cytochrome C! Boy I am better than I thought."? Or do I get junk with no meaningful activity? Does any 100 make a useful protein, or is there some sort of pattern necessary? Allen Shepherd
PaV,
We’re not dealing (pardon the pun) with permutations here. If you had a deck of cards spread on the table, and from that you picked 13 cards, to keep one collection of 13 distinct from another collection of 13 cards, then, yes, “order” matters. But we’re dealing with a ‘card deal’. And the first pick is 1/52; then the next pick is 1/51 etc. One doesn’t need the permutation formula for this. And, so, the “order” doesn’t matter. If you’re interested in how to ‘group’ the cards into ‘hands’ of 13 cards, then the ‘hands’ can be either ordered, or not ordered, and you would use either the formula for permutations or combinations, respectively.
Please read the Kitcher quote again. He speaks of the event of getting a particular set of 13 cards in a particular order. That's a permutation of 13 of the 52 cards, by definition. Why do you think the probability you arrived at equals 1 over 52P13? This is elementary combinatorics.
This sentence contradicts itself. Can you see that? This is the problem. It is obvious that, by definition, “improbable events” cannot occur “constantly,” yet you blithely state it as fact.
Did you read my post #57? I could sit here all day drawing 13-card sequences, each of which is only expected to be drawn once every 100 trillion years (if I can perform the draws fast enough). If I draw 1 sequence per second, I can generate 86400 of these in a 24-hour period.
And, to make once again the point I made earlier: yes, there are probably “millions of individual stones,” but there are probably 10^200 ways in which they configure themselves; and the one calculation is simply the INVERSE of the other. And any number times its inverse is equal to ONE. So, the probability of the gravel assuming the configuration you find it in is 1. QED.
Vastly more than 10^200 ways, I would guess. Here's a way to test this. Get a truckful (or shovelful) of gravel. Dump it out on the ground and record the exact configuration of the stones (thought experiment obviously). Now scoop up the gravel and repeat the experiment over and over under similar conditions [clarification: I mean for this to be like a card shuffle, so not exactly the same conditions]. After a sufficient number of trials, divide the number of times the gravel came up in exactly the same configuration by the total number of trials. Do you think you will get 1? daveS
daveS:
Do we however agree that: There are 3954242643911239680000 ordered sequences of 13 distinct cards (permutations). There are 635013559600 unordered sets of 13 distinct cards (combinations).
We're not dealing (pardon the pun) with permutations here. If you had a deck of cards spread on the table, and from that you picked 13 cards, to keep one collection of 13 distinct from another collection of 13 cards, then, yes, "order" matters. But we're dealing with a 'card deal'. And the first pick is 1/52; then the next pick is 1/51 etc. One doesn't need the permutation formula for this. And, so, the "order" doesn't matter. If you're interested in how to 'group' the cards into 'hands' of 13 cards, then the 'hands' can be either ordered, or not ordered, and you would use either the formula for permutations or combinations, respectively. In the permutation formula, you would have 52 x 51 x 50 x 49 x . . . x 3 x 2 x 1 in the numerator, and 39 x 38 x 37 x . . . x 3 x 2 x 1 in the denominator. The last 39 terms of the numerator cancel with those of the denominator leaving 52 x 51 x 50 x . . . x 41 x 40 possibilities, and the probability of any permutation would be 1 in 4 x 10^21. When you select one card from a deck of 52 cards, then the probability of choosing the first card is 1 in 52, and, without replacement, the probability of choosing the second card is 1 in 51, etc. Same numbers. Same odds. In terms of all the different kinds of permutations that result from a deck of 52 cards, order matters; but order doesn't come into play in terms of what we're discussing here. daveS:
Yes, it’s true. For example, in a card game, improbable events occur constantly.
This sentence contradicts itself. Can you see that? This is the problem. It is obvious that, by definition, "improbable events" cannot occur "constantly," yet you blithely state it as fact.
There is a road being repaved near my home, and I witnessed a truck dump a large load of gravel. The individual stones ended up in one of who knows how many possible configurations due to a number of factors. That was like shuffling a deck of cards, except with probably millions of individual stones instead of just 52 cards.
And, to make once again the point I made earlier: yes, there are probably "millions of individual stones," but there are probably 10^200 ways in which they configure themselves; and the one calculation is simply the INVERSE of the other. And any number times its inverse is equal to ONE. So, the probability of the gravel assuming the configuration you find it in is 1. QED. PaV
If Kitcher's argument is correct, then under what circumstances, if any, would it be appropriate to make an argument from probability? If you completely invalidate unlikelihood as an argument, then there's no need for natural selection in an evolutionary model. After all, all of those blind mutations, regardless of how convenient, are no more improbable than any other mutations. hnorman5
Origenes,
What does Kitcher mean when he says: (paraphrasing) “only a chance of 1 in 4 x 10^21, so it should happen only once every 100 trillion years, BUT you did it just now! And with my method you can do it all day long! Probabilities are irrelevant”
I think he means simply that improbable events happen all the time. He definitely doesn't imply "probabilities are irrelevant" or anything of the sort.
Doesn’t Kitcher’s “But you did!” imply that something very improbable happened, when in fact this is not the case? Does he not paint the bull’s eye around the arrow after it was shot?
Well, as I argued, an improbable event did occur. I don't think he committed the sharpshooter fallacy, because, following the analogy, he didn't claim that he had been aiming for that particular outcome. Every outcome of that experiment is very improbable, that's all. daveS
DaveS @57
I can’t read Kitcher’s mind, but I really doubt he expects his readers to make such an obvious error.
So Dave, what is your take on it? What does Kitcher mean when he says: (paraphrasing) "only a chance of 1 in 4 x 10^21, so it should happen only once every 100 trillion years, BUT you did it just now! And with my method you can do it all day long! Probabilities are irrelevant" Doesn't Kitcher's "But you did!" imply that something very improbable happened, when in fact this is not the case? Does he not paint the bull's eye around the arrow after it hit the wall? Origenes
Origenes,
What he (implicitly) wants us to do is to assume that a pre-specification has been met: "... But you did, and you have witnesses to testify that your records are correct."
I can't read Kitcher's mind, but I really doubt he expects his readers to make such an obvious error. Each outcome in the 13-card draw experiment has probability about 1 in 4 x 10^21. Suppose this experiment is performed once per second 24 hours a day, 7 days per week. Then the outcome I obtained and posted in #12 would be expected to occur about once every 100 trillion years. That's an improbable event by my standards. Note that I'm not saying this is comparable to successfully predicting which cards will be drawn next. Unless I had psychic powers, that would also only occur about once every 100 trillion years. daveS
DaveS: Yes, it’s true. For example, in a card game, improbable events occur constantly.
During a card game hands are being dealt. No improbability here. What is improbable is to have correctly predicted the hand that you are being dealt. IOWs the improbability of getting a particular hand is connected to its pre-specification. Kitcher offers no such pre-specification. What he (implicitly) wants us to do is to assume that a pre-specification has been met: “ … But you did, and you have witnesses to testify that your records are correct.” Kitcher's "But you did.." implies "you have guessed it just right" and/or "you dealt yourself exactly the pre-specified hand". It is painting the bull's eye around the arrow after the shot. It's nonsense for sure. See also #50. Origenes
PaV,
If the first card is an Ace of Spades, or a King of Hearts, doesn’t matter since every card has the same likelihood of being chosen: 1/52. The only variable is that the total number of cards to select from changes after each selection if the card is not replaced. If you want to include “order,” then each card would have a 1 in 52 chance of being ‘selected,’ and a 1 in 13 chance of being in the right order. This would increase the improbability.
I'm still not following this reasoning. Do we however agree that: There are 3954242643911239680000 ordered sequences of 13 distinct cards (permutations). There are 635013559600 unordered sets of 13 distinct cards (combinations). If so, then I'll move on.
But, then, Kitcher wants to go on to say that a highly improbable event, such as a particular bridge hand, happens all the time. How would you answer him?
Yes, it's true. For example, in a card game, improbable events occur constantly. There is a road being repaved near my home, and I witnessed a truck dump a large load of gravel. The individual stones ended up in one of who knows how many possible configurations due to a number of factors. That was like shuffling a deck of cards, except with probably millions of individual stones instead of just 52 cards. daveS
daveS: Yes, I meant "1 in 3.95 x 10^21."
That is the probability of any particular permutation being drawn, and that’s an ordered collection of cards. I don’t know what you mean about the first and last cards not mattering.
If the first card is an Ace of Spades, or a King of Hearts, doesn't matter since every card has the same likelihood of being chosen: 1/52. The only variable is that the total number of cards to select from changes after each selection if the card is not replaced. If you want to include "order," then each card would have a 1 in 52 chance of being 'selected,' and a 1 in 13 chance of being in the right order. This would increase the improbability.
Shuffle a deck well, draw 13 cards at random, and record the permutation of cards drawn. The outcome of this experiment is a permutation. This experiment has about 4 x 10^21 equally likely outcomes. Therefore the event you just witnessed has/had probability about 1 in 4 x 10^21.
But, then, Kitcher wants to go on to say that a highly improbable event, such as a particular bridge hand, happens all the time. How would you answer him? PaV
PaV,
Again, this is a minor point. However, the numbers are: 1/52 x 1/51 x 1/50 . . . . 1.40 = 3.95 x 10^21. Which is the first card, or the last, doesn’t matter, if the cards aren’t being replaced.
I assume on the right you mean the reciprocal of 3.95 x 10^21. That is the probability of any particular permutation being drawn, and that's an ordered collection of cards. I don't know what you mean about the first and last cards not mattering.
Well, what do you think Kitcher was saying? It’s not clear from the context of the quote.
He means something like this: Shuffle a deck well, draw 13 cards at random, and record the permutation of cards drawn. The outcome of this experiment is a permutation. This experiment has about 4 x 10^21 equally likely outcomes. Therefore the event you just witnessed has/had probability about 1 in 4 x 10^21. daveS
Phinehas,
Do you really think there is no quantitative, mathematical strategy for helping you? Are you not curious about whether one might exist?
Yes, I am curious whether such a strategy exists. Do you know of any? I did suggest above that perhaps sequences with especially short descriptions might more likely be hand chosen, given human nature/psychology. On the other hand, maybe not. If I were trying to create a random-looking sequence by hand, I would definitely avoid such sequences. I think any such strategy would have to be developed empirically, by understanding how human-created sequences typically deviate from random sequences, and devising statistical tests to differentiate between them.
Intuitively, I’d suppose that the odds of the first hand being randomly drawn is close to 1 / 1. I’d also suppose that the odds of the second hand being randomly drawn is minuscule, perhaps even as small as 1 / 4 x 10^21. Let’s say I chose these as starting points. Can you offer anything to demonstrate that my intuition is definitively wrong?
No, but it's not clear to me this would extend to a comprehensive strategy that is actually helpful overall. daveS
Origenes: Nice summary! PaV
Kitcher: You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order? The answer is one in 4 x 10^21…But you did, and you have witnesses to testify that your records are correct.”
Miller throws a 3 with a fair dice. Kitcher: "What is the probability that you get a 3?" Miller: "1/6" Kitcher: "But you did it in one single throw!" Miller: Indeed. Let's try it again! Miller throws a 5 with a fair dice. Kitcher: "What is the probability that you get a 5?" Miller: "1/6" Kitcher: "But you did it in one single throw!" Miller: It must be my lucky day! Let's try it again! Miller throws a 2 with a fair dice. Kitcher: "What is the probability that you get a 2?" Miller: "1/6" Kitcher: "But you did it in one single throw!" Miller: Wow! Let's try it again! ..... Origenes
daveS:
No, order is definitely important. Otherwise you’re talking about combinations of cards, and there are “only” 635013559600 rather than about 4 x 10^21 of those.
Again, this is a minor point. However, the numbers are: 1/52 x 1/51 x 1/50 . . . . 1.40 = 3.95 x 10^21. Which is the first card, or the last, doesn't matter, if the cards aren't being replaced.
Well, I just don’t know what we disagree on. If you can point to something which I have said that is factually incorrect, then please do so.
Well, you haven't really said what you think Kitcher's quote means. You wrote this:
Yes. I don’t think we have any disagreement. This is all consistent with what the Kitcher quote that I responded to, as far as I can tell.
Well, what do you think Kitcher was saying? It's not clear from the context of the quote. But Rossiter's take is that Kitcher is saying that Behe is wrong, and the the probability of getting a hand of 13 cards is a highly probable event. Rossiter then goes on to say this:
This is asinine, because it misses the entire thrust of the probability argument. When you deal yourself thirteen cards, you have no idea what your hand will be. And, it doesn’t matter in any meaningful way. You don’t have to get a certain hand. You simply get thirteen cards. Now, the thrust of the probability argument is to say that you must get a particular hand (thirteen cards in a particular order). You can try that ‘til the cows come home.
As a start, do you agree with Rossiter's critique of Kitcher? PaV
Rossiter puts it like this:
When you deal yourself thirteen cards, you have no idea what your hand will be. And, it doesn’t matter in any meaningful way. You don’t have to get a certain hand. You simply get thirteen cards. ....
Rossiter also notices that Kitcher leaves out the pre-specification. He seems to imply that odds are irrelevant at this point.
... Now, the thrust of the probability argument is to say that you must get a particular hand (thirteen cards in a particular order). You can try that ‘til the cows come home.
Indeed, with a pre-specification we find ourselves in a completely different situation.
If your life depended on you dealing yourself a king of hearts, 10 of spades, 3 of clubs, jack of clubs, 9 of diamonds, queen of hearts, 4 of clubs, 8 of clubs, 10 of hearts, king of diamonds, 6 of diamonds, queen of diamonds, and an ace of spades—in that order—you are a dead man. If you dealt a hand every second for the next 1,000,000,000,000,000 years, you would (on average) get that hand once. That’s how the argument is supposed to work.
- - - - - - Kitcher's trick seems to be that he fabricates a hypothetical situation where a pre-specified hand has been dealt against all odds. Suppose you win the lottery .... Again Kitcher:
You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order?
What he asks us to do is: (1) Suppose the pre-specification of a particular hand. (2) Suppose that the dealt hand matches the pre-specification. IOWs 'suppose you win the lottery'. Next Kitcher asks us:
What is the probability that you get exactly those cards in exactly that order? The answer is one in 4 x 10^21…But you did, and you have witnesses to testify that your records are correct.”
Suppose you win the lottery. What were the odds? Low. Yeah, but you did it anyway! And I am sure Kitcher can play this game all day long with every hand he deals himself. Did Kitcher invent a way to beat the odds every time? Is this Nobel Prize stuff? Well only if you are willing to accept his 2 suppositions ... Origenes
DS:
I doubt that my decisions would be strongly correlated with whether the sequences were actually hand chosen or not.
In my hypothetical, doing this is exactly your task. You must do your best to make the correlation as strong as possible. Do you really think there is no quantitative, mathematical strategy for helping you? Are you not curious about whether one might exist? Intuitively, I'd suppose that the odds of the first hand being randomly drawn is close to 1 / 1. I'd also suppose that the odds of the second hand being randomly drawn is minuscule, perhaps even as small as 1 / 4 x 10^21. Let's say I chose these as starting points. Can you offer anything to demonstrate that my intuition is definitively wrong? Phinehas
to Origenes: I haven't actually paid any attention to the part about Kitcher, Miller, et al. My apologies if my contribution hasn't been useful or on target. jdk
Phinehas,
You don’t get to know anything else. You have to decide based on the cards alone. Everything before the cards get displayed on the screen is a black box to you.
That puts me in a tough spot. It's then essentially a psychological question, as humans are notoriously bad at distinguishing between randomly chosen sequences and hand chosen sequences. I would probably just go with my gut reaction each draw, guessing what percentage of people would identify the sequence as hand chosen. I doubt that my decisions would be strongly correlated with whether the sequences were actually hand chosen or not. daveS
Jdk @42
Jdk: If in fact all spades comes up, the probability that it indeed did come up is now 1, because there it is. The probability that would happen in respect to the past is one out of S, but the probability that it did happen is 1, because it did happen. However, if this is the issue I’m not sure how meaningful this is.
The issue is important because Kitcher, Miller & co attempt to use the confusion over "but it did happen!" to argue against any ID arguments from probability — see #19
Kitcher: “Consider the humdrum phenomenon suggested by [Michael] Behe’s analogy with bridge. You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order? The answer is one in 4 x 10^21…But you did, and you have witnesses to testify that your records are correct.”
Origenes
DS:
But this is an interesting example. If I actually were working for the casino, I would guess that more gamblers would say the first sequence was drawn randomly than would say the second one was drawn randomly, so that would affect the betting odds. But I can’t say anything quantitative. I would need to run some trials first in order to set the odds to ensure that this is a profitable game for the casino. I would also want to know on average what percentage of the sequences were to be hand selected and what sort of procedure would be used.
For the purposes of this hypothetical scenario: You don't get to know anything else. You have to decide based on the cards alone. Everything before the cards get displayed on the screen is a black box to you. Additionally, the casino simply takes a small percentage of the total money bet. Your job is to make the odds as close to fair as possible so that the most number of people bet, being confident they are betting on a fair game where the potential rewards make as much sense as possible when compared to the risks. Phinehas
And try as I might, I don't see what you guys are arguing about. It seems like everyone is in agreement about the number of possible outcomes, the probability of any one specific hand coming up, and the probability of some hand coming up, which is 1. But perhaps the distinction is this. Before the draw, the probability of all spades coming up in order when the draw is made (in the future) is 1 out of S, where S = 4 x 10^21. If in fact all spades comes up, the probability that it indeed did come up is now 1, because there it is. The probability that would happen in respect to the past is one out of S, but the probability that it did happen is 1, because it did happen. However, if this is the issue I'm not sure how meaningful this is. Also PaV writes, "The “order” is not important. The cards are interchangeable. But this is only a minor point." The order is important in this discussion: this is about permutations, not combinations. If order isn't significant (having all spades but not knowing or caring if they were dealt in order) is a combination, and there are only about 6.35 x 10^11 possibilities. But perhaps this is not what PaV means. jdk
PaV,
First, the “order” is not important. The cards are interchangeable. But this is only a minor point.
No, order is definitely important. Otherwise you're talking about combinations of cards, and there are "only" 635013559600 rather than about 4 x 10^21 of those.
I hope you see the great difference between these two cases.
Well, I just don't know what we disagree on. If you can point to something which I have said that is factually incorrect, then please do so. daveS
Another issue is what percentage of hands would include some type of sequence that we would recognize as meaningful. Would the sequence 1 club, 4 diamonds, 7 hearts, etc be meaningful because the numbers went up by 3 every time? What is the cards laid out my birthdate and time?: that would be meaningful to me but not to anyone else. What if someone pre-specified a code that no one else knew anything about so on average two cards (such as 4 diamonds and jack spades) representing a letter such as A, etc, and then a hand spelled out a specific message that only the person with the code recognized. My point is that there are some gray areas in distinguishing hands which look purely random from ones which look that they might have been selected on purpose. jdk
daveS:
However, when you “specify” a particular hand ahead of time, then instead of there being 10^21 possible hands that can be dealt, now there is only ONE hand that meets the requirement we’re setting ahead of time.
You don't seem to be dealing with the importance of this statement. Please take another look at my last post. daveS:
What I take the passage to be saying is that anytime I draw 13 cards at random from a deck, the chance of exactly those cards coming up in exactly that order is about 1 in 4 x 10^21. Indeed that was the probability of this particular permutation coming up, let’s say from the point of view of a time before the draws.
First, the "order" is not important. The cards are interchangeable. But this is only a minor point. However, the highlighted portion of what you wrote seems to indicate you haven't digested my prior statement. Let me try to be more detailed. The prior and posterior probability of a "hand" is the same. This is the point you seem to be stressing. We agree. What changes is not the probability, but the number of "acceptable outcomes," for want of a better phrase. If ANY permutation is acceptable, then there are 4 x 10^21 possible permutations that are "acceptable outcomes." And if the probability of EACH "outcome" is 1 in 4 x 10^21, then the "likelihood" of being dealt a hand is 1. However, if only ONE possible permutation is an "acceptable outcome," which would be the case if you, ahead of time, "specified" the ONE "outcome" you deemed "acceptable," then there is only a 1 in 4 x 10^21 that you will be dealt this singular "acceptable outcome." I hope you see the great difference between these two cases. PaV
Origenes,
Without a pre-specification, it is guaranteed that something comes up which has a 1 in 4 x 10^21 chance of coming up.
As long as we're clear on what the "something" is, this is accurate. We are guaranteed that one of approximately 4 x 10^21 equally likely outcomes (permutations) will come up. daveS
Phinehas,
Consider the following hypothetical scenario. ***
First thing, I would tell the casino that they should fire me and hire a statistician. An amateur math enthusiast probably would not do well in this situation. :) But this is an interesting example. If I actually were working for the casino, I would guess that more gamblers would say the first sequence was drawn randomly than would say the second one was drawn randomly, so that would affect the betting odds. But I can't say anything quantitative. I would need to run some trials first in order to set the odds to ensure that this is a profitable game for the casino. I would also want to know on average what percentage of the sequences were to be hand selected and what sort of procedure would be used. daveS
Without a pre-specification, it is guaranteed that something comes up which has a 1 in 4 x 10^21 chance of coming up. That doesn't have a nice ring to it. Origenes
Phinehas,
You mean other than the one I and others are proposing? When the pre-specification is 13 random cards, it makes no sense to pretend that the calculation of probabilities involves 13 specific cards. What is wrong with the above statement?
Kitcher clearly states that the probability of getting those exact cards in that exact order is 1 in 4 x 10^21. I knew what he was talking about, and didn't think he was "pretending" something was the case when it wasn't. As the wider context that Origenes posted shows, there is a question of whether we could accuse Kitcher of committing a fallacy, but in the passage I responded to, that's not apparent.
If what I am saying lines up well with the experience of surprise and disbelief that accompanies probabilities, why the reluctance to consider it? Why not follow the evidence where it leads?
Where does it lead? I already stated that I didn't believe you got that sequence using a random card generator. That is somewhat difficult to justify is all I am saying. We could talk about collections of sequences that, for example, have a very short description (all clubs, say) and how they are "naturally" specified in some way, but that leads to the question of why I'm singling out "having a very short description" as especially significant. daveS
PaV,
Yes, all of this is inarguable, and the same numbers I used. However, when you “specify” a particular hand ahead of time, then instead of there being 10^21 possible hands that can be dealt, now there is only ONE hand that meets the requirement we’re setting ahead of time. So, now, instead of there being 10^21 opportunities to get a “hand” with the chance of each hand being 1/10^21, we now have 1 opportunity of getting that “hand,” but there being 10^21 possible “hands.” Thus, when the “hand” is specified ahead of time, the “odds” of getting that particular “hand” dealt to you is now, instead of 1, it is 1 in 10^21. This, too, is inarguable.
Yes. I don't think we have any disagreement. This is all consistent with what the Kitcher quote that I responded to, as far as I can tell. daveS
DS: Consider the following hypothetical scenario. A Las Vegas casino is testing a new game of chance. It works like this: A list of 13 cards are displayed on a television screen. Players then place bets on whether the cards were randomly drawn or hand selected and ordered by an intelligent agent. If they guess correctly, they win. If they guess incorrectly, they lose. The casino hires you as a mathematician to set the betting odds for each hand that is to be displayed to players. You do not get to see whether the hand was randomly drawn (RD) or hand selected (HS). You must simply set your odds based only on the hand itself. Once you determine the betting odds, this gets displayed to the screen beneath the hand, and the betting begins. The first hand you see is:
7 of clubs 6 of clubs 2 of diamonds king of spades 2 of clubs 3 of clubs 5 of diamonds ace of hearts 10 of hearts 6 of clubs king of diamonds 5 of spades 3 of diamonds
What do you set the betting odds at for this hand (RD/HS)? And why? Did you use any math? What was it? The second hand you see is:
Ace of clubs King of clubs Queen of clubs Jack of clubs 10 of clubs 9 of clubs 8 of clubs 7 of clubs 6 of clubs 5 of clubs 4 of clubs 3 of clubs 2 of clubs
What do you set the betting odds at for this hand (RD/HS)? And why? Did you use any math? What was it? Phinehas
DS:
Phin: So, do you believe I got the outcome described via a true random card generator? If you don’t doubt this because it defies probabilities, then why do you doubt it?
DS: I don’t believe you got it using a random card generator, even though I know the two permutations have exactly the same chance of coming up. My reasons for believing you did not use a random card generator are at least partly psychological, though.
This seems a bit vague. Maybe it is something you could ruminate on a bit more? No need for any science-stopper, right?
DS: I don’t know there is a truly objective way to justify these reasons.
You mean other than the one I and others are proposing? When the pre-specification is 13 random cards, it makes no sense to pretend that the calculation of probabilities involves 13 specific cards. What is wrong with the above statement? If what I am saying lines up well with the experience of surprise and disbelief that accompanies probabilities, why the reluctance to consider it? Why not follow the evidence where it leads? Phinehas
daveS:
The probability of getting that particular permutation was 1 in 4 x 10^21. The probability of getting 13 cards was presumably 1. Agreed? This is really inarguable.
Yes, all of this is inarguable, and the same numbers I used. However, when you "specify" a particular hand ahead of time, then instead of there being 10^21 possible hands that can be dealt, now there is only ONE hand that meets the requirement we're setting ahead of time. So, now, instead of there being 10^21 opportunities to get a "hand" with the chance of each hand being 1/10^21, we now have 1 opportunity of getting that "hand," but there being 10^21 possible "hands." Thus, when the "hand" is specified ahead of time, the "odds" of getting that particular "hand" dealt to you is now, instead of 1, it is 1 in 10^21. This, too, is inarguable. PaV
PaV: You used some program to “deal you” 13 cards. Did you specify which 13 cards you wanted ahead of time? No. Nothing was “specified.” .... IOW, you were dealt a hand.
You've nailed it. Similarly, in my example #23, a point on the wall was hit. After the fait accompli 'they' want to talk inappropriate hypotheticals and probabilities. Origenes
PaV,
Nothing was “specified.” When you were dealt your hand, there were 10^21 different possibilities for dealing your hand (nothing be “specified” in advance). And the chance of getting the hand you were dealt were 1 in 10^21. So, 10^21 x 1/10^21 = 1. IOW, you were dealt a hand.
Edit: I initially misread your post. I'm not claiming that particular permutation was specified beforehand, as I've pointed out. The probability of getting that particular permutation was 1 in 4 x 10^21. The probability of getting 13 cards was presumably 1. Agreed? This is really inarguable. daveS
Origenes,
More at home in cosmology?
No, not really. I'm not equipped to get into a detailed discussion of either field. daveS
daveS:
Yes, and that’s exactly the type of event that I (and Kitcher) are talking about. In other words, drawing cards from the top of a well-shuffled deck and getting the exact cards I listed in my post #13, and in the exact same order.
You used some program to "deal you" 13 cards. Did you specify which 13 cards you wanted ahead of time? No. Nothing was "specified." When you were dealt your hand, there were 10^21 different possibilities for dealing your hand (nothing be "specified" in advance). And the chance of getting the hand you were dealt were 1 in 10^21. So, 10^21 x 1/10^21 = 1. IOW, you were dealt a hand. PaV
Correction---I guess it's actually Behe's probability calculation for a scenario Doolittle came up with. daveS
daveS,
DaveS: Yes, I think citing the sharpshooter fallacy is the “obvious” ID response.
I have read about it some time ago. Anyway, this is my personal version of it.
DaveS: I would have to know something about biology ...
More at home in cosmology? :) Lushkin wrote this:
For example, in cosmology the required specification is an objectively understandable configuration of the physical laws and constants of the universe. Not just any improbable configuration will do. You need one that allows life to exist. The vast vast majority of configurations don’t yield any or all of the following: matter, heavy elements, molecules, galaxies, stars, solar systems, habitable planets, or even elements like carbon. So it’s not hard to understand the specification required for cosmic design: you need a configuration that produces a life-friendly universe. Thus, the laws of the universe exhibit high CSI.
The point being that here there is, arguably, a real existing target in advance (a life-friendly universe), contrary to Kitcher's 13 random cards, which is a "fake-target". Origenes
Origenes, Yes, I think citing the sharpshooter fallacy is the "obvious" ID response. I would have to know something about biology (which I don't) and Doolittle's actual calculation to know whether it's an effective one. daveS
daveS If one shoots an arrow at a large enough wall, it will certainly hit a point on that wall. Note that it makes little sense talking about probabilities at this stage, since hitting a point on the wall is a sure thing. We can only say that the probability that some point on the wall will be hit is 1. Now, after a particular point X is being hit, the naturalists ask us the following question:
“Suppose that point X was a target in advance, what would have been the odds to hit specifically point X and not one of the other points on the wall?”
Note, that this is pure fantasy land. Point X was not a target in advance. Suggesting that it was is akin to painting a bull’s eye around it after the fact. But, being good sports, we accept, arguendo, that point X was a target in advance and we answer: “Given that scenario, the odds of hitting point X are extremely low.” Now, the naturalists (Miller, Kitcher and many others) play what they imagine is a trump card:
“But it did happen! So unlikely things happen all the time! Your ID-probabilities are irrelevant!”
But, it did not happen. That is, the scenario did not happen. It is not true that point X was a target in advance and was subsequently hit. What actually happened was that some point on the wall was hit with 100% certainty. After the fact, each point that was hit would have been ‘promoted’ by naturalists as ‘hypothetical target in advance point X’. They would have painted a bull's eye around every point that was hit. It’s trickery. Nothing unlikely took place. Origenes
PaV,
Now we ask, what is the chance that blindfolded, you will touch the screen 13 times, and then “match” the list you drew up ahead of time? Now it’s 1 in 10^21.
Yes, and that's exactly the type of event that I (and Kitcher) are talking about. In other words, drawing cards from the top of a well-shuffled deck and getting the exact cards I listed in my post #13, and in the exact same order. Even after the fact, it's correct to say that the chance of drawing that particular permutation, with no extra information given, is/was 1 in 4 x 10^21. That is true whether the draws occur in the past, present, or future. I think it's reasonable to ask what this has to do with ID or Doolittle's calculation, but this event is what Kitcher accurately described in the quote. daveS
daveS: Pardon my stepping in on your discussion with Phinehas, but I think some clarification of the following comment is necessary:
Yes, the chance of getting 13 randomly drawn cards is essentially 1. And the chance of getting the 13 cards in the order I described is about 1 in 4 x 10^21. I think we both agree on that?
Yes, I agree that the chance of getting the drawn cards is essentially 1; since this depends only a fair dealer who knows how to deal a deck of cards to four persons. But the chance of getting the 13 cards in the order you describe is not 1 in 4 x 10^21. It, too is 1. Here's what I mean. Imagine you have a large computer screen in front of you that shows all the cards in a deck in the rectangular area of the screen. Now, imagine that each time you touched a card on the screen, it went into a "hand," with the card now being eliminated from the screen, and the remaining cards randomly scrambled across the computer screen. Let's imagine you touch the screen 13 times. Then, you have a 13-card hand. Now, imagine you do this "blindfolded." There you have it: 13 randomly selected cards. However, the "chance" of this happening is not 1 in 10^21, but basically 1---since all you have to do is touch the screen 13 times. Here's where ID comes in: what if, then, you list out ahead of time a "hypothetical" hand of 13 cards. You can list whatever cards you like--any group of 13 will do. Now we ask, what is the chance that blindfolded, you will touch the screen 13 times, and then "match" the list you drew up ahead of time? Now it's 1 in 10^21. Amino acids make up proteins. There are around 20 of them. Proteins perform specific biological functions, and the order of those amino acids is critical. How can this happen randomly? Sir Fred Hoyle writes that he dismissed Darwinism as a youngster because of its unlikely chance of overcoming these odds. He then calculates them for cytochrome c, a protein involved in cell division. You see, NS can't tinker if cell reproduction doesn't happen. So how did this universally needed protein arise? It's about 100 a.a.s long. PaV
Origenes, I do agree with the statement that highly improbable things happen all the time. Whether Kitcher's argument against ID has any merit, I will leave to the biologists. daveS
DaveS
daveS I just don’t see that Kitcher is saying anything very extraordinary in that passage, although as I said the wording could be better.
Here is the context:
Kitcher also butchers the ID argument from probability. The argument has traditionally been that the likelihood of producing a particular feature, given only naturalistic (non-intelligent) causes, is prohibitively small. For example (as Kitcher quotes), Doolittle has argued that the likelihood of getting a random assembly of the blood-clotting cascade is one in 1,230,000,000,000,000,000. But, Kitcher finds this line of argument absurd, not because the probability calculation is wrong, but because highly improbable things happen all of the time (an argument I deal with in my second book). Kitcher argues, “Consider the humdrum phenomenon suggested by [Michael] Behe’s analogy with bridge. You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order? The answer is one in 4 x 10^21…But you did, and you have witnesses to testify that your records are correct.”
Do you agree with Kitcher? What he does is: he throws a dart at the wall and next paints a bull's eye around it, and proudly exclaims that his throw was right in the center the bull's eye. Origenes
Phinehas,
The pre-specification was 13 randomly drawn cards. Which is actually super-likely, as born out by your example where you requested 13 random cards from a card generator and got exactly that.
Yes, the chance of getting 13 randomly drawn cards is essentially 1. And the chance of getting the 13 cards in the order I described is about 1 in 4 x 10^21. I think we both agree on that?
Given Kitcher’s argument, the probability of me getting this sequence in this order is obviously no different than the probability of you getting the cards you got in the order you got them, right? So, nothing to see here? Nothing to explain? Let’s just move along? So, do you believe I got the outcome described via a true random card generator? If you don’t doubt this because it defies probabilities, then why do you doubt it?
I don't believe you got it using a random card generator, even though I know the two permutations have exactly the same chance of coming up. My reasons for believing you did not use a random card generator are at least partly psychological, though. I don't know there is a truly objective way to justify these reasons.
I just don’t get why you and Kitcher would even attempt to deny something that seems patently obvious to me.
Hm. What do you think we are denying? I just don't see that Kitcher is saying anything very extraordinary in that passage, although as I said the wording could be better. I think it's overwhelmingly likely that the sequence I drew had never been drawn in the history of the universe (and perhaps never will be again). So by that standard, it's a very unlikely event. daveS
If you had a deck of cards with 52 different symbols that themselves contained no pattern, then every 13 card hand would look just the same to us: we would recognize no hand as particularly surprising. Using a deck of cards that contains a pattern that we already consider meaningful complicates this discussion, I think. jdk
DS:
Yes, but I don’t think the Kitcher passage contradicts this. It doesn’t claim that some pre-specified super-unlikely event occurred. At least I didn’t interpret it to say this.
It wasn't pre-specified super-unlikely. But it was pre-specified. The pre-specification was 13 randomly drawn cards. Which is actually super-likely, as born out by your example where you requested 13 random cards from a card generator and got exactly that. A practical way to think about probabilities is to consider how surprised one might be at the outcome or even how difficult it would be to believe it was randomly generated. Your outcome was not the least bit surprising. Nor is it difficult to believe that it was randomly generated. I tried a similar program online and got the following cards in the order specified:
Ace of clubs King of clubs Queen of clubs Jack of clubs 10 of clubs 9 of clubs 8 of clubs 7 of clubs 6 of clubs 5 of clubs 4 of clubs 3 of clubs 2 of clubs
Given Kitcher's argument, the probability of me getting this sequence in this order is obviously no different than the probability of you getting the cards you got in the order you got them, right? So, nothing to see here? Nothing to explain? Let's just move along? So, do you believe I got the outcome described via a true random card generator? If you don't doubt this because it defies probabilities, then why do you doubt it? I'm pretty sure you have a strong math background, so I don't think anything I'm saying is news to you. I just don't get why you and Kitcher would even attempt to deny something that seems patently obvious to me. Still, I'm open to being corrected if I've got it all wrong. Phinehas
Phinehas,
You can call the result of that random draw a “particular permutation,” and it certainly will be one after the fact. But at the time of the drawing, the specification is still 13 random cards.
Yes, but I don't think the Kitcher passage contradicts this. It doesn't claim that some pre-specified super-unlikely event occurred. At least I didn't interpret it to say this. daveS
Origenes:
What does that tell us about the state of philosophy?
Yikes! Even so, come quickly... Phinehas
DS: It is certainly true that the probability of drawing the following is quite low.
7 of clubs 6 of clubs 2 of diamonds king of spades 2 of clubs 3 of clubs 5 of diamonds ace of hearts 10 of hearts 6 of clubs king of diamonds 5 of spades 3 of diamonds
But until you have the specification defined as you've done above, the actual specification (rather than some fuzzy imagined one) is 13 randomly drawn cards. You can call the result of that random draw a "particular permutation," and it certainly will be one after the fact. But at the time of the drawing, the specification is still 13 random cards. And 13 random cards is what you will (very likely) get. What you will very likely not get is:
7 of clubs 6 of clubs 2 of diamonds king of spades 2 of clubs 3 of clubs 5 of diamonds ace of hearts 10 of hearts 6 of clubs king of diamonds 5 of spades 3 of diamonds
Phinehas
Phinehas,
When you use the phrase “particular permutation” above, in what way is the following not true? (particular permutation == 13 random cards)
Here's what I understand is supposed to happen. I just used some random card generator to deal 13 cards. The results:
7 of clubs 6 of clubs 2 of diamonds king of spades 2 of clubs 3 of clubs 5 of diamonds ace of hearts 10 of hearts 6 of clubs king of diamonds 5 of spades 3 of diamonds
What I take the passage to be saying is that anytime I draw 13 cards at random from a deck, the chance of exactly those cards coming up in exactly that order is about 1 in 4 x 10^21. Indeed that was the probability of this particular permutation coming up, let's say from the point of view of a time before the draws. Obviously the conditional probability of drawing that permutation, given that it was drawn, is 1. Presumably that's not what either of us is calculating, however, since in your case you mentioned how a heart attack could prevent the draws from being completed. daveS
Phinehas: How does this joker have any business writing a book? Seriously. The stupid is blowing my mind.
You are absolutely correct. But here is the scary part:
He earned his B.A. in Mathematics/History and Philosophy of Science from Christ's College, Cambridge in 1969, and his PhD in History and Philosophy of Science from Princeton University in 1974, where he worked closely with Carl Hempel and Thomas Kuhn. ... Kitcher is past president of the American Philosophical Association. In 2002, Kitcher was named a fellow of the American Academy of Arts and Sciences, and he was awarded the inaugural Prometheus Prize from the American Philosophical Association in 2006 in honour of extended achievement in the philosophy of science. He has trained a number of prominent philosophers of science, including Peter Godfrey-Smith at the City University of New York Graduate Center, Kyle Stanford at the University of California at Irvine, Michael R. Dietrich at Dartmouth College, and Bruce Glymour at Kansas State University, and C. Kenneth Waters at the University of Calgary and Michael Weisberg at the University of Pennsylvania as undergraduates. His appointments and service have included: Editorial Board, Philosophy of Science, 1985–1994. Editor-in-Chief, Philosophy of Science, 1994–1999. Governing Board, Philosophy of Science Association, 1987–1991. Member NIH/DOE Working Group on the Ethical, Legal, and Social Implications of the Human Genome Project, 1995–1997. Representative of the American Philosophical Association to Section L of the American Association for the Advancement of Science, 1995–1998. Member, Board of Officers, American Philosophical Association, 1996–99. Philosophy Referee for John Simon Guggenheim Memorial Foundation, 1994— [wiki]
What does that tell us about the state of philosophy? Origenes
DS: When you use the phrase "particular permutation" above, in what way is the following not true? (particular permutation == 13 random cards) Will the above not hold true until you've defined "particular permutation" to mean something other than 13 random cards? The strength of the argument relies entirely on "particular permutation" meaning something other than 13 random cards, but until you define those 13 cards in some way other than the 13 random cards that are to be drawn, they are still 13 random cards. And as long as they are 13 random cards, it is legitimate to calculate the probability of drawing 13 random cards. Besides, which makes more sense? Every time you get dealt 13 random cards you've instantiated a 1.0 probability of getting 13 random cards? Or every time you get dealt 13 random cards you've instantiated a highly improbable occurrence of getting your "particular permutation?" Bridge players are constantly defying improbability with each and every hand? Or the entirely mundane and probable is happening again and again. Phinehas
Phinehas,
What particular permutation? That’s the point. You are saying words that imply a particular permutation without actually giving a particular permutation. Instead, the actual specification defining your “particular permutation” is 13 random cards.
The particular permutation that you just dealt yourself, whatever it is. As he states, "You take a standard deck of cards and deal 13 to yourself". I will stress again that the probability is of getting that particular permutation, given only that 13 cards will be dealt, certainly not given that that hand was dealt. The way the author phrased it is not ideal. daveS
Folks, Philip Johnson nailed it in his reply to Lewontin:
For scientific materialists the materialism comes first; the science comes thereafter. [Emphasis original] We might more accurately term them "materialists employing science." And if materialism is true, then some materialistic theory of evolution has to be true simply as a matter of logical deduction, regardless of the evidence.
[--> notice, the power of an undisclosed, question-begging, controlling assumption . . . often put up as if it were a mere reasonable methodological constraint; emphasis added. Let us note how Rational Wiki, so-called, presents it:
"Methodological naturalism is the label for the required assumption of philosophical naturalism when working with the scientific method. Methodological naturalists limit their scientific research to the study of natural causes, because any attempts to define causal relationships with the supernatural are never fruitful, and result in the creation of scientific "dead ends" and God of the gaps-type hypotheses."
Of course, this ideological imposition on science that subverts it from freely seeking the empirically, observationally anchored truth about our world pivots on the deception of side-stepping the obvious fact since Plato in The Laws Bk X, that there is a second, readily empirically testable and observable alternative to "natural vs [the suspect] supernatural." Namely, blind chance and/or mechanical necessity [= the natural] vs the ART-ificial, the latter acting by evident intelligently directed configuration. [Cf Plantinga's reply here and here.] And as for the god of the gaps canard, the issue is, inference to best explanation across competing live option candidates. If chance and necessity is a candidate, so is intelligence acting by art through design. And it is not an appeal to ever- diminishing- ignorance to point out that design, rooted in intelligent action, routinely configures systems exhibiting functionally specific, often fine tuned complex organisation and associated information. Nor, that it is the only observed cause of such, nor that the search challenge of our observed cosmos makes it maximally implausible that blind chance and/or mechanical necessity can account for such.]
That theory will necessarily be at least roughly like neo-Darwinism, in that it will have to involve some combination of random changes and law-like processes capable of producing complicated organisms that (in Dawkins’ words) "give the appearance of having been designed for a purpose." . . . . The debate about creation and evolution is not deadlocked . . . Biblical literalism is not the issue. The issue is whether materialism and rationality are the same thing. Darwinism is based on an a priori commitment to materialism, not on a philosophically neutral assessment of the evidence. Separate the philosophy from the science, and the proud tower collapses. [Emphasis added.] [The Unraveling of Scientific Materialism, First Things, 77 (Nov. 1997), pp. 22 – 25.]
KF kairosfocus
DS: What particular permutation? That's the point. You are saying words that imply a particular permutation without actually giving a particular permutation. Instead, the actual specification defining your "particular permutation" is 13 random cards. Phinehas
Phinehas,
What is the probability that you get exactly those cards in exactly that order?
I think the two "exactly"s are important here. There are about 4 x 10^21 permutations of 13 cards chosen from a deck of 52. Therefore the chance of getting that particular permutation (given only that 13 cards will be dealt) is what the author quoted. daveS
Kitcher argues, “Consider the humdrum phenomenon suggested by [Michael] Behe’s analogy with bridge. You take a standard deck of cards and deal 13 to yourself. What is the probability that you get exactly those cards in exactly that order? The answer is one in 4 x 10^21…But you did, and you have witnesses to testify that your records are correct.”
Good grief. What cards in exactly what order? The probability that you'll get 13 random cards is nearly 1.0 (nearly, since I suppose you could have a heart attack mid-deal). How does this joker have any business writing a book? Seriously. The stupid is blowing my mind. Phinehas
But, as noted above, quite honestly, in terms of upsetting the bully pulpit, more is needed. News
Love the Berlinski quote. Thanks for sharing. Truth Will Set You Free
Evolution and Climate Change: Solid in that they are flexible enough to accommodate any reality. Andrew asauber
The "as well established as any of the major theories in contemporary science" chestnut always reminds me of Berlinski:
I disagree [with Paul R. Gross’ assertion] that Darwin’s theory is as “solid as any explanation in science.” Disagree? I regard the claim as preposterous. Quantum electrodynamics is accurate to thirteen or so decimal places; so, too, general relativity. A leaf trembling in the wrong way would suffice to shatter either theory. What can Darwinian theory offer in comparison?
Barry Arrington

Leave a Reply