Intelligent Design

Multiverse Mavens Hoisted on Own Petard

Spread the love

Several factors are combining to increase belief (of the “faith” variety, not the “demonstrated fact” variety) in the multiverse among materialists. Two of these factors are relevant to ID at the biological and cosmological levels. At the biological level materialists are beginning to understand that the probability that life arose by random material processes is so low (estimated in this article written by materialists to be 10 raised to -1018) that infinite universes are required for it to have occurred, the implication being that we just happen to live in the ever-so-lucky universe where it all came together.

At the cosmological level, the probability that the fine tuning of the universe necessary for the existence of life arose by sheer coincidence is so low that again the multiverse is invoked to provide infinite “probabilistic resources” to do the job (see here).

Of course, there is another possible explanation for both the emergence of life and the fine tuning of the universe. These phenomena may be the results of acts of a super powerful being whom we might call God.

Obviously, the whole reason materialists have invoked the multiverse in the first place is to avoid resorting to agency to explain the emergence of life and cosmological fine tuning. But isn’t it obvious that given the very premises invoked by materialists in the multiverse scenarios that we can just as easily conclude that God exists.

Here is how the logic runs: The materialists says, “Yes, the probability that life emerged through random material processes is vanishingly small, but in an infinite multiverse everything that is not logically impossible is in fact instantiated, and we just happen to live in the lucky universe where life was instantiated. Similarly, we happen to live in the Goldilocks universe (which, again, is one of infinite universes) where the physical constants are just right for the existence of life.”

But the theist can play this game too. “The existence of God is not logically impossible. In an infinite number of universes everything that is not logically impossible is in fact instantiated, and we just happen to live in one of those universes in which God is instantiated.”

I do not believe in the multiverse. The entire concept is a desperation “Hail Mary” pass in which logical positivists and their materialist fellow travelers are attempting to save a philosophical construct on the brink of destruction. The point is that materialists’ own multiverse premise leads to the conclusion that God exists more readily than the opposite conclusion. Ironically, far from excluding the existence of God, if the multiverse exists, God must also exist.

86 Replies to “Multiverse Mavens Hoisted on Own Petard

  1. 1
    Mats says:

    THe multiuniverse seems “plausible” only if you are a materialist determined not to “allow a DIvine Foot at the door”.

  2. 2

    Well said! Infinite universes is as crazy an idea as Hilbert’s Hotel.

  3. 3
    Heinrich says:

    I thought logical positivism died out years ago. Are there really any of them still running around?

  4. 4
    Collin says:

    Heinrich,

    I don’t suppose you have something to say about the meat of Arrington’s article do you?

    The multiverse could be real. I don’t know. But so could fairies.

  5. 5
    Barry Arrington says:

    Heinrich at [3]. LP’s are like the psycho in a bad slasher film series. Every time you think he’s dead he rises again to wreak more havoc.

  6. 6
    F2XL says:

    So we’re still up in arms about the prospect of an infinite multiverse? Assuming a functionalist interpretation of the mind/body problem, do I have any reason to trust my own thoughts if the article is true?

  7. 7
    Heinrich says:

    Can you point me to some contemporary logical positivists, Barry? I’m interested to see how they resurrected the Vienna Circle’s ideas, and how they deal with Popper, for example.

    Collin – I’m genuinely curious about logical positivists, which is why I asked.

  8. 8
    Bantay says:

    How interesting that the multiverse materialists are willing to endorse a mathematically miraculous universe but not a theologically miraculous universe.

    Seems to me the multiverse scenario is not very helpful. Consider this: There is absolutely zero zilch nada positive evidence for a multiverse. Those who wish to avoid intelligent agency in origin of the universe models may also claim the same against them, no positive evidence for design of the universe. Wouldn’t that mean that both are on equal epistemological footing?

  9. 9
    Sooner Emeritus says:

    Barry Arrington,

    All statements about the objective probability of the universe (multiverse) being the way it is are absurd. That such statements come from the mouths of well known scientists does not make them meaningful. The philosopher Hugh Mellor has spoken particularly well on this matter. The gist is that we get to objective probability only with repeatable experiments, and there is no way to regard the universe as the outcome of a repeatable experiment. We have no empirical access to a universe generator, or the universe is not what we mean by “universe.”

    We can say, “What is the probability that process X yields outcome E?” But we cannot say meaningfully, “What is the probability that process Z results in universe U in which process X results in outcome E?” If we should develop a means of observing the hypothetical process Z, then the known universe will have expanded.

  10. 10
    JDH says:

    What I have always wanted to ask a believer in the multiverse ( for sake of discussion we will call him Richard):

    Hello Richard, my name is John. You know if there are an infinite number of universes, there probably are an infinite number of times where believer John comes up to materialist Richard and engages him in conversation about belief in God. Probably in most of these conversations, believer John will get around to asking non-believer Richard if he wants to change his mind and believe in God. Is this one of the universes where Richard says “YES”!. If not why not.

  11. 11
    Barry Arrington says:

    Heinrick at [7], see “Reconsidering Logical Positivism”
    http://www.amazon.com/Reconsid.....0521624762

    Like I said above, just when you think its dead, it rises from the grave.

  12. 12
    Sotto Voce says:

    Barry,

    You completely fail to understand the logical structure of multiverse arguments. The claim is not that everything that is logically possible will be instantiated somewhere in an infinite multiverse. An infinite sample space does not guarantee that every possible event must have non-zero probability. A simple example: Say you’re picking a number at random from the set of all even numbers. This set has infinite members, yet there is no chance you’re ever going to get a 5. Similarly, an infinite multiverse does not need to include every logically possible world.

    So what is the usual structure of multiverse theories? The claim is that you have an infinite number of universes instantiating different values for fundamental physical constants and different initial conditions. Crucially, the fundamental laws of physics will not differ between the different universes in this ensemble. So the multiverse does not contain every logical possibility. It contains (at most) every universe that can be generated according to our fundamental laws by varying the constants. It is quite plausible that no such universe will contain a God that fits anything like the Christian conception. For instance, since all the universes in the ensemble are bound by the same physical laws, they could not generate an omnipotent being (since such a being would, by definition, not be bound by physical laws).

    Your conception of a multiverse model seems to be based solely on Tegmark’s Ultimate Ensemble, which is sort of close to what you describe. Tegmark’s position is an extreme minority position even among defenders of the anthropic principle.

    I should say that I am pretty unsympathetic to the multiverse as a purported solution to the fine-tuning problem, mainly because I do no think fine-tuning is a genuine problem.

  13. 13
    Sotto Voce says:

    Also, if you think the multiverse picture is somehow associated with logical positivism, then you do not understand logical positivism. Positivists were vehemently critical of theories without clear empirically verifiable consequences. This is precisely the charge that is made against multiverse models, that it is unclear what would constitute empirical confirmation of these models. So it is the critics of the multiverse, rather than its proponents, who are making positivist arguments.

  14. 14
    Sooner Emeritus says:

    Sotto Voce,

    I also do not see fine-tuning as a genuine problem, and I’d like to hear more from you on the matter.

  15. 15
    Upright BiPed says:

    Note to self:

    Education cannot kill rationality. I refuse to believe it. There are too many who are a contradiction to the idea. So, what can? What is the force that binds such educated people to utter stupidity?

    Let me know when I find out.

  16. 16
    Mark Frank says:

    Sooner Emeritus, Sotto Voce

    I agree. The fine tuning problem is not a real problem and it is meaningless to talk about the probability of our universe having the laws it does.

    Also delighted to see Hugh Mellor is still going – he taught me as an undergraduate in 1969. Although I don’t accept his distinction between epistemic and physical probability. I think epistemic is just a subjective estimate of a physical probability.

  17. 17
    Heinrich says:

    Thank you for your response, Barry. I’m genuinely surprised that logical positivism is being resurrected. However, from the link, I get the impression it’s being done by philosophers, rather than by anyone involved in physics. So how are they relevant to the many worlds interpretation?

  18. 18
    Heinrich says:

    Just to add – my impression (and this really isn’t my area) is that the many worlds interpretation isn’t open to empirical testing, i.e. it isn’t verifiable. So surely logical positivists would exclude it from science?

    As I write, I’m not an expert, so I assume I’m missing something – all explanations welcome!

  19. 19
    Collin says:

    Heinrich,

    sorry. I thought you were being sarcastic and snippy. It’s hard not to misinterpret things online sometimes.

  20. 20
    Nakashima says:

    Sooner @8,

    The gist is that we get to objective probability only with repeatable experiments, and there is no way to regard the universe as the outcome of a repeatable experiment.

    At best, we can say that Dr Koonin was engaging in a Bayesian, not frequentist, probabilistic argument, correct?

  21. 21
    Heinrich says:

    That’s OK Collin. I know things get a little partisan around here, so each side tends to automatically think the worst of the other.

  22. 22
    pelagius says:

    Barry,

    Your argument depends on the assumption that in an infinite multiverse, all logical possibilities will be instantiated. As Sotto Voce pointed out, that assumption is false.

    Yet even if we grant your assumption arguendo, your conclusion does not follow, for the arguments you cite, one for the atheist and one for the theist, are not logically equivalent.

    We know that life exists in our universe, so we can conclude with certainty that the the physical constants of our universe permit life.

    We don’t know that God exists in our universe. Therefore we cannot conclude that “we just happen to live in one of those universes in which God is instantiated.” If your argument were correct, the most we could conclude is that God exists in some universes. We still wouldn’t know that he exists in ours.

    Even without these two problems, your argument would still fail, because it assumes that God is part of the universe rather than its creator. A God who is merely part of the universe is not the God whose existence you are trying to demonstrate.

  23. 23
    Sotto Voce says:

    Sooner @ 14,

    My problem with the fine tuning argument is the same as yours, I think. I do not understand the rationale for placing a uniform probability distribution over the space of all possible values of the fundamental constants.

    The principle of indifference is a very useful rule for setting one’s epistemic probabilities, but I regard this as an a posteriori fact about our universe. I see no justification for applying the principle to an ensemble of universes.

  24. 24
    Sotto Voce says:

    Heinrich,

    There are in fact conceivable tests of contemporary versions of many-worlds interpretations of quantum mechanics, but these tests are basically impossible to realize given our current technology. The basic idea is that since the many-worlds interpretation denies that the wave function collapses, there will be possible measurements that would have different results if a collapse interpretation were true.

    However, it is important to note that the many-worlds interpretation is significantly different from multiverse theories. The former posits a branching structure to reality, so the separate “worlds” are not completely isolated. They have common pasts. In fact, there is even a very small probability that branching worlds will reconverge. The multiverse, on the other hand, consists of multiple universes that are completely causally segregated.

  25. 25
    Upright BiPed says:

    We are so very fortunate that there exist farmers and ranchers to bring forth food; caregivers to seek the humbled; engineers to make things work.

    It provides plenty of support for other fanciful endeavors.

  26. 26
    vjtorley says:

    Hi everyone,

    First of all, I’d like to thank Barry for a very interesting post.

    Although I think that there are some excellent arguments that can be made for God’s existence even if there is a multiverse, I would have to concur with Pelagius’ criticisms in #22 of Barry’s version of the argument that God must exist, even in a multiverse: (i) as formulated, Barry’s argument shows that God exists in some universes, but not that He exists in ours; (ii) the problem with saying that we live in a universe in which God is instantiated is that it makes God part of this universe, rather than its Creator.

    I’d also agree with Sotto Voce’s comment in #12 that Barry’s “multiverse” argument for God’s existence implicitly assumes Tegmark’s version of the multiverse, in which all logical possibilities are instantiated in some universe.

    Having said that, one could try to reformulate Barry’s argument, using the notion of a world, defined as a set of entities that are able to interact causally with one another. Thus saying that God exists “in” this world simply means that God interacts with some entities belonging to this world, not that God is part of some bigger entity. Additionally, since God is by definition a necessary Being, it follows that God cannot exist in just one world; if He exists in one world, then God exists in all worlds. One could then argue:

    (1) The existence of God is not logically impossible.
    (2) In Max Tegmark’s multiverse, everything that is not logically impossible is in fact instantiated.
    (3) Therefore God is instantiated in some world in Max Tegmark’s multiverse.
    (4) But if God is instantiated in some world in Max Tegmark’s multiverse, He is instantiated in all worlds (by definition of the term “God”).
    (5) Therefore God exists in our world, if we live in Max Tegmark’s multiverse.

    This is of course a version of the ontological argument. However, an atheist could question premise (1). As we cannot fully understand the nature of God, we don’t know for certain that God’s existence is even logically possible. All we can say is that God’s logical impossibility has not been demonstrated.

    The other problem with the argument is that Tegmark could make a smart counter-move: he could rule out the existence of entities that exist in more than one world, on the grounds that if any such entity existed, it could then interact with beings in more than one world, which would violate the definition of a world as a causally closed network of interacting entities. (If entity E can interact with A and B in world 1, and with C and D in world 2, then A, B, C and D all belong in the same world, since they can interact “via” E – which violates the initial stipulation that world 1 and world 2 are separate.) Thus Tegmark could argue that if there are multiple worlds, then there is no necessary Being: in other words, God’s existence would be logically impossible.

    What does this prove? Absolutely nothing, except the truth of the old philosophical adage that one person’s modus ponens is another’s modus tollens. Or, as computer scientists are fond of saying: garbage in, garbage out. Staring with false philosophical premises, you are liable to derive false conclusions.

    In the next post, I’ll argue that in fact, a good argument for God’s existence can be made, even if there is a multiverse.

  27. 27
    bornagain77 says:

    I think a very solid argument can be made for the existence of God in this universe.

    As pointed out yesterday, from the double slit experiment, it is impossible for 3 dimensional material reality to give rise to the conscious observation that causes the infinite “higher dimensional” wave to collapse to its “uncertain” 3-D particle state, for it is impossible for 3-D material reality to give rise to that which it is absolutely dependent on for its own reality in the first place.
    Yet the “cause” of our conscious observation is not sufficient, in and of itself, to explain the “effect” of universal collapse, of the higher dimensional waves to their 3-D states, for each central conscious observer in the universe. Thus a higher cause of a universal consciousness must exist to sufficiently explain the cause for the effect we witness for universal wave collapse.

    I believe that line of thought is very similar to this ancient line of reasoning:

    “The ‘First Mover’ is necessary for change occurring at each moment.”
    Michael Egnor – Aquinas’ First Way
    http://www.evolutionnews.org/2.....first.html

    I found that centuries old philosophical argument, for the necessity of a “First Mover” accounting for change occurring at each moment, to be validated by quantum mechanics. This is since the possibility for the universe to be considered a “closed system” of cause and effect is removed with the refutation of the “hidden variable” argument. i.e. There must be a sufficient transcendent cause (God/First Mover) to explain the quantum wave collapse to the “uncertain” 3D effect for “each moment” of the universe.

    This following study solidly refutes the “hidden variable” argument that has been used by materialists to try to get around the Theistic implications of this instantaneous “spooky action at a distance” found in quantum mechanics.

    Quantum Measurements: Common Sense Is Not Enough, Physicists Show – July 2009
    Excerpt: scientists have now proven comprehensively in an experiment for the first time that the experimentally observed phenomena cannot be described by non-contextual models with hidden variables. http://www.sciencedaily.com/re.....142824.htm

    (of note: hidden variables were postulated to remove the need for “spooky” forces, as Einstein termed them—forces that act instantaneously at great distances, thereby breaking the most cherished rule of relativity theory, that nothing can travel faster than the speed of light.)

    Though the lack of a hidden variable finding is very nice “icing on the cake”, the logic that material reality cannot precede the consciousness it is dependent on for its own reality is what solidly seals the argument for the Theistic position.

  28. 28
    vjtorley says:

    Hi everyone,

    I would strongly recommend Robin Collins’ Fine-Tuning Website at http://home.messiah.edu/~rcoll.....ing/ft.htm , for anyone who wants to read a robust defense of the fine-tuning argument. It contains a number of valuable essays on the subject of fine-tuning.

    For a good overview, I’d recommend Collins’ paper, “A Theistic Perspective on the Multiverse Hypothesis,” especially the section on the multiverse at http://home.messiah.edu/~rcoll.....k.htm#_1_5 which argues that a multiverse generator would still need to be designed, and the concluding section at http://home.messiah.edu/~rcoll.....k.htm#_1_6 which argues that the underlying beauty of the laws of nature is powerful evidence of their having been designed.

    For readers who don’t have a strong philosophical background, Collins’ paper, “God, Design, and Fine-Tuning” might be more helpful, as it presents the basic fine-tuning argument, along with Collins’ responses to some of the standard objections to the argument.

    To address methodological concerns that some contributors (Sooner Emeritus, Sotto Voce, Mark Frank and Nakashima) have raised concerning fine-tuning, I’d recommend Collins’ paper, “How to Rigorously Define Fine-Tuning.”

    Finally, the paper, “Evidence for Fine-Tuning” is also well worth reading, as it highlights the strongest pieces of scientific evidence for fine-tuning and warns against some scientifically faulty fine-tuning arguments that are still circulating in the literature.

    Sorry, but that’s all I have time for at the moment. I shall return within the next 24 hours.

  29. 29
    Sooner Emeritus says:

    Nakashima (20):

    At best, we can say that Dr Koonin was engaging in a Bayesian, not frequentist, probabilistic argument, correct?

    It seems that Koonin regards the probability of a possible macroscopic history with property A to be the ratio of the (finite) number of possible histories with property A to the (finite) number of possible histories. See “Probability / chance / randomness” in Table 1, especially. This is frequentism, and it assumes that all possible macroscopic histories are equiprobable. But, to adapt Sotto Voce’s remarks in 23, why should they be equiprobable? The Principle of Indifference is an outdated dictum to modelers. It does not behoove nature to exhibit indifference. Some modelers would introduce a measure of descriptive complexity on histories, and would say that histories of lower complexity are of higher probability than histories of higher complexity.

    In the appendix, Koonin starts out by making particular assumptions about the properties of an O-region, and this is essentially a uniformity assumption. Furthermore, these assumptions are based on observations of our own O-region, and I don’t know why it should be regarded as typical. So I don’t see anything objective in the probabilities he estimates.

    Koonin ends the appendix by saying,

    The model considered here is not supposed to be realistic by any account. It only serves to illustrate the difference in the demands on chance for the origin of different versions of the breakthrough system (see Fig. 1) and hence the connections between these versions and different cosmological models of the universe. [emphasis mine]

    I think his belief that he’s getting at something objective is evident here. And, as I’ve said, he’s not owning up to his assumptions.

  30. 30
    Mark Frank says:

    #28

    vjtorley

    Collins “How to Rigorously Define Fine-Tuning.” is interesting. It is a pity it is incomplete. However, at first glance I can’t see it adds anything new to the main philosophical debate.

    In general when someone attempts to estimate the probability of a physical constant supporting life it comes down to a sum:

    L/R

    where L is the range of values that support life and R is the range of “possible” values that constant might take. One level there are two problems with this.

    1) The definition of R. What is the possible range of values for G?

    2) What scale to use for L and R? Why not logL/logR or L^2/R^2? Assuming it is possible to define R then each of these transformations would greatly change the ratio and it is possible to find a transformation to make the ratio almost any value you want.

    Collins tries to get to grips with these but not very convincingly.

    However, I think these problems just reflect a deeper problem with the nature of probability. A problem that underlies much of the ID movement. It is the idea that you can legitimately use the principle of indifference in almost any context to come up with an epistemic probability of an outcome and that his has some kind of objective justification independently of any kind of frequency based evidence. Dembski uses it all the time and I am sure it is wrong.

    The problems with the principle of indifference are well-known. Most importantly you can come up with different results for the probability of an outcome depending what you are indifferent to. This is reflected in problem 2 above. Are we indifferent to the value of G or log G or G^2 or what? It may appear obvious that you can use the principle of indifference to estimate the probability of a dice coming down with six uppermost but actually this is based on our massive experience of dice-like objects. If in practice sixes came down more often in the long run the principle of indifference would have to concede to the reality of
    actual frequencies.

    In the end classical/logical and epistemic probabilities are best guesses at hypothetical frequentist probabilities. An epistemic probability is subjective. It is my attempt to estimate, given certain conditions H (which may represent my current relevant knowledge or a hypothesis), how frequently we would get outcome X were we able to reproduce H many, many times.

    Once it is phrased like that then we realise how little we can say about the probability of a universe having certain fundamental values. We have no knowledge at all of what H is or how it relates to X.

  31. 31
    tgpeeler says:

    Sotto @ 12 “The claim is that you have an infinite number of universes instantiating different values for fundamental physical constants and different initial conditions.”

    I’m curious about the concept of an “infinite” number of universes. This phrase gets thrown around all the time but the very idea of it is incoherent to me. How can there be an actual infinite number of anything physical? If these universes are physical, and at least one of them is, the one I live in, then I can always add one more to the count of how ever many there are. Ergo, no infinite number of universes.

    I’m even more curious about how anyone can actually consider the “multiverse” as a solution or explanation for anything but I don’t expect an answer to that.

  32. 32
    pelagius says:

    tgpeeler:

    I’m curious about the concept of an “infinite” number of universes. This phrase gets thrown around all the time but the very idea of it is incoherent to me. How can there be an actual infinite number of anything physical? If these universes are physical, and at least one of them is, the one I live in, then I can always add one more to the count of how ever many there are. Ergo, no infinite number of universes.

    tgpeeler,

    Your problem is not with physical infinity but rather with the concept of infinity itself.

    To see this, consider the following argument which contains the same mistake as the one you presented:

    Mathematicians claim that the set of integers is infinite. This strikes me as incoherent. How can there be an actual infinite number of integers? I can always add one more to the count of how ever many there are. Ergo, no infinite number of integers.

    Do you see the problem?

  33. 33
    Upright BiPed says:

    Pelagius,

    If I give you 5 integers (2 4 19 23 7) then there five numbers you have been given. What is the total number of integers?

  34. 34
    vjtorley says:

    Mark Frank (#30)

    Thank you for your post. It seems to me that Dr. Robin Collins has already answered many of the questions you ask. In particular, I would refer you to the article “God and Fine-tuning” by Collins at http://home.messiah.edu/~rcoll.....tuning.rtf . It carefully avoids making inflated claims about fine-tuning, and even hoses down some of the poorly supported claims in the literature, after putting forward six solid cases of fine-tuning.

    Let’s talk about gravity. You ask:

    What is the possible range of values for G?

    Here’s what Collins has to say:

    Now, the forces in nature can be thought of as spanning a range of G_0 to (10^40.G_0), at least in one widely-used dimensionless measure of the strengths of these forces (Barrow and Tipler, pp. 293 – 295). (Here, G_0 denotes the strength of gravity, with (10^40.G_0) being the strength of the strong force.)….

    [A]s shown in section B of the appendix, stars with life-times of more than a billion years (as compared to our sun’s life-time of ten billion years) could not exist if gravity were increased by more than a factor of 3000. This would have significant intelligent life-inhibiting consequences.

    Of course, an increase in the strength of gravity by a factor of 3000 is a lot, but compared to the total range of strengths of the forces in nature (which span a range of 0 to (10^40.G_0) as we saw above), this still amounts to a one_sided fine_tuning of approximately one part in 10^36. On the other hand, if the strength of gravity were zero (or negative, making gravity repulsive), no stars or other solid bodies could exist. Thus, zero could be considered a lower bound on the strength of gravity. Accordingly, the intelligent-life_permitting values of the gravitational force are restricted to the range 0 to ((3×10^3).G_0), which is about one part in 10^36 of the total range of forces. This means that there is a two_sided fine_tuning of gravity of at least one part in 10^36.

    Next, you object to the use of a linear scale, when comparing the range of life-permitting values, L, with the range of allowable values, R.

    What scale to use for L and R? Why not logL/logR or L^2/R^2? Assuming it is possible to define R then each of these transformations would greatly change the ratio and it is possible to find a transformation to make the ratio almost any value you want.

    This is equivalent to asking: why should we look at G? Why not log G or G^2? Here’s what Robin Collins wrote in answer to your question, in an earlier 1998 paper at http://www.discovery.org/a/91 :

    The answer to this question is to require that the proportion used in calculating the probability be between real physical ranges, areas, or volumes, not merely mathematical representations of them. That is, the proportion given by the scale used in one’s representation must directly correspond to the proportions actually existing in physical reality. As an illustration, consider how we might calculate the probability that a meteorite will fall in New York state instead of somewhere else in the northern, contiguous United States. One way of doing this is to take a standard map of the northern, contiguous United States, measure the area covered by New York on the map (say 2 square inches) and divide it by the total area of the map (say 30 square inches). If we were to do this, we would get approximately the right answer because the proportions on a standard map directly correspond to the actual proportions of land areas in the United States. On the other hand, suppose we had a map made by some lover of the East coast in which, because of the scale used, the East coast took up half the map. If we used the proportions of areas as represented by this map we would get the wrong answer since the scale used would not correspond to real proportions of land areas. Applied to the fine-tuning, this means that our calculations of these proportions must be done using parameters that directly correspond to physical quantities in order to yield valid probabilities. In the case of gravity, for instance, the gravitational constant G directly corresponds to the force between two unit masses a unit distance apart, whereas U does not. (Instead, U corresponds to the square of the force.) Thus, G is the correct parameter to use in calculating the probability. (Emphases mine – VJT.)

    Finally, you object to the epistemic principle of indifference. You write:

    In the end classical/logical and epistemic probabilities are best guesses at hypothetical frequentist probabilities. An epistemic probability is subjective. It is my attempt to estimate, given certain conditions H (which may represent my current relevant knowledge or a hypothesis), how frequently we would get outcome X were we able to reproduce H many, many times.

    I think this line of argumentation is unduly pessimistic, regarding what we can and cannot know. To cite Collins again from a recent paper (“The Teleological Argument:
    An Exploration of the
    Fine-Tuning of the Universe” in The Blackwell Companion to Natural Theology):

    According to the restricted Principle of Indifference, when we have no reason to prefer anyone value of a variable p over another in some range R, we should assign equal epistemic probabilities to equal ranges of p that are in R, given that p constitutes a “natural variable.” A variable is defined as “natural” if it occurs within the simplest formulation of the relevant area of physics. When there is a range of viable natural variables, then one can only legitimately speak of the range of possible probabilities, with the range being determined by probabilities spanned by the lower and upper bound of the probabilities determined by the various choices of natural variables…

    Several powerful general reasons can be offered in defense of the Principle of Indifference if it is restricted in the ways explained earlier. First, it has an extraordinarily widerange of applicability. As Roy Weatherford notes in his book, Philosophical Foundations of Probability Theory, “an astonishing number of extremely complex problems in probabilitytheory have been solved, and usefully so, by calculations based entirely on the assumptionof equiprobable alternatives [that is, the Principle of Indifference]” (1982, p. 35). Second,in certain everyday cases, the Principle of Indifference seems the only justification we have for assigning probability. To illustrate, suppose that in the last 10 minutes a factory produced the first 20-sided die ever produced (which would be a regular icosahedron). Further suppose that every side of the die is (macroscopically) perfectly symmetrical with every other side, except for each side having different numbers printed on it. (The die we are imagining is like a fair six-sided die except that it has 20 sides instead of six.) Now, we all immediately know that upon being rolled the probability of the die coming up on any given side is one in 20. Yet we do not know this directly from experience with 20-sideddice, since by hypothesis no one has yet rolled such dice to determine the relative frequency with which they come up on each side. Rather, it seems our only justification for assigning this probability is the Principle of Indifference: that is, given that every side of the die is macroscopically symmetrical with every other side, we have no reason to believe that it will land on one side versus any other. Accordingly, we assign all outcomes an equal probabilityof one in 20.

    I hope that answers your questions.

  35. 35
    pelagius says:

    Upright Biped,

    There are infinitely many integers.

  36. 36
    Sotto Voce says:

    Upright Biped,

    I hope you don’t mind me answering on pelagius’s behalf. The total number of integers is aleph null.

    Tgpeeler,

    You say you can “always add one more to the count”. I suspect you are making the same error that Barry makes in his post. An infinite set doesn’t have to include everything. It can leave things out, so you can add things to the set. Going back to my earlier example, the set of all even numbers does not contain 5, yet it is an infinite set. I could add 5 to it. So the mere fact that something can be added to a set does not show that it is not infinite. The curious thing about an infinite set is that adding one more member does not actually change the size of the set. Transfinite arithmetic is weird.

  37. 37
    Upright BiPed says:

    Pelagius,

    Correct. So, there is no number that represents infinity; any number you sample can be increased – which was exactly TGP’s point.

    Sotto,

    Aleph null is a set; a set is a ensemble of distinct objects. The number 5 cannot be added to the infinite set of even integers. That particular set does not include the number of DeSoto hubcaps either.

  38. 38
    Sooner Emeritus says:

    Mark Frank,

    A physicist commenting at another blog offered a fine slogan: “No probability without process.” As best I can recall from my reading in the philosophy of probability, everyone agrees that physical probability should be treated as the relative frequency of outcomes of a repeatable experiment. A repeatable experiment is, among other things, a known process.

    Only with a time machine could one observe an origin of life. Without a time machine, all one can do is to use information in the present to guess a process in the past, and then assign probability according to the guess (model). This probability is certainly not a physical entity. It derives from the modeler’s supposition as to what was going on 4 billion years ago.

    It is the idea that you can legitimately use the principle of indifference in almost any context to come up with an epistemic probability of an outcome and that his has some kind of objective justification independently of any kind of frequency based evidence. Dembski uses it all the time and I am sure it is wrong.

    It’s not just the unbridled application of the Principle of Indifference, but more generally the fallacy of treating epistemic probabilities as physical probabilities. Dembski rejects natural causation of an easily described, prehistorical event in favor of intelligent design (increase of physical probability) when a hypothetical process has an outcome in the event (“target”) with very low subjective probability. It is clearly wrong to infer that an intelligence intervened to change physical probability when the deficit in probability is merely subjective.

  39. 39
    R0b says:

    Upright Biped:

    So, there is no number that represents infinity;

    Transfinite numbers do.

    Aleph null is a set;

    Aleph-null is a number — a transfinite cardinal number.

    The number 5 cannot be added to the infinite set of even integers.

    Sure it can:
    {x:x/2∈Z} ∪ {5}

  40. 40
    vjtorley says:

    Sooner Emeritus (#38)

    Thank you for your post. Be careful what you wish for – you just might get what you want.

    “No probability without process.” That’s your epistemic maxim. Let’s see where that leads.

    Improbability is the converse of probability. “No probability without process” entails “No improbability without process.”

    This means that many anti-theistic arguments from the evil in the world, of the form: “There’s probably no God, because if there were a God, then He would have prevented evil X, Y and Z” are invalid, because they don’t specify HOW God would have or should have done so. “No improbability without process.”

    It also means that anti-theistic arguments of the form: “There’s probably no God, because all the intelligent beings I’ve seen are highly fragile, multi-part complex entities” are invalid, because they are based on induction – which is rendered also problematic by your maxim, because mere repetition of a phenomenon tells us nothing about the underlying process, if there is one. And by your maxim, there can be “No probability without process,” we can’t calculate the probability of an intelligent Being’s existing which is simple (i.e. NOT complex), as most classical theists believe.

    Finally, not everything can happen by virtue of some underlying process. Some natural processes are basic – they just happen. Quarks exchange gluons. How do they do that? I don’t know. That’s just what a quark does. Does that mean we can’t make probabilistic statements about quarks?

    At the very least, you will have to concede that anti-theistic probability arguments are as problematic as the theistic arguments you criticize. And I respectfully submit that your maxim does not supply a warrant for induction, or for the degree of trust you should place in your own cognitive processes.

    To borrow a phrase from Barry, skeptics are indeed hoisted on their own petard.

  41. 41
    Sotto Voce says:

    Upright,

    As R0b said, aleph null is not a set, it is a cardinality. Just like the set {a, b, c, d} has cardinality 4, the set of all natural numbers has cardinality aleph null. It is just as much of a number as 4 is.

    As for your claim that 5 cannot be added to the infinite set of even numbers, I take it you mean that it cannot be added without changing the set. That is, once 5 is added, the resulting set is no longer the set of even numbers. This is true, of course, but irrelevant to tgpeeler’s point. He/she said that we can always add another world to the set of worlds that are ostensibly part of the multiverse. My interpretation of this claim was that there are logically possible universes that are not physically possible, so there are universes “left out” of the multiverse. Of course, adding such a universe to the set will change the set, just like adding 5 to the set of even numbers, but why does this matter? Maybe I have misinterpreted tgpeeler’s point (it was a bit opaque), in which case I would be glad to be corrected.

  42. 42
    Sotto Voce says:

    Perhaps I should have been slightly more precise in my last comment. I said that aleph null is not a set, but a number of developments of set theory define cardinal numbers as sets (for instance, 0 might be defined as the empty set, 1 as the set containing the empty set, 2 as the set containing 1 and the empty set, and so on). If you define cardinalities in this way then, yes, aleph null is a set, but so is 4. It remains the case that aleph null is just as much of a number as 4.

  43. 43
    Upright BiPed says:

    ROb,

    Infinity is not a number, it is a concept.

    The question asked had to do with a measurable number of physical objects.

    You can visit the question in comment #31. Perhaps you read set theory into the question.

    I don’t.

  44. 44
    Mark Frank says:

    vjtorley

    You will not be surprised to know that I disagree with Collins all three regards. He expresses his arguments clearly but they are hardly new.

    I am going to duck the first one. It leads to some difficult discussions about what is a “possible” range for a physical constant and it is not necessary to win this debate to prove my point.

    The second and third are two sides of the same coin. To say that we don’t know what scale to use when comparing possible ranges of a physical constant, is mathematically identical to saying we don’t know what probability distribution function to use when considering a particular scale. Collins tries to justify the choice of one scale by writing:

    the proportion given by the scale used in one’s representation must directly correspond to the proportions actually existing in physical reality.

    but this is not as straightforward as he makes it sound.

    Consider the analogy of the map that he gives. It may well be true that a meteorite is just as likely to hit any square mile of the earth’s surface as any other. We can estimate that because we know that meteorites come from pretty much any direction and behave fairly chaotically when they hit the atmosphere (i.e. we learn from experience that this is a good scale to use). But consider a radiation based phenomenon from the Sun – say a burst of particular type of radiation hitting the earth’s surface. Such a burst is far more likely to hit a square mile near the equator than a square mile near the poles. In this case the relevant “physical reality” is the disc that the earth presents to Sun rather than the sphere of the earth’s surface.

    Something similar applies to G. It is true that G corresponds to the force between two unit masses a unit distance apart, but what claim has Force to be a physical reality? It is just a mathematically convenient way of treating the movement of objects. Once could rewrite Newton’s laws based on a concept, call it rootforce, which is the square root of our current definition of force.

    In the case rootforce between two objects of mass m1 and m2 = squareroot(G)*squareroot(m1*m2)/r

    It is much more convenient to deal with force than rootforce – but in what sense is more of a physical reality?

    This leads then into the principle of indifference. Why are we indifferent to equal ranges of G but not indifferent to equal ranges of root G?

    Although Collins says:
    Several powerful general reasons can be offered in defense of the Principle of Indifference if it is restricted in the ways explained earlier.

    He actually offers just two reasons in the passage you quote.
    First, it has an extraordinarily wide range of applicability.

    Second,in certain everyday cases, the Principle of Indifference seems the only justification we have for assigning probability.

    I would argue that the principle of indifference is actually just a heuristic to help us estimate or guess the underlying objective probability. Given that, then the first defence is really irrelevant. If it is a good heuristic then it will work in a wide range of situations.
    The second case is more interesting. To take the example of the 2-sided die. Collins writes:

    given that every side of the die is macroscopically symmetrical with every other side, we have no reason to believe that it will land on one side versus any other. Accordingly, we assign all outcomes an equal probability of one in 20.

    But of course every side is not identical. They have different numbers on them. How, do we know that this is not a relevant difference? Because of our experience with other objects. We have learned what factors determine the probability of an object falling with a particular face upwards. Indeed if you were superstitious you might well believe that whatever is inscribed on the face of the die does determine the probability of that face landing upwards.

    You might argue that we could remove the numbers. But there will always be differences. One side would the one closest to you before throwing it. One side will be the one that was previously on top etc. The “principle of indifference” is really “the principle of relevant indifference”. And in the end the judge of relevance is experience.

    Which comes back to the fact that we have no experience to fall back on when it comes fundamental physical constants.

    I don’t entirely agree with ““No probability without process.” It seems to me that you can substitute observation of repeated trials instead of process. But if you have no possibility of repeated trials then you must have some idea of the process to make even a guess at the probability.

  45. 45
    Nakashima says:

    Dr Torley,

    Thanks for the links to Collins’ site. I think I’ve been through some of his stuff before.

    From the work you directed my attention to,

    For now, we can think of the unconditional epistemic probability of a proposition is the degree of confidence or belief we rationally should have in the proposition; the conditional epistemic probability of a proposition R on another proposition S can roughly be defined as the degree to which the proposition S of itself should rationally lead us to expect that R is true.

    Since this epistemic probability is grounded in our notions of confidence and rationality, I found it hard to use in spots where Collins freely admits that he has no idea what the free parameters are at level of QCD, but continues blithely on anyway to assert that ranges are meaningful at other levels. I’m sorry, I have no prior confidence or rational expectations at the QCD level. Collins does nothing to support an argument that because two forces are in ratio of 1:10^40 that it is appropriate to take 1 as lower bound and 10^40 as upper bound of the possible variation range in either one. Surely the range of variation is an attribute of the free parameter itself, not the happenstance ratio of one free parameter to another. Why should the ratio of the mass of the electron to the muon establish a range, and not the electron to the tau, by his reasoning?

    Since we are on the subject of parameters for life supporting universes, I’ll mention that thinkers as diverse as Ed Fredkin and Stephen Wolfram have floated the idea that our universe is actually fully quantized, and that the Theory of Everything could be a set of cellular automata rules. So I think it is appropriate to bring back the CA as life discussion that I had hoped to have with you at the tail of another thread. What say you?

  46. 46
    R0b says:

    vjtorley, quoting Collins:

    The answer to this question is to require that the proportion used in calculating the probability be between real physical ranges, areas, or volumes, not merely mathematical representations of them. That is, the proportion given by the scale used in one’s representation must directly correspond to the proportions actually existing in physical reality.

    Unfortunately, this approach is still arbitrary, as Mark Frank has pointed out. Suppose we take his meteor example and look at the size of the meteor rather than its impact location. Do we assume uniform probability over possible circumferences, surface areas, or volumes?

  47. 47
    vjtorley says:

    Mr. Nakashima, Mark Frank and R0b:

    I’ve just been having a look at Robin Collins’ latest 80-page essay (2009) on fine-tuning, which is available at http://commonsenseatheism.com/.....gument.pdf . It’s actually chapter 4 in “The Teleological Argument: An Exploration of the Fine-Tuning of the Universe” in The Blackwell Companion to Natural Theology Edited William Lane Craig and J. P. Moreland. 2009. Blackwell Publishing Ltd. ISBN: 978-1-405-17657-6. (From what I hear, the essays in this volume are all of a high caliber; for skeptics, it’s definitely a “must-buy.”) I’d strongly recommend that you have a look at Collins’ latest essay, because his treatment of the subject is absolutely exhaustive. There is no objection to fine-tuning that he hasn’t anticipated, and he rebuts them all. I’ll deal with some of the objections that have been raised on this thread in the next few posts.

  48. 48
    vjtorley says:

    Mr. Nakashima (#45)

    Thank you for your post. You raise the very pertinent issue of how the range should be defined for a natural constant of physics:

    I’m sorry, I have no prior confidence or rational expectations at the QCD level. Collins does nothing to support an argument that because two forces are in ratio of 1:10^40 that it is appropriate to take 1 as lower bound and 10^40 as upper bound of the possible variation range in either one.

    Actually, I believe Collins takes 0 rather than 1 as the lower bound, but that’s a minor quibble.

    The underlying principle that Collins is appealing to here is that the domain of applicability for a physical concept (e.g. gravity, or the strong force) should be taken as the range of values over which we have no reason to believe that the concept would break down.

    As Collins points out in his essay, we do have reason to believe that at very high energies, our force-related concepts would break down. These high energies represent a natural cutoff point for the application of our force-related concepts. At these high energies, space and time are no longer continuous.

    Mr. Nakashima, in your post you mentioned the possibility that that our universe is actually fully quantized, and that the Theory of Everything could be a set of cellular automata rules. If I read him correctly, Robin Collins would not be at all fazed if this idea, which has been floated by Ed Fredkin and Stephen Wolfram, actually turned out to be correct.

    Anyway, without further ado, I’ll enclose a relevant quote from Collins’ lengthy essay. I hope it addresses your concerns about range:

    4.5. Examples of the EI region

    In the past, we have found that physical theories are limited in their range of applicability – for example, Newtonian mechanics was limited to medium-sized objects moving at slow speeds relative to the speed of light. For fast-moving objects, we require special relativity; for massive objects, General Relativity; for very small objects, quantum theory….

    There are good reasons to believe that current physics is limited in its domain of applicability. The most discussed of these limits is energy scale….

    The limits of the applicability our current physical theories to below a certain energy scales, therefore, translates to a limit on our ability to determine the effects of drastically increasing a value of a given force strength – for example, our physics does not tell us what would happen if we increased the strong nuclear force by a factor of 10^1,000

    Further, we have no guarantee that the concept of a force strength itself remains applicable from within the perspective of the new physics at such energy scales…

    Thus, by inductive reasoning from the past, we should expect not only entirely unforeseen phenomena at energies far exceeding the cutoff, but we even should expect the loss of the applicability of many of our ordinary concepts, such as that of force strength.

    The so-called Planck scale is often assumed to be the cutoff for the applicability of the strong, weak, and electromagnetic forces. This is the scale at which unknown quantum gravity effects are suspected to take place thus invalidating certain foundational assumptions on which current quantum field theories are based, such a continuous space-time (see e.g. Peacock 1999, p. 275; Sahni & Starobinsky 1999, p. 44). The Planck scale occurs at the energy of 10^19 GeV (billion electron volts), which is roughly 10^21 higher than the binding energies of protons and neutrons in a nucleus. This means that we could expect a new physics to begin to come into play if the strength of the strong force were increased by more than a factor of ~10^21…

    Effective field theory approaches to gravity also involve General Relativity’s being a low-energy approximation to the true theory. One common proposed cutoff is the Planck scale

  49. 49
    Toronto says:

    If there was a god, and there was a multi-verse, why bother fine-tuning any particular universe?

    We might simply have been put in a universe that is most beneficial to us.

  50. 50
    Sooner Emeritus says:

    vjtorley (40),

    You are using “probability” equivocally. I know that you hold a doctoral degree in philosophy, but you probably (pun, as well as example of usage, intended) have not studied the philosophy of probability. The article on interpretation of probability in the online Stanford Encyclopedia of Philosophy is an excellent place to start reading.

    I probably annoy people with my excessive use of highlighting, but here I go again:

    A physicist commenting at another blog offered a fine slogan: “No probability without process.” As best I can recall from my reading in the philosophy of probability, everyone agrees that physical probability should be treated as the relative frequency of outcomes of a repeatable experiment.

    The “probability” in your response to this was not physical. In fact, people do not assign numerical probabilities to propositions in that sort of discourse, and it’s not clear to me that there is a rational way to do so. The attachment of “probably” to a proposition is often just a rhetorical trick for sidestepping subjective preference and giving the impression of objectivity.

    As for hoisting me on my own petard, you’re making an error that is common at UncommonDescent, namely to jump to the conclusion that someone who objects to your argument objects to God. The fact is that I regard so-called proofs and disproofs of God as idolatry of reason.

    I should mention that Koonin comes perilously close to giving, and perhaps lays between the lines, an argument from improbability to an infinite multiverse. If he were to make this argument explicit, I would object to it just as I do IDists’ arguments from improbability. No one should question whether “I calls ’em like I sees ’em.”

  51. 51
    vjtorley says:

    Mark Frank (#44) and R0b (#46):

    You both raise some valid points regarding the scale that we should use when assessing whether a constant is fine-tuned. R0b writes:

    Suppose we take his [Collins’] meteor example and look at the size of the meteor rather than its impact location. Do we assume uniform probability over possible circumferences, surface areas, or volumes?

    Let’s go back to gravity, which I discussed in an earlier post (#34). In that post, I quoted Collins as saying that stars with life-times of more than a billion years (as compared to our sun’s life-time of ten billion years) could not exist if gravity were increased by more than a factor of 3000.

    Now, I can sympathize with skeptics who might object that being able to increase the strength of gravity up to 3000 times doesn’t sound like fine-tuning, unless you set it against the backdrop of a very large range (0 to 10^40). But actually, gravity is much, much more finely-tuned than that, as I found out when reading Collins’ latest essay:

    2.3.2 Fine-tuning of gravity

    There is, however, a fine-tuning of gravity relative to other parameters. One of these is the fine-tuning of gravity relative to the density of mass-energy in the early universe and other factors determining the expansion rate of the Big Bang – such as the value of the Hubble constant and the value of the cosmological constant. Holding these other parameters constant, if the strength of gravity were smaller or larger by an estimated one part in 10^60 of its current value, the universe would have either exploded too quickly for galaxies and stars to form, or collapsed back on itself too quickly for life to evolve. The lesson here is that a single parameter, such as gravity, participates in several different fine-tunings relative to other parameters.

    [Footnote: This latter fine-tuning of the strength of gravity is typically expressed as the claim that the density of matter at the Planck time (the time at which we have any confidence in the theory of Big Bang dynamics) must have been tuned to one part in 10^60 of the so-called critical density (e.g. Davies 1982, p. 89).] (Emphases mine – VJT.)

    Look at that. “One part in 10^60.” Amazing, isn’t it? Wouldn’t you agree that’s pretty finely-tuned?

    But Collins isn’t finished yet. He has anticipated Mark Frank’s objection (#44) that the fine-tuning “surprise factor” diminishes markedly, if one suitably redefines the natural constant:

    Once could rewrite Newton’s laws based on a concept, call it rootforce, which is the square root of our current definition of force.

    In the case rootforce between two objects of mass m1 and m2 = squareroot(G)*squareroot(m1*m2)/r

    It is much more convenient to deal with force than rootforce – but in what sense is more of a physical reality?

    Robin Collins considers an even nastier example, involving 100-th roots:

    3.3.2 Restricted Principle of Indifference

    In the case of the constants of physics, one can always find some mathematically equivalent way of writing the laws of physics in which (W_r)/(W_R) [the ratio of the restricted range of life-permitting values for a physical constant to the total range of values over which the constant can vary – VJT] is any arbitrarily selected value between zero and one. For example, one could write Newton’s law of gravity as F = (U^100.m_1.m_2)/r^2, where U is the corresponding gravitational constant such that U^100 = G. If the comparison range for the standard gravitational constant G were from 0 to 10^100.G_0, and the life-permitting range were from 0 to 10^9.G_0, that would translate to a comparison range for U of 0 to 10.U_0 and a life-permitting range of 0 to (1.2).U_0, since 10.U_0 = 10^100.G_0 and (1.2).U_0 = 10^9.G_0. (Here G_0 is the present value of G and U_0 would be the corresponding present value of U.) Thus, using G as the gravitational constant, the ratio, (W_r)/(W_R), would be (10^9.G_0)/(10^100.G_0) = 1/10^91, and using U as the “gravitational constant,” it would be (1.2.U_0)/(10.U_0), or 0.12, a dramatic difference! Of course, F = (U^100.m_1.m_2)/r^2 is not nearly as simple as F = (G.m_1.m_2)/r^2, and thus the restricted Principle of Indifference would only apply when using G as one’s variable, not U.

    …In the next section, however, we shall see that for purposes of theory confirmation, scientists often take those variables that occur in the simplest formulation of a theory as the natural variables. Thus, when there is a simplest formulation, or nontrivial class of such formulations, of the laws of physics, the restricted Principle of Indifference circumvents the Bertrand Paradoxes.

    Next, Collins argues that refusal to recognize the restricted Principle of Indifference would have the unacceptable consequence that highly accurate scientific predictions do not count as a valid reason for accepting a scientific theory:

    3.3.3. Natural variable assumption

    Typically, in scientific practice, precise and correct novel predictions are taken to significantly confirm a theory, with the degree of confirmation increasing with the precision of the prediction. We shall argue, however, that the notion of the “precision” of a prediction makes sense only if one privileges certain variables – the ones that I shall call the natural variables. These are the variables that occur in the simplest overall expression of the laws of physics. Thus, epistemically privileging the natural variables as required by the restricted Principle of Indifference corresponds to the epistemic practice in certain areas of scientific confirmation; if scientists did not privilege certain variables, they could not claim that highly precise predictions confirm a theory significantly more than imprecise predictions….

    From examples like the one cited earlier, it is also clear that W_R precision also depends on the choice of the natural variable, as we explained for the case of fine-tuning. [W_R is the total range of possible values for a constant of nature – VJT.] So it seems that in order to speak of the predictive SD [significant digit] or W_R precision for those cases in which a theory predicts the correct experimental value for some quantity, one must assume a natural variable for determining the known predictive precision. One could, of course, deny that there exists any nonrelative predictive precision, and instead claim that all we can say is that a prediction has a certain precision relative to the variable we use to express the prediction. Such a claim, however, would amount to a denial that highly accurate predictions, such as those of QED [Quantum Electro-Dynamics – VJT], have any special epistemic merit over predictions of much less precision. This, however, is contrary to the practice of most scientists. In the case of QED, for instance, scientists did take the astounding, known precision of QED’s prediction of the g-factor [gyromagnetic ration – VJT] of the electron, along with its astoundingly accurate predictions of other quantities, such as the Lamb shift, as strong evidence in favor of the theory. Further, denying the special merit of very accurate predictions seems highly implausible in and of itself. Such a denial would amount to saying, for example, that the fact that a theory correctly predicts a quantity to an SD [significant digit] precision of, say, 20 significant digits does not, in general, count significantly more in favor of the theory than if it had correctly predicted another quantity with a precision of two significant digits. This seems highly implausible.

    Collins is of course aware of cases where there may be some doubt as to which variable we should use:

    [C]onsider the case in which we are told that a factory produces cubes between 0 and 10 meters in length, but in which we are given no information about what lengths it produces. Using our aforementioned principle, we shall now calculate the epistemic probability of the cube being between 9 and 10 meters in length. Such a cube could be characterized either by its length, L, or its volume, V. If we characterize it by its length, then since the range [9,10] is one-tenth of the possible range of lengths of the cube, the probability would be 1/10. If, however, we characterize it by its volume, the ratio of the range of volumes is: [1,000 ? 93]/1,000 = [1,000 ? 729]/1,000 = 0.271, which yields almost three times the probability as for the case of using length. Thus, the probability we obtain depends on what mathematically equivalent variable we use to characterize the situation.

    Collins replies that in this particular case, there is a genuine ambiguity:

    In analogy to Bertrand’s Cube Paradox for the Principle of Indifference, in the case of the aforementioned cube it seems that we have no a priori way of choosing between expressing the precision in terms of volume or in terms of length, since both seem equally natural. At best, all we can say is that the predicted precision is somewhere between that determined by using length to represent the experimental data and that determined by using volume to represent the experimental data.

    Collins’ admission that there may be some “hard cases” in no way undermines his fundamental point, that “the notion of the ‘precision’ of a prediction makes sense only if one privileges certain variables – the ones that I shall call the natural variables… [I]f scientists did not privilege certain variables, they could not claim that highly precise predictions confirm a theory significantly more than imprecise predictions.”

    And returning briefly to the meteor example discussed by R0b above: I think it’s pretty clear that area is the relevant variable to consider.

  52. 52
    vjtorley says:

    Sooner Emeritus (#38, #49)

    Thank you for your posts. Like you, I tend to highlight a lot. It helps if one is reading a lengthy post – you can skim more quickly that way.

    First, I’d like to offer my sincere apologies for assuming that you were a skeptic, in my earlier posts.

    My post in #40 was intended purely as a dig at the double standards of some skeptics who balk at the probabilities invoked by the fine-tuning argument, but then proceed to invoke far vaguer “probabilities” (based on nothing more than subjective hunches) when discussing the problem of evil, or the likelihood of there being an incorporeal Deity.

    Although probability is not my specialty in philosophy, I am certainly aware of the difference between epistemic and physical probabilities, which you mentioned in your last post (#49). So is Collins, whose lengthy essay I’m currently summarizing. Accordingly, I’ll cite some remarks by Collins which address your objections.

    You object to the fine-tuning argument on the ground that it conflates epistemic and physical probabilities:

    Only with a time machine could one observe an origin of life. Without a time machine, all one can do is to use information in the present to guess a process in the past, and then assign probability according to the guess (model). This probability is certainly not a physical entity. It derives from the modeler’s supposition as to what was going on 4 billion years ago…

    It is clearly wrong to infer that an intelligence intervened to change physical probability when the deficit in probability is merely subjective.

    Collins responds in section 3.1 of his essay (“The need for epistemic probability”) that on the contrary, epistemic probability is extensively used in scientific confirmation, and that it often precedes any application of physical and/or statistical probability:

    Consider, for example, the arguments typically offered in favor of the Thesis of Common Ancestry, continental drift theory, and the atomic hypothesis. The Thesis of Common Ancestry is commonly supported by claiming that a variety of features of the world – such as the structure of the tree of life – would not be improbable if this thesis is true, but would be very improbable under other contending, nonevolutionary hypotheses, such as special creation….

    Similar lines of reasoning are given for accepting continental drift theory. For example, the similarity between the animal and plant life on Africa and South America millions of years ago was considered to provide significant support for continental drift theory. Why? Because it was judged very unlikely that this similarity would exist if continental drift theory were false, but not if it were true.

    Finally, consider the use of epistemic probability in the confirmation of atomic theory. According to Wesley Salmon (1984, pp. 219–20), what finally convinced virtually all physical scientists by 1912 of the atomic hypothesis was the agreement of at least 13 independent determinations of Avogadro’s number based on the assumption that atomic theory was correct…

    Since some of the probabilities in the aforementioned examples involve singular, nonrepeatable states of affairs, they are not based on statistical probabilities, nor arguably other non-epistemic probabilities. This is especially evident for the probabilities involved in the confirmation of atomic theory since some of them involve claims about probabilities conditioned on the underlying structure and laws of the universe being different – e.g. atoms not existing. Hence, they are not based on actual physical propensities, relative frequencies, or theoretical models of the universe’s operation. They therefore cannot be grounded in theoretical, statistical, or physical probabilities. (Emphases mine – VJT.)

    All in all, I think Collins has made a good case that epistemic probabilities cannot be eliminated from science, and that their use may even precede the use of physical probabilities invoked by scientists.

    And now, over to you.

  53. 53
    R0b says:

    vjtorley, if we’re interested in the size of the meteor, it seems that circumference, surface area, and volume are all valid measures. Collins’ acknowledgement of Bertrand’s Paradox shows that he’s at least aware of the issue, and I’ll have to read his work before I opine on whether he has resolved the paradox adequately in the case of fine-tuning.

  54. 54
    vjtorley says:

    Pelagius, Sotto Voce, R0b:

    All of you have discussed the concept of transfinite numbers in previous posts. How does this affect Collins’ fine-tuning argument?

    It may interest you to know that Robin Collins is actually quite friendly to the idea of an infinite universe. Here is what he writes in his essay, “Universe or Multiverse? A Theistic Perspective” at http://home.messiah.edu/~rcoll.....20talk.htm :

    Indeed, the fact that the multiverse scenario fits well with an idea of an infinitely creative God, and the fact that so many factors in contemporary cosmology and particle physics conspire together to make an inflationary multiverse scenario viable significantly tempts me toward seriously considering a theistic version of it. This temptation is strengthened by the fact that science has progressively shown that the visible universe is vastly larger than we once thought, with a current estimate of some 300 billion galaxies with 300 billion stars per galaxy. Thus, it makes sense that this trend will continue and physical reality will be found to be much larger than a single universe.

    Of course, one might object that creating a fine-tuned universe by means of a universe generator would be an inefficient way for God to proceed. But this assumes that God does not have any other motive for creation – such as that of expressing his/her infinite creativity and ingenuity – than creating a life-permitting cosmos using the least amount of material. But why would one make this assumption unless one already had a preexisting model of God as something more like a great engineer instead of a great artist? Further, an engineer with infinite power and materials available would not necessarily care much about efficiency. (Emphases mine – VJT.)

    In that essay, Collins goes on to argue that even if a multiverse generator of baby universes exists, the multiverse generator itself – whether of the inflationary variety or some other type – still needs to be “well-designed” in order to produce life-sustaining universes. He goes on to argue in detail that an inflationary multiverse generator would need:

    (i) A mechanism to supply the energy needed for the bubble universes (the inflaton field);

    (ii) a mechanism to form the bubbles (Einstein’s equation of general relativity);

    (iii) a mechanism to convert the energy of the inflaton field to the normal mass/energy we find in our universe (i.e. Einstein’s mass-energy equivalence relation, combined with an hypothesized coupling between the inflaton field and normal mass/energy fields); and

    (iv) a mechanism that allows enough variation in constants of physics among universes (e.g. superstring theory).

    Concludes Collins:

    In sum, even if an inflationary multiverse generator exists, it along with the background laws and principles have just the right combination of laws and fields for the production of life-permitting universes: if one of the components were missing or different, such as Einstein’s equation or the Pauli-exclusion principle, it is unlikely that any life-permitting universes could be produced.

    In his more recent essay (2009), “The Teleological Argument:An Exploration of the Fine-Tuning of the Universe” at http://commonsenseatheism.com/.....gument.pdf , Collins discusses the multiverse scenario in much greater depth, in section 6.

    I’ll quote some brief remarks Collins makes about using what he calls an unrestricted multiverse as an argument against fine-tuning. The unrestricted multiverse is one in which all possible worlds exist:

    6.2. Critique of the unrestricted multiverse

    To begin our argument, consider a particular event for which we would normally demand an explanation, say, that of Jane’s rolling a six-sided die 100 times in a row and its coming up on six each time… DTx is the state of affairs of [die] D’s coming up 100 times in a row on six for that particular sequence of die rolls. Normally, we should not accept that DTx simply happened by chance; we should look for an explanation…

    Now, for any possible state of affairs S – such as DTx – UMU [the unrestricted multiverse – VJT] entails that this state of affairs S is actual

    Hence, the mere fact that [UMU] entails the existence of our universe and its life-permitting structure cannot be taken as undercutting the claim that it is improbable without at the same time undercutting claims such as that DTx is improbable.

    So there you have it. “Hey, we live in an unrestricted multiverse! Sooner or later, 100 sixes in a row was bound to come up!” I wouldn’t like to try that line in Vegas.

    Next, Collins addresses restricted multiverses (such as inflationary-superstring multiverse), which may still contain an infinite number of universes, but without being exhaustive of all possibilities. I’ll quote a brief summary extract:

    6.3. The inflationary-superstring multiverse explained and criticized

    6.3.5 Conclusion

    The aforementioned arguments do not show that inflationary cosmology is wrong or even that scientists are unjustified in accepting it. What they do show is that the inflationary multiverse offers no help in eliminating either the fine-tuning of the laws of nature or the special low-entropic initial conditions of the Big Bang. With regard to the special low-entropic initial conditions, it can explain the special conditions of the Big Bang only by hypothesizing some other, even more special, set of initial conditions.

    I think it is fair to conclude that the existence of the multiverse, even if confirmed, in no way undermines the fine-tuning argument.

  55. 55
    vjtorley says:

    Sooner Emeritus (#49)

    I’d like to address your remark:

    The fact is that I regard so-called proofs and disproofs of God as idolatry of reason.

    Surprisingly, I think Robin Collins would agree with you. To support my point, I’d like to quote a few extracts from Collins’ latest essay, “The Teleological Argument: An Exploration of the Fine-Tuning of the Universe” at http://commonsenseatheism.com/.....gument.pdf , where Collins is discussing the question of whether LPU (the existence of a life-permitting universe) lends evidential support to T, the theistic hypothesis that “there exists an omnipotent, omniscient, everlasting or eternal, perfectly free creator of the universe whose existence does not depend on anything outside itself”, as opposed to NSU [the naturalistic single-universe hypothesis] and the naturalistic multiverse hypothesis:

    7.1 The “who designed God?” objection

    [T]his objection [Who made God? – VJT] would arise only if either T were constructed solely to explain the fine-tuning, without any independent motivation for believing it, or one considered these other motivations as data and then justified T by claiming that it is the best explanation of all the data. Our main argument, however, is not that T is the best explanation of all the data, but only that given the fine-tuning evidence, LPU strongly confirms T over NSU…

    The existence of God is not a hypothesis that is being offered as the best explanation of the structure of the universe, and hence it is not relevant whether or not God is an explanatorily better (e.g. simpler) terminus for ultimate explanation than the universe itself. Nonetheless, via the restricted version of the Likelihood Principle (Section 1.3), the various features of the universe can be seen as providing confirming evidence for the existence of God. One advantage of this way of viewing the situation is that it largely reconciles the views of those who stress a need for faith in coming to believe in God and those who stress reason. They each play a complementary role.

    8. Conclusion

    As I developed in Sections 1.3 and 1.4, the fine-tuning argument concludes that, given the evidence of the fine-tuning of the cosmos, LPU [the existing of a life-permitting universe] significantly confirms T [theism] over NSU [the naturalistic single-universe hypothesis]. In fact, as shown in Section 5.2, a good case can be made that LPU conjoined with the existence of evil significantly confirms T over NSU. This does not itself show that T is true, or even likely to be true; or even that one is justified in believing in T. Despite this, I claimed that such confirmation is highly significant – as significant as the confirmation that would be received for moral realism if we discovered that extraterrestrials held the same fundamental moral beliefs that we do and that such an occurrence was very improbable under moral antirealism… This confirmation would not itself show that moral realism is true, or even justified. Nonetheless, when combined with other reasons we have for endorsing moral realism (e.g. those based on moral intuitions), arguably it tips the balance in its favor. Analogous things, I believe, could be said for T [theism].

    I hope that clears matters up, regarding the status of the fine-tuning argument for theism.

  56. 56
    Nakashima says:

    Dr Torley,

    Yes, part of the problem with establishing this range is know whether the sub-range that permits life is a single continuous region. At different points it seems that Collins allows as to how it is possible that it might not be true, that it actually might be necessary to take the sum of several non-contiguous sub-regions, but then basically abandons this caution. He seems really over the top in agreeing with Leslie’s fly-on-the-wall analogy that the local region is all that needs to be taken into account, even if the wall is thick with flies elsewhere. How does that make sense to someone trying to calculate the total area of the sub-regioins??

    # BTW, I see your quotes correct Collins’ documents (on his website) from “Plank” to “Planck”. I have to admit that while reading, this kind of error did not increase my confidence (or rational expectation) in his argument.

    Again, in other spots Collins toys briefly with the idea that the parameter G actually could have a negative value. But then this is abandoned. Why? I’m not worried that allowing negtive values for G will increase the eventual appearance of fine tuning. I’m worried that Collins’ method for arriving at a range is completely arbitrary.

    It seems to me that Collins’ heuristics for arriving at a range are either completely grounded in our experience of this univese, applied at the wrong level of physical reality, or arbitrary. The question of whether a non-contiguous region holds parameter settings that support life seems to me to be strongly supported by 1D and 2D automata capable of universal computation, and 2D CA replicators in the tradition of von Neumann.

  57. 57
    Nakashima says:

    Dr Torley,

    Physicists being convinced by 13 experiments does not sound to me like epistmeic probability. It sounds completely frequentist. Similarly with plants and continental drift. We have lots of experience (aka frequently repeated trials) with plant ranges, and when they are interrupted by rivers, lakes, mountains, etc. An argument that two ranges of plants were once connected is based on experience at many other scales of time and space.

  58. 58
    tgpeeler says:

    Pelagias @ 32 “Mathematicians claim that the set of integers is infinite. This strikes me as incoherent. How can there be an actual infinite number of integers? I can always add one more to the count of how ever many there are. Ergo, no infinite number of integers.”

    Yes, I do, but apparently you do not. I’ll try again. The CONCEPT of infinity is abstract. There can be no actual infinite number of anything. As soon as you reify your “infinite” number of integers with an actual number then it’s no longer infinite. So the CONCEPT of an infinite number of integers remains coherent but placing an actual value on what that is, is not. To speak of an actual infinite number of universes is logically incoherent. I hope this helps.

  59. 59
    tgpeeler says:

    oops, Pelagius

    Sotto @ 36 “The curious thing about an infinite set is that adding one more member does not actually change the size of the set. Transfinite arithmetic is weird.”

    Indeed it is. One only has to check into Hilbert’s hotel to understand that.

    Infinity is abstract, not concrete. That permeates the definition of what infinity means. So to say that we have an infinite number of universes is to say that we have an infinite set of concrete things. Not possible. By definition. Which makes the discussion of an infinite multiverse a fool’s errand. I’ve been on plenty of those, mind you, and will be on many more, no doubt, but not this one.

    Another thing about the infinite (a very fascinating subject which I know very little about) is that some mathemeticians (I forget the guy’s name now who started this – Cantor maybe?) say that some infinite sets are larger than other infinite sets (has to do with one to one correspondence). In a conversation with a mathematics professor (the poor guy was trying to get me to understand this) I said to him that if one infinite set could be larger that another infinite set it had to be infinitely larger. He said, this is true. He couldn’t explain it beyond saying it and neither can I. Mind bending stuff, no error. Just like God.

  60. 60
    pelagius says:

    tgpeeler:

    The CONCEPT of infinity is abstract.

    So is the CONCEPT of ‘four’.

    There can be no actual infinite number of anything. As soon as you reify your “infinite” number of integers with an actual number then it’s no longer infinite.

    Why would you reify infinity with a finite number? They’re not the same. You might as well reify six with four.

    So the CONCEPT of an infinite number of integers remains coherent but placing an actual value on what that is, is not.

    By “an actual value” you appear to mean “a finite value”. If so, then of course it is incoherent to place a finite value on infinity. Infinity is not finite.

    To speak of an actual infinite number of universes is logically incoherent. I hope this helps.

    It helps me to understand the mistake you are making.

    Infinity is abstract, not concrete. That permeates the definition of what infinity means. So to say that we have an infinite number of universes is to say that we have an infinite set of concrete things. Not possible. By definition.

    You seem to be confused over the terms ‘abstract’ and ‘concrete’. The concept ‘four’ is abstract, but that doesn’t mean that a set of four concrete things is logically impossible. Similarly, the concept ‘infinity’ is abstract, but that doesn’t mean that an infinite number of universes is logically incoherent.

    …mathemeticians …say that some infinite sets are larger than other infinite sets (has to do with one to one correspondence). In a conversation with a mathematics professor (the poor guy was trying to get me to understand this) I said to him that if one infinite set could be larger that another infinite set it had to be infinitely larger. He said, this is true.

    Yes. For example, the set of even integers is just as large as the set of all integers, because they can be brought into a one-to-one correspondence with each other.

    The set of real numbers, on the other hand, is larger than the set of integers. You cannot bring the integers into a one-to-one correspondence with the real numbers. If you try, there will always be an infinite number of reals left over.

  61. 61
    vjtorley says:

    Mr. Nakashima (#56)

    Thank you for your post. By the way, “vjtorley” is fine with me.

    I’d like to begin with physicist John Leslie’s “fly-on-the-wall” analogy, which Robin Collins cites in his paper. Here’s the relevant passage from Leslie:

    If a tiny group of flies is surrounded by a largish fly-free wall area then whether a bullet hits a fly in the group will be very sensitive to the direction in which the firer’s rifle points, even if other very different areas of the wall are thick with flies. So it is sufficient to consider a local area of possible universes, e.g., those produced by slight changes in gravity’s strength… It certainly needn’t be claimed that Life and Intelligence could exist only if certain force strengths, particle masses, etc. fell within certain narrow ranges… All that need be claimed is that a lifeless universe would have resulted from fairly minor changes in the forces etc. with which we are familiar. (1989, pp. 138–9)

    Your comment:

    Yes, part of the problem with establishing this range is know whether the sub-range that permits life is a single continuous region. At different points it seems that Collins allows as to how it is possible that it might not be true, that it actually might be necessary to take the sum of several non-contiguous sub-regions, but then basically abandons this caution. He seems really over the top in agreeing with Leslie’s fly-on-the-wall analogy that the local region is all that needs to be taken into account, even if the wall is thick with flies elsewhere. How does that make sense to someone trying to calculate the total area of the sub-regions??

    I would agree with you that if it makes sense to speak of a physical constant as being able to take any value between infinity and minus infinity, then we have no idea whether the total length of the life-permitting regions is finite or infinite. All we can talk about is our own relatively narrow band, in which our concepts of physics meaningfully apply. This is the epistemically illuminated region.

    As Leslie points out, it is still a remarkable fact that within this large but finite region, a lifeless universe would have resulted from fairly minor changes in the forces etc. with which we are familiar. Wouldn’t you agree?

  62. 62
    vjtorley says:

    Mr. Nakashima (#56)

    In your post, you ask why Collins does not address the possibility that the gravitational constant G that G could have a negative value:

    Again, in other spots Collins toys briefly with the idea that the parameter G actually could have a negative value. But then this is abandoned. Why? I’m not worried that allowing negative values for G will increase the eventual appearance of fine tuning. I’m worried that Collins’ method for arriving at a range is completely arbitrary.

    Three points in reply. First, as I read it, Collins’ fine-tuning argument for G relates only to the magnitude of G, rather than its sign. This is apparent from the following extract from his essay, “God, Design and Fine-Tuning” at http://academic.udayton.edu/Wi.....tuning.htm :

    Suppose, for instance, that the “theoretically possible” range, R, of values for the strength of gravity is zero to the strength of the strong nuclear force between those protons – that is, 0 to 10^40.G_0, where G_0 represents the current value for the strength of gravity. As we saw above, the life-permitting range r for the strength of gravity is at most 0 to 10^9.G_0. Now, of itself (specifically, apart from the knowledge that we exist), the atheistic single-universe hypothesis gives us no reason to think that the strength of gravity would fall into the life-permitting region instead of any other part of the theoretically possible region. Thus, assuming the strength of the forces constitute a real physical magnitude, the principle of indifference would state that equal ranges of this force should be given equal probabilities, and hence the probability of the strength of gravity falling into the life-permitting region would be at most r/R = 10^9/10^40 = 1/10^31. (Bold type mine – VJT.)

    Second, even if one were to allow negative values for G into the epistemic range, this range would still be bounded below by the Planck scale, which is often assumed to be the cutoff for the applicability of the strong, weak, and electromagnetic forces, and which is commonly proposed as a cutoff point for gravity as well. Thus including possible negative values of G would at most double the epistemic range.

    Third, the foregoing considerations do nothing to weaken the force of Collins’ strongest fine-tuning argument for G – that it is fine-tuned to a value of 1 in 10^60 on either side. The following extract is taken from section 2.3.2 of his essay, “The Teleological Argument: An Exploration of the Fine-Tuning of the Universe” (page 215):

    There is, however, a fine-tuning of gravity relative to other parameters. One of these is the fine-tuning of gravity relative to the density of mass-energy in the early universe and other factors determining the expansion rate of the Big Bang – such as the value of the Hubble constant and the value of the cosmological constant. Holding these other parameters constant, if the strength of gravity were smaller or larger by an estimated one part in 10^60 of its current value, the universe would have either exploded too quickly for galaxies and stars to form, or collapsed back on itself too quickly for life to evolve.

    Notice too that if G is too small, stars cannot form. Presumably this would also be the case if G were negative.

    I conclude that Collins’ fine-tuning argument for G is substantially sound. The incredible degree of fine-tuning observed (1 in 10^60) should surprise anyone who reflects on the fact.

    By the way, I’d be very interested to hear about whether CA theory makes any predictions for the values of physical constants, and for the degree of fine-tuning we should expect to observe. Also, what does CA theory say about the range of values for these constants?

    I’ve just had another thought about Leslie’s “fly-on-the-wall” argument. It seems to me that the skeptic, in arguing that we should not be unduly surprised if the bullet hits a fly in the sparsely populated illuminated region of the wall, is implicitly assuming a “God’s-eye” view of the wall: if there are lots of flies elsewhere, then it’s quite likely that the bullet will hit a fly. The same applies when the skeptic argues that a potentially infinite variety of other forms of life, which we cannot conceive of, may exist in universes with parameters completely different from our own.

    What strikes me as ironical here is that the skeptic does not believe in the existence of any entity possessing this “God’s-eye” view of reality, but he/she is invoking this “God’s-eye” view of reality, which (on his/her account) no being can possibly have, in order to argue against God. I have to say I find that odd.

  63. 63
    Mark Frank says:

    vjtorley

    I have spent much of the last day thinking about Collins essay and I have to admit I underestimated him. It is thought provoking; in particular the example of inferring the truth of QED from the very accurate measurement of g. This made me think hard about the nature of this inference. Clearly it is a powerful reason for accepting QED but I cannot accept his attempt to phrase the inference in terms of comparative likelihood. The point being there is no alternative hypothesis to compare it to. I found it more helpful to think of it in terms of Bayesian priors.

    But after a while I realised that, while this is fascinating, it is not my main problem with the fine tuning argument. After all, if some genius came up with a naturalistic unified theory of everything, based on independent evidence, which predicted that the fundamental constants the universe would have the values we observe, then I would take that as very strong evidence for that theory (especially if the genius did not know the values of those constants beforehand).

    The key here is the “independent evidence”. I know that you and Collins believe there is independent evidence for a “universe designer”. I don’t. Therefore I find the explanation “ there must be a designer that wanted it this way” to be hopelessly ad hoc. The outcome is built into the hypothesis. It is the same fundamental problem with all design hypotheses. Unless the hypothesis is sufficiently specific about the designer so that it is possible to have evidence for that designer other than the data you are trying to explain then it is just gremlins in the attic writ large.

    Every so often the debate at UD rises above the trivial and you are usually involved – thanks.

  64. 64
    Nakashima says:

    Dr Torley,

    Please allow me to continue to use an honorific you have obviously earned.

    As Leslie points out, it is still a remarkable fact that within this large but finite region, a lifeless universe would have resulted from fairly minor changes in the forces etc. with which we are familiar. Wouldn’t you agree?

    If it were true, I might! I think the Olbermann paper on the resonances that generate C or O in stars is the best start in that direction. But as Collins points out, that ratio of C:O production can’t be considered a free parameter of the system.

    This is like arguing that the flicker rate of the sun is fine tuned to 0. If the sun flickered on and off, it would have a dramatic effect on life on earth. If the sun were on for 6 months, then off for 6 months, could life have arisen naturally? We can calculate a range of solar flicker values, from 0 to the length of time it takes a photon to cross the sun. Blah, blah, blah. It doesn’t matter because solar flicker rate is not a free parameter.

    We don’t know the free parameters of the universe. The Standard Model has 19. They are things like the mass of the electron and the CKM 23-mixing angle. What is the possible range of the CKM 23-mixing angle (actual value = 2.4 degrees)? 0 to 360? What is smallest increment? Is there a Planck angle? Why would I be concerned with the interval 2.3 to 2.5 degrees rather than 0-360?

    Collins can’t analyze the universe at this level, and is inconsistent with his answers to these questions. This is a large source of frustration to me in reading his work.

  65. 65
    Nakashima says:

    Dr Torley,

    You should be aware that my discussion of Collins’ positions and arguments is drawn completely from the single paper you referred me to earlier in this thread – How to Rigorously Define Fine-Tuning.

    Again, the point is not the actual size of the range Collins can establish for G, it is that his procedure for choosing the range is not rigorous. Could G be negative, and gravity actually be repulsion? Having brought up the possibility, Collins doesn’t show that he has a good procedure for rejecting it.

    Holding these other parameters constant, if the strength of gravity were smaller or larger by an estimated one part in 10^60 of its current value, the universe would have either exploded too quickly for galaxies and stars to form, or collapsed back on itself too quickly for life to evolve.

    Perhaps you should have bolded the phrase Holding these other parameters constant. Is this estimate a clue to fine tuning or to the idea that these are not independent parameters?

    Yes, Leslie. Why should his fly shooting analogy be relevant? Going back to the start of the discussion on probability, we know by experience that fine tuning the gun angle to hit a particular fly is difficult. Frequentism. We don’t have that warrant of experience for creating universes.

  66. 66
    tgpeeler says:

    re Pelagius @ 60 “You seem to be confused over the terms ‘abstract’ and ‘concrete’. The concept ‘four’ is abstract, but that doesn’t mean that a set of four concrete things is logically impossible. Similarly, the concept ‘infinity’ is abstract, but that doesn’t mean that an infinite number of universes is logically incoherent.”

    This is rich. I know that it is possible to have a concrete number of 4, 5, 6, or whatever number of things, even though the numbers themselves are abstract. What you fail to grasp is that BY DEFINITION there cannot be a concrete number of infinite things. You say that it is logically coherent for an infinite number of universes to exist. That’s absurd on the face of it. Let me try to make it even simpler. If it’s physical, it’s countable. If it’s countable, it’s not infinite. Because part of the definition of infinite means being uncountable. This is the logical absurdity that at one point in your last post it looked like you might understand.

    By the way, the reason we empirically KNOW the universe is finite, that it began, that it is not infinitely old or eternal, is because it still has not reached a maximum state of entropy. If the universe was “infinitely old” (a logical impossibility) all of the stars would have burned out by now. But they haven’t. Therefore, the universe isn’t eternal. It had a beginning.

    I cannot say for certain whether there is more than one universe and neither can anyone else. It is certainly logically possible that there are more. But I do know that there are not an infinite number of them. Because if you start showing them to me I can count them. Which means, of course, there is not an infinite number of them.

  67. 67
    Mark Frank says:

    #66

    But I do know that there are not an infinite number of them. Because if you start showing them to me I can count them. Which means, of course, there is not an infinite number of them.

    You might want to Google “countable infinity”. You will find out about the difference between countable and uncountable infinities.

  68. 68
    pelagius says:

    tgpeeler:

    What you fail to grasp is that BY DEFINITION there cannot be a concrete number of infinite things.

    That sentence doesn’t make sense. Are you trying to say that there cannot be an infinite number of concrete things?

    You say that it is logically coherent for an infinite number of universes to exist. That’s absurd on the face of it. Let me try to make it even simpler. If it’s physical, it’s countable. If it’s countable, it’s not infinite.

    You evidently aren’t familiar with the concept “countably infinite”. Try googling it.

    By the way, the reason we empirically KNOW the universe is finite, that it began, that it is not infinitely old or eternal, is because it still has not reached a maximum state of entropy. If the universe was “infinitely old” (a logical impossibility) all of the stars would have burned out by now. But they haven’t. Therefore, the universe isn’t eternal. It had a beginning.

    Not true. That fact by itself is not enough to establish the finite age of the universe. The Steady State Theory of Fred Hoyle (who is beloved of IDers and creationists everywhere for his fallacious “tornado in a junkyard” argument) posited an infinitely old universe, but it did not require that the universe be in a state of maximum entropy.

    But I do know that there are not an infinite number of [universes]. Because if you start showing them to me I can count them.

    If I start showing them to you then you can start to count them. But that’s not the issue. The issue is whether you will ever finish. How do you know that you will?

    Your argument boils down to this:

    1. There cannot be an infinite number of physical things.

    2. A universe is a physical thing.

    3. Therefore, there cannot be an infinite number of universes.

    The problem is with your assumption in step 1. How do you know that there cannot be an infinite number of physical things?

  69. 69
    vjtorley says:

    Mr Nakashima (#65)

    Thank you for your post, and thank you also for letting me know that you have only been reading one of Collins’ papers. I would strongly suggest that you have a look at Collins’ 2009 paper, The Teleological Argument: An Exploration of the Fine-Tuning of the Universe . It is exhaustive: I think it’s fair to say there’s not a single objection to the argument in the literature that Collins doesn’t address somewhere in his latest paper.

    You raise a good point about the fine-tuning argument for G, at 1 part in 10^60, when you write:

    Perhaps you should have bolded the phrase Holding these other parameters constant. Is this estimate a clue to fine tuning or to the idea that these are not independent parameters?

    But Collins has anticipated this objection too. Here’s what he writes on page 215, in Section 2.3.2 of his essay:

    [Footnote] 10. This latter fine-tuning of the strength of gravity is typically expressed as the claim that the density of matter at the Planck time (the time at which we have any confidence in the theory of Big Bang dynamics) must have been tuned to one part in 10^60 of the so-called critical density (e.g. Davies 1982, p. 89). Since the critical density is inversely proportional to the strength of gravity (Davies 1982, p. 88, eqn. 4.15), the fine-tuning of the matter density can easily be shown to be equivalent to the aforementioned claim about the tuning of the strength of gravity. Of course, if one cites this fine-tuning of gravity, one cannot then treat the fine-tuning of the force of the Big Bang or matter density of the Big Bang as an independent fine-tuning. (See Section 5.1.1 for how to combine cases of fine-tuning.) (Emphases mine – VJT.)

    Now, I only cited gravity in my preceding discussion in #51. I didn’t talk about the force of the Big Bang or matter density of the Big Bang, so I think I’ve been fair here. What Collins is saying is that we need to avoid double-counting our cases of fine-tuning.

    In section 5.1.1, on pages 252-253, Collins develops his point at further length:

    Some have faulted the fine-tuning arguments for only varying one constant at a time, while keeping the values of the rest fixed. For example, Victor Stenger claims that, “One of the many major flaws with most studies of the anthropic coincidences is that the investigators vary a single parameter while assuming all the others remain fixed!” (2007, p. 148).

    This issue can be easily addressed for a case in which the life-permitting range of one constant, C_1, does not significantly depend on the value that another constant, C_2, takes within its comparison range, R_2. In that case, the joint probability of both C_1 and C_2 falling into their life-permitting ranges is simply the product of the two probabilities….[Supporting mathematical calculations follow – VJT.] Thus, we can treat the two probabilities as effectively independent.

    When will two constants be independent in this way? Those will be cases in which the factors responsible for C_1’s being life-permitting are effectively independent of the factors responsible for C_2’s being life-permitting. For example, consider the case of the fine-tuning of the cosmological constant (C_1) and the fine-tuning of the strength of gravity (C_2) relative to the strength of materials – that is, the first case of the fine-tuning of gravity discussed in Section 2.3.2. The life-permitting range of gravity as it relates to the strength of materials does not depend on the value of the cosmological constant…. This means that the joint probability of both gravity and the cosmological constant’s falling into their life-permitting ranges is the product of these two probabilities: W_r/W_R for gravity times W_r/W_R for the cosmological constant. This same analysis will hold for any set of fine-tuned constants in which the life-permitting range for each constant is independent of the values the other constants take in their respective EI ranges: e.g., the set consisting of the fine-tuning of the strong nuclear force needed for stable nuclei and the previously discussed example of the fine-tuning of gravity. (Emphases mine – VJT.)

    Regarding the possibility of G having a negative value, I think Collins deals adequately with this in the following quote from page 212 in section 2.2:

    In classical physics, the amount of force is given by Newton’s law, F = G.m_1.m_2/r^2, where F is the force of attraction between two masses, m_1 and m_2, separated by a distance r, and G is the gravitational constant (which is simply a number with a value of 6.672 × 10^?11 N.m2/kg^2). Now consider what would happen if there were no universal, long-range attractive force between material objects, but all the other fundamental laws remained (as much as possible) the same. If no such force existed, then there would be no stars, since the force of gravity is what holds the matter in stars together against the outward forces caused by the high internal temperatures inside the stars. This means that there would be no long-term energy sources to sustain the evolution (or even existence) of highly complex life. Moreover, there probably would be no planets, since there would be nothing to bring material particles together, and even if there were planets (say because planet-sized objects always existed in the universe and were held together by cohesion), any beings of significant size could not move around without floating off the planet with no way of returning. This means that embodied moral agents could not evolve, since the development of the brain of such beings would require significant mobility. For all these reasons, a universal attractive force such as gravity is required for embodied moral agents. (Emphases mine – VJT.)

    The same points apply if gravity were negative.

    I hope that answers your questions.

  70. 70
    tgpeeler says:

    P @ 68 “Not true. That fact by itself is not enough to establish the finite age of the universe. The Steady State Theory of Fred Hoyle (who is beloved of IDers and creationists everywhere for his fallacious “tornado in a junkyard” argument) posited an infinitely old universe, but it did not require that the universe be in a state of maximum entropy.”

    The steady state theory of Fred Hoyle is nonsense and has been recognized as such for years. He was also an advocate of panspermia. hee hee. Yeah, aliens did it. Being infinitely old would require maximum entropy whether his theory required it or not. There is a burn rate for the energy in the universe and unless things have radically changed in the last few weeks there is a finite amount of energy in the universe (see first law of thermodynamics). This means it will eventually run out. This means the universe can’t possibly be infinitely old. The argument goes like this: If the universe were infinitely old it would have reached a state of maximum entropy by now (see second law of thermodynamics). But it hasn’t reached a state of maximum entropy. Therefore, it isn’t infinitely old. This isn’t difficult. It’s called modus tollens and it’s a valid form of deduction. That means if the premise is true the conclusion is true. So tell me that the second law of thermodynamics somehow doesn’t eventually lead to a maximum state of entropy. As for the rest, this really isn’t the place to split definitional hairs about what infinite means so I give up on that. And this too, actually. 🙂

  71. 71
    pelagius says:

    tgpeeler:

    Being infinitely old would require maximum entropy whether [Hoyle’s] theory required it or not.

    Only if you also assume that the universe is closed. In Hoyle’s theory, it was not — matter was being continuously created.

    There is a burn rate for the energy in the universe and unless things have radically changed in the last few weeks there is a finite amount of energy in the universe (see first law of thermodynamics). This means it will eventually run out.

    No, the first law tells us that energy (including mass) is conserved, which means that it will never run out. Don’t confuse energy with entropy.

  72. 72
    Sotto Voce says:

    Tgpeeler,

    The second law is a statistical law. For any finite system in a maximum entropy state, there is some non-zero probability that it will fluctuate into a lower entropy state. Of course, this is a very low probability event, but in an infinitely old universe it will occur an infinite number of times. So the observation of non-maximal entropy is not incompatible with an infinite past. Boltzmann, the man who first formulated statistical mechanics, proposed that the universe was infinitely old, had been in a maximum entropy state in the past and has since fluctuated into its current low entropy state.

    Of course this is all academic. We currently have excellent evidence that the age of the universe is finite. Just an interesting sidebar to the discussion.

    On the more substantive point, current cosmology strongly suggests that the universe is spatially infinite (although things are still far from conclusive). Do you think a spatially infinite universe is logically impossible?

  73. 73
    bornagain77 says:

    An actual infinite does not reside with space, as we understand it, but resides with transcendent information.

    Space, as we know it, is shown by general relativity to be inextricably linked with time, and thus space is being “created” or expanded, as we go into the future. Space has a finite past and thus a definite point of origination in the past, as does time, as well space has a definite point of origination into the future with the passing of time: Both time and space are also linked, fundamentally, to the second law of thermodynamics as is noted by Penrose.
    Will God, as revealed in Christ, fundamentally change this arrangement, sometime in the future when He returns?, So that time and space, as we understand them, are radically changed so as to not incorporate the second law at such a fundamental level? I would think so, at first glance, since the second law is so closely tied to death and decay, but as to degree of the change I have not the foggiest.

  74. 74
    Clive Hayden says:

    tgpeeler,

    So tell me that the second law of thermodynamics somehow doesn’t eventually lead to a maximum state of entropy.

    Some folks claim that the oscillating model of the universe relieves this difficulty, as if entropy were renewed at each oscillation, similar to how a player in a video game has full strength after each death, and this model is about as realistic as that video game too 😉

  75. 75
    vjtorley says:

    Mark Frank (#63) and Mr. Nakashima (#64)

    Thank you for your posts, both of which raise excellent points.

    I would agree that the fine-tuning hypothesis needs to make more specific predictions in order to be judged scientifically useful, and also in order to rebut the philosophical charge of being too ad hoc.

    Interestingly, the first person to use fine-tuning to make a scientific prediction was the atheist, Fred Hoyle. To quote Wikipedia:

    An early paper of Hoyle’s made an interesting use of the anthropic principle. In trying to work out the routes of stellar nucleosynthesis, he observed that one particular nuclear reaction, the triple-alpha process, which generates carbon, would require the carbon nucleus to have a very specific energy for it to work. The large amount of carbon in the universe, which makes it possible for carbon-based life-forms (e.g. humans) to exist, demonstrated that this nuclear reaction must work. Based on this notion, he made a prediction of the energy levels in the carbon nucleus that was later borne out by experiment.

    However, those energy levels, while needed in order to produce carbon in large quantities, were statistically very unlikely. Hoyle later wrote:

    Would you not say to yourself, “Some super-calculating intellect must have designed the properties of the carbon atom, otherwise the chance of my finding such an atom through the blind forces of nature would be utterly minuscule.” Of course you would… A common sense interpretation of the facts suggests that a superintellect has monkeyed with physics, as well as with chemistry and biology, and that there are no blind forces worth speaking about in nature. The numbers one calculates from the facts seem to me so overwhelming as to put this conclusion almost beyond question.

    Regarding future predictions from fine-tuning, I think it may be necessary to invoke a principle of maximal elegance. That’s why I’m interested in Garrett Lisi’s E8 theory. As one Web article puts it: “In Lisi’s model, the base is a four-dimensional surface—our spacetime—and the fiber is the E8 Lie group, a complicated 248 dimensional shape, which some mathematicians consider to be the most beautiful shape in mathematics.” (See http://theoryoforder.com/blog/.....rsymmetry/ .) I would expect an infinite Mind to make the universe in the most elegant fashion, if there is one that stands out from all the others.

    Would you consider that to constitute evidence for theism?

  76. 76
    Mark Frank says:

    #75

    I would expect an infinite Mind to make the universe in the most elegant fashion, if there is one that stands out from all the others.

    Would you consider that to constitute evidence for theism?

    I am afraid not. It supposes that the infinite mind is fond of elegance and therefore, to quote Sober, “builds the observational outcome into the theory it is supposed to test”. Or to use Collins’ terminology, I lack “independent motivations for believing the hypothesis apart from the confirming data”. Suppose the universe, or some aspect of the universe, turns out not be elegant but actually extremely chaotic and ugly. Would that now be evidence for an infinite mind that is fond of the chaotic and ugly?

  77. 77
    tgpeeler says:

    Sotto @ 36 “An infinite set doesn’t have to include everything.”

    I agree. I don’t recall ever claiming that one did. Did I?

  78. 78
    tgpeeler says:

    p @ 71 “No, the first law tells us that energy (including mass) is conserved, which means that it will never run out. Don’t confuse energy with entropy.”

    I’m not. Conserved means neither created nor destroyed (yet here we are) which means no more of it is being “made” which means finite. The energy changes from ‘available’ or a state of lower entropy, to ‘unavailable’ or a state of more entropy. Maximum entropy being a state of no ‘available’ energy. I hope this helps.

    p.s. I’ve found that when making assumptions upon which I build a system of conclusions that it works better if the assumptions are true. Just a thought.

  79. 79
    tgpeeler says:

    Sotto @ 72. “The second law is a statistical law.”

    Of course it is but there’s a reason it is called a LAW. Is there any ‘law’ in nature that is more certain that this one? Eddington thought not. It’s the physical equivalent to the law of non-contradiction in logic. If your argument breaks it, it’s wrong. If your theory ‘breaks’ the 2nd law of thermodynamics it’s wrong. See perpetual motion machines.

    It’s logically possible that heat can move from a cooler object to a warmer object or that a drop of dye in a glass of water will remain a drop and not diffuse throughout the water in the glass but it’s not physically possible that these things will ever occur because of the statistical LAW they would be violating.

    “Of course this is all academic. We currently have excellent evidence that the age of the universe is finite. Just an interesting sidebar to the discussion.”

    Yes, ‘we’ do. Well, not me but everyone who knows seems to say so. I would add that, once again, the empirical world offers up evidence that validates a logical truth. There is an argument from pure reason that says the age of universe is finite and lo and behold the evidence confirms it. Reason is the ultimate ruler when it comes to truth.

    “Do you think a spatially infinite universe is logically impossible?”

    Yes, I do. But let me explain before you declare me beyond hope. By definition it is impossible to have a finite infinite or an infinite finite. So how can the universe be finite (in terms of matter/energy and age) yet be infinite in terms of space? That said, I think the universe could expand indefinitely, but that is not the same as it being infinite. Or so I say. The infinite, as I understand its technical sense and as I use the term, is always abstract. So, for example, I would agree that the set of integers, or positive integers, or negative integers, is infinite. But that is a conceptual infinite. As soon as you start to write them down, i.e. make them concrete, then you are dealing with a finite set, even if it is one that goes on indefinitely. It’s just impossible to have an infinite number of finite things. Just like it’s impossible to have a square circle. Even God can’t make a square circle. It violates reason (law of identity) and therefore cannot possibly be true.

  80. 80
    tgpeeler says:

    Clive @ 74

    Exactly. 🙂

  81. 81
    tgpeeler says:

    Mark @ 76 “I am afraid not.”

    What would you consider evidence for theism?

  82. 82
    pelagius says:

    tgpeeler:

    I’ve found that when making assumptions upon which I build a system of conclusions that it works better if the assumptions are true. Just a thought.

    Indeed, which is why I asked this question earlier:

    The problem is with your assumption in step 1. How do you know that there cannot be an infinite number of physical things?

    tgpeeler:

    It’s logically possible that heat can move from a cooler object to a warmer object or that a drop of dye in a glass of water will remain a drop and not diffuse throughout the water in the glass but it’s not physically possible that these things will ever occur because of the statistical LAW they would be violating.

    The fact that the SLoT is a statistical law means that it is physically possible, just very, very unlikely.

    So, for example, I would agree that the set of integers, or positive integers, or negative integers, is infinite. But that is a conceptual infinite. As soon as you start to write them down, i.e. make them concrete, then you are dealing with a finite set, even if it is one that goes on indefinitely.

    And earlier you wrote:

    But I do know that there are not an infinite number of [universes]. Because if you start showing them to me I can count them.

    At any point in time, the set of integers that you have already written down is finite. That is because you are writing them down one at a time. Nevertheless, the set of all integers remains infinite. You will never finish writing them down.

    Likewise, suppose that there are an infinite number of universes, and I begin showing them to you at a constant rate. At any point in time, the number of universes you have counted is finite, because you are seeing them one at a time. Nevertheless, the number of universes remains infinite. You will never finish counting.

    The logic is the same in both cases. The idea of an infinitude of universes is just as coherent as the idea of an infinitude of integers.

    Do you see your mistake?

  83. 83
    Sotto Voce says:

    Tgpeeler,

    If we assign the second law the modal status you recommend, it would straightforwardly contradict both classical mechanics and quantum theory. According to the Poincare Recurrence Theorem, any closed Hamiltonian system must eventually return arbitrarily close to its initial conditions. This means that if we start with a drop of dye localized at the center of a glass of water and wait long enough, it MUST eventually return to this state. Of course, in the relative short term it will diffuse, as the second law predicts. But in the long term it must eventually violate the second law. Of course, it’s not really a violation once you understand that the law is statistical. Say it is a law that a coin I flip has a 50% chance of landing heads. If I flip the coin long enough I am very likely to eventually get a string of, say, 100000 heads. Taken by itself, this string might seem like a violation of my law, but understood as a part of a much longer sequence, it isn’t.

    As for your rejection of actual infinites: I don’t deny it is a coherent position, but I still don’t see any compelling reason to hold it. You’ve just asserted that it’s impossible. It’s true that I (or a Turing machine) can’t generate an infinite output in finite time, but in the multiverse scenario we’re not talking about the infinity being generated in finite time. It’s just there. I still fail to see why you think this is conceptually impossible (I presume you mean conceptual impossibility rather than logical impossibility. An actual infinity is clearly logically possible – it doesn’t entail a contradiction.)

  84. 84
    tgpeeler says:

    P and SV – going skiing for a week. No puter. Some things to think about. thanks.

  85. 85
    pelagius says:

    tgpeeler,

    Have fun skiing. And watch out for the infinite char lift. 🙂

  86. 86
    pelagius says:

    Oops. char -> chair

Leave a Reply