Uncommon Descent Serving The Intelligent Design Community

Biology prof: How can we really know if the universe is fine-tuned?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

From Waynesburg U biology prof Wayne Rossiter, author of Shadow of Oz: Theistic Evolution and the Absent God, a question about claims for fine tuning of the universe:

My major concern with arguments from fine-tuning in cosmology is, how do we really get from from observations of precision to statements of probability? To say that something is precise is not to say that it is improbable. Those are two different things.

As a third quick analogy, if we studied the fall patterns of icicles from the roof of my home, we might find that their placement is incredibly precise. Given the vast surface area a given icicle could fall on (my yard, the road, my neighbor’s yard, etc.), the fact that they consistently fall within a very narrow area directly below the edge of the roof (they more or less fall straight down) seems absurdly precise. Absurdly precise, if it was logical to entertain the possibility of icicles falling in ways other than straight down. But the presence of gravity and the lack of strong winds make this highly precise phenomenon highly probable. Said plainly, it would be absurd to treat the falling of an icicle straight down and the falling of it laterally into my neighbor’s yard as equally likely.

But, I think that’s the sort of assumption being made in the argument from cosmological fine-tuning. To say that such-and-such a physical parameter rests upon a razor’s edge does tell us something. It tells us that any small change in the setting of that parameter would lead to a universe drastically different from the one we live in, and likely one that could never even produce material objects (let alone life) as we understand it. Fair enough. I agree. What it doesn’t tell us is how likely any of those other settings are. More.

Thoughts?

See also: Copernicus, you are not going to believe who is using your name. Or how.

Follow UD News at Twitter!

Comments
of supplemental note to post 133: The discovery of a 'Dark Age' for the early universe uncannily matches up with the Bible passage in Job 38:4-11.
Job 38:4-11 “Where were you when I laid the foundations of the earth? Tell me if you have understanding. Who determined its measurements? Surely you know! Or who stretched a line upon it? To what were its foundations fastened? Or who laid its cornerstone, When the morning stars sang together, and all the sons of God shouted for joy? Or who shut in the sea with doors, when it burst forth and issued from the womb; When I made the clouds its garment, and thick darkness its swaddling band; When I fixed my limit for it, and set bars and doors; When I said, ‘This far you may come but no farther, and here your proud waves must stop!" History of the Universe - Timeline Graph Image http://www.der-kosmos.de/pics/CMB_Timeline300_gr.jpg Job 26:10 He marks out the horizon on the face of the waters for a boundary between light and darkness. Proverbs 8:26-27 While as yet He had not made the earth or the fields, or the primeval dust of the world. When He prepared the heavens, I was there, when He drew a circle on the face of the deep, Planck satellite unveils the Universe -- now and then (w/ Video showing the mapping of the 'sphere' of the Cosmic Microwave Background Radiation with the Planck satellite) - 2010 http://phys.org/news197534140.html#nRlv
bornagain77
December 22, 2016
December
12
Dec
22
22
2016
04:14 PM
4
04
14
PM
PDT
DS, again, look at the matter, we are talking about not just constants (which simply do not partake of the sort of necessity that pi etc do) but quantities, circumstances and the like. And, even more importantly, I am not arguing that such necessity must be or is the case. I am simply pointing out that if such a wide vartioety of things not critically dependent on one another, but all of which must be in a resonance zone for our kind of cell based life rooted in C-chemistry in aqueous medium to be, then the locking is not in the phenomena. it lies elsewhere, some force that locks. You can choose to deny or be super-skeptical about it, but that is not going to change the force of the point, which should be readily evident to anyone who looks into it. And, a force -- in the broad sense -- that locks up so many disparate things is going to have to be pretty carefully set up itself, which means fine tuning has been displaced up one level. Worse, if there is actually a law of metaphysical necessity that locks up all sorts of things to life permitting zones as we can list on and on, then that necessity is highly "suspicious," too. The suggestion maybe it is a metaphysical necessity that cannot be averted in any world -- no more than a world can exist without two-ness in it -- is not only utterly implausible but does not evade the point of fine tuning, were it to actually hold. So, I turn about the challenge, on what grounds do you wish to suggest that such an idea is anything but special pleading of a most implausible nature, and how does such evade the point that something that sets up reality to be necessarily much like our world as we observe, would not be fine tuned? Worse, how does it then address the force of the mathematics based on the observations of our cosmos, that allows sensitivity analysis and from that leads to, we are in a narrow resonance. KFkairosfocus
December 22, 2016
December
12
Dec
22
22
2016
02:21 PM
2
02
21
PM
PDT
JAD, you may appreciate this if you don't already have it:
How The Stars Were Born - Michael D. Lemonick) http://www.time.com/time/magazine/article/0,9171,1376229-2,00.html For the first 400,000 years of our universe’s expansion, the universe was a seething maelstrom of energy and sub-atomic particles. This maelstrom was so hot, that sub-atomic particles trying to form into atoms would have been blasted apart instantly, and so dense, light could not travel more than a short distance before being absorbed. If you could somehow live long enough to look around in such conditions, you would see nothing but brilliant white light in all directions. When the cosmos was about 400,000 years old, it had cooled to about the temperature of the surface of the sun. The last light from the "Big Bang" shone forth at that time. This "light" is still detectable today as the Cosmic Background Radiation. This 400,000 year old “baby” universe entered into a period of darkness. When the dark age of the universe began, the cosmos was a formless sea of particles. By the time the dark age ended, a couple of hundred million years later, the universe lit up again by the light of some of the galaxies and stars that had been formed during this dark era. It was during the dark age of the universe that the heavier chemical elements necessary for life, carbon, oxygen, nitrogen and most of the rest, were first forged, by nuclear fusion inside the stars, out of the universe’s primordial hydrogen and helium. It was also during this dark period of the universe the great structures of the modern universe were first forged. Super-clusters, of thousands of galaxies stretching across millions of light years, had their foundations laid in the dark age of the universe. During this time the infamous “missing dark matter”, was exerting more gravity in some areas than in other areas; drawing in hydrogen and helium gas, causing the formation of mega-stars. These mega-stars were massive, weighing in at 20 to more than 100 times the mass of the sun. The crushing pressure at their cores made them burn through their fuel in only a million years. It was here, in these short lived mega-stars under these crushing pressures, the chemical elements necessary for life were first forged out of the hydrogen and helium. The reason astronomers can’t see the light from these first mega-stars, during this dark era of the universe’s early history, is because the mega-stars were shrouded in thick clouds of hydrogen and helium gas. These thick clouds prevented the mega-stars from spreading their light through the cosmos as they forged the elements necessary for future life to exist on earth. After about 200 million years, the end of the dark age came to the cosmos. The universe was finally expansive enough to allow the dispersion of the thick hydrogen and helium “clouds”. With the continued expansion of the universe, the light, of normal stars and dwarf galaxies, was finally able to shine through the thick clouds of hydrogen and helium gas, bringing the dark age to a close. (adapted from How The Stars Were Born - Michael D. Lemonick) The Elements: Forged in Stars – video https://www.youtube.com/watch?v=B-LXUHJmzzc
bornagain77
December 22, 2016
December
12
Dec
22
22
2016
02:11 PM
2
02
11
PM
PDT
JAD,
It appears that DaveS wants to take the law or necessity option. However, if you are going to argue that all the apparent fine-tuning we see is necessary you are going to have to explain from where it originated. If it was there “in the beginning,” how did it get there? In other words, you’ve avoided the chance option but what does that really get you?
Frankly, I'm not even thinking about explaining apparent fine tuning at the moment, so I don't expect my "argument", such as it is, will have much bearing on that. What I am thinking about is this particular step in KF's reasoning which says, more or less, that if our physical constants are necessary in some sense, then there must exist a higher-order realm of tuneable entities which generated those constants. On its face, that seems very similar to positing the existence of a multiverse---which is also unfalsifiable and unconfirmable.daveS
December 22, 2016
December
12
Dec
22
22
2016
01:12 PM
1
01
12
PM
PDT
KF,
DS, constants of physics are not logically compelled in any possible world, such as would the value of e or that of pi or phi. And that is before we get to things like charge balance or the proportions of normal matter and anti-matter, etc. Similarly, we have no reason to believe the structure of empirical laws we see is forced by power of logic so these must be in any possible world. That’s why people can talk about multiverses. KF
How do we know all these things about worlds to which we have no access? It would seem impossible to confirm or refute any of these statements.
... The range of things locked would require a significant mechanism.
Again, how do we know this? Is there some experiment we can perform to test it?daveS
December 22, 2016
December
12
Dec
22
22
2016
12:58 PM
12
12
58
PM
PDT
It has been known for some time that sufficient basic elements that are needed for planet formation and life chemistry could not exist without supernova explosions occurring at the right frequencies, and even in the right places. It is hard for me to see how this frequency which depends on a number of other parameters could be conceived as necessary or “locked in” and not contingent. Brian Koberlein, an astrophysicist and physics professor at Rochester Institute of Technology, explains the process:
For small stars, hydrogen is the only element they can fuse; when they run out, they go dark. But after the largest of the first stars transformed their hydrogen to helium, they burned on in another way. When these large stars stopped fusing hydrogen, their internal pressure went down, gravity began to collapse them again, and the temperature of their cores rose. As their cores reached a temperature of a hundred million Kelvin, helium began to fuse into beryllium (an atom with four protons), and beryllium and helium fused to produce carbon (six protons). The element central to life on Earth began to form in the blazing hearts of stars, though this carbon still had a long journey ahead before it would become a part of us. From carbon fusion comes nitrogen and oxygen (seven and eight protons, respectively), two more elements necessary for life, and from these comes a chain of fusion up to iron (26 protons). Fusing iron into heavier elements doesn’t produce more energy, as the fusion of lighter elements does—when iron fuses, it absorbs energy, which is actually a good thing. If elements always fused into heavier elements, then the first stars would have simply fused indefinitely, until they became neutron stars, enormous, undifferentiated orbs of nuclear material. But because the fusion of iron actually cools the core of a star, the chain of fusion shuts down. After their fusion stopped, the first big stars eventually collapsed under their own weight, which triggered supernova explosions. The outer layers of each star, rich in carbon, nitrogen, and oxygen, were cast into interstellar space, and only the cores of these stars collapsed, yet again, into neutron stars.
http://nautil.us/blog/how-the-universe-made-the-stuff-that-made-us However, the natural synthesis of heavy elements, as Koberlein goes on to explain, is even more complex than that because stellar nucleosynthesis is only capable taking us up to iron. For example, it is believed that heavier elements like gold are the result of neutron star collisions. That is another thing that appears to me to be very contingent. Hugh Ross summarizes the situation.
supernovae eruptions: if too close: life on the planet would be exterminated by radiation if too far: not enough heavy element ashes would exist for the formation of rocky planets if too infrequent: not enough heavy element ashes present for the formation of rocky planets if too frequent: life on the planet would be exterminated if too soon: heavy element ashes would be too dispersed for the formation of rocky planets at an early enough time in cosmic history if too late: life on the planet would be exterminated by radiation
http://www.reasons.org/articles/fine-tuning-for-life-on-earth-june-2004 Again, here is the fine-tuning argument that William Lane Craig likes to use:
The fine-tuning of the universe to support life is either due to law, chance or design It is not due to law [or necessity] or chance Therefore, the fine-tuning is due to design
It appears that DaveS wants to take the law or necessity option. However, if you are going to argue that all the apparent fine-tuning we see is necessary you are going to have to explain from where it originated. If it was there “in the beginning,” how did it get there? In other words, you’ve avoided the chance option but what does that really get you?john_a_designer
December 22, 2016
December
12
Dec
22
22
2016
12:09 PM
12
12
09
PM
PDT
MW, the estimated scale and age of the observed cosmos trace to empirical evidence. I simply spoke in that context. Come up with better evidence and numbers and I would go with them. KFkairosfocus
December 22, 2016
December
12
Dec
22
22
2016
11:44 AM
11
11
44
AM
PDT
DS, constants of physics are not logically compelled in any possible world, such as would the value of e or that of pi or phi. And that is before we get to things like charge balance or the proportions of normal matter and anti-matter, etc. Similarly, we have no reason to believe the structure of empirical laws we see is forced by power of logic so these must be in any possible world. That's why people can talk about multiverses. KF PS: again, I spoke to the three alternatives: locked (which requires locking as we are not speaking of things like pi), flat random take any value, distributions that make some values more and some less, likely. The range of things locked would require a significant mechanism. Whether flat or biased, a distribution will in the Monte Carlo type context eventually sample all cells in the space. so, none of the three is able to remove fine tuning, as seen already.kairosfocus
December 22, 2016
December
12
Dec
22
22
2016
11:40 AM
11
11
40
AM
PDT
mw Thanks for those fascinating historical and theological details. It's certainly enough to make us wonder about the various 'certainties' we are given.Silver Asiatic
December 22, 2016
December
12
Dec
22
22
2016
07:21 AM
7
07
21
AM
PDT
@ 119, john_a_designer, makes a fair point, “how do you calculate the odds of coincidence? And gives an example. In conjunction, kf twice has stated the cosmos is 90 billion light years wide and 13.8 billion years old. ___________________________________________________________________ However; coincidences. Carl Jung, coined the term “synchronicity.” “Several psychoanalysts noted certain strange coincidences in which their patients received information about them by extra-sensorial ways, information that was not accessible to the general public.” “Jung writes a book on synchronicity together with Nobel laureate W. Pauli,...” http://carl-jung.net/synchronicity.html As for omens or coincidences embedded with some perceived physical or spiritual significance, Yahweh warns to keep clear, as Isaiah prophesied. It is God, “who frustrates the omens of liars, and makes fools of diviners; who turns back the wise, and makes their knowledge foolish;” (Isa 44:25). Perhaps the following is an example of an omen? http://www.catholicnewsagency.com/news/san-gennaros-blood-didnt-liquefy--so-pray-anyway-abbot-says-74307/ However, some may call the figures of 90 and 13.8, a synchronicity; a term to explain a meaningful coincidence. The danger lies in who or what is contributing to the meaning. It seems that, coincidently we have theoretically arrived at a scientific type of synchronicity, an omen; we have considered the figures by coincidence, as true from the beginning, which coincidently as ‘proved’ God created over 13.8 billion years. Such a coincidence we may have made full of sense to our liking, but without first applying clear divine law, the key to better our understanding. Therefore, on that basis, in my opinion, the figures are more of a meaningful coincident, secondary to all being created in six days. As for the perceived 90 size of the cosmos, what is that to the unknown size and power of God? Julian of Norwich and her manuscript, “Revelations of Divine Love,” featured on BBC 4 TV, 19th July, 16. Julian, an unlettered woman, received the revelations at death’s door in May (probably the 13th) 1373. She later became the first English woman to write a book (Revelations) in English. “She is called Blessed, although she was never formally beatified,” and “venerated in both the Catholic and Anglican Communion.” She received a vision on the size of the creation: ‘In this vision he showed me a little thing, the size of a hazelnut, and it was round as a ball. I looked at it with the eye of my understanding and thought What may this be? And it was generally answered thus: “It is all that is made.” I marvelled how it might last, for it seemed it might suddenly have sunk into nothing because of its littleness. And I was answered in my understanding: “It lasts and ever shall, because God loves it.”’ http://www.cynthialarge.com/julian/hazelnutboxpoem.html You may object and say, but such is not canonical. True, but the Ten Commandments are, and they hold the key to understanding life, the cosmos and everything, and it is not the number 42: or a googol. https://en.wikipedia.org/wiki/Googolplex Is it not more accurate to say, those figures 90 and 13.8, give the impression of size and age? However, surely needed is a clearly provided divine key for better understanding, honest science and greater faith? In my opinion, we are continuing to pay the price for not keeping to divine law by pumping up exclusively the Big Bang Theory and Darwinism. A Catholic example and ten year olds: "Those that are leaving for no religion - and a pretty big component of them saying they are atheist or agnostic - it turns out that when you probe a bit more deeply and you allow them to talk in their own words, that they are bringing up things that are related to science and a need for evidence and a need for proof," said Dr. Mark Gray, a senior research associate at the Center for Applied Research in the Apostolate at Georgetown University. http://www.catholicnewsagency.com/news/why-catholics-are-leaving-the-faith-by-age-10-and-what-parents-can-do-about-it-48918/ The problem is, it seems, that the answer is to give children more of the same: evolution theory and divine law are compatible when they are billions of years apart. Proof! If Moses resurrected from the dead, would we believe him?mw
December 22, 2016
December
12
Dec
22
22
2016
06:33 AM
6
06
33
AM
PDT
KF, Yes, I do think we are talking about two different things then. I am specifically concerned with the possibility that the physical constants can only take one value, namely those values they hold in the existing universe. I guess you would say that in this case, it is physically impossible for the constants to have been otherwise. It's not possible to rule this out is it? Or even to estimate its probability. As to "going up a level", how can we actually know this makes sense? Again, this involves positing the existence of some sort of entities which determine the physical constants, with some degrees of freedom themselves. It seems even more speculative than simply supposing the constants actually are sampled from some distribution(s). It appears in your view that it cannot be that the constants each have only one possible value---you will always appeal to "higher levels" in order to maintain the position that they were somehow selected from a larger space.daveS
December 22, 2016
December
12
Dec
22
22
2016
06:10 AM
6
06
10
AM
PDT
PPS: A designer pondering alternatives in an effective simulation world then effecting physically our observed cosmos based on choice would constitute fine tuning, too. Indeed, such would make Monte Carlo driven sensitivity analysis of greater import than anything else! (It would also give some force to the old thought about how in science we think God's thoughts after him -- along the lines of Hoyle's super-calculating super-intellect.) And, inherently, such a designer is on the table to explain origin of the world.kairosfocus
December 22, 2016
December
12
Dec
22
22
2016
12:02 AM
12
12
02
AM
PDT
DS, remember, we are discussing not just laws but parameters and sheer simple quantities such as how many positive and negatively charged particles exist, how much matter exists, the balance of matter vs anti-matter etc. In that context, I have considered what are the logical options, relative to the known mathematical format of the composition and dynamics of cosmology that models our cosmos; using fairly standard longstanding approaches for looking at phase or configuration spaces and well-known results:
. . . once we have systems that can freely wander and any cell is at all possible, a large enough ensemble given enough time will pass through every possible state -- for that matter a singleton given enough time [much more time] will do the same, especially if we ponder a randomising force that drives a random walk effect that superposes on any trajectory . . . such a space is being searched and the issue is search resources and a wandering mechanism that makes any particular cell in the space a possibly occupied one . . . if possible [= accessible], it eventually will be occupied by the system.
Notice Walker and Davies above and their Arxiv paper. This then feeds an examination of the three alternatives, the three possibilities logically available:
(a) the lot are locked [by some force etc], (b) they vary with maximal freedom, (c) they vary with a bias that makes some zones easier/harder than others,
and this applies for in effect a neighbourhood of our local observed cosmos' framework. Such an approach will apply to ANY quasi-physical super-laws/ forces/ mechanisms driving such, whatever we may later see. To do that Monte Carlo style exploration I do not need anything more than the fact that mathematics can reasonably be seen as [the study of] the logic of structure and quantity. This study patently includes sensitivity analysis of the structure of mathematical systems or model frameworks. It is that study that led to the implications of fine tuning being highlighted since 1953. As for mechanisms that may drive the thing into lock [highly unlikely . . . and in regard to quantities of particles etc, almost utterly certainly not so] or give maximal freedom or freedom with a bias, that is of interest but not relevant to the point that we here have captured the three options for wandering around the abstract configuration space of the mathematical frame of a cosmos or sub-cosmos. The result is, fine tuning is still there as an inherent aspect of the mathematics, whether at first or second level or onward levels. On the locked option, ponder a force [in the broad sense]that locks cosmic mass at bang and inflation to what 1 in 10^60 or so, or the famous result on initiating entropy that BA77 likes to allude to; it is not plausible that such a framing force is itself locked . . . indeed what is quite plausible is that we have a design decision with bill of materials and properties etc that leads to framing a world that very much looks set to a local resonance because that is just what is intended. Fine tuning is not going away, as Sir Fred Hoyle knew long since. The real issue is to explain it, and the options seriously on the table are, multiverse or design. The evidence strongly favours the latter, but given the dominance of evolutionary materialism, that will be resisted, including by appealing to a multiverse for which there is but little physical evidence and certainly no observational evidence. KF PS: Someone above commented on the span of laws per observations and general views. I have given the span per observations and typical contexts, 90 bn LY and 13.8 BY. To change those simply provide strong enough observations to change the numbers. Science is not about observationally uncontrolled speculation.kairosfocus
December 21, 2016
December
12
Dec
21
21
2016
11:37 PM
11
11
37
PM
PDT
KF,
The three cases with “distributions” of values are going to be locked to value seen in all possible and relevant worlds, varying flatly without constraint, an intermediate which is not quite fixed and not quite freely variable, i.e. a somehow preferred range but room to move about.
Yes, and I am asking only about the "locked to value" case, in which the physical constants could have only been what they actually are.
It turns out that on locking, then we face something that forces a wide range of quite disparate things to hold their values as we see them in any possible, relevant world. Such a law of force would obviously be itself fine tuned.
Well, "obviously fine tuned" suggest to me this law of force could have been different (otherwise no tuning is possible). But that means that the physical constants actually could have been otherwise, depending on the particular law of force. So I think we have different ideas of what "locked to value" means here.daveS
December 21, 2016
December
12
Dec
21
21
2016
07:49 PM
7
07
49
PM
PDT
DS, pardon me but the issue is the issue. There is indisputably an ideological imposition of a priori evolutionary materialism on science of origins and I spoke to that; particularly to the self referential incoherence and the problem that such mechanisms as it allows cannot pass the Newton vera causa test. Tied, there is a problem of undue accommodation to this, which I have spoken of as becoming a fellow traveller in previous discussions. Perhaps using "you" was a poor word choice on my part, as I certainly did not mean "you" = WR, specifically. For that point of possible confusion, I am sorry; unintentional. Next, I draw attention to the way I addressed the question of values, exhausting all three positions on variation of parameters, amounts, and structures of laws in the context of sensitivity analysis. And yes, I am treating cosmology as though it were a model and am asking, what happens if -- per mathematics -- things move about. The three cases with "distributions" of values are going to be locked to value seen in all possible and relevant worlds, varying flatly without constraint, an intermediate which is not quite fixed and not quite freely variable, i.e. a somehow preferred range but room to move about. This exhausts the possibilities. Which was the intent. It turns out that on locking, then we face something that forces a wide range of quite disparate things to hold their values as we see them in any possible, relevant world. Such a law of force would obviously be itself fine tuned. For the consideration, moving about freely in whatever configuration space is relevant, we readily see the local resonance peak and deeply isolated operating point put on the table since 1953. (This is the one that is eing targetted by discussions about you have to know the probability distribution, i.e. we see a rejection of the Bernouilli-Laplace principle of indifference in probability analysis as default in absence of specific reason to assign bias. I suspect this is actually selectively hyperskeptical, but am not yet defending that view.) The third view is the one that is most esoteric, and probably requires some familiarity with Gibbsian approaches to statistical mechanics to see its force. The approach is to take as a thought exercise a large ensemble of systems with similar components and starting conditions then allow to run for long enough. Where of course randomness is a significant component of what is happening. The result is, that the phase space will be fully explored across the ensemble, as time goes on. If states are at all possible, they will be actual in some system at some point. 9just think, randomness is always disturbing trajectories and in the classical case sensitive dependence on initial or intervening circumstances will cause divergence across time between initially similar systems, until after a time they will be radically different. Eventually, with enough systems and enough time, every possible microstate will be actualised at some point. In this case, we are in effect doing a Monte Carlo run across the space of configs of the math of the cosmos, and thus the laws, parameters and values. bias but not locking obtains, so even though it is hard to reach certain possible states, eventually they WILL be reached. So, the issue becomes, how big a collection and how much time to run to search out the relevant zone. The end is, a biased distribution will only make it harder to explore the space fully, it will not block it, otherwise we are looking at some form of locking. on surveying the three cases, the result is obvious. The sharp resonance that marks the laws and parameters we see will still be there. In the locked parameters etc case, this is simply displaced to the next level -- what locks the lock so to speak. In the other two cases, the exploration will happen and the result will be to expose the sharp resonance. Coming back down to physical worlds, if the laws of physics, constants, values of quantities and so forth are locked to what we see, then there is a displacement of fine tuning to the locking force or mechanism if you will. Something is setting up the cosmos bakery to consistently deliver well baked loaves of bread. In the other cases, the sharp resonance is evident right there. So, if a multiverse exists, we need to ask, why are we in an operating point on such a sharp resonance? (By statistical weight of clusters, we should be anywhere but here. this invites the inference that there is intentional fine tuning that put us here. Leslie's lone fly on a patch of wall swatted by a bullet.) Fine tuning is there, it is not going away. KFkairosfocus
December 21, 2016
December
12
Dec
21
21
2016
06:34 PM
6
06
34
PM
PDT
KF,
Pardon but did you observe that a few times now, I took time to look at two extrema and a spectrum between, then showed that none of the three cases suffices to remove the fine tuning issue?
Yes, I observed that you have claimed such*, but I wonder why you felt it necessary to include language such as the following in a post addressed to WR:
Once you have done that, all the challenges of comparative difficulties analysis are on the table. And you cannot appeal to the holy lab coat to lock out unwelcome major worldview alternatives. On pain of grand ideologically motivated question-begging. Where, very rapidly, such evolutionary materialism finds itself unable to account for the minds required to actually reason as opposed to compute. Not to mention, adherents face extraordinary problems accounting for creating FSCO/I rich computational substrates and soundly programming them. Evolutionary materialism is self referentially absurd. It is not a serious option, though it is a common one and it is artificially propped up ideologically, by those who hope to gain from its widespread adoption. We come full circle, without such materialism being privileged [and with it under a cloud of self falsification], what best explains precise coordination, coupling and organisation of many elements forming a coherently functional whole?
It seemed a bit condescending to me, given his position and background. But that's none of my business, I suppose. *This statement (following on yesterday's discussion) piques my interest:
Namely, if the cluster of parameters, quantities and laws are locked by some super-force, that force will be fine tuned. The problem is simply displaced one level.
It seems you are positing a collection of more than one "superforce" here, which is roughly just as questionable in my mind as positing that the fundamental physical constants are sampled randomly from some interval (of positive length) of real numbers. Why does there have to be more than one (or any, for that matter) superforces in order that the physical constants each can only take one value?daveS
December 21, 2016
December
12
Dec
21
21
2016
05:27 PM
5
05
27
PM
PDT
There are many things in real life whose probability can’t be calculated. For example, how do you calculate the odds of coincidence? For example, there is a well-documented story from 1864 of Booth saving Lincoln before Booth shot Lincoln. A man by the name of Edwin Booth saved the life a younger man named Robert Lincoln who had just fallen off a station platform next to moving train, in Jersey City, NJ. But the coincidence goes beyond their last names. Edwin Booth was the brother of John Wilkes Booth who assassinated President Abraham Lincoln, while Robert Lincoln was the President’s oldest son. What are the odds of something like that happening? How would you ever begin to calculate the probability of something like that? http://www.historynet.com/edwin-booth Another coincidence: Thomas Jefferson and John Adams both die on July 4, 1826 exactly 50 years after the signing of the declaration of independence. Again, how would you ever begin to calculate the odds of something like that happening? People sometimes describe things like this as “just coincidence.” And the cases that I cited above may be just that. (My apologies to any Calvinists out there.) However, what do we say when we’re confronted with a string of coincidences? Suppose for example, one day while you are out driving you notice a dark SUV carrying 2 men dressed in dark suits wearing dark sunglasses. A few days later you see them again, then again a couple days after that. This continues for several weeks even after you deliberately decide to drive a different from normal route. Would you say that this was just coincidence or that you were being purposely followed? What would the basis of your inference be? A rigorous calculation of the probabilities? Or an intuition? The point is that when confronted with a string of coincidences we naturally begin to suspect after a while that maybe this is not just coincidence. I think that this is what is happening when we’re confronted with a string of finely tuned cosmic coincidences. Is it all just coincidence or is there some other explanation? Ironically, both theist and the atheist have the intuition that there must be some other explanation. It is just that they have different explanations. However, this leaves the atheist in the awkward situation that he has to believe in something he has no evidence for-- the multiverse-- by faith. But don’t atheists believe that faith is irrational? How ironic.john_a_designer
December 21, 2016
December
12
Dec
21
21
2016
05:18 PM
5
05
18
PM
PDT
Certainly many of the fine tuning arguments are false. Either because universal constants cannot have any other value than they have, that the universal can only be or not be and not be tuned, or the value that they have could be tuned, but the one that they have is a logical function of maths, so is the most likely to begin with.mohammadnursyamsu
December 21, 2016
December
12
Dec
21
21
2016
04:51 PM
4
04
51
PM
PDT
This semi-related piece just hit my facebook feed:
How a Defense of Christianity Revolutionized Brain Science - JORDANA CEPELEWICZ ON DEC 20, 2016 Excerpt: in 1748,, philosopher David Hume published 'An Enquiry Concerning Human Understanding', calling into question, among other things, the existence of miracles. According to Hume, the probability of people inaccurately claiming that they’d seen Jesus’ resurrection far outweighed the probability that the event had occurred in the first place. This did not sit well with the reverend. Inspired to prove Hume wrong, Bayes tried to quantify the probability of an event.,,, “The basic probabilistic point” of Price’s article, says statistician and historian Stephen Stigler, “was that Hume underestimated the impact of there being a number of independent witnesses to a miracle, and that Bayes’ results showed how the multiplication of even fallible evidence could overwhelm the great improbability of an event and establish it as fact.” The statistics that grew out of Bayes and Price’s work became powerful enough to account for wide ranges of uncertainties. In medicine, Bayes’ theorem helps measure the relationship between diseases and possible causes. In battle, it narrows the field to locate an enemy’s position. In information theory, it can be applied to decrypt messages. And in the brain, it helps make sense of sensory input processes. http://nautil.us/blog/how-a-defense-of-christianity-revolutionized-brain-science
bornagain77
December 21, 2016
December
12
Dec
21
21
2016
02:07 PM
2
02
07
PM
PDT
DS, Pardon but did you observe that a few times now, I took time to look at two extrema and a spectrum between, then showed that none of the three cases suffices to remove the fine tuning issue? Namely, if the cluster of parameters, quantities and laws are locked by some super-force, that force will be fine tuned. The problem is simply displaced one level. Next, considering a reasonable neighbourhood of the observed operating point for the mathematical framework, we can use sensitivity analysis based on a flat random -- maximal freedom to move -- model, which immediately reveals the narrow resonance. That is, this highlights the fine tuning. Then, if we look on the model's configuration space and apply some bias that is intermediate between the two cases so far in a Monte Carlo style sensitivity analysis, so long as we have a large range of relevant cells in the space being possible, a sufficiently large ensemble of cases with sufficient time to develop will explore all possibilities. The issue being how big a collection and how long. But this case has also failed to escape fine tuning, it only tells us how big a search is needed to do such an exploration. (This is utterly unsurprising on longstanding stat mech results.) So, the side debates on specifying probability distribution functions is strictly irrelevant to the point at stake. Like many such debates it may indeed pull the argument into a side-point, but it does not actually remove the fine tuning challenge from the table. So, having first shown the irrelevancy, it is appropriate to draw attention back to the focal issue. KFkairosfocus
December 21, 2016
December
12
Dec
21
21
2016
01:47 PM
1
01
47
PM
PDT
In my opinion using fine tuning as a teleological argument works just fine based on merely the precision of the fundamental constants alone, even if we can’t rigorously derive any of their probabilities. John Leslie’s firing squad parable, where 50 or more trained marksmen all miss a man who has been condemned to die is a good illustration. Leslie, who I don’t believe is a theist, used his parable to critique the weak anthropic principle which says, in a question begging way that we shouldn’t be surprised to find ourselves existing in this universe because if it wasn’t fine-tuned we would be here. If you were the person who survived firing squad would you conclude it was chance or luck, or some kind of conspiracy? What’s the best explanation? You would certainly have good reason to be surprised, wouldn’t you?john_a_designer
December 21, 2016
December
12
Dec
21
21
2016
11:19 AM
11
11
19
AM
PDT
john_a_designer and daveS, I've apologized for my part in instigating his ad hominem towards me and moved on. Moreover, I have since then earnestly tried to see if his argument has any real merit to it, i.e. read his post from top to bottom, watched the back and forth here, mulled it over, and I can still find no real merit in his argument. He simply has offered no compelling reason, scientific or philosophical or otherwise, why anyone should not be 'surprised' by fine tuning. And like everyone else here, I can only offer my own opinion. But that is my opinion of his argument for what its worth. I simply, all personal issues aside, find his argument to be without any real merit.bornagain77
December 21, 2016
December
12
Dec
21
21
2016
11:07 AM
11
11
07
AM
PDT
JAD, Yes, I'm sure he can. And to be explicit, you are one of those I referred to above who I believe is giving WR a fair hearing and even recognizing the points he makes.daveS
December 21, 2016
December
12
Dec
21
21
2016
10:18 AM
10
10
18
AM
PDT
DaveS, I think Wayne can handle the back and forth. Go back and read and what I said @ 59. You can use direct link is here: https://uncommondescent.com/fine-tuning/biology-prof-how-can-we-really-know-if-the-universe-is-fine-tuned/#comment-622376 And then his response at 61. Also read some of the legitimate concerns he raises @ 60.john_a_designer
December 21, 2016
December
12
Dec
21
21
2016
09:36 AM
9
09
36
AM
PDT
KF and WR, Pardon my jumping in here, but I find this whole encounter quite baffling. AFAICS, WR has stated that unless someone can give pdfs for the fundamental physical constants [or perhaps supply some other justification], we should not make arguments based on the probability of those constants lying in a certain region. And for that he's being given a remedial course in ID. It appears to me that he simply wants to maintain a high level of rigor, and therefore should be praised (as some here have) rather than condescended to.daveS
December 21, 2016
December
12
Dec
21
21
2016
07:52 AM
7
07
52
AM
PDT
JAD
Obviously we don’t know anything at all about other universe, indeed, we don’t know another universe exist or existed– let alone “10^500 possible universes.
Yes, this is the point which I haven't gotten wrossiter to acknowledge yet. You can't have it both ways. One cannot forbid speculations about the probable origin of physical constants but at the same time, allow speculations on the proposed 'count of additional (imaginary) universes that supposedly exist'. On WR's standard (which I agree with strictly speaking), all scientific talk of a multiverse must be silenced. We only know of one universe. Our data set is a population of one. That's it. But are we willing to never engage those conversations on an "even if your imaginary speculation made sense" basis? I agree with WR's rigor on the topic, but he's not being consistent. If we adopted his view, all talk of a multiverse from the ID perspective would be totally dead. It is supported by zero scientific evidence.Silver Asiatic
December 21, 2016
December
12
Dec
21
21
2016
07:16 AM
7
07
16
AM
PDT
In all sincerity, a really good piece, kf. Just a little light-hearted comment on the following: “The 90 bn LY wide, 13.8 bn year cosmos testifies to the stability of the system in the long term and across an extraordinarily large span.” _______________________________________________________ Is space really “90 bn LY wide,” or is that version from the flat cosmos big bang society whereby we fall of the edge of space if we go any further? Seriously though, is not perceived design over billions of years just what you would expect if God created a stable mature universe in six days? According to any first impression in reading Judaeo-Christian scripture, God testifies he created in six days (Exod 31:18), words that God said are unalterable, and given with a warning (Deut 4:1-3). Today, we have gay people coming out of the closet seemingly replaced by those who believe God created by design in six days; words written in stone and placed in the ark of His Testimony designed by God according to His plans. The mercy-seat (the lid on top of the ark) designed by God (Exod 25:17) had two golden images of cherubs (Exod 25:18). From above the mercy-seat God would speak personally to Moses in the Holy of Holies. Further, God even provided spirit filled craftsmen “with ability, intelligence and knowledge,” including the clothing of the high priest and furniture of the tabernacle (Exod 31:1-11). So much for the nonsense of theistic Darwinism and blind design in imperceptible steps. You cannot get just when needed spirit filled craftsmen by theoretical common descent. Besides, Darwin rejected Jesus as the Son of God, and rejected miracles. Satan must have been rubbing his hands. However, God remined Moses: “And see that you make them according to the pattern for them, which is being shown you on the mountain.” (Exod. 25:40) Later, God provided the plans for the building of the first stone temple: “for the altar of incense made of refined gold, and its weight; also his plan for the golden chariot of the cherubim that spread their wings and covered the ark of the covenant of the LORD. ‘All this, in writing at the LORD’s direction, he made clear to me—the plan of all the works.’” (1 Chr 28:18-19 and 11:12) Of note; the designs given by Yahweh included a god disc, “a rosette of pure gold” (Exod 28:36) with four holes in it through which the disc was held by blue chord around the head-piece on the forehead of the high priest with the words, “Holiness to Yahweh.” Strict were the rules God gave for carrying His holy words in stone. However, considering the six stone jars of water at Cana; the water instantly created into mature wine. No test could prove it was not created instantly other than believing Jesus/God. According to untouchable divine law we could say we live in the matured wine of the cosmos. We have no means to decide how the cosmos arose because we cannot come out of the cosmos or have the power to produce a test cosmos to verify theory or not. Ultimately, only a true testimony will suffice: a gold standard of truth. Tuned will be our judgement to the word of God from Sinai. Today, it seems God’s word needs fine tuning to get his word in line with the Big Bang Theory. Hence, six days really means approx 365 x 13.8 billion years. Or, am I the one in need of fine tuning? As for design, the ark God designed was place in the first permanent temple God designed. Yahweh also gifted with the greatest wisdom king Solomon who had the temple built. Are we to say that with all the meticulous designs from God, over which the glory of God had spoken to Moses, built with spirit filled craftsman, that God had placed two wrong words in the Holy of Holies in the first stone temple? Seriously flawed the numbers six and seven, blemished and disfigured words in the Holy of Holies? Does that sound like a pattern of sound words? Or, perhaps the result of a powerful beguiling theory the God of this world has taken to his dark heart? Are we to say, that the God of numbers, who can number the hairs on our heads, cannot number the days he took to create, or number above our heads how long he took to create the cosmos. I mean, to create a cosmos in six days is worthy of worship. That God took ages, and ages, and ages and ages, seems very odd when he can create instantly life from a rock or stone. How long for a planetary rock: same time, surely? True divine given knowledge, testifies God created the cosmos in six days. If we are truly honest, no human can check or test truthfully a divine law. “I did not speak in secret, in a land of darkness; I did not say to the offspring of Jacob, ‘Seek me in chaos.’ I the LORD speak the truth, I declare what is right.” (Isa 45:19) Ah well, back to the closet. Happy Christmas.mw
December 21, 2016
December
12
Dec
21
21
2016
07:07 AM
7
07
07
AM
PDT
Excellent work, KF. Thank you.Truth Will Set You Free
December 21, 2016
December
12
Dec
21
21
2016
12:13 AM
12
12
13
AM
PDT
WR, pardon but I find it necessary to ask, have you ever designed and built anything that requires fine precision and tight coupling of multiple parts, say something mechanical with parts working together to 1 thou (of an inch) or an electronic ckt with a 1 - 10 parts per million crystal controlled oscillator that controls some process? Perhaps, even a bit of carpentry that requires precision to fit and function. There is a world of experience of such, and the unity of purpose and mutual fit required for organisation and coupling based function to emerge is itself directly a strong sign of design. This is driven by a longstanding sense that multiple coincidences that result in something of significantly precise fit, tight coupling and organisation to effect a function, is not credibly driven by blind chance and/or equally blind mechanical necessity. Yes, I know, I know, in biology we have long been indoctrinated to believe such things come about by the magic of natural selection [as the inadvertently telling summary is put]. But in fact there is not ONE instance of actually observed emergence of functionally specific complex organisation and its associated information by known blind forces. There are trillions of cases observed by intelligently directed creative configuration. In this context, per the von Neumann kinematic self replicator, we know that the mechanism of reproduction is also a case of FSCO/I, starting with cellular self replication. So, the appeal to filtered chance variation is not actually credible for life, apart from an a priori imposition contrary to Newton's vera causa principle, that we should explain traces of things we have not seen the cause of only on causes actually observed to produce the like effect. Now, I assume you are familiar with Sir Fred Hoyle, holder of a Nobel-equivalent prize for his astrophysics. It is he who led the process of identifying fine tuning in the physics and arrangements of the observed cosmos, its substance and underlying laws and parameters. It turns out that something remarkable has happened with the mathematics, that there is extraordinary coordination, precision and specificity of the set of key factors, and this not in a context that is anywhere near to the imagined magic of chance variation and differential reproductive success leading to equally imagined grand changes of body plan by incremental process. And yet, this is connected to biology, as it turns out the extraordinarily precise sensitivity locked into the mathematics is keyed to a cosmos in which C-chemistry, aqueous medium, cell based life is established. For me, just the result of the first four most abundant elements and the extraordinary properties locked into such, is enough to give me pause, as it did Sir Fred. H, gives us everything starting with stars. He, gets us to the rest of the table of elements. C and O in close balance dependent on resonances tied to Be is then extraordinary: water, the oxides at the core of terrestrial planet crusts, ozone shields, organic chemistry based on C as connector block element, water with its astonishing simplicity as an individual molecule and sophistication of function through a sort of polymerisation tied to its polar molecule, giving solvent and thermal properties etc. Add in N which is close by, and we have proteins. Remember, these are the four most common elements here. Fine tuning as a necessary and enabling condition of the sort of biological, C-chemistry, aqueous medium, cell based life we observe. Strong signs of intent. Now, I had to point out several times, that sensitivity analysis is a standard procedure in dealing with design or modelling, or systems. And this is what we have in analysing the physics that frames a cosmos in which life like we enjoy is possible. The 90 bn LY wide, 13.8 bn year cosmos testifies to the stability of the system in the long term and across an extraordinarily large span. is there a stabilising force that locks these parameters and laws etc together in any possible world? if so, the fine tuning force is itself extraordinarily fine tuned and this raises issues of design directly. (A theory of everything will NOT succeed in explaining away a cosmos, it only points to the sustaining power of the force that backs such a frame of laws were it to be discovered. Not that a lot of nonsensical rhetoric would not be launched were such discovered. there are none so blind as those who are determined not to see.) Perhaps, then, it is variable instead, per branes and the like with an extraordinary proposal of 10^500 sub-cosmi or some other ensemble . . . a standard move of statistical mechanics BTW is to analysie on a theoretical model of a large collection with closely similar initial conditions and independent unfolding. Whether there is maximal uncertainty in such an array -- thus for all we know flat random distributions of walks in phase space, or else there is some bias that constrains the possibilities to some extent intermediate between fully free and fully locked, makes but little difference. in effect a fundamentally random system with enough time and opportunity will walk throughout its phase space. A wide enough ensemble will therefore sample the full gamut of possibilities. The problem no 1 is, we don't have quasi infinite time and resources warranted by empirical evidence, we have about 10^80 atoms and 10^17 s. With org chemistry that might get us up to 10^12 - 14 reactions per second. The underlying physics is seen to be extraordinarily sensitive in the abstract space of parameters. If we are looking at a brane or the like, we should not be here, at a tight, tight resonance as operating point. We should have any one of a number of far easier to find points, the Boltzmann brain world being the most commonly discussed. That is, I point to the relative statistical weight of clusters of states. In this sense, Craig is well warranted to speak of improbable. The reason why we see certain standard thermodynamic properties and patterns is not that far different cases are impossible, but because there is accessibility of possible states multiplied by utterly overwhelmingly dominant clusters of more or less neighbouring configurations. this drives us to the case of powerful stabilisation to the point where in most cases fluctuations are simply below observability. Hence Criag's comments on Boltzmann brain worlds and the like. No, Craig has not made an embarrassing mis-step, he has alluded to a subtle but powerful pattern in large spaces of possibilities: utterly dominant clusters. In that higher order sense of probability, he is right to say it is utterly improbable to see us in this sort of cosmos, as opposed to clusters of possibilities in the abstract that would carry utterly overwhelming statistical weight. With blind forces and circumstances as the intended explanation, one is in fact constrained by such, and appealk to bare possibility becomes utterly incredible beyond a certain degree. Just the distribution of 1,000 coins tossed makes the point. Utterly overwhelmingly, you will find yourself by chance in the states close to 50-50 in no particular readily recognisable organised pattern. Patterns that can be simply, independently descried like "alternating H and T." In such a context, seeing such a pattern is thus sufficient grounds to infer design on sign that is highly reliable. But we are not there in the utterly dominant macrostates for observers, we are here at an utterly isolated narrow resonance as operating point for our cosmos. What, on our experience explains such a phenomenon, apart from expert craftsmanship? Ans (as a rule): silence and diversion. Compound this by looking at the presence of CODE in the heart of cell based life, i.e. LANGUAGE and ALPHABETIC symbol systems. tied to ALGORITHMS and implementation engines using molecular nanotech. What best explains symbols, language, algorithms and implementation engines? What is the empirically warranted explanation of same? What happens if we transform the cosmos into a grand ensemble giving each atom a tray of 1,000 coins or equivalent [say a paramagnetic substance with weak B field to impose directional order) and toss and observe 10^12 - 14 times/s for 10^17 s? Ans, we can only sample so small a portion of the space of possibilities that if we were to compare the effective zone of search to a needle the haystack of possibilities would more than swallow up the observed cosmos. Notice, again, not probability, search challenge i/l/o extraordinary degree of functionally complex organisation pointing to islands of function in a much larger space of possible configurations, leading to needle in vast haystack search challenge. Fine tuning, again. With LANGUAGE, CODES, ALPHABETS and ALGORITHMS in play. Again, a strong sign pointing to design, in the core of cell based life and causally antecedent to there being cellular self replication on genetic information. Where, the chemistry used for this is rooted in the fine tuned cosmos. Mutually reinforcing inferences from several widely different sciences, exponentiating the explanatory challenge to come up with a serious alternative to design. Then, look at ourselves, as needing to be responsibly and rationally free to undertake such a study on logic. Such freedom of mind cannot be explained on GIGO-limited computational substrates, not digital ones, nor analogue ones nor neural network ones. Computation is inherently blindly mechanical, not freely rational, it depends on the prior sound organisation of a programmer or designer to work. And blind chance and/or mechanical necessity in the face of the relevant functionally specific complex organisation and associated information (usually abbreviated hereabouts as FSCO/I) is not a credible explanation. Further convergent evidence. Now, this is not at all a demonstrative proof compelling agreement of all rational individuals. Not even Mathematics, on the whole post Godel, can achieve that. Instead, we have warrant on evidence led inference to the best current explanation, multiplied by the associated consequences of either global or selective hyperskepticism. And notice, we have not drawn on any precise probability models or frameworks to this point. We have simply pointed out that once there is a system that is subject to sensitivity analysis, we face lock-down to given config or else some degree of freedom to wander in the relevant config space. To wander across the whole space, simply multiply the number of possible cases in a population of test, once such becomes quasi-infinite the all but impossible is going to be there in some particular case. That is certainly one reason why multiverse models are appealed to in answer to the discovery of fine tuning. but then it poses a cruel dilemma: abandon probability based reasoning as a block, or stand indicted as playing at selective hyperskepticism to dismiss what one does not wish to face in the only case we actually do observe. As in, we have now crossed over into philosophical speculation, as there is no actual observed basis for a quasi-infinite array of possible worlds. In that speculation, we face the point that by overwhelming odds we should observe a world with parameters that are anything but as we observe. That is on a multiverse speculation, we are in an extraordinarily anomalous situation. We should not be seeing the sort of resonance point world we seem to be seeing. But there is is, all around us. So, we have a choice that speaks volumes on the inclinations of our hearts. Especially, in a world where -- we are now in phil, have been for some time since we are looking at unobserved multiverses etc -- we find ourselves subject to moral government and underlying principles of natural law. That, too, must be explained. But coming back full circle, the issue of possible different degrees of likelihood of different configurations makes but little difference in a quasi-infinite ensemble. Save, to tell us just how big it needs to be to make something in principle observable. That is where things fall apart for the materialist and those who travel with or unduly accommodate them, we are looking at vast realms of the unobserved, and so the materialist crosses over into philosophy unrecognised. Once you have done that, all the challenges of comparative difficulties analysis are on the table. And you cannot appeal to the holy lab coat to lock out unwelcome major worldview alternatives. On pain of grand ideologically motivated question-begging. Where, very rapidly, such evolutionary materialism finds itself unable to account for the minds required to actually reason as opposed to compute. Not to mention, adherents face extraordinary problems accounting for creating FSCO/I rich computational substrates and soundly programming them. Evolutionary materialism is self referentially absurd. It is not a serious option, though it is a common one and it is artificially propped up ideologically, by those who hope to gain from its widespread adoption. We come full circle, without such materialism being privileged [and with it under a cloud of self falsification], what best explains precise coordination, coupling and organisation of many elements forming a coherently functional whole? Ans: intelligently directed configuration, aka design. KFkairosfocus
December 20, 2016
December
12
Dec
20
20
2016
11:06 PM
11
11
06
PM
PDT
Origenes @ 97 quotes William Lane Craig:
“even though there may be a huge number of possible universes lying within the life-permitting region of the cosmic landscape, nevertheless that life-permitting region will be unfathomably tiny compared to the entire landscape, so that the existence of a life-permitting universe is fantastically improbable. Indeed, given the number of constants that require fine-tuning, it is far from clear that 10^500 possible universes is enough to guarantee that even one life-permitting world will appear by chance in the landscape!”
I agree with Wayne when he says that Craig over reaches when he tries to apply a probabilistic argument to the multiverse (“the existence of a life-permitting universe is fantastically improbable”). Obviously we don’t know anything at all about other universe, indeed, we don’t know another universe exist or existed-- let alone “10^500 possible universes.” Craig commits what I call the “stepping in it” error. I learned about this error when I was growing up. As kids we liked to take walks in the fields of my uncle’s dairy farm. However, he warned us up front, “Don’t step in it.” (In case you’re wondering what it is, it rhymes with it.) On the other hand, atheistic naturalists and materialists, are compelled to accept the idea of a multiverse because they apparently believe that out universe’s fine-tuning is a result of chance. If it is a result of “chance,” it is their responsibility to derive probabilities for each of the cosmological constants, is it not? As I have written before,
“one of the strongest arguments in favor teleology (design or purpose) is the overwhelming evidence for what is commonly termed the fine tuning of the universe. Theists like myself argue that an intelligent Creator (God) is the ultimate explanation behind this apparent teleology. Ironically even some atheists are willing to concede that God is a possible explanation for the for the universes apparent fine-tuning. For example, in 2007 while making observations at the Keck observatory in Hawaii, Sandra Faber, a professor of astronomy at the University of California, Santa Cruz, told science writer Anil Ananthaswamy, “that there were only two possible explanations for fine-tuning. ‘One is that there is a God and that God made it that way…’ But for Faber, an atheist, divine intervention is not the answer. “The only other approach that makes any sense is to argue that there really is an infinite, or a very big, ensemble of universes out there and we are in one,” she said. This ensemble would be the multiverse. In a multiverse, the laws of physics and the values of physical parameters like dark energy would be different in each universe, each the outcome of some random pull on the cosmic slot machine. We just happened to luck into a universe that is conducive to life. After all, if our corner of the multiverse were hostile to life, Faber and I wouldn’t be around to ponder these questions under stars.” Other atheists agree that God counts as a rational explanation. In a debate with Christian philosopher William Lane Craig, California Institute of Technology physicist, Sean Carrol said, “I’m very happy to admit right off the bat – [that God fine-tuning the universe] is the best argument that the theists have when it comes to cosmology.” However, Carroll then deftly takes away with the left hand what he had just offered with his right. “I am by no means convinced that there is a fine-tuning problem,” he told Craig. Oh? Is Carrol speaking for everyone? Is an airy wave of the hand all that is needed to solve the fine tuning as a problem. Other prominent physicists and astrophysicists would disagree, among them Sir Martin Rees, Paul Davies, Roger Penrose, Stephen Hawking, Max Tegmark, Andrei Linde and Alexander Vilenkin to name a few. All these men, as far as I know, reject traditional theism. Nevertheless, they see fine-tuning as being a real problem in need of an explanation.
https://uncommondescent.com/intelligent-design/scientists-driven-to-teleological-view-of-the-cosmos/#comment-622191 Why is it a problem? Because they believe “chance” is the explanation for the universe’s fine tuning. But chance just can’t start from nothing, so they have to dream up a way to kick the can down the road-- forever, if possible. Unfortunately, at least for the present, chance the way they are using it is not a scientific explanation but a metaphysical one.john_a_designer
December 20, 2016
December
12
Dec
20
20
2016
09:52 PM
9
09
52
PM
PDT
1 2 3 4 6

Leave a Reply