Uncommon Descent Serving The Intelligent Design Community

Biology prof: How can we really know if the universe is fine-tuned?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

From Waynesburg U biology prof Wayne Rossiter, author of Shadow of Oz: Theistic Evolution and the Absent God, a question about claims for fine tuning of the universe:

My major concern with arguments from fine-tuning in cosmology is, how do we really get from from observations of precision to statements of probability? To say that something is precise is not to say that it is improbable. Those are two different things.

As a third quick analogy, if we studied the fall patterns of icicles from the roof of my home, we might find that their placement is incredibly precise. Given the vast surface area a given icicle could fall on (my yard, the road, my neighbor’s yard, etc.), the fact that they consistently fall within a very narrow area directly below the edge of the roof (they more or less fall straight down) seems absurdly precise. Absurdly precise, if it was logical to entertain the possibility of icicles falling in ways other than straight down. But the presence of gravity and the lack of strong winds make this highly precise phenomenon highly probable. Said plainly, it would be absurd to treat the falling of an icicle straight down and the falling of it laterally into my neighbor’s yard as equally likely.

But, I think that’s the sort of assumption being made in the argument from cosmological fine-tuning. To say that such-and-such a physical parameter rests upon a razor’s edge does tell us something. It tells us that any small change in the setting of that parameter would lead to a universe drastically different from the one we live in, and likely one that could never even produce material objects (let alone life) as we understand it. Fair enough. I agree. What it doesn’t tell us is how likely any of those other settings are. More.

Thoughts?

See also: Copernicus, you are not going to believe who is using your name. Or how.

Follow UD News at Twitter!

Comments
'But the fact remains that we have no idea whether or not (or how much) a parameter could vary.' Just before writing that, you claimed that you were NOT arguing from ignorance. What on earth does, 'We have no idea...' suggest to you, if not total ignorance. And to state your cluelessness in the very next sentence !!!! For crying out loud. Axel
mw
It is the lid of the ark with two cherubs; from above it God spoke, and face to face with Moses: and in plain speech.
Yes, as given by divine command the creation of the cherubs reflected the heavenly reality - thus the existence of angels. We don't know (as cited above regarding the Star of Bethlehem) to what extent the angels have been involved in creation, in the designs we observe. That is why ID cannot identify the intelligent designer directly. It may be that angels actually implemented designs. Regarding the philistines, it's interesting also that they were instructed to create golden mice to offer as a sacrifice. And that non-Jewish sacrifices had a reparative effect. Images of animals had a sacred function in that regard. Silver Asiatic
Silver Asiatic, @ 160. "Or perhaps it’s better to think of ID as just a core principle – a simple argument. Then that argument can be adopted and used by different beliefs." I like that view point. Yes, today is the feast of St. John the Evangelist. At Patmos, in a vision he saw the ark of the covenant: the Testimony of God. Today, most probably do not even know what or where on the ark the mercy-seat is. It is the lid of the ark with two cherubs; from above it God spoke, and face to face with Moses: and in plain speech. No one as such had to look into the ark. When the ark was captured by the Philistines, the stone statue of their God Dagon lost its head, legs and arms. They did look in the ark, but eventually sent it back in fear as mice, haemorrhoids and tumours coincidentally came on them. https://en.m.wikipedia.org/wiki/Philistine_captivity_of_the_Ark What's more, God had given clear instructions on how the ark was to be carried. To aid its return, David decided to put the ark on a cart. It slipped, and someone tried to prevent its fall. The man died for breaking an ordinance. Harsh? The lesson; the stone word of God, the only scripture God has ever written by the Holy Trinity through Yahweh, is the Holy of Holy scripture to be carried with utmost respect. I believe it is there in heaven and we will face its contents some day and hopefully the mercy of the Creator Saviour. Thank you for your comments. mw mw
F/N: Joe Carter at First Things some years back:
https://www.firstthings.com/blogs/firstthoughts/2010/10/fine-tuning-an-argument-and-a-universe At least two dozen demandingly exact physical constants must be in place for carbon-based life to exist (see list at end of post), the slightest variation in any of these conditions—even to a minuscule degree—would have rendered the universe unfit for the existence of any kind of life. “At least on the face of it, these so–called “anthropic coincidences” would appear to support the idea that we were built–in from the beginning,” says physicist Stephen Barr. “Even some former atheists and agnostics have seen in them impressive evidence of a divine plan.” Indeed, as I hope to show, anthropic coincidences can form the basis of one of the most sound teleological arguments: The apparent fine-tuning of the universe is due to either physical necessity, chance, or design. The apparent fine-tuning is not due to physical necessity or chance. Therefore, it is due to design. The first option, physical necessity, is the easiest to dismiss. The idea that it was physically impossible for the universe to have been created in any way other than in a manner that would support life is neither logically necessary nor scientifically plausible. As Barr notes, “In the final analysis one cannot escape from two very basic facts: the laws of nature did not have to be as they are; and the laws of nature had to be very special in form if life were to be possible.” Our options, therefore, are between chance (the anthropic coincidences truly are coincidences) or design (the parameters needed for life were purposely arranged). While it cannot be established with absolute certainty, we can, I believe, determine that design is the most probable explanation. There is little dispute that probability of this series of “coincidences” occurring is infinitesimally small. Still, it is often argued that since we exist then the probability must be 1. In their book, The Anthropic Cosmological Principle , John Barrow and Frank Tipler contend that we ought not be surprised at observing the universe to be as it is and that therefore no explanation of its fine-tuning is needed. In other words, we can only observe the need for fine-tuning in universes that support life. Surprisingly, this dubious argument is often used as if it were a silver bullet that destroys the fine-tuning argument. But philosopher John Leslie (as told by William Lane Craig ) provides an illustration of why such reasoning is faulty: Suppose you are dragged before a firing squad of 100 trained marksmen, all of them with rifles aimed at your heart, to be executed. The command is given; you hear the deafening sound of the guns. And you observe that you are still alive, that all of the 100 marksmen missed! Now while it is true that 5. You should not be surprised that you do not observe that you are dead, nonetheless it is equally true that 6. You should be surprised that you do observe that you are alive. Since the firing squad’s missing you altogether is extremely improbable, the surprise expressed in (6) is wholly appropriate, though you are not surprised that you do not observe that you are dead, since if you were dead you could not observe it. Similarly, while we should not be surprised that we do not observe features of the universe which are incompatible with our existence, it is nevertheless true that 7. We should be surprised that we do observe features of the universe which are compatible with our existence, in view of the enormous improbability that the universe should possess such features. Barr also provide a helpful analogy: Suppose you were looking for a specific obscure recipe for, say, goulash. If the first book you took at random from the cooking shelf of the library happened to have exactly that recipe, you would regard it as a great coincidence. If you then discovered that the book contained every recipe for goulash ever invented, you would cease to regard it as coincidental that it had the one of particular interest to you. But you would be surprised nonetheless, for one does not expect a cookbook to treat that particular category of food so comprehensively. The fact that it happened to be so comprehensive in its selection of goulash, when it was goulash that you needed, would itself count as a remarkable coincidence. Another problem I find with this line of thinking is that it implies that the probability of a stochastically independent event is determined by the existence of an observer. For example, imagine a universe that is exactly like ours yet contains no carbon-based life forms. We could determine the factors required for such an existence and calculate the probability of such constants appearing as they do. The result, of course, would be an infinitesimally small probability. The implication made by opponents of fine-tuning, though, is that the probability suddenly becomes 1 by the mere addition of a human observer. Such a conclusion is exceedingly absurd. Most critics of fine-tuning have begun to recognize that this approach is insufficient. Faced with scientific evidence that undermines their agnostic assumptions, they turn to metaphysical speculation in the form of the “multiple universes” theory. There is a distinction, however, between the mulitple-domains within one universe and the multiple independent universes. As Barr explains: In the version that physicists take seriously, the many “universes” are not really distinct and separate universes at all, but domains or regions of one all–encompassing Universe. The domains are far apart in space, or otherwise prevented from communicating with each other. Conditions are assumed to be so different from one domain to another that they appear superficially to have different physical laws. However, at a deeper level all the domains are really controlled by one and the same set of fundamental laws. These laws also control what types of domains the universe has, and how many of each type. The other version of the idea posits the existence of a large number of universes that really are universes, distinct and unconnected in any way with each other. Each has its own set of physical laws. There is no overarching physical system of which each is a part. One can understand why this version is not discussed among scientists. At least in the many–domains version all the domains are part of the same universe as we, so that, even if we cannot in practice observe them directly, we might hope at least to infer their existence theoretically from a deep understanding of the laws of nature. In the many–universes version, this is not the case. Briefly stated, the multiple universe theory is the hypotheses that if the universe contains an exhaustively infinite number of universes—all of which actually exist—then anything that can occur with non-vanishing probability will occur somewhere. While it might be true that the probability that our universe could develop in a way that supports life is incredibly small, these critics claim that in an infinite series of universes even the improbable is likely to happen quite often. Such a move, however, commits the inverse gambler’s fallacy , which states that an improbable event can be made less improbable by the hypothesis that many similar events exist, and that the hypothesis is thence confirmed by the improbable event. Even if multiple independent universes do exist, though, it does not change the probability that our universe would turn out as it did. Again, to use an illustration by John Leslie : There is no need for us to ask whether very great alterations in these affairs would have rendered it fully possible once more, let alone whether physical worlds conforming to very different laws could have been observer-permitting without being in any way fine tuned. Here it can be useful to think of a fly on a wall, surrounded by an empty region. A bullet hits the fly. Two explanations suggest themselves. Perhaps many bullets are hitting the wall or perhaps a marksman fired the bullet. There is no need to ask whether distant areas of the wall, or other quite different walls, are covered with flies so that more or less any bullet striking there would have hit one. The important point is that the local area contains just the one fly. Having reduced the chance hypothesis to a virtual impossibility we are left with the obvious conclusion that the fine-tuning is not only apparent but actual. While this fine-tuning does not imply that the existence of a tuner is absolutely certain, it certainly makes it more probable than not. Unless one starts with the assumption that the Fine Tuner cannot or must not exist, it seems more probable (at least as a Baynesian inference that such a Being actually does exist. Of course it must be noted that the the uses of such teleological argument are not likely to persuade the unbelief in the existence of God. As I have said many time before the unbeliever suffers from a form of invincible ignorance. There are no metaphysical and illogical knots the agnostically inclined will not twist themselves into in order to avoid having to admit that the existence of God is more reasonable and probable than its alternative.
Just to keep wheels turning. KF kairosfocus
DS, I spoke to Mathematics as the logic of structurte and quantity, and in that context to sensitivity analysis of the system that models our cosmos as with any other complex system model. Your remarks above come periously close to endorsing rejection of that. And, your stuff on hypothetically possible cosmos are not comfort given that you seem to want to suggest -- utterly implausibly -- that the key constants of the cosmos are necessary and obtain in all possible worlds. KF PS: I sympathise on the snow, and remind one and all that the Caribbean's beaches are nice, much nicer than ice. kairosfocus
mw
However, it would seem Dr Rossiter, as yet, does not include a divine law with the key to our origins. The birth days of the Cosmos, delivered complete. However, to my understanding, he has pointed out an inconsistency in theistic belief. God regularly guides and intervenes, while in between God creates the brainless natural selection for those theistic Christians who have had to bring on board Darwin to save them from believing the ‘errors’ of divine law.
Thanks for your fascinating insights and this one is not exception. If we agree that there is this inconsistency in theistic evolution (as I do), then this is also a problem for ID since there couldn't be any reference to divine law in that context. As I see it, there are different flavors of ID - a theistic version, Christian ID or a secular ID. Or perhaps it's better to think of ID as just a core principle - a simple argument. Then that argument can be adopted and used by different beliefs. But as you point out, belief in divine law has important consequences that cannot be overlooked. Today, the feast of St. John the Evangelist, teacher that the Logos became flesh. The principle of rationality, the Word, the Logos. Those who deny Christ, reject the Logos, thus embrace irrationality. Silver Asiatic
daveS #157. The spontaneous generation of numbers even unto 'imaginary numbers' such as in quadratic equations, may be likened to the spontaneous generation of the first life form; the spontaneous generation of the cosmos, and the spontaneous generations of the multiverse. It's like a spell, a formula/chant that works every time in the imagination. Happy New Year. https://www.mathsisfun.com/numbers/imaginary-numbers.html mw
KF, I don't understand what you are saying in the first two sentences of #150. I do think that mathematics would work the same in any hypothetical physically possible universe. I have a driveway full of snow that I need to deal with at the moment, so don't have much else to add now. daveS
mw@149, I don't think that numbers ever began to exist or evolve, although in what sense they do exist is a difficult question. I believe we can discuss them as if they have an objective existence of their own without running into to much trouble. daveS
KF @ 153: "MW, that famous star of Bethlehem is indeed a challenge to our presumptions." ______________________________________________________________ Indeed. So is Gabriel. Daniel pondered on what certain numbers meant. Daniel also was lamenting the loss of people’s belief in the Commandments and ordinances of God. Daniel was answered in a vision by the man Gabriel (Dan 9). Of course, by some synchronicity or meaningful coincidence Gabriel also approached the Mother of the God-Man to first await her consent before God the Holy Trinity generated himself as the son of man, God in part and God in whole. A planned event; nonsense said Darwin. It was Darwin who rejected Jesus as Son of God. Darwin rejected miracles. Darwin said in "Origin", Yahweh is "erroneous." Darwin created a Godless theory. A design-less theory. By that I mean, any God but the Judaeo-Christian God, and as long as he keeps his nose out of Darwin's business. Darwin gained the world etc… However, it was Dr Rossiter who pointed out in his book and elsewhere that, threaded out through scripture, God massively intervened to change the course of nature to his way: the Ten Plagues, Sinai, the Incarnation, the Virgin Birth, the Resurrection and the Ascension to name but a few. However, it would seem Dr Rossiter, as yet, does not include a divine law with the key to our origins. The birth days of the Cosmos, delivered complete. However, to my understanding, he has pointed out an inconsistency in theistic belief. God regularly guides and intervenes, while in between God creates the brainless natural selection for those theistic Christians who have had to bring on board Darwin to save them from believing the ‘errors’ of divine law. Today is St Stephen's day. The Father formed a holy people for himself, and Jesus said the Father is greater than he, though one God. Stephen would have worshipped Yahweh in the glory of remembering the real purpose of the Sabbath. He followed Jesus into the synagogue. Today, church doors are more or less shut to such. There is no room. It is full with evolutionism. The ass and donkey only fit for the company for those who believe otherwise. In my opinion and belief, God keeps everything in constant tune. It cannot be otherwise. How he decided on which set of numbers to use to fine tune the cosmos I have not got a clue. He has tuned the end game to finish in his time. mw
H'mm: here's a term: Goldilocks zone, life-permitting narrow resonance in the space of cosmologically possible worlds. KF kairosfocus
NB: That captcha then a popup tab tracing to RU, before this worked. Mr Webmaster please check for invasion by de hackerz.ru! >>>>>>>>>>>>> F/N2: Here, from my always linked note, are notes on a typical range of objections to the fine tuning-design inference:
Multiple sub-universes: It is asserted that there is an at least quasi-infinite array of sub-universes that have popped up out of the underlying eternal universe as a whole, with randomly scattered parameters. So, we are in the one that just happened to get lucky: somebody will as a rule win a lottery! We should therefore not be surprised, and there is nothing more to "explain." (Of course, this first resorts to suggesting that there is/must be a vast, unobserved wider universe as a whole. So, right from the start it moves into the province of a worldview claim; it is not at all properly a scientific theory. It therefore cannot fairly exclude other worldview claims from the table of comparative difficulties analysis, nor can it stand apart from the other claims of the underlying worldview it attempts to save: that morally indefensible and factually inadequate and logically self-defeating naturalistic philosophical system that can be best described as evolutionary materialism. Moreover, following Koons, we may paraphrase Leslie tellingly: let us think of a miles-long wall, some of whose sections are actually carpeted with flies; but there is a 100-yard or so stretch with just one fly. Then, suddenly, a bullet hits it. Is it more credible to think that the fly is just monstrously unlucky, or do we celebrate the marksmanship of the hidden shooter? That is, in the end, a locally rare and finetuned possibility is just as wondrous as a globally finetuned one.) But, Science cannot think in terms of the supernatural: That is, "science" is here redefined in terms of so-called methodological naturalism, which in effect implies that a claim can only be deemed scientific if it explains in terms compatible with the materialist's sequence of postulated evolutions: cosmological, chemical, biological, socio-cultural. (That is not only demonstrably historically inaccurate, but it also reduces to: imposition of philosophical materialism by implication. In short, it reduces to philosophical materialism disguised as science. Nor is it fair: in fact the distinction the inference to design makes, strictly is to selecting intelligent agency from the three-way split: chance, regularity of nature [aka necessity], agency. If FSCI is a signature of intelligence, then its detection points us to intelligence, and so we should not resort to intellectual gerrymandering to rule out such possibilities.) "Chance" is good enough, we just plain got lucky: In effect, odds mean nothing as SOMEONE has to win a lottery, and there is probably much more universe out there than we happen to see just now. (First, not all lotteries are winnable, and cosmologically evolving a life-habitable universe that then forms life is not set up to deliver a winner, on pain of reducing to yet another design inference -- cf. Leslie's argument above on the point that the cosmos is designed to get to life, even through a random array of sub-cosmi. But, of course, the point of the fly on the wall analogy is that, a locally rare and finetuned possibility is just as wondrous as a globally rare one. More to the point, the argument self-refutes through its underlying inconsistency: routinely, in the face of the logical possibility that all apparent messages we have ever decoded are simply lucky noise, we infer to intent as the explanation of many things, once they exhibit FSCI: in effect, we take the "welcome to Wales sign" made out of arranged stones seriously, and do not dismiss it as a quirk of geology. In short, the selective resort to "chance" to explain some of the most complex and functionally specific entities we observe is driven by a worldview commitment, not a consistent pattern of reasoning. So, the objector first needs to stop being selectively hyperskeptical, and should fairly address the comparative difficulties problems of his/her own worldview.) The "probabilities"/"Sensitivities" are not credible: usually, this is said by, say challenging the fineness of the balance, perhaps by asserting that some of the parameters may be linked, or that they are driven by an underlying regularity, one that is not as yet discovered. It may even be asserted that the scope of the universe as a whole is such that the size swamps the probabilities in the "little" sub-cosmos we can see. (The first two of these face the problem that while say the Carbon-Oxygen balance is of the order of several percent, the ratio of electrons to protons is unity to within 10^-37, and other parameters that simply do not depend on the accident of how many electrons and protons exist, are even finer. An underlying regularity that drives cosmic values and parameters to such fine balances of course itself raises the issue of design. And, not only is the proposed wider universe concept not empirically controlled, thus strictly a philosophical issue; but also it is manifestly an after the fact ad hoc assertion driven by the discovery of the finetuning.) You can't objectively assign "probabilities": First, the argument strictly speaking turns on sensitivities, not probabilities-- we have dozens of parameters, which are locally quite sensitive in aggregate, i.e. slight [or modest in some cases] changes relative to the current values will trigger radical shifts away from the sort of life-habitable cosmos we observe. Further, as Leslie has noted, in some cases the Goldilocks zone values are such as meet converging constraints. That gives rise to the intuitions that we are looking at complex, co-adapted components of a harmonious, functional, information-rich whole. So we see Robin Collins observing, in the just linked:"Suppose we went on a mission to Mars, and found a domed structure in which everything was set up just right for life to exist . . . Would we draw the conclusion that it just happened to form by chance? Certainly not . . . . The universe is analogous to such a "biosphere," according to recent findings in physics. Almost everything about the basic structure of the universe--for example, the fundamental laws and parameters of physics and the initial distribution of matter and energy--is balanced on a razor's edge for life to occur. As the eminent Princeton physicist Freeman Dyson notes, "There are many . . . lucky accidents in physics. Without such accidents, water could not exist as liquid, chains of carbon atoms could not form complex organic molecules, and hydrogen atoms could not form breakable bridges between molecules" (p. 251)--in short, life as we know it would be impossible." So, independent of whether or not we accept the probability estimates that are often made, the fine-tuning argument in the main has telling force. Can one assign reasonable Probabilities? Yes. Where the value of a variable is not otherwise constrained across a relevant range, one may use the Laplace criterion of indifference to assign probabilities. In effect, since a die may take any one of six values, in absence of other constraints, the credible probability of each outcome is 1/6. Similarly, where we have no reason to assume otherwise, the fact that relevant cosmological parameters may for all we know vary across a given range may be converted into a reasonable (though of course provisional -- as with many things in science!) probability estimate. So, for instance, the Cosmological Constant [considered to be a metric of the energy density of empty space, which triggers corresponding rates of expansion of space itself], there are good physical science reasons [i.e. inter alia Einsteinian General Relativity as applied to cosmology] to estimate that the credible possible range is 10^53 times the range that is life-accommodating, and there is no known constraint otherwise on the value. Thus, it is reasonable to apply indifference to the provisionally known possible range to infer a probability of being in the Goldilocks zone of 1 in 10^53. Relative to basic principles of probability reasoning and to the general provisionality of science, it is therefore reasonable to infer that this is an identifiable, reasonably definable value. (Cf Collins' discussion, for more details.) There are/may be underlying forcing laws or circumstances: It is possible that as yet undiscovered physics may lead us to see that the values in question are more or less as they "have" to be. (However, such a "theory of everything" would itself imply exquisitely balanced functionally specific complexity in the cosmos as a whole, i.e. it is itself a prospect that would lead straight to the issue of design as its explanation.) What about radically different forms of life: We do not know for certain that life must be based on carbon chemistry, so perhaps there is some strange configuration of matter and/or energy (or perhaps, borrowing from the Avida experiments, information) that can be called "life" without being based on the chemistry of carbon and related atoms. (Indeed, theists would immediately agree: spirit is a way that life can exist without being tied down to atoms and molecules! They would also immediately agree that information and -- more fundamentally -- mind are key components of intelligent life. So, this point may lead in surprising directions. But more on the direct point, the proposal is again highly speculative and ad hoc, once it was seen that the cosmos seems designed for life as we know it.) Naturalistic Anthropic Principles: Perhaps, the most important version, the Weak form [WAP] asserts that intelligent life can only exist in a cosmos that has properties permitting their origin and existence. Then, it is inferred, if we are here, we should not be surprised that the parameters are so tight: if they were not met, we would not be here to wonder about it. (Now, of course, if the universe did not permit life like ours, we would not be here to see that we do not exist. But that still leaves open the implications of the point that the cosmos in which we do exist is exquisitely finely tuned for that existence, at least on a local basis. That is, we are simply back to the fly on the wall gets swatted by a bullet example; it is still wondrous and raises the question of marksmanship and intent as the best explanation.)
KF kairosfocus
MW, that famous star of Bethlehem is indeed a challenge to our presumptions. As are almost all things connected to that most famous of all births. However, just for argument, ask yourself how is it that when suggestions of an eternal world and the like are put on the table,the physics keeps on pointing to patterns that indicate design? KF kairosfocus
Folks, It may be helpful to refocus i/l/o Bradley's remarks (HT W/B machine) on in effect the bill of requisites for a cosmos that supports life such as we enjoy:
* Order to provide the stable environment that is conducive to the development of life, but with just enough chaotic behavior to provide a driving force for change. * Sufficient chemical stability and elemental diversity to build the complex molecules necessary for essential life functions: processing energy, storing information, and replicating. A universe of just hydrogen and helium will not "work." * Predictability in chemical reactions, allowing compounds to form from the various elements. * A "universal connector," an element that is essential for the molecules of life. It must have the chemical property that permits it to react readily with almost all other elements, forming bonds that are stable, but not too stable, so disassembly is also possible. Carbon is the only element in our periodic chart that satisfies this requirement. [--> No, Si, does not quite make the grade. And, do not overlook why Hoyle was so impressed with the enzymes, the range of choice created a search space that challenges available resources and time] * A "universal solvent" in which the chemistry of life can unfold. Since chemical reactions are too slow in the solid state, and complex life would not likely be sustained as a gas, there is a need for a liquid element or compound that readily dissolves both the reactants and the reaction products essential to living systems: namely, a liquid with the properties of water. [Added note: Water requires both hydrogen and oxygen.] * A stable source of energy to sustain living systems in which there must be photons from the sun [--> or a like star, stars being the only energy source in the cosmos of requisite duration and character] with sufficient energy to drive organic, chemical reactions, but not so energetic as to destroy organic molecules (as in the case of highly energetic ultraviolet radiation). [--> this also implies the right kind of solar system, with terrestrial planets in the habitable zone, with protective giants and a good moon etc, cf Jay Richards, here, again.] * A means of transporting the energy from the source (like our sun) to the place where chemical reactions occur in the solvent (like water on Earth) must be available. In the process, there must be minimal losses in transmission if the energy is to be utilized efficiently.
Robert C Koons argues, in skeletal form:
1] The physical constants of the cosmos take anthropic values [that is, those conducive to C-based, intelligent life]. 2] This coincidence must have a causal explanation (we set aside for the moment the possibility of a chance explanation through the many-worlds hypothesis [cf. on this, the points raised by Leslie as cited above; noting too that such a wider "multiverse" is speculative rather than observationally anchored]). ________________________________________________ 3] Therefore, the constants take the values that they do because these values are anthropic (i.e., because they cause the conditions needed for life). 4] Therefore, the purpose of the values of these constants is to permit the development of life (using the aetiological definition of purpose). 5] Therefore, the values of these constants are the purposive effects of an intelligent agent (using the minimalist conception of agency). 6] Therefore, the cosmos has been created.
Again, it is simply not plausible that sensitivity analysis -- note, not precise probability estimates, it having been shown that biased distributions only mean that bigger ensembles for longer times will be needed than in the flat random case* -- is conveniently inapplicable to frameworks for cosmoslogical systems, nor that search challenge is not a relevant issue. _______________
*F/N: As, with such ensembles, given enough cases and time ALL cells in the config space will be explored eventually [i.e. flat random is CONSERVATIVE in estimating search challenge] . . . the opposite of what is suggested by the arguments that in effect imply oh there is a golden search. Golden search in the context of large config spaces is self defeating. For, a search is a subset of a space, and so the set of searches is in effect the power set, an exponentially harder search of order 2^N for a space of N cells. If direct search is challenging, search on golden search is much more so. Likewise, given the cluster of constants, parameters, quantities etc, the effect of suggesting they are locked by a super-force is to simply promote the fine tuning one level, as we need to ask pointed questions about how we get such a conveniently specific law that puts us down in such a convenient neighbourhood. And BTW, what empirical and logical evidence is there for such a law in action? [And no, I will not be intimidated by arguments that boil down to don't you dare go there. There being no evidence of metaphysical necessities, something that aptly configures has to explain such things. ]
Likewise, it is utterly implausible that the range of relevant parameters, constants, quantities, boundary conditions, circumstances, decision nodes in unfolding etc will all be matters of metaphysical necessity: frame-working to any possible world and inevitably present in its foundations as a result. That is they do not partake of the character of two-ness: once distinct identity, so A and also NOT-A exist even as concepts, two exists and must exist in any possible world. Likewise, once a circle exists as a concept, by mathematical extension from numbers to reals to the complex plane and/or Cartesian plane [go ox as reals, then use i*x as orthogonal, then i*i*x as negatives, etc, then simply go to (x,y); thus, functions that specify circles of form say x^2 + y^2 = r^2, then circumference and diameter exist and their ratio as lengths can be recognised as pi, etc etc. (And yes, I am taking this numbers and algebra back-door to classical Geometry as a realm of "logic of structure and quantity" necessity by way of utter contrast with the empirically anchored phenomena of physics.) We have come to this, God help us. When it comes to multiverse proposals and frameworks, the matter is, sensitivity analysis suggests strongly that our observed cosmos sits on a narrow, isolated resonance in the config space of mathematical possibilities for world systems. This brings Leslie's lone fly on the wall and expert firing squad "fails" arguments to the table to ground why we should be surprised to see such, and why it strongly points to fine tuning. And no, it is suspiciously special pleading to argue that in effect standard mathematical techniques -- here, sensitivity analysis and the use of the ensembles approach of statistical mechanics as pioneered by Gibbs -- should not be applied to this particular system. Is a quasi-infinite multiverse the likely explanation? We should first appreciate that a Brane of 10^500 sub cosmi is a SMALL number relative to just the search space to get to a first life form with genomes of 100,000 base pairs . . . in the near-neighbourhood possible worlds cases. As, 4^100,000 ~ 9.98*10^60,205. in short, search challenge is a real issue. the problem with multiversews, first is, they are an observational challenge at the very least; this implies it is easy to wander over into philosophy, and to make the mistake of thinking the lab coat prevails by dint of prestige in science. And, it bears noting that it is an established challenge that there has been massive evolutionary materialist ideology imposition and indoctrination, in a context where that system is in fact self falsifying as has been shown umpteen times in and around UD now. Further to this, the pattern of dominant clusters of microstates from statistical mechanics prevails. We have a very narrow resonance dealing with. The overwhelming outcome -- this is what stabilises the second law of thermodynamics -- is that predominant clusters dominate observation and are overwhelmingly likely to be seen. Narrow, deeply isolated resonances are just too rare to be found readily by contrast with the presumed overwhelming clusters. One argument on this line is that the Boltzmann brain world is far more likely as a mere fluctuation of underlying quasi-space-time than what we see. We face an inference to best empirically and analytically warranted explanation, and the best explanation for the cluster of tight, converging specifications met that we see is, a plan backed up by a force capable of being harnessed to effect same. In short, intelligently directed configuration, at scale of creating a cosmos. Intelligent design. That is what is on the table, and has been firmly on it since there was a recognition of the Be-O-C resonances and the credibility that the world we observe has a finitely remote beginning, were on the table. Finally, it is no accident that this debate has developed in the context of a year in which an issue on the table has been the proposal of an actually completed infinite causal succession to get to the present, presented using oh at any time we are here and so the infinite succession is already there and can be presumed till it is overthrown. The answer to that has been, nope no such presumption can be had until you show how an infinite succession of steps can bridge an endless transfinite span. No sound answer to this has been forthcoming. We can safely accept that the observed world and its onward antecedents of quasi-physical character has had a finite extension in the past and thus a beginning, requiring a begin-ner. From this, we then see that sensitivity analysis and search challenge point to the fine tuned nature of our observed cosmos and that this is best explained on intelligently directed configuration. One of the best pointers to just how robust an explanation this is, is that the objectors have had to resort to such extreme and implausible arguments (and too often ideological lockouts) to try to blunt its point. KF kairosfocus
"The heavens declare the glory of God" scripture says. Fine tuning took place in the heavens when God created a star to lead wise people. All sorts of explanations abound; a conjunction of plants, a supernova. It came in a day and vanished in a certain day. It did what stars cannot do, stop over some specified place. A meaningful coincidence, a divine omen. This was no ordinary star. It was a light carried by an angel, is the best mystical explanation that I have read. If it was a real star, it still would make no difference. It was created light at will. It did not take billions of years to evolve. It was precise. At the right time and in the right place. It was powerful. At the creation, on day one, God generated an unknown light not from any star. That light was generated from Christ. A few days later, the sun was created, so it may be believed, but in line with divine law. mw
DS, do you see the issue that it is the force of the logic of structure and quantity that is flagging the fine tuning? To dismiss that strongly suggests that you imply that the Math works for physics at our operating point as a cosmos, but not significantly away from it. Which would be a form of fine tuning. Next, there is a considerable list of factors, quantities, laws etc that per the math are just so to set up our observed cosmos. It is maximally implausible that these are fixed per the metaphysics of being; which is what would be needed for them to be fixed in any possible world. (BTW, Barnes has a fairly technical discussion on related matters.) If that span of things -- not just what you want to talk about -- is "fixed," something else is doing it. Something that can be justly characterised as a super force or super law. And, something that sets things in cascade from it to the cumulative life resonance point our cosmos sits at [per the math and sensitivity analysis] -- recall it is not plausible these are like pi or 2, necessarily so in themselves in any world -- is going to itself be very specific to a configuration, not set here by some metaphysical necessity. That is, it is fine tuned as a set-point mechanism. One that specifies a whole panoply of values, like . . . a plan put into effect through a mechanism. As noted and explained, it itself would be fine tuned. Which is not so strange to see with a plan. Plans define targets as ends and specify often complex, organised means in order to hit such targets. We are evidently seeing co-ordinated, organised "bits and pieces" that work together to give us the basis for C-chemistry, aqueous medium, cell based life. That practically shouts, DESIGN. KF kairosfocus
Hello daveS # 147 and 148. You say, "PS: I don’t believe mathematics itself is fine-tuned, certainly." "When people make claims about any kind of unobservable entity (higher order “superforces” and the like), I’m skeptical." ___________________________________________________________________________ You are not the only one on the latter. No doubt a 'fine tune' was played by a superforce trumpet blast over Mount Sinai, so written records testify. Indeed, people were afraid. Darwin simple reduced such superforces to primitive thinking (a type of argumentum ad novitatem). However, first, how did numbers arise? Did numbers evolve? To me they must have always existed. Hence, in place before the beginning. Or did numbers evolve and become fine tuned to the certainty they are today, as the Big Bang theory progressed? mw
PS^2: I'll have to wait on the pdf until later. Regarding your PS(1): Well, since I am talking about physical constants, I guess there is a presumption that the possible worlds under consideration have some physical or material component. But I deny this has anything to do with the ideology of "evolutionary materialism", lab coats, &etc. When people make claims about other universes in the "multiverse", I'm skeptical. When people make claims about any kind of unobservable entity (higher order "superforces" and the like), I'm skeptical. daveS
PS: I don't believe mathematics itself is fine-tuned, certainly. daveS
KF, I was referring to the "second order" sensititivity analysis that you suggested somewhere above. Because my connection is so bad, I'll have to respond piece-by-piece. Starting from your P^4S: No, I don't believe the implications of mathematics are to be disbelieved simply because physical constants, laws, etc., have been changed. To be clear, I accept that virtually any change in the known physical constants will make life (as we know it, anyway) impossible. My question is how this tells us anything about the contingent vs necessary question. If I can, I will attempt to address the rest of your post. daveS
DS, the sensitivity analysis has been done from 1953 on -- Hoyle -- and its verdict is quite strong: fine tuning. The real issue is why? KF PS: Reservations without substantiation of an extremely implausible claim, locking out other factors such as simple quantities etc that also play a big part in the fine tuning of the cosmos picture. It seems to me, there is likely an underlying context of thought that any possible world must be at least quasi-physical-material, as part of a framing of evolutionary materialist metaphysics. Showing such a claim is a challenge, at minimum. And certainly it should not be given an implicit metaphysics default by virtue of putting on a lab coat. PPS: Barnes responding to Stenger p. 7 on here, is illuminating: https://arxiv.org/pdf/1112.4647.pdf PPPS: Jay Richards' list of 22 points with explanations in brief here will also help: http://www.discovery.org/f/11011 So will Collins' discussion here: http://www.discovery.org/a/91 P^4S: Do you intend it to be understood that while Mathematics is applicable to our particular circumstances, its implications are to be disbelieved if we slide the dials over a bit on the parameters, constants, frameworks, laws etc? Is MATH -- the logic of structure and quantity, peculiarly fine tuned on your view? kairosfocus
MW, I hear your point. KF kairosfocus
KF, I unexpectedly have (very poor) wifi access, so will briefly post. Referring to the last part of your post, I don't think there's anything wrong with investigating this problem using sensitivity analysis, so please do so if you feel moved to. Let us know what you find. I've already stated my reservations regarding your "levels" argument, and don't have much else to add, so unless pressing new questions arise, I will leave it at that. daveS
DS, it is obvious you have no reason for the assertion that the constants of physics as we have found are "plausibly" locked, and it looks a lot like you imagine there is a mirror-image situation. Actually, not. First, what would have to be locked is much more than things like the permittivity of free space or the universal gravitation constant or Planck's constant, or the Boltzmann constant or the speed of light in vacuo etc, you cannot just rule datum lines for argument. Second, the very fact of the "unreasonable effectiveness" of Mathematics in Physics, should give pause before blanking out a result that comes from a standard Math procedure, sensitivity analysis: our observed cosmos stands at a narrow resonance that is life permitting. Third, these constants, by and large, are not forced by the logic of being or the like, though of course something like wave equation analysis ties electrical and magnetic properties of space to the wave speed in the medium, the speed of light. A great many constants do not stand in lock like the three just looked at; and for that one, there is no reason why we cannot say ask, what would happen if space could be manipulated along lines of sensitivity analysis. Where of course, many other values tied to such fine tuning are not constants but simple quantities, etc. So, it is utterly reasonable to explore the mathematical possibilities and to examine the result, no inherent contradiction arises unlike trying to pretend that 2 does not hold a fixed value, or pi etc. KF kairosfocus
KF, First, this will be my last comment for a while, since I'll be away for a few days.
we are dealing with the physics of building universes here, which turns out to be strongly shaped by the logic of structure and quantity, aka mathematics. But also it is required that this reflects the range of the facts on the ground — actually, in the sky. We do not get to pick and choose. there are constraints that will be so in every possible world, e.g. two-ness rooted in the existence of distinction A vs ~A, the number pi, the number e etc. Those do not get you anywhere near building a world in which you have intelligent observers based on C-Chemistry, aqueous medium cell based, terrestrial planet, Galactic habitable zone life.
Sure, but I'm definitely not up for a comprehensive discussion on building universes. Rather, I'm only going to look at issues around the fundamental physical constants.
In the context of having explained and pointed out the sort of things that are quite plausibly contingent, I make fair comment: I find dismissive rhetoric on your part on terms like “facile” just now, out of order. I think you have some walking back to do, sir.
And I agree it is plausible they are contingent. Given what is known, it's plausible (in my view) that they are not contingent. Who knows anything about how (or even whether) the constants are "chosen"? And yes, "facile" does have insulting connotations that I did not intend.
The issue is, there is a wide range of things that would have to be metaphysically locked by the sort of structure of reality constraints that make two-ness or pi necessary beings. You have offered not a whit of support for any such, you are making an implied utterly implausible claim and are wishing to impose it as default.
I'm not trying to impose anything by "default". I'm just expressing what I believe to be an appropriate level of skepticism regarding your "levels" argument. We are, after all, talking about things which are apparently untestable, so I don't see how one could come to any firm conclusion one way or another. You are/should be just as skeptical if I claimed that the constants were indeed necessary in some sense. daveS
Further: those who believe in the Judaeo-Christian God; crediting him as the brains behind heaven and earth and all that is, and supreme judge and supreme witness; if he cannot write an intelligent sound law for all time, he cannot expect to last for all time, let alone judge anybody. mw
kf @ 129: "MW, the estimated scale and age of the observed cosmos trace to empirical evidence. I simply spoke in that context. Come up with better evidence and numbers and I would go with them." ___________________________________________________________________ Hello kf. Please note, I did commend you on your fine piece of work. Nevertheless, in my opinion, there is a better explanation for figures indicating the cosmos is a 90 size, from which many, believe God created over 13.8 billion years. An explanation is God set the key essential evidence in stone, as testified by witnesses and the witness statement of the God at Sinai. However, from any plain reading of divine law from the Ten commandments, devoid of any elasticating of scripture, being that God changes not (Mal 3:6) and Jesus is the same for ever (Heb 13:8), which in terms of the belief in the Holy Trinity, Jesus spoke at Sinai as One God, and said: "The Lord said to Moses: You yourself are to speak to the Israelites: ‘You shall keep my sabbaths, for this is a sign between me and you throughout your generations, given in order that you may know that I, the Lord, sanctify you. You shall keep the sabbath, because it is holy for you; everyone who profanes it shall be put to death; whoever does any work on it shall be cut off from among the people. For six days shall work be done, but the seventh day is a sabbath of solemn rest, holy to the Lord; whoever does any work on the sabbath day shall be put to death. Therefore the Israelites shall keep the sabbath, observing the sabbath throughout their generations, as a perpetual covenant. It is a sign for ever between me and the people of Israel that in six days the Lord made heaven and earth, and on the seventh day he rested, and was refreshed.’ When God finished speaking with Moses on Mount Sinai, he gave him the two tablets of the covenant, tablets of stone, written with the finger of God.” (Exodus 31:12-18) At one time, the Holy Trinity demanded the death penalty for disbelieving and breaking his law. It follows, if God actually created in 13.8 billion years, while condemned to death a man for disobedience to his clear law, we stretch out God to being an unjust murderer! Do we not make the child in a crib a murder even before he can walk? Jesus said, “before Abraham was I am” (Jn 8:58). Surely, he could remember how long he took to create, when he could remember such! Of course, we may bring in other scripture to make people doubt the very accuracy of the word of God. Satan did that in Genesis. Indeed, Satan tried using scripture against the very word of God himself. In the wilderness, Jesus/God give in a swift rebuke: “But he answered, ‘It is written, “One does not live by bread alone, but by every word that comes from the mouth of God.”’ (Matt 4:4) Jesus asks Yahweh to sanctify us in His truth (Jn 17:17), that is God sanctifies (makes us holy) by believing in the law of God, including that he created in six days; and keeping to the teaching of Jesus (Rev 12:17). Today, for many, that has become the ‘worst’ of God’s teachings, when the power of faith could move a cosmos if that was the will of God. Today we do not much appreciate what speaking to God face to face and in plain language means (Num 12:1-16). Has anyone had that continued privilege over 40 years? As for Jesus, God of Sinai said, “I declare what I have seen in the Father’s presence; as for you, you should do what you have heard from the Father” (Jn. 8:38). At the time of Sinai, all Israel heard the Father! Jesus said Moses would be their judge (Jn 5:45). Are we any different as spiritual heirs to Abraham? In my opinion, according to divine law, the numbers 90 and 13.8 must have come about in six days. No one can prove otherwise, and certainly not be using scripture against divine law, the only scripture ever written by the Holy Trinity—the Ten Commandments. If one law is deemed flawed; inaccurate, do we not contaminate the validity of the other nine? How accurate the truth of God and all scripture when wide of the mark of theory? How many faiths do we need? mw
DS, we are dealing with the physics of building universes here, which turns out to be strongly shaped by the logic of structure and quantity, aka mathematics. But also it is required that this reflects the range of the facts on the ground -- actually, in the sky. We do not get to pick and choose. there are constraints that will be so in every possible world, e.g. two-ness rooted in the existence of distinction A vs ~A, the number pi, the number e etc. Those do not get you anywhere near building a world in which you have intelligent observers based on C-Chemistry, aqueous medium cell based, terrestrial planet, Galactic habitable zone life. In the context of having explained and pointed out the sort of things that are quite plausibly contingent, I make fair comment: I find dismissive rhetoric on your part on terms like "facile" just now, out of order. I think you have some walking back to do, sir. The issue is, there is a wide range of things that would have to be metaphysically locked by the sort of structure of reality constraints that make two-ness or pi necessary beings. You have offered not a whit of support for any such, you are making an implied utterly implausible claim and are wishing to impose it as default. Those are not responsible moves, and you are backing them by playing the burden of proof shift game so beloved of evolutionary materialism advocates. I am not buying such. The evidence is, a lot of constraints have to be in a zone to get to a cosmos like ours with life like we see, and that this zone is set up in a narrow and deeply isolated resonance in the space of mathematically grounded possibilities. Such has to be locked in a relevant zone of 90 Bn LY and 13.8 BY, on conventional estimates. that is we have stable laws in a stable cosmos. But we have no reason whatsoever to imagine that these things are locked in any possible world per metaphysical necessity of being. And we do have a lot of sensitivity analysis that points in very different directions. the first objections were on oh we don't thing you get to probability. that has been answered by showingf the sensitivity search framework. Now there is a gambit on oh maybe it is all locked up in any possible world. in short, ignore the sensitivity analysis and get back to our preferred game. I am not going there. KF PS: Collins here may help with background (without necessarily endorsing everything said): http://www.discovery.org/a/91 likewise Barnes here: https://arxiv.org/pdf/1112.4647.pdf PPS: O/T, I notice some pretty aggressive pop-ups here at UD that are breaking through several layers of antivirus and popup blockers. Been so for a few weeks now. kairosfocus
of supplemental note to this:
Job 26:10 He marks out the horizon on the face of the waters for a boundary between light and darkness. Proverbs 8:26-27 While as yet He had not made the earth or the fields, or the primeval dust of the world. When He prepared the heavens, I was there, when He drew a circle on the face of the deep, Planck satellite unveils the Universe — now and then (w/ Video showing the mapping of the ‘sphere’ of the Cosmic Microwave Background Radiation with the Planck satellite) – 2010 http://phys.org/news197534140.html#nRlv
It is interesting to note that the Bible predicted that 'He drew a circle on the face of the deep' thousands of years before the Cosmic Microwave Background Radiation (CMBR) was discovered by modern science. I would call that a rather stunning confirmation in science of a Theistic prediction that ranks right up there with the Theistic prediction that the universe had a beginning.. Moreover, it is interesting to note that Atheistic Materialism, through its conjecture of inflation, is driven into catastrophic epistemological failure in trying to account for the 'homogenity' of the universe in general and/or for the 'sphere' of the CMBR in particular.
Space is all the same temperature. Coincidence? Distant patches of the universe should never have come into contact. So how come they’re all just as hot as each other? - 26 October 2016 Excerpt: THE temperature of the cosmic microwave background – the radiation bathing all of space – is remarkably uniform. It varies by less than 0.001 degrees from a chilly 2.725 kelvin. But while that might seem natural enough, this consistency is a real puzzle. For two widely separated areas of the cosmos to reach thermal equilibrium, heat needs enough time to travel from one to the other. Even if this happens at the speed of light, the universe is just too young for this to have happened. Cosmologists try to explain this uniformity using the hypothesis known as inflation. It replaces the simple idea of a big bang with one in which there was also a moment of exponential expansion. This sudden, faster-than-light increase in the size of the universe allows it to have started off smaller than an atom, when it would have had plenty of time to equalise its temperature. “On the face of it, inflation is a totally bonkers idea – it replaces a coincidence with a completely nonsensical vision of what the early universe was like,” says Andrew Pontzen at University College London. https://www.newscientist.com/article/mg23230970-900-cosmic-coincidences-everythings-at-the-same-temperature/ Why I Still Doubt Inflation, in Spite of Gravitational Wave Findings By John Horgan - March 17, 2014 Excerpt: Indeed, inflation, like string theory, has always suffered from what is sometimes called the “Alice’s Restaurant Problem.” Like the diner eulogized in the iconic Arlo Guthrie song, inflation comes in so many different versions that it can give you “anything you want.” In other words, it cannot be falsified, and so–like psychoanalysis, Marxism and other overly flexible hypotheses (mmm Darwinism?)–it is not really a scientific theory. http://blogs.scientificamerican.com/cross-check/2014/03/17/why-i-still-doubt-inflation-in-spite-of-gravity-wave-findings/ Cosmic inflation is dead, long live cosmic inflation - 25 September 2014 Excerpt: (Inflation) theory, the most widely held of cosmological ideas about the growth of our universe after the big bang, explains a number of mysteries, including why the universe is surprisingly flat and so smoothly distributed, or homogeneous (i.e. why the universe is 'round').,,, Paul Steinhardt of Princeton University, who helped develop inflationary theory but is now scathing of it, says this is potentially a blow for the theory, but that it pales in significance with inflation's other problems. Meet the multiverse Steinhardt says the idea that inflationary theory produces any observable predictions at all – even those potentially tested by BICEP2 – is based on a simplification of the theory that simply does not hold true. "The deeper problem is that once inflation starts, it doesn't end the way these simplistic calculations suggest," he says. "Instead, due to quantum physics it leads to a multiverse where the universe breaks up into an infinite number of patches. The patches explore all conceivable properties as you go from patch to patch. So that means it doesn't make any sense to say what inflation predicts, except to say it predicts everything. If it's physically possible, then it happens in the (inflationary) multiverse someplace Steinhardt says the point of inflation was to explain a remarkably simple universe. "So the last thing in the world you should be doing is introducing a multiverse of possibilities to explain such a simple thing," he says. "I think it's telling us in the clearest possible terms that we should be able to understand this and when we understand it it's going to come in a model that is extremely simple and compelling. And we thought inflation was it – but it isn't." http://www.newscientist.com/article/dn26272-cosmic-inflation-is-dead-long-live-cosmic-inflation.html?page=1#.VCajrGl0y00 WHAT SCIENTIFIC IDEA IS READY FOR RETIREMENT? Infinity - Max Tegmark - January 2014 and Feb. 2015 Excerpt: Physics is all about predicting the future from the past, but inflation seems to sabotage this: when we try to predict the probability that something particular will happen, inflation always gives the same useless answer: infinity divided by infinity. The problem is that whatever experiment you make, inflation predicts that there will be infinitely many copies of you far away in our infinite space, obtaining each physically possible outcome, and despite years of tooth-grinding in the cosmology community, no consensus has emerged on how to extract sensible answers from these infinities. So strictly speaking, we physicists are no longer able to predict anything at all! This means that today’s best theories similarly need a major shakeup, by retiring an incorrect assumption. Which one? Here’s my prime suspect: infinity. MAX TEGMARK – Physicist (actually the ‘theory’ that needs to be retired from science is the philosophy of materialism in general) http://www.theguardian.com/science/2014/jan/12/what-scientific-idea-is-ready-for-retirement-edge-org A Matter of Considerable Gravity: On the Purported Detection of Gravitational Waves and Cosmic Inflation - Bruce Gordon - April 4, 2014 Excerpt: Thirdly, at least two paradoxes result from the inflationary multiverse proposal that suggest our place in such a multiverse must be very special: the "Boltzmann Brain Paradox" and the "Youngness Paradox." In brief, if the inflationary mechanism is autonomously operative in a way that generates a multiverse, then with probability indistinguishable from one (i.e., virtual necessity) the typical observer in such a multiverse is an evanescent thermal fluctuation with memories of a past that never existed (a Boltzmann brain) rather than an observer of the sort we take ourselves to be. Alternatively, by a second measure, post-inflationary universes should overwhelmingly have just been formed, which means that our existence in an old universe like our own has a probability that is effectively zero (i.e., it's nigh impossible). So if our universe existed as part of such a multiverse, it would not be at all typical, but rather infinitely improbable (fine-tuned) with respect to its age and compatibility with stable life-forms. http://www.evolutionnews.org/2014/04/a_matter_of_con084001.html
My main point in bringing this failure of materialism up is to point out the fact that this failure of Atheistic materialism to account for the 'fine-tuning' of the CMBR is post Big Bang. In other words, this fine-tuning that must be accounted for is after the creation of space-time matter-energy itself. Thus the 'surprise' we should have at the 'exorbitantly improbable' fine-tuning of the universe is as just as surprising for us after the Big Bang, if not more so, as it is for any fine-tuning of the universe that must be accounted for prior to the Big Bang. Verse:
Hebrews 1:3 The Son is the radiance of God's glory and the exact representation of his being, sustaining all things by his powerful word. After he had provided purification for sins, he sat down at the right hand of the Majesty in heaven.
bornagain77
KF,
DS, again, look at the matter, we are talking about not just constants (which simply do not partake of the sort of necessity that pi etc do) but quantities, circumstances and the like.
Well, I know you're talking about a variety of things, but I'm talking only about constants almost exclusively, particularly the point in the last paragraph of #132. How do we know they don't "partake of necessity"?
The suggestion maybe it is a metaphysical necessity that cannot be averted in any world — no more than a world can exist without two-ness in it — is not only utterly implausible but does not evade the point of fine tuning, were it to actually hold.
I don't know why it's utterly implausible, but once more, I'm not taking on fine tuning itself. Just this maneuver of "going up one level", which seems a little too facile. daveS
of supplemental note to post 133: The discovery of a 'Dark Age' for the early universe uncannily matches up with the Bible passage in Job 38:4-11.
Job 38:4-11 “Where were you when I laid the foundations of the earth? Tell me if you have understanding. Who determined its measurements? Surely you know! Or who stretched a line upon it? To what were its foundations fastened? Or who laid its cornerstone, When the morning stars sang together, and all the sons of God shouted for joy? Or who shut in the sea with doors, when it burst forth and issued from the womb; When I made the clouds its garment, and thick darkness its swaddling band; When I fixed my limit for it, and set bars and doors; When I said, ‘This far you may come but no farther, and here your proud waves must stop!" History of the Universe - Timeline Graph Image http://www.der-kosmos.de/pics/CMB_Timeline300_gr.jpg Job 26:10 He marks out the horizon on the face of the waters for a boundary between light and darkness. Proverbs 8:26-27 While as yet He had not made the earth or the fields, or the primeval dust of the world. When He prepared the heavens, I was there, when He drew a circle on the face of the deep, Planck satellite unveils the Universe -- now and then (w/ Video showing the mapping of the 'sphere' of the Cosmic Microwave Background Radiation with the Planck satellite) - 2010 http://phys.org/news197534140.html#nRlv
bornagain77
DS, again, look at the matter, we are talking about not just constants (which simply do not partake of the sort of necessity that pi etc do) but quantities, circumstances and the like. And, even more importantly, I am not arguing that such necessity must be or is the case. I am simply pointing out that if such a wide vartioety of things not critically dependent on one another, but all of which must be in a resonance zone for our kind of cell based life rooted in C-chemistry in aqueous medium to be, then the locking is not in the phenomena. it lies elsewhere, some force that locks. You can choose to deny or be super-skeptical about it, but that is not going to change the force of the point, which should be readily evident to anyone who looks into it. And, a force -- in the broad sense -- that locks up so many disparate things is going to have to be pretty carefully set up itself, which means fine tuning has been displaced up one level. Worse, if there is actually a law of metaphysical necessity that locks up all sorts of things to life permitting zones as we can list on and on, then that necessity is highly "suspicious," too. The suggestion maybe it is a metaphysical necessity that cannot be averted in any world -- no more than a world can exist without two-ness in it -- is not only utterly implausible but does not evade the point of fine tuning, were it to actually hold. So, I turn about the challenge, on what grounds do you wish to suggest that such an idea is anything but special pleading of a most implausible nature, and how does such evade the point that something that sets up reality to be necessarily much like our world as we observe, would not be fine tuned? Worse, how does it then address the force of the mathematics based on the observations of our cosmos, that allows sensitivity analysis and from that leads to, we are in a narrow resonance. KF kairosfocus
JAD, you may appreciate this if you don't already have it:
How The Stars Were Born - Michael D. Lemonick) http://www.time.com/time/magazine/article/0,9171,1376229-2,00.html For the first 400,000 years of our universe’s expansion, the universe was a seething maelstrom of energy and sub-atomic particles. This maelstrom was so hot, that sub-atomic particles trying to form into atoms would have been blasted apart instantly, and so dense, light could not travel more than a short distance before being absorbed. If you could somehow live long enough to look around in such conditions, you would see nothing but brilliant white light in all directions. When the cosmos was about 400,000 years old, it had cooled to about the temperature of the surface of the sun. The last light from the "Big Bang" shone forth at that time. This "light" is still detectable today as the Cosmic Background Radiation. This 400,000 year old “baby” universe entered into a period of darkness. When the dark age of the universe began, the cosmos was a formless sea of particles. By the time the dark age ended, a couple of hundred million years later, the universe lit up again by the light of some of the galaxies and stars that had been formed during this dark era. It was during the dark age of the universe that the heavier chemical elements necessary for life, carbon, oxygen, nitrogen and most of the rest, were first forged, by nuclear fusion inside the stars, out of the universe’s primordial hydrogen and helium. It was also during this dark period of the universe the great structures of the modern universe were first forged. Super-clusters, of thousands of galaxies stretching across millions of light years, had their foundations laid in the dark age of the universe. During this time the infamous “missing dark matter”, was exerting more gravity in some areas than in other areas; drawing in hydrogen and helium gas, causing the formation of mega-stars. These mega-stars were massive, weighing in at 20 to more than 100 times the mass of the sun. The crushing pressure at their cores made them burn through their fuel in only a million years. It was here, in these short lived mega-stars under these crushing pressures, the chemical elements necessary for life were first forged out of the hydrogen and helium. The reason astronomers can’t see the light from these first mega-stars, during this dark era of the universe’s early history, is because the mega-stars were shrouded in thick clouds of hydrogen and helium gas. These thick clouds prevented the mega-stars from spreading their light through the cosmos as they forged the elements necessary for future life to exist on earth. After about 200 million years, the end of the dark age came to the cosmos. The universe was finally expansive enough to allow the dispersion of the thick hydrogen and helium “clouds”. With the continued expansion of the universe, the light, of normal stars and dwarf galaxies, was finally able to shine through the thick clouds of hydrogen and helium gas, bringing the dark age to a close. (adapted from How The Stars Were Born - Michael D. Lemonick) The Elements: Forged in Stars – video https://www.youtube.com/watch?v=B-LXUHJmzzc
bornagain77
JAD,
It appears that DaveS wants to take the law or necessity option. However, if you are going to argue that all the apparent fine-tuning we see is necessary you are going to have to explain from where it originated. If it was there “in the beginning,” how did it get there? In other words, you’ve avoided the chance option but what does that really get you?
Frankly, I'm not even thinking about explaining apparent fine tuning at the moment, so I don't expect my "argument", such as it is, will have much bearing on that. What I am thinking about is this particular step in KF's reasoning which says, more or less, that if our physical constants are necessary in some sense, then there must exist a higher-order realm of tuneable entities which generated those constants. On its face, that seems very similar to positing the existence of a multiverse---which is also unfalsifiable and unconfirmable. daveS
KF,
DS, constants of physics are not logically compelled in any possible world, such as would the value of e or that of pi or phi. And that is before we get to things like charge balance or the proportions of normal matter and anti-matter, etc. Similarly, we have no reason to believe the structure of empirical laws we see is forced by power of logic so these must be in any possible world. That’s why people can talk about multiverses. KF
How do we know all these things about worlds to which we have no access? It would seem impossible to confirm or refute any of these statements.
... The range of things locked would require a significant mechanism.
Again, how do we know this? Is there some experiment we can perform to test it? daveS
It has been known for some time that sufficient basic elements that are needed for planet formation and life chemistry could not exist without supernova explosions occurring at the right frequencies, and even in the right places. It is hard for me to see how this frequency which depends on a number of other parameters could be conceived as necessary or “locked in” and not contingent. Brian Koberlein, an astrophysicist and physics professor at Rochester Institute of Technology, explains the process:
For small stars, hydrogen is the only element they can fuse; when they run out, they go dark. But after the largest of the first stars transformed their hydrogen to helium, they burned on in another way. When these large stars stopped fusing hydrogen, their internal pressure went down, gravity began to collapse them again, and the temperature of their cores rose. As their cores reached a temperature of a hundred million Kelvin, helium began to fuse into beryllium (an atom with four protons), and beryllium and helium fused to produce carbon (six protons). The element central to life on Earth began to form in the blazing hearts of stars, though this carbon still had a long journey ahead before it would become a part of us. From carbon fusion comes nitrogen and oxygen (seven and eight protons, respectively), two more elements necessary for life, and from these comes a chain of fusion up to iron (26 protons). Fusing iron into heavier elements doesn’t produce more energy, as the fusion of lighter elements does—when iron fuses, it absorbs energy, which is actually a good thing. If elements always fused into heavier elements, then the first stars would have simply fused indefinitely, until they became neutron stars, enormous, undifferentiated orbs of nuclear material. But because the fusion of iron actually cools the core of a star, the chain of fusion shuts down. After their fusion stopped, the first big stars eventually collapsed under their own weight, which triggered supernova explosions. The outer layers of each star, rich in carbon, nitrogen, and oxygen, were cast into interstellar space, and only the cores of these stars collapsed, yet again, into neutron stars.
http://nautil.us/blog/how-the-universe-made-the-stuff-that-made-us However, the natural synthesis of heavy elements, as Koberlein goes on to explain, is even more complex than that because stellar nucleosynthesis is only capable taking us up to iron. For example, it is believed that heavier elements like gold are the result of neutron star collisions. That is another thing that appears to me to be very contingent. Hugh Ross summarizes the situation.
supernovae eruptions: if too close: life on the planet would be exterminated by radiation if too far: not enough heavy element ashes would exist for the formation of rocky planets if too infrequent: not enough heavy element ashes present for the formation of rocky planets if too frequent: life on the planet would be exterminated if too soon: heavy element ashes would be too dispersed for the formation of rocky planets at an early enough time in cosmic history if too late: life on the planet would be exterminated by radiation
http://www.reasons.org/articles/fine-tuning-for-life-on-earth-june-2004 Again, here is the fine-tuning argument that William Lane Craig likes to use:
The fine-tuning of the universe to support life is either due to law, chance or design It is not due to law [or necessity] or chance Therefore, the fine-tuning is due to design
It appears that DaveS wants to take the law or necessity option. However, if you are going to argue that all the apparent fine-tuning we see is necessary you are going to have to explain from where it originated. If it was there “in the beginning,” how did it get there? In other words, you’ve avoided the chance option but what does that really get you? john_a_designer
MW, the estimated scale and age of the observed cosmos trace to empirical evidence. I simply spoke in that context. Come up with better evidence and numbers and I would go with them. KF kairosfocus
DS, constants of physics are not logically compelled in any possible world, such as would the value of e or that of pi or phi. And that is before we get to things like charge balance or the proportions of normal matter and anti-matter, etc. Similarly, we have no reason to believe the structure of empirical laws we see is forced by power of logic so these must be in any possible world. That's why people can talk about multiverses. KF PS: again, I spoke to the three alternatives: locked (which requires locking as we are not speaking of things like pi), flat random take any value, distributions that make some values more and some less, likely. The range of things locked would require a significant mechanism. Whether flat or biased, a distribution will in the Monte Carlo type context eventually sample all cells in the space. so, none of the three is able to remove fine tuning, as seen already. kairosfocus
mw Thanks for those fascinating historical and theological details. It's certainly enough to make us wonder about the various 'certainties' we are given. Silver Asiatic
@ 119, john_a_designer, makes a fair point, “how do you calculate the odds of coincidence? And gives an example. In conjunction, kf twice has stated the cosmos is 90 billion light years wide and 13.8 billion years old. ___________________________________________________________________ However; coincidences. Carl Jung, coined the term “synchronicity.” “Several psychoanalysts noted certain strange coincidences in which their patients received information about them by extra-sensorial ways, information that was not accessible to the general public.” “Jung writes a book on synchronicity together with Nobel laureate W. Pauli,...” http://carl-jung.net/synchronicity.html As for omens or coincidences embedded with some perceived physical or spiritual significance, Yahweh warns to keep clear, as Isaiah prophesied. It is God, “who frustrates the omens of liars, and makes fools of diviners; who turns back the wise, and makes their knowledge foolish;” (Isa 44:25). Perhaps the following is an example of an omen? http://www.catholicnewsagency.com/news/san-gennaros-blood-didnt-liquefy--so-pray-anyway-abbot-says-74307/ However, some may call the figures of 90 and 13.8, a synchronicity; a term to explain a meaningful coincidence. The danger lies in who or what is contributing to the meaning. It seems that, coincidently we have theoretically arrived at a scientific type of synchronicity, an omen; we have considered the figures by coincidence, as true from the beginning, which coincidently as ‘proved’ God created over 13.8 billion years. Such a coincidence we may have made full of sense to our liking, but without first applying clear divine law, the key to better our understanding. Therefore, on that basis, in my opinion, the figures are more of a meaningful coincident, secondary to all being created in six days. As for the perceived 90 size of the cosmos, what is that to the unknown size and power of God? Julian of Norwich and her manuscript, “Revelations of Divine Love,” featured on BBC 4 TV, 19th July, 16. Julian, an unlettered woman, received the revelations at death’s door in May (probably the 13th) 1373. She later became the first English woman to write a book (Revelations) in English. “She is called Blessed, although she was never formally beatified,” and “venerated in both the Catholic and Anglican Communion.” She received a vision on the size of the creation: ‘In this vision he showed me a little thing, the size of a hazelnut, and it was round as a ball. I looked at it with the eye of my understanding and thought What may this be? And it was generally answered thus: “It is all that is made.” I marvelled how it might last, for it seemed it might suddenly have sunk into nothing because of its littleness. And I was answered in my understanding: “It lasts and ever shall, because God loves it.”’ http://www.cynthialarge.com/julian/hazelnutboxpoem.html You may object and say, but such is not canonical. True, but the Ten Commandments are, and they hold the key to understanding life, the cosmos and everything, and it is not the number 42: or a googol. https://en.wikipedia.org/wiki/Googolplex Is it not more accurate to say, those figures 90 and 13.8, give the impression of size and age? However, surely needed is a clearly provided divine key for better understanding, honest science and greater faith? In my opinion, we are continuing to pay the price for not keeping to divine law by pumping up exclusively the Big Bang Theory and Darwinism. A Catholic example and ten year olds: "Those that are leaving for no religion - and a pretty big component of them saying they are atheist or agnostic - it turns out that when you probe a bit more deeply and you allow them to talk in their own words, that they are bringing up things that are related to science and a need for evidence and a need for proof," said Dr. Mark Gray, a senior research associate at the Center for Applied Research in the Apostolate at Georgetown University. http://www.catholicnewsagency.com/news/why-catholics-are-leaving-the-faith-by-age-10-and-what-parents-can-do-about-it-48918/ The problem is, it seems, that the answer is to give children more of the same: evolution theory and divine law are compatible when they are billions of years apart. Proof! If Moses resurrected from the dead, would we believe him? mw
KF, Yes, I do think we are talking about two different things then. I am specifically concerned with the possibility that the physical constants can only take one value, namely those values they hold in the existing universe. I guess you would say that in this case, it is physically impossible for the constants to have been otherwise. It's not possible to rule this out is it? Or even to estimate its probability. As to "going up a level", how can we actually know this makes sense? Again, this involves positing the existence of some sort of entities which determine the physical constants, with some degrees of freedom themselves. It seems even more speculative than simply supposing the constants actually are sampled from some distribution(s). It appears in your view that it cannot be that the constants each have only one possible value---you will always appeal to "higher levels" in order to maintain the position that they were somehow selected from a larger space. daveS
PPS: A designer pondering alternatives in an effective simulation world then effecting physically our observed cosmos based on choice would constitute fine tuning, too. Indeed, such would make Monte Carlo driven sensitivity analysis of greater import than anything else! (It would also give some force to the old thought about how in science we think God's thoughts after him -- along the lines of Hoyle's super-calculating super-intellect.) And, inherently, such a designer is on the table to explain origin of the world. kairosfocus
DS, remember, we are discussing not just laws but parameters and sheer simple quantities such as how many positive and negatively charged particles exist, how much matter exists, the balance of matter vs anti-matter etc. In that context, I have considered what are the logical options, relative to the known mathematical format of the composition and dynamics of cosmology that models our cosmos; using fairly standard longstanding approaches for looking at phase or configuration spaces and well-known results:
. . . once we have systems that can freely wander and any cell is at all possible, a large enough ensemble given enough time will pass through every possible state -- for that matter a singleton given enough time [much more time] will do the same, especially if we ponder a randomising force that drives a random walk effect that superposes on any trajectory . . . such a space is being searched and the issue is search resources and a wandering mechanism that makes any particular cell in the space a possibly occupied one . . . if possible [= accessible], it eventually will be occupied by the system.
Notice Walker and Davies above and their Arxiv paper. This then feeds an examination of the three alternatives, the three possibilities logically available:
(a) the lot are locked [by some force etc], (b) they vary with maximal freedom, (c) they vary with a bias that makes some zones easier/harder than others,
and this applies for in effect a neighbourhood of our local observed cosmos' framework. Such an approach will apply to ANY quasi-physical super-laws/ forces/ mechanisms driving such, whatever we may later see. To do that Monte Carlo style exploration I do not need anything more than the fact that mathematics can reasonably be seen as [the study of] the logic of structure and quantity. This study patently includes sensitivity analysis of the structure of mathematical systems or model frameworks. It is that study that led to the implications of fine tuning being highlighted since 1953. As for mechanisms that may drive the thing into lock [highly unlikely . . . and in regard to quantities of particles etc, almost utterly certainly not so] or give maximal freedom or freedom with a bias, that is of interest but not relevant to the point that we here have captured the three options for wandering around the abstract configuration space of the mathematical frame of a cosmos or sub-cosmos. The result is, fine tuning is still there as an inherent aspect of the mathematics, whether at first or second level or onward levels. On the locked option, ponder a force [in the broad sense]that locks cosmic mass at bang and inflation to what 1 in 10^60 or so, or the famous result on initiating entropy that BA77 likes to allude to; it is not plausible that such a framing force is itself locked . . . indeed what is quite plausible is that we have a design decision with bill of materials and properties etc that leads to framing a world that very much looks set to a local resonance because that is just what is intended. Fine tuning is not going away, as Sir Fred Hoyle knew long since. The real issue is to explain it, and the options seriously on the table are, multiverse or design. The evidence strongly favours the latter, but given the dominance of evolutionary materialism, that will be resisted, including by appealing to a multiverse for which there is but little physical evidence and certainly no observational evidence. KF PS: Someone above commented on the span of laws per observations and general views. I have given the span per observations and typical contexts, 90 bn LY and 13.8 BY. To change those simply provide strong enough observations to change the numbers. Science is not about observationally uncontrolled speculation. kairosfocus
KF,
The three cases with “distributions” of values are going to be locked to value seen in all possible and relevant worlds, varying flatly without constraint, an intermediate which is not quite fixed and not quite freely variable, i.e. a somehow preferred range but room to move about.
Yes, and I am asking only about the "locked to value" case, in which the physical constants could have only been what they actually are.
It turns out that on locking, then we face something that forces a wide range of quite disparate things to hold their values as we see them in any possible, relevant world. Such a law of force would obviously be itself fine tuned.
Well, "obviously fine tuned" suggest to me this law of force could have been different (otherwise no tuning is possible). But that means that the physical constants actually could have been otherwise, depending on the particular law of force. So I think we have different ideas of what "locked to value" means here. daveS
DS, pardon me but the issue is the issue. There is indisputably an ideological imposition of a priori evolutionary materialism on science of origins and I spoke to that; particularly to the self referential incoherence and the problem that such mechanisms as it allows cannot pass the Newton vera causa test. Tied, there is a problem of undue accommodation to this, which I have spoken of as becoming a fellow traveller in previous discussions. Perhaps using "you" was a poor word choice on my part, as I certainly did not mean "you" = WR, specifically. For that point of possible confusion, I am sorry; unintentional. Next, I draw attention to the way I addressed the question of values, exhausting all three positions on variation of parameters, amounts, and structures of laws in the context of sensitivity analysis. And yes, I am treating cosmology as though it were a model and am asking, what happens if -- per mathematics -- things move about. The three cases with "distributions" of values are going to be locked to value seen in all possible and relevant worlds, varying flatly without constraint, an intermediate which is not quite fixed and not quite freely variable, i.e. a somehow preferred range but room to move about. This exhausts the possibilities. Which was the intent. It turns out that on locking, then we face something that forces a wide range of quite disparate things to hold their values as we see them in any possible, relevant world. Such a law of force would obviously be itself fine tuned. For the consideration, moving about freely in whatever configuration space is relevant, we readily see the local resonance peak and deeply isolated operating point put on the table since 1953. (This is the one that is eing targetted by discussions about you have to know the probability distribution, i.e. we see a rejection of the Bernouilli-Laplace principle of indifference in probability analysis as default in absence of specific reason to assign bias. I suspect this is actually selectively hyperskeptical, but am not yet defending that view.) The third view is the one that is most esoteric, and probably requires some familiarity with Gibbsian approaches to statistical mechanics to see its force. The approach is to take as a thought exercise a large ensemble of systems with similar components and starting conditions then allow to run for long enough. Where of course randomness is a significant component of what is happening. The result is, that the phase space will be fully explored across the ensemble, as time goes on. If states are at all possible, they will be actual in some system at some point. 9just think, randomness is always disturbing trajectories and in the classical case sensitive dependence on initial or intervening circumstances will cause divergence across time between initially similar systems, until after a time they will be radically different. Eventually, with enough systems and enough time, every possible microstate will be actualised at some point. In this case, we are in effect doing a Monte Carlo run across the space of configs of the math of the cosmos, and thus the laws, parameters and values. bias but not locking obtains, so even though it is hard to reach certain possible states, eventually they WILL be reached. So, the issue becomes, how big a collection and how much time to run to search out the relevant zone. The end is, a biased distribution will only make it harder to explore the space fully, it will not block it, otherwise we are looking at some form of locking. on surveying the three cases, the result is obvious. The sharp resonance that marks the laws and parameters we see will still be there. In the locked parameters etc case, this is simply displaced to the next level -- what locks the lock so to speak. In the other two cases, the exploration will happen and the result will be to expose the sharp resonance. Coming back down to physical worlds, if the laws of physics, constants, values of quantities and so forth are locked to what we see, then there is a displacement of fine tuning to the locking force or mechanism if you will. Something is setting up the cosmos bakery to consistently deliver well baked loaves of bread. In the other cases, the sharp resonance is evident right there. So, if a multiverse exists, we need to ask, why are we in an operating point on such a sharp resonance? (By statistical weight of clusters, we should be anywhere but here. this invites the inference that there is intentional fine tuning that put us here. Leslie's lone fly on a patch of wall swatted by a bullet.) Fine tuning is there, it is not going away. KF kairosfocus
KF,
Pardon but did you observe that a few times now, I took time to look at two extrema and a spectrum between, then showed that none of the three cases suffices to remove the fine tuning issue?
Yes, I observed that you have claimed such*, but I wonder why you felt it necessary to include language such as the following in a post addressed to WR:
Once you have done that, all the challenges of comparative difficulties analysis are on the table. And you cannot appeal to the holy lab coat to lock out unwelcome major worldview alternatives. On pain of grand ideologically motivated question-begging. Where, very rapidly, such evolutionary materialism finds itself unable to account for the minds required to actually reason as opposed to compute. Not to mention, adherents face extraordinary problems accounting for creating FSCO/I rich computational substrates and soundly programming them. Evolutionary materialism is self referentially absurd. It is not a serious option, though it is a common one and it is artificially propped up ideologically, by those who hope to gain from its widespread adoption. We come full circle, without such materialism being privileged [and with it under a cloud of self falsification], what best explains precise coordination, coupling and organisation of many elements forming a coherently functional whole?
It seemed a bit condescending to me, given his position and background. But that's none of my business, I suppose. *This statement (following on yesterday's discussion) piques my interest:
Namely, if the cluster of parameters, quantities and laws are locked by some super-force, that force will be fine tuned. The problem is simply displaced one level.
It seems you are positing a collection of more than one "superforce" here, which is roughly just as questionable in my mind as positing that the fundamental physical constants are sampled randomly from some interval (of positive length) of real numbers. Why does there have to be more than one (or any, for that matter) superforces in order that the physical constants each can only take one value? daveS
There are many things in real life whose probability can’t be calculated. For example, how do you calculate the odds of coincidence? For example, there is a well-documented story from 1864 of Booth saving Lincoln before Booth shot Lincoln. A man by the name of Edwin Booth saved the life a younger man named Robert Lincoln who had just fallen off a station platform next to moving train, in Jersey City, NJ. But the coincidence goes beyond their last names. Edwin Booth was the brother of John Wilkes Booth who assassinated President Abraham Lincoln, while Robert Lincoln was the President’s oldest son. What are the odds of something like that happening? How would you ever begin to calculate the probability of something like that? http://www.historynet.com/edwin-booth Another coincidence: Thomas Jefferson and John Adams both die on July 4, 1826 exactly 50 years after the signing of the declaration of independence. Again, how would you ever begin to calculate the odds of something like that happening? People sometimes describe things like this as “just coincidence.” And the cases that I cited above may be just that. (My apologies to any Calvinists out there.) However, what do we say when we’re confronted with a string of coincidences? Suppose for example, one day while you are out driving you notice a dark SUV carrying 2 men dressed in dark suits wearing dark sunglasses. A few days later you see them again, then again a couple days after that. This continues for several weeks even after you deliberately decide to drive a different from normal route. Would you say that this was just coincidence or that you were being purposely followed? What would the basis of your inference be? A rigorous calculation of the probabilities? Or an intuition? The point is that when confronted with a string of coincidences we naturally begin to suspect after a while that maybe this is not just coincidence. I think that this is what is happening when we’re confronted with a string of finely tuned cosmic coincidences. Is it all just coincidence or is there some other explanation? Ironically, both theist and the atheist have the intuition that there must be some other explanation. It is just that they have different explanations. However, this leaves the atheist in the awkward situation that he has to believe in something he has no evidence for-- the multiverse-- by faith. But don’t atheists believe that faith is irrational? How ironic. john_a_designer
Certainly many of the fine tuning arguments are false. Either because universal constants cannot have any other value than they have, that the universal can only be or not be and not be tuned, or the value that they have could be tuned, but the one that they have is a logical function of maths, so is the most likely to begin with. mohammadnursyamsu
This semi-related piece just hit my facebook feed:
How a Defense of Christianity Revolutionized Brain Science - JORDANA CEPELEWICZ ON DEC 20, 2016 Excerpt: in 1748,, philosopher David Hume published 'An Enquiry Concerning Human Understanding', calling into question, among other things, the existence of miracles. According to Hume, the probability of people inaccurately claiming that they’d seen Jesus’ resurrection far outweighed the probability that the event had occurred in the first place. This did not sit well with the reverend. Inspired to prove Hume wrong, Bayes tried to quantify the probability of an event.,,, “The basic probabilistic point” of Price’s article, says statistician and historian Stephen Stigler, “was that Hume underestimated the impact of there being a number of independent witnesses to a miracle, and that Bayes’ results showed how the multiplication of even fallible evidence could overwhelm the great improbability of an event and establish it as fact.” The statistics that grew out of Bayes and Price’s work became powerful enough to account for wide ranges of uncertainties. In medicine, Bayes’ theorem helps measure the relationship between diseases and possible causes. In battle, it narrows the field to locate an enemy’s position. In information theory, it can be applied to decrypt messages. And in the brain, it helps make sense of sensory input processes. http://nautil.us/blog/how-a-defense-of-christianity-revolutionized-brain-science
bornagain77
DS, Pardon but did you observe that a few times now, I took time to look at two extrema and a spectrum between, then showed that none of the three cases suffices to remove the fine tuning issue? Namely, if the cluster of parameters, quantities and laws are locked by some super-force, that force will be fine tuned. The problem is simply displaced one level. Next, considering a reasonable neighbourhood of the observed operating point for the mathematical framework, we can use sensitivity analysis based on a flat random -- maximal freedom to move -- model, which immediately reveals the narrow resonance. That is, this highlights the fine tuning. Then, if we look on the model's configuration space and apply some bias that is intermediate between the two cases so far in a Monte Carlo style sensitivity analysis, so long as we have a large range of relevant cells in the space being possible, a sufficiently large ensemble of cases with sufficient time to develop will explore all possibilities. The issue being how big a collection and how long. But this case has also failed to escape fine tuning, it only tells us how big a search is needed to do such an exploration. (This is utterly unsurprising on longstanding stat mech results.) So, the side debates on specifying probability distribution functions is strictly irrelevant to the point at stake. Like many such debates it may indeed pull the argument into a side-point, but it does not actually remove the fine tuning challenge from the table. So, having first shown the irrelevancy, it is appropriate to draw attention back to the focal issue. KF kairosfocus
In my opinion using fine tuning as a teleological argument works just fine based on merely the precision of the fundamental constants alone, even if we can’t rigorously derive any of their probabilities. John Leslie’s firing squad parable, where 50 or more trained marksmen all miss a man who has been condemned to die is a good illustration. Leslie, who I don’t believe is a theist, used his parable to critique the weak anthropic principle which says, in a question begging way that we shouldn’t be surprised to find ourselves existing in this universe because if it wasn’t fine-tuned we would be here. If you were the person who survived firing squad would you conclude it was chance or luck, or some kind of conspiracy? What’s the best explanation? You would certainly have good reason to be surprised, wouldn’t you? john_a_designer
john_a_designer and daveS, I've apologized for my part in instigating his ad hominem towards me and moved on. Moreover, I have since then earnestly tried to see if his argument has any real merit to it, i.e. read his post from top to bottom, watched the back and forth here, mulled it over, and I can still find no real merit in his argument. He simply has offered no compelling reason, scientific or philosophical or otherwise, why anyone should not be 'surprised' by fine tuning. And like everyone else here, I can only offer my own opinion. But that is my opinion of his argument for what its worth. I simply, all personal issues aside, find his argument to be without any real merit. bornagain77
JAD, Yes, I'm sure he can. And to be explicit, you are one of those I referred to above who I believe is giving WR a fair hearing and even recognizing the points he makes. daveS
DaveS, I think Wayne can handle the back and forth. Go back and read and what I said @ 59. You can use direct link is here: https://uncommondesc.wpengine.com/fine-tuning/biology-prof-how-can-we-really-know-if-the-universe-is-fine-tuned/#comment-622376 And then his response at 61. Also read some of the legitimate concerns he raises @ 60. john_a_designer
KF and WR, Pardon my jumping in here, but I find this whole encounter quite baffling. AFAICS, WR has stated that unless someone can give pdfs for the fundamental physical constants [or perhaps supply some other justification], we should not make arguments based on the probability of those constants lying in a certain region. And for that he's being given a remedial course in ID. It appears to me that he simply wants to maintain a high level of rigor, and therefore should be praised (as some here have) rather than condescended to. daveS
JAD
Obviously we don’t know anything at all about other universe, indeed, we don’t know another universe exist or existed– let alone “10^500 possible universes.
Yes, this is the point which I haven't gotten wrossiter to acknowledge yet. You can't have it both ways. One cannot forbid speculations about the probable origin of physical constants but at the same time, allow speculations on the proposed 'count of additional (imaginary) universes that supposedly exist'. On WR's standard (which I agree with strictly speaking), all scientific talk of a multiverse must be silenced. We only know of one universe. Our data set is a population of one. That's it. But are we willing to never engage those conversations on an "even if your imaginary speculation made sense" basis? I agree with WR's rigor on the topic, but he's not being consistent. If we adopted his view, all talk of a multiverse from the ID perspective would be totally dead. It is supported by zero scientific evidence. Silver Asiatic
In all sincerity, a really good piece, kf. Just a little light-hearted comment on the following: “The 90 bn LY wide, 13.8 bn year cosmos testifies to the stability of the system in the long term and across an extraordinarily large span.” _______________________________________________________ Is space really “90 bn LY wide,” or is that version from the flat cosmos big bang society whereby we fall of the edge of space if we go any further? Seriously though, is not perceived design over billions of years just what you would expect if God created a stable mature universe in six days? According to any first impression in reading Judaeo-Christian scripture, God testifies he created in six days (Exod 31:18), words that God said are unalterable, and given with a warning (Deut 4:1-3). Today, we have gay people coming out of the closet seemingly replaced by those who believe God created by design in six days; words written in stone and placed in the ark of His Testimony designed by God according to His plans. The mercy-seat (the lid on top of the ark) designed by God (Exod 25:17) had two golden images of cherubs (Exod 25:18). From above the mercy-seat God would speak personally to Moses in the Holy of Holies. Further, God even provided spirit filled craftsmen “with ability, intelligence and knowledge,” including the clothing of the high priest and furniture of the tabernacle (Exod 31:1-11). So much for the nonsense of theistic Darwinism and blind design in imperceptible steps. You cannot get just when needed spirit filled craftsmen by theoretical common descent. Besides, Darwin rejected Jesus as the Son of God, and rejected miracles. Satan must have been rubbing his hands. However, God remined Moses: “And see that you make them according to the pattern for them, which is being shown you on the mountain.” (Exod. 25:40) Later, God provided the plans for the building of the first stone temple: “for the altar of incense made of refined gold, and its weight; also his plan for the golden chariot of the cherubim that spread their wings and covered the ark of the covenant of the LORD. ‘All this, in writing at the LORD’s direction, he made clear to me—the plan of all the works.’” (1 Chr 28:18-19 and 11:12) Of note; the designs given by Yahweh included a god disc, “a rosette of pure gold” (Exod 28:36) with four holes in it through which the disc was held by blue chord around the head-piece on the forehead of the high priest with the words, “Holiness to Yahweh.” Strict were the rules God gave for carrying His holy words in stone. However, considering the six stone jars of water at Cana; the water instantly created into mature wine. No test could prove it was not created instantly other than believing Jesus/God. According to untouchable divine law we could say we live in the matured wine of the cosmos. We have no means to decide how the cosmos arose because we cannot come out of the cosmos or have the power to produce a test cosmos to verify theory or not. Ultimately, only a true testimony will suffice: a gold standard of truth. Tuned will be our judgement to the word of God from Sinai. Today, it seems God’s word needs fine tuning to get his word in line with the Big Bang Theory. Hence, six days really means approx 365 x 13.8 billion years. Or, am I the one in need of fine tuning? As for design, the ark God designed was place in the first permanent temple God designed. Yahweh also gifted with the greatest wisdom king Solomon who had the temple built. Are we to say that with all the meticulous designs from God, over which the glory of God had spoken to Moses, built with spirit filled craftsman, that God had placed two wrong words in the Holy of Holies in the first stone temple? Seriously flawed the numbers six and seven, blemished and disfigured words in the Holy of Holies? Does that sound like a pattern of sound words? Or, perhaps the result of a powerful beguiling theory the God of this world has taken to his dark heart? Are we to say, that the God of numbers, who can number the hairs on our heads, cannot number the days he took to create, or number above our heads how long he took to create the cosmos. I mean, to create a cosmos in six days is worthy of worship. That God took ages, and ages, and ages and ages, seems very odd when he can create instantly life from a rock or stone. How long for a planetary rock: same time, surely? True divine given knowledge, testifies God created the cosmos in six days. If we are truly honest, no human can check or test truthfully a divine law. “I did not speak in secret, in a land of darkness; I did not say to the offspring of Jacob, ‘Seek me in chaos.’ I the LORD speak the truth, I declare what is right.” (Isa 45:19) Ah well, back to the closet. Happy Christmas. mw
Excellent work, KF. Thank you. Truth Will Set You Free
WR, pardon but I find it necessary to ask, have you ever designed and built anything that requires fine precision and tight coupling of multiple parts, say something mechanical with parts working together to 1 thou (of an inch) or an electronic ckt with a 1 - 10 parts per million crystal controlled oscillator that controls some process? Perhaps, even a bit of carpentry that requires precision to fit and function. There is a world of experience of such, and the unity of purpose and mutual fit required for organisation and coupling based function to emerge is itself directly a strong sign of design. This is driven by a longstanding sense that multiple coincidences that result in something of significantly precise fit, tight coupling and organisation to effect a function, is not credibly driven by blind chance and/or equally blind mechanical necessity. Yes, I know, I know, in biology we have long been indoctrinated to believe such things come about by the magic of natural selection [as the inadvertently telling summary is put]. But in fact there is not ONE instance of actually observed emergence of functionally specific complex organisation and its associated information by known blind forces. There are trillions of cases observed by intelligently directed creative configuration. In this context, per the von Neumann kinematic self replicator, we know that the mechanism of reproduction is also a case of FSCO/I, starting with cellular self replication. So, the appeal to filtered chance variation is not actually credible for life, apart from an a priori imposition contrary to Newton's vera causa principle, that we should explain traces of things we have not seen the cause of only on causes actually observed to produce the like effect. Now, I assume you are familiar with Sir Fred Hoyle, holder of a Nobel-equivalent prize for his astrophysics. It is he who led the process of identifying fine tuning in the physics and arrangements of the observed cosmos, its substance and underlying laws and parameters. It turns out that something remarkable has happened with the mathematics, that there is extraordinary coordination, precision and specificity of the set of key factors, and this not in a context that is anywhere near to the imagined magic of chance variation and differential reproductive success leading to equally imagined grand changes of body plan by incremental process. And yet, this is connected to biology, as it turns out the extraordinarily precise sensitivity locked into the mathematics is keyed to a cosmos in which C-chemistry, aqueous medium, cell based life is established. For me, just the result of the first four most abundant elements and the extraordinary properties locked into such, is enough to give me pause, as it did Sir Fred. H, gives us everything starting with stars. He, gets us to the rest of the table of elements. C and O in close balance dependent on resonances tied to Be is then extraordinary: water, the oxides at the core of terrestrial planet crusts, ozone shields, organic chemistry based on C as connector block element, water with its astonishing simplicity as an individual molecule and sophistication of function through a sort of polymerisation tied to its polar molecule, giving solvent and thermal properties etc. Add in N which is close by, and we have proteins. Remember, these are the four most common elements here. Fine tuning as a necessary and enabling condition of the sort of biological, C-chemistry, aqueous medium, cell based life we observe. Strong signs of intent. Now, I had to point out several times, that sensitivity analysis is a standard procedure in dealing with design or modelling, or systems. And this is what we have in analysing the physics that frames a cosmos in which life like we enjoy is possible. The 90 bn LY wide, 13.8 bn year cosmos testifies to the stability of the system in the long term and across an extraordinarily large span. is there a stabilising force that locks these parameters and laws etc together in any possible world? if so, the fine tuning force is itself extraordinarily fine tuned and this raises issues of design directly. (A theory of everything will NOT succeed in explaining away a cosmos, it only points to the sustaining power of the force that backs such a frame of laws were it to be discovered. Not that a lot of nonsensical rhetoric would not be launched were such discovered. there are none so blind as those who are determined not to see.) Perhaps, then, it is variable instead, per branes and the like with an extraordinary proposal of 10^500 sub-cosmi or some other ensemble . . . a standard move of statistical mechanics BTW is to analysie on a theoretical model of a large collection with closely similar initial conditions and independent unfolding. Whether there is maximal uncertainty in such an array -- thus for all we know flat random distributions of walks in phase space, or else there is some bias that constrains the possibilities to some extent intermediate between fully free and fully locked, makes but little difference. in effect a fundamentally random system with enough time and opportunity will walk throughout its phase space. A wide enough ensemble will therefore sample the full gamut of possibilities. The problem no 1 is, we don't have quasi infinite time and resources warranted by empirical evidence, we have about 10^80 atoms and 10^17 s. With org chemistry that might get us up to 10^12 - 14 reactions per second. The underlying physics is seen to be extraordinarily sensitive in the abstract space of parameters. If we are looking at a brane or the like, we should not be here, at a tight, tight resonance as operating point. We should have any one of a number of far easier to find points, the Boltzmann brain world being the most commonly discussed. That is, I point to the relative statistical weight of clusters of states. In this sense, Craig is well warranted to speak of improbable. The reason why we see certain standard thermodynamic properties and patterns is not that far different cases are impossible, but because there is accessibility of possible states multiplied by utterly overwhelmingly dominant clusters of more or less neighbouring configurations. this drives us to the case of powerful stabilisation to the point where in most cases fluctuations are simply below observability. Hence Criag's comments on Boltzmann brain worlds and the like. No, Craig has not made an embarrassing mis-step, he has alluded to a subtle but powerful pattern in large spaces of possibilities: utterly dominant clusters. In that higher order sense of probability, he is right to say it is utterly improbable to see us in this sort of cosmos, as opposed to clusters of possibilities in the abstract that would carry utterly overwhelming statistical weight. With blind forces and circumstances as the intended explanation, one is in fact constrained by such, and appealk to bare possibility becomes utterly incredible beyond a certain degree. Just the distribution of 1,000 coins tossed makes the point. Utterly overwhelmingly, you will find yourself by chance in the states close to 50-50 in no particular readily recognisable organised pattern. Patterns that can be simply, independently descried like "alternating H and T." In such a context, seeing such a pattern is thus sufficient grounds to infer design on sign that is highly reliable. But we are not there in the utterly dominant macrostates for observers, we are here at an utterly isolated narrow resonance as operating point for our cosmos. What, on our experience explains such a phenomenon, apart from expert craftsmanship? Ans (as a rule): silence and diversion. Compound this by looking at the presence of CODE in the heart of cell based life, i.e. LANGUAGE and ALPHABETIC symbol systems. tied to ALGORITHMS and implementation engines using molecular nanotech. What best explains symbols, language, algorithms and implementation engines? What is the empirically warranted explanation of same? What happens if we transform the cosmos into a grand ensemble giving each atom a tray of 1,000 coins or equivalent [say a paramagnetic substance with weak B field to impose directional order) and toss and observe 10^12 - 14 times/s for 10^17 s? Ans, we can only sample so small a portion of the space of possibilities that if we were to compare the effective zone of search to a needle the haystack of possibilities would more than swallow up the observed cosmos. Notice, again, not probability, search challenge i/l/o extraordinary degree of functionally complex organisation pointing to islands of function in a much larger space of possible configurations, leading to needle in vast haystack search challenge. Fine tuning, again. With LANGUAGE, CODES, ALPHABETS and ALGORITHMS in play. Again, a strong sign pointing to design, in the core of cell based life and causally antecedent to there being cellular self replication on genetic information. Where, the chemistry used for this is rooted in the fine tuned cosmos. Mutually reinforcing inferences from several widely different sciences, exponentiating the explanatory challenge to come up with a serious alternative to design. Then, look at ourselves, as needing to be responsibly and rationally free to undertake such a study on logic. Such freedom of mind cannot be explained on GIGO-limited computational substrates, not digital ones, nor analogue ones nor neural network ones. Computation is inherently blindly mechanical, not freely rational, it depends on the prior sound organisation of a programmer or designer to work. And blind chance and/or mechanical necessity in the face of the relevant functionally specific complex organisation and associated information (usually abbreviated hereabouts as FSCO/I) is not a credible explanation. Further convergent evidence. Now, this is not at all a demonstrative proof compelling agreement of all rational individuals. Not even Mathematics, on the whole post Godel, can achieve that. Instead, we have warrant on evidence led inference to the best current explanation, multiplied by the associated consequences of either global or selective hyperskepticism. And notice, we have not drawn on any precise probability models or frameworks to this point. We have simply pointed out that once there is a system that is subject to sensitivity analysis, we face lock-down to given config or else some degree of freedom to wander in the relevant config space. To wander across the whole space, simply multiply the number of possible cases in a population of test, once such becomes quasi-infinite the all but impossible is going to be there in some particular case. That is certainly one reason why multiverse models are appealed to in answer to the discovery of fine tuning. but then it poses a cruel dilemma: abandon probability based reasoning as a block, or stand indicted as playing at selective hyperskepticism to dismiss what one does not wish to face in the only case we actually do observe. As in, we have now crossed over into philosophical speculation, as there is no actual observed basis for a quasi-infinite array of possible worlds. In that speculation, we face the point that by overwhelming odds we should observe a world with parameters that are anything but as we observe. That is on a multiverse speculation, we are in an extraordinarily anomalous situation. We should not be seeing the sort of resonance point world we seem to be seeing. But there is is, all around us. So, we have a choice that speaks volumes on the inclinations of our hearts. Especially, in a world where -- we are now in phil, have been for some time since we are looking at unobserved multiverses etc -- we find ourselves subject to moral government and underlying principles of natural law. That, too, must be explained. But coming back full circle, the issue of possible different degrees of likelihood of different configurations makes but little difference in a quasi-infinite ensemble. Save, to tell us just how big it needs to be to make something in principle observable. That is where things fall apart for the materialist and those who travel with or unduly accommodate them, we are looking at vast realms of the unobserved, and so the materialist crosses over into philosophy unrecognised. Once you have done that, all the challenges of comparative difficulties analysis are on the table. And you cannot appeal to the holy lab coat to lock out unwelcome major worldview alternatives. On pain of grand ideologically motivated question-begging. Where, very rapidly, such evolutionary materialism finds itself unable to account for the minds required to actually reason as opposed to compute. Not to mention, adherents face extraordinary problems accounting for creating FSCO/I rich computational substrates and soundly programming them. Evolutionary materialism is self referentially absurd. It is not a serious option, though it is a common one and it is artificially propped up ideologically, by those who hope to gain from its widespread adoption. We come full circle, without such materialism being privileged [and with it under a cloud of self falsification], what best explains precise coordination, coupling and organisation of many elements forming a coherently functional whole? Ans: intelligently directed configuration, aka design. KF kairosfocus
Origenes @ 97 quotes William Lane Craig:
“even though there may be a huge number of possible universes lying within the life-permitting region of the cosmic landscape, nevertheless that life-permitting region will be unfathomably tiny compared to the entire landscape, so that the existence of a life-permitting universe is fantastically improbable. Indeed, given the number of constants that require fine-tuning, it is far from clear that 10^500 possible universes is enough to guarantee that even one life-permitting world will appear by chance in the landscape!”
I agree with Wayne when he says that Craig over reaches when he tries to apply a probabilistic argument to the multiverse (“the existence of a life-permitting universe is fantastically improbable”). Obviously we don’t know anything at all about other universe, indeed, we don’t know another universe exist or existed-- let alone “10^500 possible universes.” Craig commits what I call the “stepping in it” error. I learned about this error when I was growing up. As kids we liked to take walks in the fields of my uncle’s dairy farm. However, he warned us up front, “Don’t step in it.” (In case you’re wondering what it is, it rhymes with it.) On the other hand, atheistic naturalists and materialists, are compelled to accept the idea of a multiverse because they apparently believe that out universe’s fine-tuning is a result of chance. If it is a result of “chance,” it is their responsibility to derive probabilities for each of the cosmological constants, is it not? As I have written before,
“one of the strongest arguments in favor teleology (design or purpose) is the overwhelming evidence for what is commonly termed the fine tuning of the universe. Theists like myself argue that an intelligent Creator (God) is the ultimate explanation behind this apparent teleology. Ironically even some atheists are willing to concede that God is a possible explanation for the for the universes apparent fine-tuning. For example, in 2007 while making observations at the Keck observatory in Hawaii, Sandra Faber, a professor of astronomy at the University of California, Santa Cruz, told science writer Anil Ananthaswamy, “that there were only two possible explanations for fine-tuning. ‘One is that there is a God and that God made it that way…’ But for Faber, an atheist, divine intervention is not the answer. “The only other approach that makes any sense is to argue that there really is an infinite, or a very big, ensemble of universes out there and we are in one,” she said. This ensemble would be the multiverse. In a multiverse, the laws of physics and the values of physical parameters like dark energy would be different in each universe, each the outcome of some random pull on the cosmic slot machine. We just happened to luck into a universe that is conducive to life. After all, if our corner of the multiverse were hostile to life, Faber and I wouldn’t be around to ponder these questions under stars.” Other atheists agree that God counts as a rational explanation. In a debate with Christian philosopher William Lane Craig, California Institute of Technology physicist, Sean Carrol said, “I’m very happy to admit right off the bat – [that God fine-tuning the universe] is the best argument that the theists have when it comes to cosmology.” However, Carroll then deftly takes away with the left hand what he had just offered with his right. “I am by no means convinced that there is a fine-tuning problem,” he told Craig. Oh? Is Carrol speaking for everyone? Is an airy wave of the hand all that is needed to solve the fine tuning as a problem. Other prominent physicists and astrophysicists would disagree, among them Sir Martin Rees, Paul Davies, Roger Penrose, Stephen Hawking, Max Tegmark, Andrei Linde and Alexander Vilenkin to name a few. All these men, as far as I know, reject traditional theism. Nevertheless, they see fine-tuning as being a real problem in need of an explanation.
https://uncommondesc.wpengine.com/intelligent-design/scientists-driven-to-teleological-view-of-the-cosmos/#comment-622191 Why is it a problem? Because they believe “chance” is the explanation for the universe’s fine tuning. But chance just can’t start from nothing, so they have to dream up a way to kick the can down the road-- forever, if possible. Unfortunately, at least for the present, chance the way they are using it is not a scientific explanation but a metaphysical one. john_a_designer
A few quick questions: After this discussion about the probability of the fine tuning parameters is over, when the dust has been settled down, are we going to have a valid explanation for the origin of biological systems? Does the fine tuning alone resolve that problem? Should complex functional specified information be created? Can the fine-tuning create it? Is the fine-tuning a necessary condition? Is it sufficient? Thank you. Dionisio
bornagain77, Incidentally, the quantum Zeno effect apparently also suppresses quantum tunneling. http://phys.org/news/2015-10-zeno-effect-verifiedatoms-wont.html -Q Querius
The reason why I am very impressed with the Quantum Zeno effect is, to reiterate, Entropy is, by a wide margin, the most finely tuned of initial conditions of the Big Bang, i.e. 1 in 10^10^123 can't be written down in long hand notation even if every particle in the universe were used to try to denote the number. Another reason I am very impressed with the Quantum Zeno Effect is because of how foundational entropy is in its explanatory power for the actions within the space-time of the universe:
Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy. ,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/
And to make entropy even more personal, entropy is also the primary reason why our physical, temporal, bodies grow old and die,,,
Aging Process – 85 years in 40 seconds – video http://www.youtube.com/watch?v=A91Fwf_sMhk *3 new mutations every time a cell divides in your body * Average cell of 15 year old has up to 6000 mutations *Average cell of 60 year old has 40,000 mutations Reproductive cells are ‘designed’ so that, early on in development, they are ‘set aside’ and thus they do not accumulate mutations as the rest of the cells of our bodies do. Regardless of this protective barrier against the accumulation of slightly detrimental mutations still we find that,,, *60-175 mutations are passed on to each new generation. Per John Sanford Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220
And yet, to repeat,,,
Quantum Zeno effect Excerpt: The quantum Zeno effect is,,, an unstable particle, if observed continuously, will never decay. per wiki
This is just fascinating! Why in blue blazes should conscious observation put a freeze on entropic decay, unless consciousness was/is more foundational to reality than the 1 in 10^10^123 entropy is? In fact, when including other lines of evidence from quantum mechanics, we have a compelling argument for God from consciousness. Putting all the lines of evidence together the argument for God from consciousness can now be framed like this:
1. Consciousness either preceded all of material reality or is a ‘epi-phenomena’ of material reality. 2. If consciousness is a ‘epi-phenomena’ of material reality then consciousness will be found to have no special position within material reality. Whereas conversely, if consciousness precedes material reality then consciousness will be found to have a special position within material reality. 3. Consciousness is found to have a special, even a central position within material reality. 4. Therefore, consciousness is found to precede material reality. Five intersecting lines of experimental evidence from quantum mechanics that shows that consciousness precedes material reality (Double Slit, Wigner’s Quantum Symmetries, Wheeler’s Delayed Choice, Leggett’s Inequalities, Quantum Zeno effect): https://docs.google.com/document/d/1uLcJUgLm1vwFyjwcbwuYP0bK6k8mXy-of990HudzduI/edit
Verses, Video and Music:
Romans 8:18-21 I consider that our present sufferings are not worth comparing with the glory that will be revealed in us. The creation waits in eager expectation for the sons of God to be revealed. For the creation was subjected to frustration, not by its own choice, but by the will of the one who subjected it, in hope that the creation itself will be liberated from its bondage to decay and brought into the glorious freedom of the children of God. Psalm 102:25-27 Of old You laid the foundation of the earth, And the heavens are the work of Your hands. They will perish, but You will endure; Yes, they will all grow old like a garment; Like a cloak You will change them, And they will be changed. But You are the same, And Your years will have no end. "We have the sober scientific certainty that the heavens and earth shall ‘wax old as doth a garment’.... Dark indeed would be the prospects of the human race if unilluminated by that light which reveals ‘new heavens and a new earth.’" Sir William Thomson, Lord Kelvin (1824 – 1907) – pioneer in many different fields, particularly electromagnetism and thermodynamics. The Resurrection of Jesus Christ as the 'Theory of Everything' (Entropic Concerns) - video https://www.youtube.com/watch?v=rqv4wVP_Fkc&index=2&list=PLtAP1KN7ahia8hmDlCYEKifQ8n65oNpQ5 Evanescence – The Other Side (Lyric Video) http://www.vevo.com/watch/evanescence/the-other-side-lyric-video/USWV41200024?source=instantsearch
bornagain77
Well, I find the 1 in 10^10^123 initial entropy of the universe to be devastating for atheistic metaphysics from two different angles. First, the 1 in 10^10^123 event is so ‘exorbitantly improbable’ that it drives, (via only 'mediocre improbable' Boltzmann Brains), atheistic materialism into catastrophic epistemological failure. Second, it also, via quantum mechanics, provides fairly compelling evidence that consciousness must precede material reality. As to the first point:
The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: "The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the "source" of the Second Law (Entropy)." http://irafs.org/irafs_1/cd_irafs02/texts/penrose.pdf The 'accuracy of the Creator's aim' would have had to be in 10^10^123" Hawking, S. and Penrose, R., The Nature of Space and Time, Princeton, Princeton University Press (1996), 34, 35. Multiverse and the Design Argument - William Lane Craig Excerpt: Roger Penrose of Oxford University has calculated that the odds of our universe’s low entropy condition obtaining by chance alone are on the order of 1 in 10^10(123), an inconceivable number. If our universe were but one member of a multiverse of randomly ordered worlds, then it is vastly more probable that we should be observing a much smaller universe. For example, the odds of our solar system’s being formed instantly by the random collision of particles is about 1 in 10^10(60), a vast number, but inconceivably smaller than 1 in 10^10(123). (Penrose calls it “utter chicken feed” by comparison [The Road to Reality (Knopf, 2005), pp. 762-5]). Or again, if our universe is but one member of a multiverse, then we ought to be observing highly extraordinary events, like horses’ popping into and out of existence by random collisions, or perpetual motion machines, since these are vastly more probable than all of nature’s constants and quantities’ falling by chance into the virtually infinitesimal life-permitting range. Observable universes like those strange worlds are simply much more plenteous in the ensemble of universes than worlds like ours and, therefore, ought to be observed by us if the universe were but a random member of a multiverse of worlds. Since we do not have such observations, that fact strongly disconfirms the multiverse hypothesis. On naturalism, at least, it is therefore highly probable that there is no multiverse. — Penrose puts it bluntly “these world ensemble hypothesis are worse than useless in explaining the anthropic fine-tuning of the universe”. http://www.reasonablefaith.org/multiverse-and-the-design-argument Does a Multiverse Explain the Fine Tuning of the Universe? - Dr. Craig (observer selection effect vs. Boltzmann Brains) - video https://www.youtube.com/watch?v=pb9aXduPfuA
Thus, if you believe you live in a rational universe and are not a 'Boltzmann brain', then you are forced to believe it is 'exorbitantly' more likely that Theism is true. As to the second point, an unstable particle, if observed continuously, will never decay. This is known as the Quantum Zeno Effect:
Quantum Zeno Effect The quantum Zeno effect is,, an unstable particle, if observed continuously, will never decay. http://en.wikipedia.org/wiki/Quantum_Zeno_effect Interaction-free measurements by quantum Zeno stabilization of ultracold atoms – 14 April 2015 Excerpt: In our experiments, we employ an ultracold gas in an unstable spin configuration, which can undergo a rapid decay. The object—realized by a laser beam—prevents this decay because of the indirect quantum Zeno effect and thus, its presence can be detected without interacting with a single atom. http://www.nature.com/ncomms/2015/150414/ncomms7811/full/ncomms7811.html?WT.ec_id=NCOMMS-20150415 “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics. He then obtained a masters in theoretical mathematics from the University of Maryland. After graduating from law school, magna cum laude, he became a prominent attorney. 'Zeno effect' verified: Atoms won't move while you watch - Oct. 22, 2015 Excerpt: One of the oddest predictions of quantum theory – that a system can’t change while you’re watching it,,, Graduate students Yogesh Patil and Srivatsan Chakram created and cooled a gas of about a billion Rubidium atoms inside a vacuum chamber and suspended the mass between laser beams. In that state the atoms arrange in an orderly lattice just as they would in a crystalline solid. But at such low temperatures the atoms can “tunnel” from place to place in the lattice. ,,, The researchers demonstrated that they were able to suppress quantum tunneling merely by observing the atoms. http://www.news.cornell.edu/stories/2015/10/zeno-effect-verified-atoms-wont-move-while-you-watch
bornagain77
Bornagain77: Would 1 in 10^10^123 initial entropy be considered ‘exorbitantly improbable’?
That depends. It is not at all improbable in the context of 'eternal inflation' — in fact nothing is. There may very well be compelling arguments against eternal inflation, but improbability of events is not one of them. Origenes
as to: "My only beef is with the assertion that the observed values are exorbitantly improbable. I simply want ID folks and apologists to stop making those statements. They seem almost entirely unfounded. That is all." Would 1 in 10^10^123 initial entropy be considered 'exorbitantly improbable'? :) I see no reason to stop arguing that 1 in 10^10^123 initial entropy is 'exorbitantly improbable'. It certainly is 'exorbitantly improbable'. After having spent a few days trying to see if there is any real meat to your criticism, I still think you are tilting at windmills. Perhaps more so. You simply offer no compelling reason. There is nothing that you have said that makes me question the extraordinary nature to which we find the constants or the initial conditions of the universe. bornagain77
Or as my former boss, a Marine 2 star General used to say. "That should be obvious even to a sea-going corporal." ayearningforpublius
I repeat my comments and observations from @5 above: ------------------------------------- Some of the fundamentals of Darwinian Evolution, as I understand it are: The complexities of life we see all around us, and within us are assembled from the bottom up in a Natural Selection process which chooses beneficial mutations among a long series of such changes, while allowing less beneficial changes to wither away, or perhaps allowed to remain as flotsam or “junk.” The resulting “designs” we see from such a process are merely illusions, the appearance of design … not actual design as we see in all of the human artifacts we dwell among such as the automobile and computers. Evolution is said to be without purpose, without direction and without goals. What we may see as purpose, direction and goals are simply the result of the workings of natural processes – simply illusions of and the appearance of design, ___________________ So then why do we see purpose, direction and goals at every level of life – from the cellular level, to the systems level to the completed body plan? We see purpose in the various machines and structures within each of several trillion cells in our bodies. We see the Kinesin motor transporting cargo from one place on the cell to another. We see the marvel of DNA which, coupled with other cellular components, represents not only a massive mass storage capability, but also represents a type of blueprint package defining all aspects of the end product body. This DNA package also contains what can be described as a complete set of “shop travelers” which, much like a manufacturing process, provides step by step instructions and bills of materials for the manufacture of the myriad parts making up the completed human body – bones, hair, brain, liver, eye, nose … and more. And each of these subunits exhibits purpose — specific purpose. What is finally assembled as an arm and hand for example, takes on a myriad of functional purposes such as accurately throwing a baseball, playing a musical instrument such as a violin and cradling a new born baby. Each of our vital organs play specific and necessary roles in keeping our body alive and functioning – there are goals and purpose expressed in each and every one of our body parts. What we see and experience in the finished goal directed and purposeful human body is beautifully expressed in many ways, such as when we witness a magnificent choral and orchestral performance such as Handel’s Messiah. What we experience in that concert hall is not an illusion — it is real and is the culmination of a multitude of designs, both in the natural as well as the realm of human intelligence and ingenuity. _____________________________ It seems as simple as that! Why all this talking over the heads of the common man? ayearningforpublius
wrossite: ... I’ll ask you the same question I’ve asked others: What is the range of possible values for a given parameter (let’s say Newton’s gravitational constant) and what does the probability distribution look like? (and how did you derive it)?
Is it not obviously the case that the probability distribution depends on the proposed mechanism? And if this is indeed the case then what proposed mechanism does Kairosfocus use as a context? As I understand it there are at least two contenders: 1) The 'cosmic landscape', which is related to string theory. If I understand it correctly, 10^500 different universes governed by the present laws of nature but with an uniform distribution of different values of the physical constants. There seems to be considerable room for discussion about probabilities. Craig writes:
"even though there may be a huge number of possible universes lying within the life-permitting region of the cosmic landscape, nevertheless that life-permitting region will be unfathomably tiny compared to the entire landscape, so that the existence of a life-permitting universe is fantastically improbable. Indeed, given the number of constants that require fine-tuning, it is far from clear that 10^500 possible universes is enough to guarantee that even one life-permitting world will appear by chance in the landscape!"
2) Eternal inflation.
Carrier: Everyone agrees multiverse theory refutes any fine tuning argument for God. Because on a standard multiverse theory (e.g. eternal inflation), all configurations of physical universes will be realized eventually, and therefore the improbability of any of them is negated. No matter how improbable an individual universe is, the probability that it exists if a multiverse exists is effectively 100%.
If Carrier is correct, then there is no sense in talking about probabilities in the context of 'eternal inflation'. Origenes
KF,
DS, kindly see just above. The relevant calcs are done in stat mech or in info theory, where systems are WLOG reducible to strings that describe in some description language.
Ok, I take it this means your proposal is meant to actually be carried out. Please let me know if anyone publishes on this second-order sensitivity analysis. daveS
All, it seems that my point has been conceded over and over, and yet some of you want to misrepresent the extent of my argument. My argument is incredibly modest. From the get-go, I have acknowledged the precision (sensitivity) argument. I'm not taking exception with that point. My only beef is with the assertion that the observed values are exorbitantly improbable. I simply want ID folks and apologists to stop making those statements. They seem almost entirely unfounded. That is all. W wrossite
Wayne @ 81 wrote,
“My major concern with arguments from fine-tuning in cosmology is, how do we really get from observations of precision to statements of probability? To say that something is precise is not to say that it is improbable. Those are two different things…To say that such-and-such a physical parameter rests upon a razor’s edge does tell us something. It tells us that any small change in the setting of that parameter would lead to a universe drastically different from the one we live in, and likely one that could never even produce material objects (let alone life) as we understand it. Fair enough. I agree. What it doesn’t tell us is how likely any of those other settings are.” Okay, so “precision” is a way of speaking of sensitivity. Here we agree. Therefor what? To even talk about sensitivity as if it’s important, we must assume that it was possible for a parameter to fall outside of those narrow ranges of precision. We cannot… Bruce Gordon offering, “Suppose that the universe is about 30 billion light years across…and you stretch a tape measure across that. That comes out to about 10^28 inches. Peg Newton’s constant on one of those inches. Now, what would be the consequence say, of moving one inch to the left or to the right?” These are explicit appeals to statements regarding parameter space (a range of possible values, all being equiprobable). If we can’t say such things, then we shouldn’t.
A couple of people commenting here have said that they think that you have made a good point here. I agree that we cannot at present derive probabilities for every one of the fine-tuned constants. However, I see a problem. Why can’t we say that? Is it logically impossible? Earlier @ &72 I wrote: “Assuming that they could be different (and not knowing if they could be, it is possible that they could) how would you derive probability for each one?” Again, I’ll concede that we have a problem getting “from observations of precision to statements of probability,” however, I don’t see why that should limit us about speculating what metaphysically could be. Here is the fine-tuning argument that William Lane Craig likes to use:
The fine-tuning of the universe to support life is either due to law, chance or design It is not due to law or chance Therefore, the fine-tuning is due to design
And, of course if the universe is designed it must have a designer. Craig does NOT claim that this is a scientific argument. Rather, he argues that it is a philosophical argument with a premise that is derived inductively from empirical observations of the universe. Philosophical augments deal with what is logically possible. In other words, I don’t see how it is logically impossible that a given cosmological constant could not be different. Would you argue that because we don’t know how to state these constants in strict probabilistic terms that the fine-tuning cannot possibly have been different? In other words, if it is logically impossible, can you explain how? john_a_designer
DS, kindly see just above. The relevant calcs are done in stat mech or in info theory, where systems are WLOG reducible to strings that describe in some description language. In any case, long before we get to such, the spectrum from maximal flexibility to no flexibility is incapable of eliminating the import of the sensitivity analysis. The observed cosmos is at a sharp resonance point per the sensitivity analysis, with some values set in the same range in multiple contexts. Fine tuning is a serious issue. KF kairosfocus
WR, with all due respects, I have made no such decision. I have simply pointed out that we look here at sensitivity analysis, an absolutely standard mathematical procedure . . . and since 1953, we have increasingly seen that if parameters were just slightly different, we would not have a cell based life permitting cosmos. If sensitivity by possibility of varied values does not apply and the values of quantities, parameters, structure of laws etc are all locked -- not particularly credible but we consider for argument -- then it points to a locking force, which would itself manifest awesome fine tuning across a span of at least 90 bn light years and what 13.8 by . . . big shoes to fill. Second, I point you to statistical mechanics, in which the flat distribution model is a first case, and there is a more general approach that allows for varying probabilities, e.g. through expressions of form SUM pi log pi; consider the comparable case of equiprobable symbols in strings, and strings with diverse probability symbols [which, necessarily, lowers the uncertainty involved]. in effect, maximum flexibility, no flexibility, the spectrum that blends the two between. None of the three cases -- note, that covers the available ground -- is capable of eliminating fine tuning as a significant issue as manifested in sensitivity analysis. The issue is there, the issue of one value taking several significance is there, and so forth. What is needed is not to raise clouds of dust regarding the phenomenon, but to address its import as a striking meta observation. KF kairosfocus
to reiterate wrossite's claim at 44,,,
KF, Thanks for the post. Notice that it assumes many of the things I’m suggesting we can’t. It talks as if there are in fact other universes. It talks as if we can know anything about them or their laws. It assumes a range and a distribution for parameters. None of this can be known. It is pure speculation. As I’ve repeatedly pointed out, you cannot calculate a probability given a sample size of one. W
wrossite is basically trying to lay down a mandate that states 'we are not allowed to speculate on what might have been before time began at the big bang' And as was pointed out in 51, and then further clarified in 70, wrossite is basically trying to do the impossible in that he is trying to get man to act against the 'timeless' nature of his own thoughts. The overall gist of post 70 was,,
"since man thinks, speaks, and writes in terms of (immaterial) information, then this makes the nature of man’s thoughts, of necessity, ‘timeless’."
But to go further, wrossite's claim that we can't speculate as to what was before time began at the big bang strongly reminds me of Godel's incompleteness theorem. Godel, by using 'timeless' mathematics and logic, (specifically using the 'logic of infinity'), proved that “Anything you can draw a circle around cannot explain itself without referring to something outside the circle—something you have to assume but cannot prove” Stephen Hawking himself conceded this point to Godel's incompleteness theorem in his book 'The Grand Design':
"Gödel's incompleteness theorem (1931), proves that there are limits to what can be ascertained by mathematics. Kurt Gödel (ref. on cite), halted the achievement of a unifying all-encompassing theory of everything in his theorem that: “Anything you can draw a circle around cannot explain itself without referring to something outside the circle—something you have to assume but cannot prove”. Thus, based on the position that an equation cannot prove itself, the constructs are based on assumptions some of which will be unprovable." Cf., Stephen Hawking & Leonard Miodinow, The Grand Design (2010) @ 15-6
And although we may not be able to mathematically prove what is outside the circle of the universe, (or outside any circle we may draw around anything else), which is the main point that I believe wrossite is trying to drive at, none-the-less, I hold that we can still at least logically know what is outside the circle of the universe. (In fact, Godel proved his incompleteness theorem using logic instead of proving it with math)
Taking God Out of the Equation - Biblical Worldview - by Ron Tagliapietra - January 1, 2012 Excerpt: Kurt Gödel (1906–1978) proved that no logical systems (if they include the counting numbers) can have all three of the following properties. 1. Validity ... all conclusions are reached by valid reasoning. 2. Consistency ... no conclusions contradict any other conclusions. 3. Completeness ... all statements made in the system are either true or false. The details filled a book, but the basic concept was simple and elegant. He (Godel) summed it up this way: “Anything you can draw a circle around cannot explain itself without referring to something outside the circle—something you have to assume but cannot prove.” For this reason, his proof is also called the Incompleteness Theorem. Kurt Gödel had dropped a bomb on the foundations of mathematics. Math could not play the role of God as infinite and autonomous. It was shocking, though, that logic could prove that mathematics could not be its own ultimate foundation. Christians should not have been surprised. The first two conditions are true about math: it is valid and consistent. But only God fulfills the third condition. Only He is complete and therefore self-dependent (autonomous). God alone is “all in all” (1 Corinthians 15:28), “the beginning and the end” (Revelation 22:13). God is the ultimate authority (Hebrews 6:13), and in Christ are hidden all the treasures of wisdom and knowledge (Colossians 2:3). http://www.answersingenesis.org/articles/am/v7/n1/equation#
I think that it is fairly obvious that either God or random chance must be 'outside the circle' of the universe. Yet, through detailed analysis of Godel's incompleteness theorem, it is found that random chance, (i.e. anti-theism), cannot possibly ground mathematics. Therefore random chance cannot possibly be the assumption that we are forced to make for what is outside Godel's circle for mathematics or, more importantly, outside Godel's circle for the universe
A BIBLICAL VIEW OF MATHEMATICS Vern Poythress - Doctorate in theology, PhD in Mathematics (Harvard) 15. Implications of Gödel’s proof B. Metaphysical problems of anti-theistic mathematics: unity and plurality Excerpt: Because of the above difficulties, anti-theistic philosophy of mathematics is condemned to oscillate, much as we have done in our argument, between the poles of a priori knowledge and a posteriori knowledge. Why? It will not acknowledge the true God, wise Creator of both the human mind with its mathematical intuition and the external world with its mathematical properties. In sections 22-23 we shall see how the Biblical view furnishes us with a real solution to the problem of “knowing” that 2 + 2 = 4 and knowing that S is true. http://www.frame-poythress.org/a-biblical-view-of-mathematics/
Therefore, via process of elimination, via Godel's incompleteness, God must be what, or more specifically, Who is outside the circle of the universe. bornagain77
Origenes, thank you for your comments to my unclear sentence; “What we cannot then really extrapolate is, was that by coincidence or by design, and how does the figures come about?” You say: “What figures? Where does the possibility of coincidence come from? For the upteenth time: probabilities or improbabilities enter the arena only after some smartass proposes a random mechanism. _________________________________________________________________ According to eyewitnesses, God wrote two figures in stone, the numbers six and seven, and stated them publicly with trumpet blast. It seems that the figure six in relation to divine law is like a God Mode, where all the laws, constants, physical laws and such like came about through one key stroke activated by voice. How or why did an all seeing God choose to write that he created in six days; coincidence, or random choice for primitive misunderstanding convenience; or, by design and truth for our future benefit and good? Sure, we can in common sense perhaps understand irreducible complexity, but we still cannot prove the God Mode by which such came about, and certainly not time wise. We can only believe. Or, perhaps dismissively say such questions are not scientific enough! Surely, therefore, there must also be irreducible complexity in the cosmos, fine tuning included? Just because we may think by consensus science we can theoretically detect some form of beginning, does not disprove a matured beginning in six days. However, Darwin was a "smartass;" he simple dismissed miracles saying the witnesses were unreliable. The Big Bang Theory does similar. Perhaps inflation theory is the real "smartass" of the Big Bang, that and not having a verifiable cosmic theory from no space. Or, tying down Yahweh to the Big Bang and Darwinism which will eventually flush him down some black hole in disbelief. Evolutionism is Satan's best means to divide and exorcise out Christianity. It certainly is eclipsing the Judaeo-Christian God and making a mess of scripture. According to Genesis, he knows, in beguiling tones, how to fine tune disbelief in one and then for many to be effected. mw
KF. First, my point was that others (e.g. Gordon and Craig) DO TREAT the range of possible values as equiprobable. If that's wrong, then they need to know it. Now, since you've decided that they aren't equiprobable, I'll ask you the same question I've asked others: What is the range of possible values for a given parameter (let's say Newton's gravitational constant) and what does the probability distribution look like? (and how did you derive it)? W wrossite
KF, Yes, I understand the argument, but my question has to do with actually carrying out this sensitivity analysis, i.e., actually performing calculations in MATLAB or some other language (hopefully :P). I assume an early step would be to identify and parameterize the "second-level inputs", which I tentatively labeled H_i. These are "cosmos factories" which generate collections of fundamental constants of nature. Do you intend this to be more of a thought experiment rather an analysis to be actually performed? daveS
WR, at no point whatsoever above, have I argued that the values in paramteres tied to our cosmos and its life permitting constraints, are required to be equi-probable. I have explicitly pointed to the opposite, identifying the implication of there being a locking force that sets the parameters. Namely, that the fine tuning goes up one level. I highlighted that the fine tuning challenge is hard to escape -- if one deals with it on the merits. KF PS: Can you show me where sensitivity analysis is not a normal or typical facet of analysis of frameworks for designs or models etc? I submit, you cannot. Such analysis is not inherently wedded to equiprobable possibilities, as say statistical thermodynamics readily shows. What do you thing expressions of form [SUM pi log pi] are about, but providing for cases where probabilities are not flat random? kairosfocus
DS, Kindly cf Robin Collin et al and the bread-baking factory discussion; as though there was need for some authority to say anything before it can be addressed on its patent merits. A system that consistently turns out well baked loaves of bread rather than a doughy mess or a burned hockey puck, is very carefully calibrated so to do. Similarly, when we see a lone fly on a patch of wall swatted by a bullet we are looking at a tack driver rifle and a marksman able to use the capacity of the rifle -- if this were a world by chance driven pursuit of life permitting zones, we should expect to be in the equivalent of Leslie's fly-carpeted portion of the wall, for reasons quite similar to the driving logic behind the statistical form of the second law of thermodynamics. This is fine tuning. And the point was, if the system is "locked" to produce our cosmos, that implies a prior mechanism that does the locking. Further to this, in a multiverse type scenario, we are looking at the implication that we are dealing with a deeply isolated narrow "resonance," which so happens to bring together a great many factors in a context of mutual fit and constraints that enables what we see. that strongly points to unified purpose and to powerful, intelligent, knowledgeable and skilled mind behind it. I suggest that, absent strong empirically grounded reason to hold otherwise, the evident fine tuning of the observed cosmos strongly points to a designer. KF kairosfocus
WR, re:
My major concern with arguments from fine-tuning in cosmology is, how do we really get from from observations of precision to statements of probability?
Actually, no. As I have pointed out for some time now, the first issue is, the sensitivity analysis in the framework of physics that undergirds the cosmos. As, for instance, Sir Fred Hoyle pointed out long ago now, as the first significant person to note a fine tuning result. Also, as Leslie highlighted with his lone fly on a section of wall argument. We have a system that evidently has closely co-adapted components, many of which are multiply constrained, and this is as an integral part of a unified system that enables function, here, a life permitting cosmos with C-chemistry, aqueous medium cell based life. That first needs to be faced. Yes, probability issues do come in but they come in in that context. KF kairosfocus
WR
These are explicit appeals to statements regarding parameter space (a range of possible values, all being equiprobable).
Yes, but in those two quotes you offered, neither explicitly mentioned probabilities, strictly speaking. They're just referring to imaginary scenarios. As I said earlier, if we can't know if any other range of values is possible (and therefore cannot speak of them) then we can't know if any other universe is possible. So discussions on the possibility of a multiverse die right there. But again, what I thought was strange was that you didn't hesitate to consider the probability of a multiverse, and not only that, but accept that there is some sort of likelihood that there are 10^500 of them. That simply doesn't follow. What Craig and Gordon are doing is simply using imaginary concepts and drawing some common sense ideas from them. Multiverse proponents dress up their ideas with some mathematics, but they're doing the same thing. It's completely imaginary with zero directly observable evidence to support it. We don't know if any other universe is possible. Silver Asiatic
KF, PS: Regarding this second-level analysis, it seems to me that we would have to posit a family of super-force/principles of action regimes H_i such that the physical constants are logically "locked together" under some of the H_i but not the others. For example, perhaps under H_1, the physical constants could all range between 0 and infinity, each with some probability distribution. Perhaps under H_2, the physical constants are locked together logically, with the values we observe in our universe. There might be an H_3 under which the physical constants are again locked together, but with values different from those in our universe. And so forth. Does that sound about right? daveS
KF,
We need to accept that sensitivity analysis is inherently part of analysing models or systems with parameters and structuring frameworks that are not locked by force of logical necessity. (And in this case, if such a range of entities is so locked to fit together, the implied super-force and principles of action yielding the structure would be a very interesting target for level 2 sensitivity analysis.)
Has anyone attempted this "level 2" sensitivity analysis in the case where the physical constants are indeed locked together by logical necessity? This reminds me a bit of the discussions we've had over Euler's Identity exp(iπ) = −1, which is logically necessary, I take it. Can a second order analysis be performed on this instance? daveS
From my blog: "My major concern with arguments from fine-tuning in cosmology is, how do we really get from from observations of precision to statements of probability? To say that something is precise is not to say that it is improbable. Those are two different things...To say that such-and-such a physical parameter rests upon a razor’s edge does tell us something. It tells us that any small change in the setting of that parameter would lead to a universe drastically different from the one we live in, and likely one that could never even produce material objects (let alone life) as we understand it. Fair enough. I agree. What it doesn’t tell us is how likely any of those other settings are." Okay, so "precision" is a way of speaking of sensitivity. Here we agree. Therefor what? To even talk about sensitivity as if it's important, we must assume that it was possible for a parameter to fall outside of those narrow ranges of precision. We cannot. And yet, so many do. I offer two examples in my blog (Bill Craig saying, “[Fine-tuning] is like all the roulette wheels in Monte Carlo’s yielding simultaneously numbers within narrowly prescribed limits and those numbers bearing certain precise relations among themselves.” Bruce Gordon offering, “Suppose that the universe is about 30 billion light years across…and you stretch a tape measure across that. That comes out to about 10^28 inches. Peg Newton’s constant on one of those inches. Now, what would be the consequence say, of moving one inch to the left or to the right?” These are explicit appeals to statements regarding parameter space (a range of possible values, all being equiprobable). If we can't say such things, then we shouldn't. I'm not sure why we're suddenly discussing biology, the origins of life, etc. I haven't attacked any of these. In fact, I'd rather see us spend out time talking about these items, because they can be experimentally and empirically examined. Genuine probabilities can be expressed. W wrossite
PPS: Walker and Davies update Hoyle:
In physics, particularly in statistical mechanics, we base many of our calculations on the assumption of metric transitivity, which asserts that a system’s trajectory will eventually [--> given "enough time and search resources"] explore the entirety of its state space – thus everything that is phys-ically possible will eventually happen. It should then be trivially true that one could choose an arbitrary “final state” (e.g., a living organism) and “explain” it by evolving the system backwards in time choosing an appropriate state at some ’start’ time t_0 (fine-tuning the initial state). In the case of a chaotic system the initial state must be specified to arbitrarily high precision. But this account amounts to no more than saying that the world is as it is because it was as it was, and our current narrative therefore scarcely constitutes an explanation in the true scientific sense. We are left in a bit of a conundrum with respect to the problem of specifying the initial conditions necessary to explain our world. A key point is that if we require specialness in our initial state (such that we observe the current state of the world and not any other state) metric transitivity cannot hold true, as it blurs any dependency on initial conditions – that is, it makes little sense for us to single out any particular state as special by calling it the ’initial’ state. If we instead relax the assumption of metric transitivity (which seems more realistic for many real world physical systems – including life), then our phase space will consist of isolated pocket regions and it is not necessarily possible to get to any other physically possible state (see e.g. Fig. 1 for a cellular automata example).
[--> or, there may not be "enough" time and/or resources for the relevant exploration, i.e. we see the 500 - 1,000 bit complexity threshold at work vs 10^57 - 10^80 atoms with fast rxn rates at about 10^-13 to 10^-15 s leading to inability to explore more than a vanishingly small fraction on the gamut of Sol system or observed cosmos . . . the only actually, credibly observed cosmos]
Thus the initial state must be tuned to be in the region of phase space in which we find ourselves [--> notice, fine tuning], and there are regions of the configuration space our physical universe would be excluded from accessing, even if those states may be equally consistent and permissible under the microscopic laws of physics (starting from a different initial state). Thus according to the standard picture, we require special initial conditions to explain the complexity of the world, but also have a sense that we should not be on a particularly special trajectory to get here (or anywhere else) as it would be a sign of fine–tuning of the initial conditions. [ --> notice, the "loading"] Stated most simply, a potential problem with the way we currently formulate physics is that you can’t necessarily get everywhere from anywhere (see Walker [31] for discussion). ["The “Hard Problem” of Life," June 23, 2016, a discussion by Sara Imari Walker and Paul C.W. Davies at Arxiv.]
kairosfocus
Folks, Pardon a few prelim, personal notes, I am a bit less headachy and time-squeezed than I was yesterday, as parliament sits today. And, the creative mindstorm has reached critical mass and is over even as U3 is on the table as WIP -- here's lookin' at ya, St Helena GBP 285 mn airport controversy [and Wiki comes up for Kudos, News . . . by contrast with much of the UK press] -- with U1 and U2 initially complete as well as a scope-sequence with refs. Now to look at fine tuning, from the Math angle. Mathematics can aptly be understood as the [study of the] logic of structure and quantity. It is an inherently abstract discipline and it is inextricably deeply entangled with the physical sciences. Where, in order for such sciences to exist, we must live in a world where sufficiently rationally and responsibly free agents are possible -- and actual -- that such logic can be freely taken up and pursued. (BTW, this already constrains the nature of reality, but that is metaphysics.) Also, we must have a cosmos that is observer-permitting at local and cosmological level. Down that road lies the privileged planet discussion, which again we can set aside for another interesting day. Our present focus is the mathematics of a cosmos such as ours. Allow me to suggest that there are well-known results relevant to the constitution of a fine-tuned cosmos. Sir Fred Hoyle:
>>[Sir Fred Hoyle, In a talk at Caltech c 1981 (nb. this longstanding UD post):] From 1953 onward, Willy Fowler and I have always been intrigued by the remarkable relation of the 7.65 MeV energy level in the nucleus of 12 C to the 7.12 MeV level in 16 O. If you wanted to produce carbon and oxygen in roughly equal quantities by stellar nucleosynthesis, these are the two levels you would have to fix, and your fixing would have to be just where these levels are actually found to be. Another put-up job? . . . I am inclined to think so. A common sense interpretation of the facts suggests that a super intellect has "monkeyed" with the physics as well as the chemistry and biology, and there are no blind forces worth speaking about in nature. [F. Hoyle, Annual Review of Astronomy and Astrophysics, 20 (1982): 16.]>> . . . also, in the same talk at Caltech: >>The big problem in biology, as I see it, is to understand the origin of the information carried by the explicit structures of biomolecules. The issue isn't so much the rather crude fact that a protein consists of a chain of amino acids linked together in a certain way, but that the explicit ordering of the amino acids endows the chain with remarkable properties, which other orderings wouldn't give. The case of the enzymes is well known . . . If amino acids were linked at random, there would be a vast number of arrange-ments that would be useless in serving the pur-poses of a living cell. When you consider that a typical enzyme has a chain of perhaps 200 links and that there are 20 possibilities for each link,it's easy to see that the number of useless arrangements is enormous, more than the number of atoms in all the galaxies visible in the largest telescopes. [ --> 20^200 = 1.6 * 10^260] This is for one enzyme, and there are upwards of 2000 of them, mainly serving very different purposes. So how did the situation get to where we find it to be? This is, as I see it, the biological problem - the information problem . . . . I was constantly plagued by the thought that the number of ways in which even a single enzyme could be wrongly constructed was greater than the number of all the atoms in the universe. So try as I would, I couldn't convince myself that even the whole universe would be sufficient to find life by random processes - by what are called the blind forces of nature . . . . By far the simplest way to arrive at the correct sequences of amino acids in the enzymes would be by thought, not by random processes . . . . Now imagine yourself as a superintellect working through possibilities in polymer chemistry. Would you not be astonished that polymers based on the carbon atom turned out in your calculations to have the remarkable properties of the enzymes and other biomolecules? Would you not be bowled over in surprise to find that a living cell was a feasible construct? Would you not say to yourself, in whatever language supercalculating intellects use: Some supercalculating intellect must have designed the properties of the carbon atom, otherwise the chance of my finding such an atom through the blind forces of nature would be utterly minuscule. Of course you would, and if you were a sensible superintellect you would conclude that the carbon atom is a fix. >> . . . and again: >> I do not believe that any physicist who examined the evidence could fail to draw the inference that the laws of nuclear physics have been deliberately designed with regard to the [--> nuclear synthesis] consequences they produce within stars. ["The Universe: Past and Present Reflections." Engineering and Science, November, 1981. pp. 8–12]>>
Notice, how his primary focus is on sensitivity analysis? Once Mathematics is in the door, inherently all of it is in the door; logic is free-ranging and Mathematics turns on logic. We have structure, we have quantities, what does logic have to say about this? Consequently, probability is a secondary matter. We need to accept that sensitivity analysis is inherently part of analysing models or systems with parameters and structuring frameworks that are not locked by force of logical necessity. (And in this case, if such a range of entities is so locked to fit together, the implied super-force and principles of action yielding the structure would be a very interesting target for level 2 sensitivity analysis.) Let me clip the Matlab-Simulink folks, world class experts on the math of systems and designs:
Sensitivity analysis is defined as the study of how uncertainty in the output of a model can be attributed to different sources of uncertainty in the model input[1]. In the context of using Simulink® Design Optimization™ software, sensitivity analysis refers to understanding how the parameters and states (optimization design variables) of a Simulink model influence the optimization cost function. Examples of using sensitivity analysis include: Before optimization — Determine the influence of the parameters of a Simulink model on the output. Use sensitivity analysis to rank parameters in order of influence, and obtain initial guesses for parameters for estimation or optimization. After optimization — Test how robust the cost function is to small changes in the values of optimized parameters. One approach to sensitivity analysis is local sensitivity analysis, which is derivative based (numerical or analytical). Mathematically, the sensitivity of the cost function with respect to certain parameters is equal to the partial derivative of the cost function with respect to those parameters. The term local refers to the fact that all derivatives are taken at a single point. For simple cost functions, this approach is efficient. However, this approach can be infeasible for complex models, where formulating the cost function (or the partial derivatives) is nontrivial. For example, models with discontinuities do not always have derivatives. Local sensitivity analysis is a one-at-a-time (OAT) technique. OAT techniques analyze the effect of one parameter on the cost function at a time, keeping the other parameters fixed. They explore only a small fraction of the design space, especially when there are many parameters. Also, they do not provide insight about how the interactions between parameters influence the cost function. Another approach to sensitivity analysis is global sensitivity analysis, often implemented using Monte Carlo techniques. This approach uses a representative (global) set of samples to explore the design space. Use Simulink Design Optimization software to perform global sensitivity analysis using the Sensitivity Analysis tool, or at the command line . . .
They are telling how to use their software, but that is just a tool. The point is they are speaking about sensitivity analysis. Now, what happens when such analysis delivers the result that we are at a "special" "resonance" where various components must be just so in a neighbourhood, N, in order for a recognisable distinction to obtain, is that we are looking at an island of function in a configuration space. A locally tight, narrow island of special, life-permitting function, is just what we are looking at; hence, John Leslie again:
"One striking thing about the fine tuning is that a force strength or a particle mass often appears to require accurate tuning for several reasons at once. Look at electromagnetism. Electromagnetism seems to require tuning for there to be any clear-cut distinction between matter and radiation; for stars to burn neither too fast nor too slowly for life’s requirements; for protons to be stable; for complex chemistry to be possible; for chemical changes not to be extremely sluggish; and for carbon synthesis inside stars (carbon being quite probably crucial to life). Universes all obeying the same fundamental laws could still differ in the strengths of their physical forces, as was explained earlier, and random variations in electromagnetism from universe to universe might then ensure that it took on any particular strength sooner or later. Yet how could they possibly account for the fact that the same one strength satisfied many potentially conflicting requirements, each of them a requirement for impressively accurate tuning?" [Our Place in the Cosmos, The Royal Institute of Philosophy, 1998 (courtesy Wayback Machine) Emphases added.] AND: ". . . the need for such explanations does not depend on any estimate of how many universes would be observer-permitting, out of the entire field of possible universes. Claiming that our universe is ‘fine tuned for observers’, we base our claim on how life’s evolution would apparently have been rendered utterly impossible by comparatively minor alterations in physical force strengths, elementary particle masses and so forth. There is no need for us to ask whether very great alterations in these affairs would have rendered it fully possible once more, let alone whether physical worlds conforming to very different laws could have been observer-permitting without being in any way fine tuned. Here it can be useful to think of a fly on a wall, surrounded by an empty region. A bullet hits the fly Two explanations suggest themselves. Perhaps many bullets are hitting the wall or perhaps a marksman fired the bullet. There is no need to ask whether distant areas of the wall, or other quite different walls, are covered with flies so that more or less any bullet striking there would have hit one. The important point is that the local area contains just the one fly." [Emphasis his.]
In short, the Math is speaking in the language of sensitivity analysis. We get to probabilities by two possible routes. One, we can look at indifference and think in terms of a simple random moving about leading to a Monte Carlo type analysis; and/or we may modify to look at biased distributions. Two, we can exploit the conceptual or quantitative duality between information and probability. But, we are not at all locked up to specific models and/or mechanisms of probability. Indeed, I add, probability in the murky middle is an index of ignorance and/or uncertainty. It maximises at flat randomness in a relevant range, local or global. (That is, we are least certain when any conceived possible outcome is in effect equi-possible so far as we know.) And, as noted, if a force is "locking" the system at a life permitting point, that takes the fine tuning issue up one level. Fine tuning is in the door and is not so easily got rid of. The realistic options are a designed, specifically functional world, or a quasi-infinite wider reality, a multiverse. In the latter case, we ought not to be looking at a world at a narrow resonance like this. And, at this point, we are not looking at empirical observation so we are looking at worldviews analysis in philosophy. Which means all serious options -- no "invisible friend" strawman tactics, please -- are on the table to be assessed per comparative difficulties. Where, simply to do serious Math, we must be responsibly, rationally free to significant degree. Post Hume's guillotine, this pins us to a challenge to find an IS that inherently grounds OUGHT at world root level. Without further elaborate argument, I note the balance of centuries of debate. There is just one serious candidate, the inherently good Creator God, a necessary and maximally great being worthy of loyalty and respectful, responsible reasonable service by doing the good in accord with our evident nature. If you doubt or dismiss, simply put up a feasible and comparably good candidate. Enjoy the Christmas season. KF PS: Regarding "Fiat lux," etc, I find Heb 1 instructive:
Heb 1:1 God, having spoken to the fathers long ago in [the voices and writings of] the prophets in many separate revelations [each of which set forth a portion of the truth], and in many ways, 2 has in these last days spoken [with finality] to us in [the person of One who is by His character and nature] His Son [namely Jesus], whom He appointed heir and lawful owner of all things, through whom also He created the universe [that is, the universe as a space-time-matter continuum]. 3 The Son is the radiance and only expression of the glory of [our awesome] God [reflecting God’s [a]Shekinah glory, the Light-being, the brilliant light of the divine], and the exact representation and perfect imprint of His [Father’s] essence, and upholding and maintaining and propelling all things [the entire physical and spiritual universe] by His powerful word [carrying the universe along to its predetermined goal]. When He [Himself and no other] had [by offering Himself on the cross as a sacrifice for sin] accomplished purification from sins and established our freedom from guilt, He sat down [revealing His completed work] at the right hand of the Majesty on high [revealing His Divine authority], 4 having become as much superior to angels, since He has inherited a more excellent and glorious [b]name than they [that is, Son—the name above all names] . . . [AMP]
kairosfocus
mw: I do think Prof W Rossiter has a point. It appears the cosmos is the result of fine tuning, that is design.
Indeed that is design! There is a complete alignment of various constants wrt the top-level function — harboring life. We see a staggering functional coherence. So, we infer design. Again, at this point there is no reference to probabilities or improbabilities of any mechanism. The unity of function is just evidence on its own. No probabilities involved.
mw: What we cannot then really extrapolate is, was that by coincidence or by design, and how does the figures come about?
What figures? Where does the possibility of coincidence come from? For the upteenth time: probabilities or improbabilities enter the arena only after some smartass proposes a random mechanism.
Bartlett: I have often noticed something of a confusion on one of the major points of the Intelligent Design movement – whether or not the design inference is primarily based on the failure of Darwinism and/or mechanism. This is expressed in a recent thread by a commenter saying, “The arguments for this view [Intelligent Design] are largely based on the improbability of other mechanisms (e.g. evolution) producing the world we observe.” I’m not going to name the commenter because this is a common confusion that a lot of people have. ... The only reason for probabilities in the modern design argument is because Darwinites have said, “you can get that without design”, so we modeled NotDesign as well, to show that it can’t be done that way. ... the *only* reason we are talking about probabilities is to answer an objection. The original evidence *remains* the primary evidence that it was based on.
Origenes
john_a_designer you mention "Or, if the ratio of the electromagnetic force constant to the gravitational force constant had not been precisely balanced to 1 part in 10^40 then we would have no stars of the right size to support life. We need both fast burning large stars to produce the essential elements for life’s chemistry and planet formation as well as long burning small stars to burn long enough to provide planetary systems habitable for life." Thanks, that is what I was looking for. I will reference that below my 'sloppy' Gordon reference. bornagain77
I do think Prof W Rossiter has a point. It appears the cosmos is the result of fine tuning, that is design. What we cannot then really extrapolate is, was that by coincidence or by design, and how does the figures come about? Though, his assertion, that time and space began at the Big Bang, appears not take into account the space/time of God and from out of which we came, live and have our being (Acts 17:27-28). God, the only person to witness the creation, cast into the vaults of heaven the material to create the cosmos out of nothing created. Space is space, spiritual or material, otherwise, where does it end or where does it really begin; in eternity? As God is both beginning and the end (Rev 1:11), is space some form of an eternal wheel, or one of the characteristics of God? How can we fine tune space? It is like saying we can fine tune God? Or did space compress and roll into an infinitely hot ball, held in non-space and non-gravity? Or is such, a powerful beguiling fudge of a theory, the best human nature can produce without guidance from superior knowledge? While BA77 does sterling work in relation to giving excellent data against Darwinism, it seems, in this case, he appears to want to batter people into submission—fine tuning comes from the Big Bang theory; there is little room for other considerations, not even the word of God written in stone. In my case, what is missing is an understanding of miracles and their effects on data, when God in a divine law said he created in six days, and that what he said was easy to understand, and is unalterable. A maturing miracle we are clueless to produce. Neither can we incorporate a miracle into any calculations for the Big Bang Theory. It we did, we would see God was true and the theory false (based on the word of God). However, that would prove God true. If we could prove God true, we would be greater than God! We have no understanding of how a spoken word can produce a cosmos. We have only the word of God, his historic and scientific word based on evidence at Sinai, when a whole nation publicly witnessed thunder, lightning, dark cloud and the word of God. Words cut in stone, and placed in the holy of holies. Carried with utmost respect and fear. The same God who, just before his crucifixion, worshipped in the synagogue remembering, he created in six days. Or did he worship under his breath God created by the Big Bang time scale, and hence, when Yahweh condemned a man stoned to death for working on the sabbath (Num 15:32-36) we make Jesus, God in part and God in whole, a murderer and a liar in our disbelief relative to the Big Bang time scale and Darwinism. After the resurrection, in heaven, did Jesus change his worship? If and when we go to heaven, as St John saw, there is the Ark of the Testimony of God (Rev 11:19), amidst thunder, lightning and hail. Does heaven contain a lie, stretched out truth in relative to what God wrote, the only scripture ever written by God, hence of utmost truth and importance for our protection and guidance. “He has made everything suitable for its time; moreover, he has put a sense of past and future into their minds, yet they cannot find out what God has done from the beginning to the end.” (Eccl 3:11) And: “Thus says the LORD: If the heavens above can be measured, and the foundations of the earth below can be explored, then I will reject all the offspring of Israel because of all they have done, says the LORD.” (Jer 31:37) In other words, God is saying we cannot measure how God created. If we could he would have to reject the offspring of Israel. His words not mine. Has anyone seen God in the theoretical Big Bang with his finger on the trigger. No, are we perhaps not being beguiled all over again. ‘God did not really mean six days, know theory, you will be equal to God in that knowledge!’ Jesus teaches that Satan makes war against the Holy Mother and the remnant of her seed; those who keep the Commandments of God without adding or subtracting to them and keep his teaching (Rev 12:17) (Rev 22:18-19). It is worth noting at Christmas, that, the baby in the manger is the God of Sinai who wrote scripture at Sinai. mw
juwilker, if I had cared about what people who did not like me or my posts on UD told me to do, I would never comment on UD and would have killed myself a long time ago since I have been told to do both those things by people who were opposing me. Thus, in response, I have a developed a fairly hard edge against people who do not like me or my posts. Atheists are often irritated that I often include related scriptures in my posts and have openly mocked me for such a practice and said they refuse to read my posts because of it. (Matzke and Myers comes readily to mind) Whatever. I include scripture anyway when I see fit. Thus not caring what others think about me or my posts is more or less an attitude that is given to me by my opponents. I try to write solely for the sake of advancing knowledge and understanding regardless of whether people may like what I write or not. Anyways, regardless of whether I was too 'rough' or not in defending my right to post as how I best see fit on UD, I've already apologized for any part, real or imagined, that I had in his ad hominem towards me and have moved on. I sense that he has moved on also. Which is just as well and good. Holding grudges is counter productive and in the end only ends up hurting oneself.
Matthew 18:21-22 Then Peter came and said to Him, "Lord, how often shall my brother sin against me and I forgive him? Up to seven times?" Jesus said to him, "I do not say to you, up to seven times, but up to seventy times seven.
bornagain77
john_a_designer: Here is a question that I haven’t seen dealt with here yet. The last four parameters that I listed above (of course, there are many others) all have specific values or a range of values. Assuming that they could be different (and not knowing if they could be, it is possible that they could) how would you derive probability for each one?
Allow me to repeat myself one more time: One can derive probability only after someone comes along who hypothesizes: 1. The coming into existence of the universe and its constants is due to a mechanism X with a random output. 2. The random output by mechanism X is such and such. IOWs probabilities or improbabilities reference a mechanism (such as flipping a coin or throwing a dice). Without a mechanism we cannot assign probabilities. This "inability" is no problem for the design inference, because the nature of the design inference is holistic. Origenes
BA77 @ 34, I am a casual observer and a great admirer of you and your posts. But I think you are being uncharitable with Dr Rossiter by saying you don't care about his comments or suggestions. Your words are technically not ad hominem attacks, but the tone is. Kind of unfair to call him out when I think your posts initiated his ad hominem response. Also, I would like to point everyone to Dr. Dembski's take on the fine tuning argument as it relates to assigning probabilities. It kind of supports Dr Rossiters objections. Chap 16 Contingency and Chance (p 128 Being as Communion): "It's worth pondering here what these difficulties of assigning probabilities to the entire universe mean for fine-tuning arguments...(goes on the explain fine tuning...). But since we are talking about features of the universe that need to be in place before the universe can be said to exist and operate, its not clear where those probabilities that are applied to the universe as a whole are coming from or how they can be coherently grounded" I think that is the point Origenes is trying to make with the Da Vinci example. We have no way to assign probabilities to designed events. The events themselves are evidence of design and we have no way to assign probabilities from a sample of one. juwilker
What is fine tuning? It is the empirically derived fact that if certain fundamental physical parameters or constants had been slightly different life and self-conscious life would not exist anywhere is the universe. Many prominent physicists agree. Stephan Hawking writes, “The laws of science, as we know them at present, contain many fundamental numbers, like the size of the electric charge of the electron and the ratio of the masses of the proton and the electron. ... The remarkable fact is that the values of these numbers seem to have been very finely adjusted to make possible the development of life.” But the fine-tuning is even more intricate than Hawking’s brief summary suggests. For example, if the ratio of the nuclear strong force to the electromagnetic force had differed by 1 part in 1016, no stars would have formed… no stars… no life. Or, if the ratio of the electromagnetic force constant to the gravitational force constant had not been precisely balanced to 1 part in 1040 then we would have no stars of the right size to support life. We need both fast burning large stars to produce the essential elements for life’s chemistry and planet formation as well as long burning small stars to burn long enough to provide planetary systems habitable for life. Also, if the nuclear ground state energies for helium 4, beryllium 6, carbon 12, and oxygen 16 had not been fine-tuned so that they varied no more than 4% with respect to each other, there would not be sufficient oxygen or carbon for the development of life.” Or if the majority of the electromagnetic radiation emitted by the sun (or any equivalent star) wasn’t within a very narrow band: one part in 10 raised the 25th power (that’s one followed 25 zeros) life could not exist on earth. Here is a question that I haven’t seen dealt with here yet. The last four parameters that I listed above (of course, there are many others) all have specific values or a range of values. Assuming that they could be different (and not knowing if they could be, it is possible that they could) how would you derive probability for each one? Not being a mathematician or a physicist, I honestly don’t know how you would. Maybe someone here does. This seems to me, if I am understanding Wayne’s point correctly, that this question gets us to the heart of the problem. If you can give me some probabilities then it seems to me you have answered his objection. If you can’t then I think Wayne has a good point. john_a_designer
WR, I suggest, the a posteriori probability that we validly experience a world is near unity. A priori, given what sensitivity analysis on the parameters and laws shows is a different matter. And, that is embedded in even the term fine tuning. The sensitivity analysis tells us something, and that something is clearly significant. Which is John Leslie's point. A further one, is that even if the parameters and laws are "locked," that only moves the issue up a level. A super-force that locks a vast cluster of parameters, laws and just plain physical manifestations across a span c 90 Bn LY across, at minimum, is itself highly suggestive. KF PS: I note, there are several senses of probability. In that context, it is meaningful to discuss the probability of an agent taking options 1 to n, per our experience and insight. And we must be aware that s/he may do something, say S; as a determination to act with surprise. This is a commonplace in strategy where surprise is a crucial force multiplier. kairosfocus
wrossite you state" "Frankly, I don’t know what you even mean to say that “human thought is endowed with a essential timeless element to it that cannot possibly be reduced to any within space-time, materialistic, explanation.” IMHO, in the following short quote, David Berlinski has succinctly captured the timeless nature of information, specifically mathematical information, and thus the 'timeless nature' of human thought:
An Interview with David Berlinski - Jonathan Witt Berlinski: There is no argument against religion that is not also an argument against mathematics. Mathematicians are capable of grasping a world of objects that lies beyond space and time…. Interviewer:… Come again(?) … Berlinski: No need to come again: I got to where I was going the first time. The number four, after all, did not come into existence at a particular time, and it is not going to go out of existence at another time. It is neither here nor there. Nonetheless we are in some sense able to grasp the number by a faculty of our minds. Mathematical intuition is utterly mysterious. So for that matter is the fact that mathematical objects such as a Lie Group or a differentiable manifold have the power to interact with elementary particles or accelerating forces. But these are precisely the claims that theologians have always made as well – that human beings are capable by an exercise of their devotional abilities to come to some understanding of the deity; and the deity, although beyond space and time, is capable of interacting with material objects. http://tofspot.blogspot.com/2013/10/found-upon-web-and-reprinted-here.html
James Franklin adds his two cents here:
The mathematical world - James Franklin - 7 April 2014 Excerpt: the intellect (is) immaterial and immortal. If today’s naturalists do not wish to agree with that, there is a challenge for them. ‘Don’t tell me, show me’: build an artificial intelligence system that imitates genuine mathematical insight. There seem to be no promising plans on the drawing board.,,, James Franklin is professor of mathematics at the University of New South Wales in Sydney. http://aeon.co/magazine/world-views/what-is-left-for-mathematics-to-be-about/
To put it more basically, all abstract human thought is 'timeless' in its foundational nature. Alfred Wallace, co-discoverer of natural selection, broke with Charles Darwin over precisely this issue:
"Nothing in evolution can account for the soul of man. The difference between man and the other animals is unbridgeable. Mathematics is alone sufficient to prove in man the possession of a faculty unexistent in other creatures. Then you have music and the artistic faculty. No, the soul was a separate creation." Alfred Russell Wallace, New Thoughts on Evolution, 1910
In other words, all information is timeless in its basic nature and since man thinks, speaks, and writes in terms of information, then this makes the nature of man's thoughts, of necessity, 'timeless'. To further clarify this point, it is good to note that whilst information can be infused into an almost endless variety of material substrates, none-the-less the meaning of the information does not change. Dr. Stephen Meyer briefly touches on that issue in the following video:
“One of the things I do in my classes, to get this idea across to students, is I hold up two computer disks. One is loaded with software, and the other one is blank. And I ask them, ‘what is the difference in mass between these two computer disks, as a result of the difference in the information content that they posses’? And of course the answer is, ‘Zero! None! There is no difference as a result of the information. And that’s because information is a mass-less quantity. Now, if information is not a material entity, then how can any materialistic explanation account for its origin? How can any material cause explain it’s origin? And this is the real and fundamental problem that the presence of information in biology has posed. It creates a fundamental challenge to the materialistic, evolutionary scenarios because information is a different kind of entity that matter and energy cannot produce. In the nineteenth century we thought that there were two fundamental entities in science; matter, and energy. At the beginning of the twenty first century, we now recognize that there’s a third fundamental entity; and its ‘information’. It’s not reducible to matter. It’s not reducible to energy. But it’s still a very important thing that is real; we buy it, we sell it, we send it down wires. Now, what do we make of the fact, that information is present at the very root of all biological function? In biology, we have matter, we have energy, but we also have this third, very important entity; information. I think the biology of the information age, poses a fundamental challenge to any materialistic approach to the origin of life.” -Dr. Stephen C. Meyer earned his Ph.D. in the History and Philosophy of science from Cambridge University for a dissertation on the history of origin-of-life biology and the methodology of the historical sciences. Intelligent design: Why can't biological information originate through a materialistic process? - video http://www.youtube.com/watch?v=wqiXNxyoof8
And here is some empirical evidence that establishes immaterial information as its own distinct physical entity that is separate from matter and energy:
A few notes on the physical reality of ‘immaterial’ information: (December. 2016) Thermodynamic Content, Erasing Classical Information with Quantum Information, Quantum Teleportation https://uncommondesc.wpengine.com/intelligent-design/digg-what-is-information-a-remarkably-unstupid-vid/#comment-622155
Moreover, it is important to reiterate the fact that humans, uniquely out of all God's creatures on earth, think, speak and write, in terms of immaterial information. As highlighted in post 51, Tom Wolfe wrote a book precisely because leading Darwinists, in the field of human language, publicly confessed, in peer-review, that they have no solid, real, clue as to how human speech could have possibly evolved. Yet, as Tom Wolfe argued in his book, "Kingdom of Speech", human speech is 95% plus of what lifts man above the animals.
“Speech is 95 percent plus of what lifts man above animal! Physically, man is a sad case. His teeth, including his incisors, which he calls eyeteeth, are baby-size and can barely penetrate the skin of a too-green apple. His claws can’t do anything but scratch him where he itches. His stringy-ligament body makes him a weakling compared to all the animals his size. Animals his size? In hand-to-paw, hand-to-claw, or hand-to-incisor combat, any animal his size would have him for lunch. Yet man owns or controls them all, every animal that exists, thanks to his superpower: speech.” —Tom Wolfe, in the introduction to his book, The Kingdom of Speech
In other words, although humans are fairly defenseless creatures in the wild compared to other creatures, such as lions, bears, and sharks, etc.., nonetheless, humans have, completely contrary to Darwinian ‘survival of the fittest’ thinking, managed to become masters of the planet, not by brute force, but simply by our unique ability to communicate information and, more specifically, infuse information into material substrates in order to create, i.e. intelligently design, objects that are extremely useful for our defense, basic survival in procuring food, furtherance of our knowledge, and also for our pleasure. And although the ‘top-down’ infusion of immaterial information into material substrates, that allowed humans to become ‘masters of the planet’, was rather crude to begin with, (i.e. spears, arrows, and plows etc..), this top down infusion of immaterial information into material substrates has become much more impressive over the last half century or so. Specifically, the ‘top-down’ infusion of mathematical and/or logical information into material substrates lies at the very basis of many, if not all, of man’s most stunning, almost miraculous, technological advances in recent decades. Here are a couple of articles which clearly get this ‘top-down’ infusion of immaterial information point across:
Here is one by Peter Tyson Describing Nature With Math By Peter Tyson – Nov. 2011 Excerpt: Mathematics underlies virtually all of our technology today. James Maxwell’s four equations summarizing electromagnetism led directly to radio and all other forms of telecommunication. E = mc2 led directly to nuclear power and nuclear weapons. The equations of quantum mechanics made possible everything from transistors and semiconductors to electron microscopy and magnetic resonance imaging. Indeed, many of the technologies you and I enjoy every day simply would not work without mathematics. When you do a Google search, you’re relying on 19th-century algebra, on which the search engine’s algorithms are based. When you watch a movie, you may well be seeing mountains and other natural features that, while appearing as real as rock, arise entirely from mathematical models. When you play your iPod, you’re hearing a mathematical recreation of music that is stored digitally; your cell phone does the same in real time. “When you listen to a mobile phone, you’re not actually hearing the voice of the person speaking,” Devlin told me. “You’re hearing a mathematical recreation of that voice. That voice is reduced to mathematics.” http://www.pbs.org/wgbh/nova/physics/describing-nature-math.html Recognising Top-Down Causation – George Ellis Excerpt: page 5: A: Causal Efficacy of Non Physical entities: Both the program and the data are non-physical entities, indeed so is all software. A program is not a physical thing you can point to, but by Definition 2 it certainly exists. You can point to a CD or flashdrive where it is stored, but that is not the thing in itself: it is a medium in which it is stored. The program itself is an abstract entity, shaped by abstract logic. Is the software “nothing but” its realisation through a specific set of stored electronic states in the computer memory banks? No it is not because it is the precise pattern in those states that matters: a higher level relation that is not apparent at the scale of the electrons themselves. It’s a relational thing (and if you get the relations between the symbols wrong, so you have a syntax error, it will all come to a grinding halt). This abstract nature of software is realised in the concept of virtual machines, which occur at every level in the computer hierarchy except the bottom one [17]. But this tower of virtual machines causes physical effects in the real world, for example when a computer controls a robot in an assembly line to create physical artefacts. Excerpt page 7: The assumption that causation is bottom up only is wrong in biology, in computers, and even in many cases in physics, ,,, The mind is not a physical entity, but it certainly is causally effective: proof is the existence of the computer on which you are reading this text. It could not exist if it had not been designed and manufactured according to someone’s plans, thereby proving the causal efficacy of thoughts, which like computer programs and data are not physical entities. http://fqxi.org/data/essay-contest-files/Ellis_FQXI_Essay_Ellis_2012.pdf
What is more interesting still about the fact that humans have a unique ability to understand and create information, and have come to dominate the world through the ‘top-down’ infusion of information into material substrates, is the fact that, due to advances in science, both the universe and life itself, are now found to be ‘information theoretic’ in their foundational basis. Renowned physicist John Wheeler stated “in short all matter and all things physical are information-theoretic in origin and this is a participatory universe”.
"it from bit” Every “it”— every particle, every field of force, even the space-time continuum itself derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. “It from bit” symbolizes the idea that every item of the physical world has a bottom—a very deep bottom, in most instances, an immaterial source and explanation, that which we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment—evoked responses, in short all matter and all things physical are information-theoretic in origin and this is a participatory universe." – Princeton University physicist John Wheeler (1911–2008) (Wheeler, John A. (1990), “Information, physics, quantum: The search for links”, in W. Zurek, Complexity, Entropy, and the Physics of Information (Redwood City, California: Addison-Wesley))
In the following article, Anton Zeilinger, a leading expert in quantum mechanics, stated that ‘it may very well be said that information is the irreducible kernel from which everything else flows.’
Why the Quantum? It from Bit? A Participatory Universe? Excerpt: In conclusion, it may very well be said that information is the irreducible kernel from which everything else flows. Thence the question why nature appears quantized is simply a consequence of the fact that information itself is quantized by necessity. It might even be fair to observe that the concept that information is fundamental is very old knowledge of humanity, witness for example the beginning of gospel according to John: "In the beginning was the Word." Anton Zeilinger - a leading expert in quantum mechanics:
In the following video at the 48:24 mark Zeilinger states that “It is operationally impossible to separate Reality and Information” and he goes on to note at the 49:45 mark the Theological significance of “In the Beginning was the Word” John 1:1
48:24 mark: “It is operationally impossible to separate Reality and Information” 49:45 mark: “In the Beginning was the Word” John 1:1 Prof Anton Zeilinger speaks on quantum physics. at UCT - video http://www.youtube.com/watch?v=s3ZPWW5NOrw
Vlatko Vedral, who is a Professor of Physics at the University of Oxford, and is also a recognized leader in the field of quantum mechanics, states
"The most fundamental definition of reality is not matter or energy, but information–and it is the processing of information that lies at the root of all physical, biological, economic, and social phenomena." Vlatko Vedral - Professor of Physics at the University of Oxford, and CQT (Centre for Quantum Technologies) at the National University of Singapore, and a Fellow of Wolfson College - a recognized leader in the field of quantum mechanics.
Moreover, besides being foundational to physical reality, information is also found to be ‘infused’ into biological life.
Information Enigma (Where did the information in life come from?) - - Stephen Meyer - Doug Axe - video https://www.youtube.com/watch?v=aA-FcnLsF1g Complex grammar of the genomic language – November 9, 2015 Excerpt: The ‘grammar’ of the human genetic code is more complex than that of even the most intricately constructed spoken languages in the world. The findings explain why the human genome is so difficult to decipher –,,, ,,, in their recent study in Nature, the Taipale team examines the binding preferences of pairs of transcription factors, and systematically maps the compound DNA words they bind to. Their analysis reveals that the grammar of the genetic code is much more complex than that of even the most complex human languages. Instead of simply joining two words together by deleting a space, the individual words that are joined together in compound DNA words are altered, leading to a large number of completely new words. - per sciencedaily Biophysics – Information theory. Relation between information and entropy: - Setlow-Pollard, Ed. Addison Wesley Excerpt: Linschitz gave the figure 9.3 x 10^12 cal/deg or 9.3 x 10^12 x 4.2 joules/deg for the entropy of a bacterial cell. Using the relation H = S/(k In 2), we find that the information content is 4 x 10^12 bits. Morowitz' deduction from the work of Bayne-Jones and Rhees gives the lower value of 5.6 x 10^11 bits, which is still in the neighborhood of 10^12 bits. Thus two quite different approaches give rather concordant figures. - per astro “a one-celled bacterium, e. coli, is estimated to contain the equivalent of 100 million pages of Encyclopedia Britannica. Expressed in information in science jargon, this would be the same as 10^12 bits of information. In comparison, the total writings from classical Greek Civilization is only 10^9 bits, and the largest libraries in the world – The British Museum, Oxford Bodleian Library, New York Public Library, Harvard Widenier Library, and the Moscow Lenin Library – have about 10 million volumes or 10^12 bits.” – R. C. Wysong 'The information content of a simple cell has been estimated as around 10^12 bits, comparable to about a hundred million pages of the Encyclopedia Britannica." Carl Sagan, "Life" in Encyclopedia Britannica: Macropaedia (1974 ed.), pp. 893-894
It is hard to imagine a more convincing proof that we are made ‘in the image of God’, than finding that both the universe and life itself are ‘information theoretic’ in their foundational basis, and that we, of all the creatures on earth, uniquely possess an ability to understand and create information, and have come to ‘master the planet’ precisely because of our ability infuse information into material substrates. I guess a more convincing evidence could be if God Himself became a man, defeated death on a cross, and then rose from the dead to prove that He was God. But who has ever heard of such overwhelming evidence as that?
Shroud of Turin: From discovery of Photographic Negative, to 3D Information, to Quantum Hologram - video https://www.youtube.com/watch?v=F-TL4QOCiis&list=PLtAP1KN7ahia8hmDlCYEKifQ8n65oNpQ5&index=5
Verses:
Genesis 1:26 And God said, Let us make man in our image, after our likeness: and let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth. John 1:1-4 In the beginning was the Word, and the Word was with God, and the Word was God. The same was in the beginning with God. All things were made by Him, and without Him was not anything made that was made. In Him was life, and that life was the Light of men.
bornagain77
" However, to dismiss me because I hold a PhD in biology, instead of cosmology or physics is to commit the genetic fallacy" ,,, and then "As for amateur, are you saying it’s mean to suggest that someone’s training and skill set might cause us to be suspicious of their ability to discuss a topic?,,," You were saying? :) (Just some good natured ribbing buddy) bornagain77
I'm not emotionally bound to the argument. In fact, if I'm wrong, then that's good for everyone (because it would mean that the arguments we use in public are actually valid). However, to dismiss me because I hold a PhD in biology, instead of cosmology or physics is to commit the genetic fallacy (http://www.logicalfallacies.info/relevance/genetic/). As for timelessness of human nature, I guess I don't follow. Others can evaluate that statement. There was no time before the big bang (so far as we know), since time is bound to space and both coalesce at the singularity of the big bang. Speculation as to the causal mechanisms that might give rise to our space-time are just that, speculations. By definition, because they would be outside of our universe, they cannot be tested or verified. Frankly, I don't know what you even mean to say that "human thought is endowed with a essential timeless element to it that cannot possibly be reduced to any within space-time, materialistic, explanation." I mean, I get the idea of mind or consciousness that is not describable by materialism. But what does that have to do with the mechanism(s) responsible for creating a universe or the assignment of a probability to a parameter? I just don't follow. W p.s. As I recall, you're asking me to apologize for calling you bitter, argumentative and amateur. Perhaps you're not bitter, and for that, I will apologize. You do seem to like to quarrel, so that's looking more and more to be a true statement (not an ad hominem). As for amateur, are you saying it's mean to suggest that someone's training and skill set might cause us to be suspicious of their ability to discuss a topic? :) (see above) You're welcome to go to the CSS website and search for the Q&A after my talk, in which Robin and I interact, and he makes this concession. wrossite
By the way, Dr. Rossiter - I think your exercise is a very good one. We do need to challenge our own views and arguments. I stumbled into that exercise here recently on another topic and it was a little difficult to walk away without some unnecessary friction. We've been somewhat of a war zone here, at least in past years (it has died down considerably) so we're quick to fight back. But thanks for your good work and it was great to learn about your background. Silver Asiatic
Wrossite: Basically, the ID community would be rejecting the argument that the parameter settings are highly improbable.
There is no rejection of any probability argument, because probabilities do not enter the design inference from the get-go. Let’s take a look how Bartlett speaks about the design inference wrt biology:
It used to be that the arguments for design were very plain. Biology proceeded according to a holistic plan both in the organism and the environment. This plan indicated a clear teleology – that the organism did things that were *for* something. These organisms exhibited a unity of being. This is evidence of design. It has no reference to probabilities or improbabilities of any mechanism. It is just evidence on its own.
Wrossite: But, since the atheists/materialists want to try to put probabilities on these things (via a multiverse scenario), then the ID community–for the sake of argument–accepts those probabilities. That in itself strikes me as really odd.
But the same thing happened when Darwinians proposed a mechanism. ID theorists accept the mechanism–for the sake of argument–for critical examination. Bartlett again:
Then, in the 19th century, Darwin suggested that there was another possibility for the reason for this cohesion – natural selection. Unity of plan and teleological design, according to Darwin, could also happen due to selection. Thus, the original argument is: X, Y, and Z indicate design Darwin’s argument is: X, Y, and Z could also indicate natural selection So, therefore, we simply show that Darwin is wrong in this assertion. If Darwin is wrong, then the original evidence for design (which was not based on any probability) goes back to being evidence for design. The only reason for probabilities in the modern design argument is because Darwinites have said, “you can get that without design”, so we modeled NotDesign as well, to show that it can’t be done that way. So, the *only* reason we are talking about probabilities is to answer an objection. The original evidence *remains* the primary evidence that it was based on. Answering the objection simply removes the objection.
wrossite : But, even so, my point is that they don’t. The multiverse folks think there could be 10^500 universes (one mainstream examples: https://www.scientificamerican.com/article/new-physics-complications-lend-support-to-multiverse-hypothesis/). So, if the ID folks are going to accept this view for the sake of argument, then those improbabilities aren’t so improbable (again, the gravitational constant is a often-used example, but is only a 1 in 10^40 chance.
Given 10^500 universes, our gravitational constant may not be improbable. BTW why not assume 10^500 multiverses to get things settled once and for all? But surely, the fact that assumptions such as these need to be made in order to ground a naturalistic explanation for our universe is quite telling. Origenes
wrossite, whatever! I've apologized, twice now, for my part even though I did not attack your personally. You did not accept my apologies. And you are still unrepentant for your direct ad hominem against me. Whatever. The clean slate of friendship is still offered if you will only accept. Moreover, to find your argument severely wanting, even more so after having read your post from top to bottom. is not to insult you personally. You are confusing the rejection of your argument with an insult to you personally. It is not. Not even close. Might I suggest you are too emotionally attached to your argument? You even confess that cosmology and fine-tuning are not your areas of expertise. Through the years here on UD, we have seen all sorts of arguments against fine-tuning, by people, i.e. by and large atheists, much more qualified than you self-admittedly are, and their arguments were all eventually found have fatal holes. As mentioned in post 51, after mulling all this over for a while, I found one particularly interesting fatal hole in your argument. Not the only hole but the most interesting one. That hole is, to reiterate, that you are basically trying to make a humans act contrary to the 'timeless' nature of their thoughts and to not even question what was before the big bang:
KF, Thanks for the post. Notice that it assumes many of the things I’m suggesting we can’t. It talks as if there are in fact other universes. It talks as if we can know anything about them or their laws. It assumes a range and a distribution for parameters. None of this can be known. It is pure speculation. As I’ve repeatedly pointed out, you cannot calculate a probability given a sample size of one. - W
wrossite, it has occurred to me that your wished for ‘mandate’, (a wished for ‘mandate’ that basically states, ‘we are not allowed to speculate on what was ‘before’ the big bang’), is equivalent to trying to make a new rule decreeing that water is not allowed to run downhill. Let me expand on that a bit. For you to try to mandate that humans cannot speculate on what happened before time began is to go against human nature itself. Human nature, especially human thought, is endowed with a essential timeless element to it that cannot possibly be reduced to any within space-time, materialistic, explanation. ,, etc.. etc..,, https://uncommondesc.wpengine.com/fine-tuning/biology-prof-how-can-we-really-know-if-the-universe-is-fine-tuned/#comment-622354
In other words, even if your argument were correct, you have scant hope of ever getting anyone besides perhaps a few to rigidly follow it since it goes against the nature of human thought itself. Now I admit getting a nod from Robin Collins for 'sloppiness' in regards to Gordon's 40 orders of magnitude 'guesstimate', was interesting. But it still does not help you since I do not know what Collins exact context was when he said it. Does he have some other a priori knowledge of what was before the big bang that Gordon does not? Certainly not. Without such a priori knowledge it is hard to envision how Collins might possibly clean it up or make it more reasonable. And if anyone could, I think that he certainly could. Does he suggest cleaning the 'guesstimate' up by some other more reasonable way? Or does he, like you, just try to say that we are forbidden from even thinking about it? I highly doubt that Collins would go that far. Perhaps you could get him (or Luke Barnes) to comment more fully on it ? Perhaps develop the argument more fully or even (unthinkably) perhaps have them show you more clearly where your holes are? bornagain77
WA Right, agreed. I wasn't trying to compare the probability arguments for both of those but rather something like the generation of functional code from a non-intelligent source. ID can oppose that in principle, without even getting into the probability arguments. But then for the sake of argument, to work as if it was possible and then discuss from there. Silver Asiatic
SA, I would push back only on the idea that cosmology is like origins of life (OOL) biology. We have many empirical resources at our disposal when it comes to OOL science. We are aware of the mechanisms proposed as causes for first life, and we can empirically evaluate them. Further, we have many planets, moons, etc to evaluate the rarity of abiogenesis. So there, the probability arguments rest on a firm foundation. Cosmology is nothing like that. We have no mechanism to empirically or experimentally evaluate. Thus, we can't really hang a probability on the things we discover about the cosmos at large. Otherwise, I think we agree. W wrossite
WR
I’ve honestly given some thought to this angle, since several have raised it. Basically, the ID community would be rejecting the argument that the parameter settings are highly improbable. But, since the atheists/materialists want to try to put probabilities on these things (via a multiverse scenario), then the ID community–for the sake of argument–accepts those probabilities. That in itself strikes me as really odd.
We do a similar thing with arguments for evolution or abiogenesis. It's not that we can't calculate probabilities, but that the materialist arguments are absurd from the start, but we accept their assumptions. Actually, consider this - atheistic materialism undercuts the very foundation of rational argument. But we (IDists) set that aside and pretend we actually can have a discussion. So, it's a question of tactics here. And yes, there's room for disagreement on that, but it's certainly not a question of right versus wrong in the methodology one wants to use to convince someone.
So, if the ID folks are going to accept this view for the sake of argument, then those improbabilities aren’t so improbable (again, the gravitational constant is a often-used example, but is only a 1 in 10^40 chance.
The multiversers propose a speculative scenario. Ok, if IDists accept the 10^500 number, then it's not as improbable. However, I think most IDists do not accept that part of the multiverse argument. But we're chasing something amorphous. We're talking about IDists and Multiversers as if they both have unified views on this.
So, I don’t buy that the ID theorist doesn’t mean to say that the fine-tuning is highly improbable, but does so just to play ball with the materialists. None of their discussions appear to be operating that way. Hope that clears up my point.
Yes, fair enough. But your argument here is really on what ID theorists might mean or not in their views. In my experience, most ID theorists I've encountered actually think a multiverse is not only 'not subject to probability studies' but is simply irrational and illogical. It's something that cannot, in principle - by definition, be a source of empirical data. It transcends analysis. That's the most common view within the ID world. I'm open to correction here. If any of my fellow IDers disagree and think that the probability of any number of additional universes can be calculated or rationally comprehended, I would like to hear it. I think Origenes expressed my view well. We consider probabilities only as an "even if" scenario, which is already absurd before we started discussing it. Silver Asiatic
Thank you john_a_designer. I want to be clear, I'm interested in making the best ID arguments we can in the public sphere. I want good arguments that work and are both honest and logically sound. I was hoping this would be iron sharpening iron. And, to be fair, several on here have done so (Origines, Silver Asiatic, Dionisio and others). So, all-in-all, it hasn't been a bad discussion. W wrossite
bornagain77. What is childish is running to the administrator when somebody doesn't treat you nice. Can you honestly say that a statement like, "To be blunt, like the hundreds of atheists I’ve dealt on UD before who don’t like me or my posts, I don’t care what you personally think about the length of my posts or me and could care less if you like them or not or if you like me or not," is in any way charitable or amenable to conversation? I'm more insulted by the fact that you didn't actually read my blog before launching into your 50,000 word responses. But, I'm a big boy. I can handle it. I've seen it before. W wrossite
SA, I don't buy this statement "Yes, those don’t change so we have no empirical evidence of other data points from which to build probabilities. I don’t think you answered Origenes’ points in 49. The ID argument accepts the materialist assumption that our universe emerged from an unknown number of random, physical/material elements all like what is known." I've honestly given some thought to this angle, since several have raised it. Basically, the ID community would be rejecting the argument that the parameter settings are highly improbable. But, since the atheists/materialists want to try to put probabilities on these things (via a multiverse scenario), then the ID community--for the sake of argument--accepts those probabilities. That in itself strikes me as really odd. But, even so, my point is that they don't. The multiverse folks think there could be 10^500 universes (one mainstream examples: https://www.scientificamerican.com/article/new-physics-complications-lend-support-to-multiverse-hypothesis/). So, if the ID folks are going to accept this view for the sake of argument, then those improbabilities aren't so improbable (again, the gravitational constant is a often-used example, but is only a 1 in 10^40 chance. The same is true for the others). So, I don't buy that the ID theorist doesn't mean to say that the fine-tuning is highly improbable, but does so just to play ball with the materialists. None of their discussions appear to be operating that way. Hope that clears up my point. W wrossite
I think it is shameful that Wayne is not getting a fair hearing here. I thought maybe UD had cleaned up its act. I guess not. Please read his blog post before you begin refuting his argument. I see little evidence that any other the other commenters have done that. We rightly criticize our atheist interlocutors for setting up straw-men. It’s hypocritical if we turn around and do the same. And, for those who are Christian, it is unchristian. Here was a link to Wayne’s blog, which was also given in the OP: https://shadowofoz.wordpress.com/2016/12/06/sock-drawers-and-cosmological-fine-tuning/ Casey Luskin has written about him at Evolution News and Views:
Rossiter tells some of his own personal story. He entered grad school as a "staunch and cantankerous atheist," studying under "an equally atheistic advisor who was of Dawkins's ilk." But soon he started having doubts about atheism, sparked in part by his increasing doubts about Darwin. As he puts it: “I started to read and listen to scientists and intellectuals who had found faith in God compelling. Just as I was converting, so too was the famed atheist Antony Flew (though never to Christianity). I started to realize that there were good reasons to doubt the metanarrative of naturalism (the centerpiece of which is Darwinian evolution), and that many secular thinkers in fields related to the topic had also come to doubt the entire enterprise (and Darwin in specific).” (p. 5) After going through a deconversion process, leaving behind atheism and Darwinism, and now with a doctorate in hand, he landed a job teaching biology at a Christian university. There, however, he saw that many Christian students were moving in the opposite direction. Under the influence of the Darwinian evolution they had been dogmatically taught they must believe, they were losing their religious faith…
http://www.evolutionnews.org/2015/12/in_shadow_of_oz101421.html Luskin also interviews at Dr. Rossiter at ID the Future. http://www.discovery.org/multimedia/audio/2016/01/shadow-of-oz-wayne-rossiter-on-theistic-evolution-pt-1/ And, he has been mentioned before here at UD. https://uncommondesc.wpengine.com/science/wayne-rossiter-conservatism-doomed-to-extinction/ Come on guys he is an ID success story not a closet atheist. john_a_designer
wrossite instead of apologizing for his personal attack towards me offers his own translation of what I wrote so as to introduce an ad hominem that I did not state: "Translation, “Wayne, you’re a nobody, I’m really not interested in your contributions to the ID enterprise, and I really don’t care what you or anybody else thinks.” Might I suggest that when you have to offer your own translation of something I wrote so as to produce evidence of a ad hominem that I did not write, perhaps you should have just apologized for your personal attack in the first place? :) As for my part, I already apologized for mistaking you for an atheist, (although that is not really an personal insult in itself) and now I also further apologize to you for anything, real or imagined, that I may have done to you to make you mad at me and hope that we can rise about such pettiness in the future. It really all seems a bit childish. I'm certain that God has a better path for us. bornagain77
WR I'm sorry, what don't you buy? You're saying that it is, indeed, possible to calculate the probability of the existence of possible universes? I thought you said we only have a population of one to work from. Now you're saying we have a population of 10^500? Yes, I may misunderstand ... you. Silver Asiatic
SA, I'm sorry, but I don't buy that. Here's why: So far as I've seen, the multiverse theorists suppose 1 x 10^500 possible universes. Guys like Craig and Gordon acknowledge this. So, if that's what we're working from, then statements about c being 1 in 10^120, or G being 1 in 10^40 would be totally within the probabilistic power of such a multiverse scenario. Even there, these guys either misuse the argument, or you misunderstand it. Sorry. W wrossite
Dr. Rossiter
The probability of our universe, so far as we know, is one. There are no other universes that we know of, and we have no idea what they might be like if they do exist. I’m not sure what you’re getting at with this idea of changing relations over space. Could you expand on that?
I probably misunderstood the topic since it appears to be limited to arguments citing finely-tuned 'constants' and the origin of the universe. Yes, those don't change so we have no empirical evidence of other data points from which to build probabilities. I don't think you answered Origenes' points in 49. The ID argument accepts the materialist assumption that our universe emerged from an unknown number of random, physical/material elements all like what is known. So, we accept, for sake of engaging the atheistic community, their assumptions, even though strictly-speaking, they cannot be used for any probabilistic analysis. It's the same with any multiverse arguments. The discussions ends when there is proposed observable evidence for anything outside of the observable sphere. It's illogical and false from the beginning. But for the sake of it, we proceed "even if ..." Beyond that, regarding changeable elements - I was pointing to the fact that the universe is also finely-tuned for life on earth and those parameters are not universal constants. The mass of the sun, distance of sun from earth, thickness of earth's crust, earth's tilt, etc. So, your point here - which you seem quite intense about, is one of tactics and not really of the science. It seems you're saying that ID proponents should not engage anyone on the topic of a multiverse and the only response to that is: "There can be no evidence. It's totally imaginary. Our understanding of universes is limited to a population of one. No probabilities can be built on that. Any talk of a multiverse is illogical and irrational." And things like that. I think we'd all agree with your conclusions (if I'm correct about them), but whether that is the best strategy for engaging our opponents - that's not something that can be solved scientifically or even philosophically. It's more in understanding what makes materialists tick - and what arguments will work best. Even those built on their false premises can be successful. Silver Asiatic
All, I don't think I'll be further discussing my blog on this forum. Many have had their say. I'm happy that several of you agree with my point (even if you disagree with tangents surrounding it). Just to wrap up what has been an awkward and unfruitful discussion with bornagain77, here are some things he/she has said to: First, the continued accusation that I (as a Christian) am an atheist. “Again, this does not surprise me since Atheists have always retreated to ‘ignorance of the gaps’ arguments to try cover up embarrassing empirical findings.” “This is too funny. Who put you up to this?” “Thus wrossite, while you may pretend that the fine-tuned universal laws are just what they are and are of no big concern for the atheist, the fact of the matter is that your own atheistic metaphysics is what is contradicting you every step of the way.” Note that it's not just that I might be an atheist (which I'm not), but that I am doing this as a ruse, and that I'm trying to be dishonest by covering up facts and ignoring contradictions, perhaps even gladly ignorant. then, upon being informed that I'm actually an ID proponent, a backhanded apology. “Frankly, I’ve never heard of you as an ID author but only as someone who has authored a book criticizing Theistic Evolution. As I don’t think much of TE anyway, I’ve not been interested in your book. It just is not that important of a topic for me. …To be blunt, like the hundreds of atheists I’ve dealt on UD before who don’t like me or my posts, I don’t care what you personally think about the length of my posts or me and could care less if you like them or not or if you like me or not.” Translation, "Wayne, you're a nobody, I'm really not interested in your contributions to the ID enterprise, and I really don't care what you or anybody else thinks." On top of this, at several points bornagain77 demonstrates that he/she hasn't even read my blog (referencing as proofs for his/her case things that I actually directly reference in my blog, as if I wasn't aware of those already). At any rate, call me mean if you like. I've made my case. I've seen nothing to make me think that I was mistaken in my appraisal of the situation. W wrossite
Moreover, when scrutinizing the details of quantum wave collapse, we find that the Christian Theist is well justified in holding that the infinite 'Mind of God' must be behind bringing reality into existence, i.e. must be behind collapsing the wave function upon conscious observation. First off, an ‘uncollapsed’ photon, in its quantum wave state, is mathematically defined as ‘infinite’ information:
Explaining Information Transfer in Quantum Teleportation: Armond Duwell †‡ University of Pittsburgh Excerpt: In contrast to a classical bit, the description of a (quantum) qubit requires an infinite amount of information. The amount of information is infinite because two real numbers are required in the expansion of the state vector of a two state quantum system (Jozsa 1997, 1) http://www.cas.umt.edu/phil/faculty/duwell/DuwellPSA2K.pdf Quantum Computing – Stanford Encyclopedia Excerpt: Theoretically, a single qubit can store an infinite amount of information, yet when measured (and thus collapsing the Quantum Wave state) it yields only the classical result (0 or 1),,, http://plato.stanford.edu/entries/qt-quantcomp/#2.1 Single photons to soak up data: Excerpt: the orbital angular momentum of a photon can take on an infinite number of values. Since a photon can also exist in a superposition of these states, it could – in principle – be encoded with an infinite amount of information. http://physicsworld.com/cws/article/news/7201
Moreover, this 'infinite information' quantum qubit is also mathematically defined as being in an 'infinite dimensional' state:
The Unreasonable Effectiveness of Mathematics in the Natural Sciences – Eugene Wigner – 1960 Excerpt: We now have, in physics, two theories of great power and interest: the theory of quantum phenomena and the theory of relativity.,,, The two theories operate with different mathematical concepts: the four dimensional Riemann space and the infinite dimensional Hilbert space, http://www.dartmouth.edu/~matc/MathDrama/reading/Wigner.html Wave function Excerpt "wave functions form an abstract vector space",,, This vector space is infinite-dimensional, because there is no finite set of functions which can be added together in various combinations to create every possible function. http://en.wikipedia.org/wiki/Wave_function#Wave_functions_as_an_abstract_vector_space Double Slit, Quantum-Electrodynamics, and Christian Theism - video https://www.facebook.com/philip.cunningham.73/videos/vb.100000088262100/1127450170601248/?type=2&theater
Since God is both omniscient and omnipresent, (possesses infinite knowledge and is everywhere present), then God is certainly a 'sufficient cause' to explain exactly how a infinite information, infinite dimensional, quantum wave state can possibly collapse to a single bit state. Verses and Music:
Job 38:19-20 “What is the way to the abode of light? And where does darkness reside? Can you take them to their places? Do you know the paths to their dwellings?” Colossians 1:17 He is before all things, and in him all things hold together. Hebrews 11:3 By faith we understand that the worlds were prepared by the word of God, so that what is seen was made from things that are not visible. THE GREATEST GIFT – Yancy - music video https://www.youtube.com/watch?v=mHGVud2Qfa4
Quote:
"As a man who has devoted his whole life to the most clear headed science, to the study of matter, I can tell you as a result of my research about atoms this much: There is no matter as such. All matter originates and exists only by virtue of a force which brings the particle of an atom to vibration and holds this most minute solar system of the atom together. We must assume behind this force the existence of a conscious and intelligent mind. This mind is the matrix of all matter." Max Planck - The main originator of Quantum Theory - Das Wesen der Materie [The Nature of Matter], speech at Florence, Italy (1944) (from Archiv zur Geschichte der Max-Planck-Gesellschaft, Abt. Va, Rep. 11 Planck, Nr. 1797)
bornagain77
wrossite at 44 you state
KF, Thanks for the post. Notice that it assumes many of the things I’m suggesting we can’t. It talks as if there are in fact other universes. It talks as if we can know anything about them or their laws. It assumes a range and a distribution for parameters. None of this can be known. It is pure speculation. As I’ve repeatedly pointed out, you cannot calculate a probability given a sample size of one. W
wrossite, it has occurred to me that your wished for 'mandate', (a wished for 'mandate' that basically states, 'we are not allowed to speculate on what was 'before' the big bang'), is equivalent to trying to make a new rule decreeing that water is not allowed to run downhill. Let me expand on that a bit. For you to try to mandate that humans cannot speculate on what happened before time began is to go against human nature itself. Human nature, especially human thought, is endowed with a essential timeless element to it that cannot possibly be reduced to any within space-time, materialistic, explanation. In other words, for you to try to say that humans are not allowed to think 'timelessly', i.e. to think about what possibly might have been before time began, is for you to try to mandate that humans are not allowed to think 'timelessly' as humans 'naturally' think. You might as well say that water is not allowed to run downhill as to try to say humans are not allowed to think about what happened before time began. Indeed. you might as well say humans are not allowed to speak or write since speaking and writing are a direct reflection of the 'timeless' immaterial attribute that man's mind is endowed with: Michael Egnor, professor of neurosurgery, puts the situation like this:
The Fundamental Difference Between Humans and Nonhuman Animals - Michael Egnor - November 5, 2015 Excerpt: Human beings think abstractly, and nonhuman animals do not. Human beings have the power to contemplate universals, which are concepts that have no material instantiation. Human beings think about mathematics, literature, art, language, justice, mercy, and an endless library of abstract concepts. Human beings are rational animals. Human rationality is not merely a highly evolved kind of animal perception. Human rationality is qualitatively different -- ontologically different -- from animal perception. Human rationality is different because it is immaterial. Contemplation of universals cannot have material instantiation, because universals themselves are not material and cannot be instantiated in matter.,,, It is a radical difference -- an immeasurable qualitative difference, not a quantitative difference. We are more different from apes than apes are from viruses.,,, http://www.evolutionnews.org/2015/11/the_fundamental_2100661.html Language Is a Rock Against Which Evolutionary Theory Wrecks Itself - Michael Egnor - September 19, 2016 Excerpt: Wolfe provides a précis of his argument: "Speech is not one of man's several unique attributes -- speech is the attribute of all attributes!" And yet, as Wolfe points out, Darwinists are at an utter loss to explain how language -- the salient characteristic of man -- "evolved.",,, I have argued before that the human mind is qualitatively different from the animal mind. The human mind has immaterial abilities -- the intellect's ability to grasp abstract universal concepts divorced from any particular thing -- and that this ability makes us more different from apes than apes are from viruses. We are ontologically different. We are a different kind of being from animals. We are not just animals who talk. Although we share much in our bodies with animals, our language -- a simulacrum of our abstract minds -- has no root in the animal world. Language is the tool by which we think abstractly. It is sui generis. It is a gift, a window into the human soul, something we are made with, and it did not evolve. Language is a rock against which evolutionary theory wrecks, one of the many rocks --,,, http://www.evolutionnews.org/2016/09/language_is_a_r103151.html
But this 'timeless' attribute of man is more than just a proof that Darwinian evolution cannot possibly be true. This 'timeless' attribute of man goes to the very heart of physics. In fact, this 'timeless' attribute of man, and the denial of the reality thereof by Einstein, is what figured centrally in Einstein failing to receive a Nobel prize for relativity. Einstein had an encounter with a famous philosopher, Henri Bergson, over the proper definition of time. In fact, that encounter with Bergson over the proper definition of time, and the heated disagreement that ensued between the two men over that proper definition of time, was one of the primary reasons that Einstein failed to receive a Nobel prize for relativity:
Einstein vs Bergson, science vs philosophy and the meaning of time – Wednesday 24 June 2015 Excerpt: The meeting of April 6 was supposed to be a cordial affair, though it ended up being anything but. ‘I have to say that day exploded and it was referenced over and over again in the 20th century,’ says Canales. ‘The key sentence was something that Einstein said: “The time of the philosophers did not exist.”’ It’s hard to know whether Bergson was expecting such a sharp jab. In just one sentence, Bergson’s notion of duration—a major part of his thesis on time—was dealt a mortal blow. As Canales reads it, the line was carefully crafted for maximum impact. ‘What he meant was that philosophers frequently based their stories on a psychological approach and [new] physical knowledge showed that these philosophical approaches were nothing more than errors of the mind.’ The night would only get worse. ‘This was extremely scandalous,’ says Canales. ‘Einstein had been invited by philosophers to speak at their society, and you had this physicist say very clearly that their time did not exist.’ Bergson was outraged, but the philosopher did not take it lying down. A few months later Einstein was awarded the Nobel Prize for the discovery of the law of photoelectric effect, an area of science that Canales noted, ‘hardly jolted the public’s imagination’. In truth, Einstein coveted recognition for his work on relativity. Bergson inflicted some return humiliation of his own. By casting doubt on Einstein’s theoretical trajectory, Bergson dissuaded the committee from awarding the prize for relativity. In 1922, the jury was still out on the correct interpretation of time. So began a dispute that festered for years and played into the larger rift between physics and philosophy, science and the humanities. Bergson was fond of saying that time was the experience of waiting for a lump of sugar to dissolve in a glass of water. It was a declaration that one could not talk about time without reference to human consciousness and human perception. Einstein would say that time is what clocks measure. Bergson would no doubt ask why we build clocks in the first place. ‘He argued that if we didn’t have a prior sense of time we wouldn’t have been led to build clocks and we wouldn’t even use them … unless we wanted to go places and to events that mattered,’ says Canales. ‘You can see that their points of view were very different.’ In a theoretical nutshell this expressed perfectly the division between lived time and spacetime: subjective experience versus objective reality.,,, Just when Einstein thought he had it worked out, along came the discovery of quantum theory,,, Some supporters went as far as to say that Bergson’s earlier work anticipated the quantum revolution of Niels Bohr and Werner Heisenberg by four decades or more.,,, Was Bergson right after all? Time will tell. http://www.abc.net.au/radionational/programs/philosopherszone/science-vs-philosophy-and-the-meaning-of-time/6539568
After Einstein's heated encounter with Bergson, Einstein had another encounter with another philosopher. Einstein was once asked by Rudolf Carnap (a philosopher):
“Can physics demonstrate the existence of ‘the now’ in order to make the notion of ‘now’ into a scientifically valid term?”
Einstein’s answer was categorical, he said:
“The experience of ‘the now’ cannot be turned into an object of physical measurement, it can never be a part of physics.”
Quote was taken from the last few minutes of this following video.
Stanley L. Jaki: “The Mind and Its Now” https://vimeo.com/10588094
And here is a bit more detail of the encounter:
The Mind and Its Now – May 22, 2008 – By Stanley L. Jaki Excerpt: ,,, Rudolf Carnap, and the only one among them who was bothered with the mind’s experience of its now. His concern for this is noteworthy because he went about it in the wrong way. He thought that physics was the only sound way to know and to know anything. It was therefore only logical on his part that he should approach, we are around 1935, Albert Einstein, the greatest physicist of the day, with the question whether it was possible to turn the experience of the now into a scientific knowledge. Such knowledge must of course be verified with measurement. We do not have the exact record of Carnap’s conversation with Einstein whom he went to visit in Princeton, at eighteen hours by train at that time from Chicago. But from Einstein’s reply which Carnap jotted down later, it is safe to assume that Carnap reasoned with him as outlined above. Einstein’s answer was categorical: The experience of the now cannot be turned into an object of physical measurement. It can never be part of physics. http://metanexus.net/essay/mind-and-its-now
The exact meaning of Carnap's question of ‘the Now’ can also be read in fuller context from the preceding article:
The Mind and Its Now – Stanley L. Jaki, May 2008 Excerpts: There can be no active mind without its sensing its existence in the moment called now.,,, Three quarters of a century ago Charles Sherrington, the greatest modern student of the brain, spoke memorably on the mind’s baffling independence of the brain. The mind lives in a self-continued now or rather in the now continued in the self. This life involves the entire brain, some parts of which overlap, others do not. ,,,There is no physical parallel to the mind’s ability to extend from its position in the momentary present to its past moments, or in its ability to imagine its future. The mind remains identical with itself while it lives through its momentary nows. ,,, the now is immensely richer an experience than any marvelous set of numbers, even if science could give an account of the set of numbers, in terms of energy levels. The now is not a number. It is rather a word, the most decisive of all words. It is through experiencing that word that the mind comes alive and registers all existence around and well beyond. ,,, All our moments, all our nows, flow into a personal continuum, of which the supreme form is the NOW which is uncreated, because it simply IS.,,,
Moreover, the statement Einstein made to Carnap on the train, ‘the now’ cannot be turned into an object of physical measurement’, was an interesting statement for Einstein to make to the philosopher since ‘the now of the mind’ has, from many recent experiments in quantum mechanics, established itself as central to quantum theory and undermined the space-time of Einstein's General Relativity as to being the absolute frame of reference for reality.
LIVING IN A QUANTUM WORLD - Vlatko Vedral - 2011 Excerpt: Thus, the fact that quantum mechanics applies on all scales forces us to confront the theory’s deepest mysteries. We cannot simply write them off as mere details that matter only on the very smallest scales. For instance, space and time are two of the most fundamental classical concepts, but according to quantum mechanics they are secondary. The entanglements are primary. They interconnect quantum systems without reference to space and time. If there were a dividing line between the quantum and the classical worlds, we could use the space and time of the classical world to provide a framework for describing quantum processes. But without such a dividing line—and, indeed, with­out a truly classical world—we lose this framework. We must explain space and time (4D space-time) as somehow emerging from fundamentally spaceless and timeless physics. http://phy.ntnu.edu.tw/~chchang/Notes10b/0611038.pdf New Mind-blowing Experiment Confirms That Reality Doesn’t Exist If You Are Not Looking at It – June 3, 2015 Excerpt: The results of the Australian scientists’ experiment, which were published in the journal Nature Physics, show that this choice is determined by the way the object is measured, which is in accordance with what quantum theory predicts. “It proves that measurement is everything. At the quantum level, reality does not exist if you are not looking at it,” said lead researcher Dr. Andrew Truscott in a press release.,,, “The atoms did not travel from A to B. It was only when they were measured at the end of the journey that their wave-like or particle-like behavior was brought into existence,” he said. Thus, this experiment adds to the validity of the quantum theory and provides new evidence to the idea that reality doesn’t exist without an observer. http://themindunleashed.org/2015/06/new-mind-blowing-experiment-confirms-that-reality-doesnt-exist-if-you-are-not-looking-at-it.html “Reality is in the observations, not in the electron.” – Paul Davies “We have become participators in the existence of the universe. We have no right to say that the past exists independent of the act of observation.” – John Wheeler
i.e. ‘the Now’, as philosophers term it, and contrary to what Einstein, (and Jaki), thought possible for experimental physics, and according to advances in quantum mechanics, takes precedence over past events in time. Moreover, due to advances in quantum mechanics, it would now be much more appropriate to phrase Einstein’s answer to the philosopher in this way:
“It is impossible for the experience of ‘the now of the mind’ to ever be divorced from physical measurement, it will always be a part of physics.”
bornagain77
wrossite @ 46: If you can't get along with BornAgain77 you have some real issues. Truth Will Set You Free
wrossite: But if it is meaningless to ask what the probability of getting a different parameter setting is, then my argument is correct, and we should stop doing it.
It is meaningful to ask what the probability of getting a different parameter setting is, only in the context of a proposed random mechanism, such as a multiverse (*). So again, my point is simple and obvious (and on this we agree): without mechanism X being proposed we cannot evaluate the probabilities and improbabilities of mechanism X. However, I'm not sure that you agree with me on the holistic nature of the design inference, and more importantly, that the absence or vagueness of a proposition of a random mechanism does not weaken the design inference wrt the universe. - - - - (*) I am not talking about unrestricted multiverses, in which anything that can possibly happen, actually happens in some universe. I don't think that probabilities apply here. Everything that can happen happens by definition — IOWs every event is 100% likely to happen. Instead, I’m talking about the more modest claim that our universe is just one of a vast number of universes with varying physical constants and different laws of nature, and that there is something like a “universe-generator” which churns out baby universes. Origenes
wrossite states,,
If the administrator cares to read your voluminous comments, he/she will see how unkindly you have treated me. Frankly, you are nasty to talk to,
You are projecting! Please provide one instance on this thread where I have 'been nasty' towards you personally. I admit, I have not been kind to your argument and indeed I find it severely wanting. But showing you to be wrong in your argument, contrary to what you may believe, does not equate to me 'being nasty' towards you personally. The right thing for you to do would be for you to apologize for your personal ad hominem towards me and start over on a clean slate with me. I doubt you will though. If you can't admit you are wrong on this relatively minor instance of insulting someone instead of forthrightly engaging them, then it is easy to see why you are having so much trouble admitting that you are wrong on the bigger issue of the truth of your claim. To reiterate KF's quote which hit the nail on the head, or the fly with a bullet as it were:
“. . . the need for such explanations does not depend on any estimate of how many universes would be observer-permitting, out of the entire field of possible universes. Claiming that our universe is ‘fine tuned for observers’, we base our claim on how life’s evolution would apparently have been rendered utterly impossible by comparatively minor alterations in physical force strengths, elementary particle masses and so forth. There is no need for us to ask whether very great alterations in these affairs would have rendered it fully possible once more, let alone whether physical worlds conforming to very different laws could have been observer-permitting without being in any way fine tuned. Here it can be useful to think of a fly on a wall, surrounded by an empty region. A bullet hits the fly Two explanations suggest themselves. Perhaps many bullets are hitting the wall or perhaps a marksman fired the bullet. There is no need to ask whether distant areas of the wall, or other quite different walls, are covered with flies so that more or less any bullet striking there would have hit one. The important point is that the local area contains just the one fly.” - John Leslie - Our Place in the Cosmos http://web.archive.org/web/20050308175505/http://www.royalinstitutephilosophy.org/articles/leslie_cosmos.htm In short, finding oneself at a locally isolated operating point for a cosmos is just as momentous as thinking in terms of whether a variable can go in magnitude from 0 to the transfinite. - KF
bornagain77
WR, nope, it is not so easy as plastering over the term "assumptions," move along nothing to see; much as with "god of the gaps"; this is about the architecture of the framework of the observed cosmos. What John Leslie did was to summarise what many others have noted about the structure of laws and parameters that we have discovered over the years, in effect applying sensitivity analysis to the laws and parameters -- a standard move. Where, mathematics is perhaps best understood as the logic of structure and quantity. The result is that in parameter space there are many converging zones yielding a deeply isolated operating point. That is significant, and well worth pondering. Nor, is it easily got rid of. If there are super-laws forcing the tightly convergent clustering Leslie summarises, then those super-laws would be fine tuned. And so forth. KF kairosfocus
Bornagain77, suck it up buttercup. If the administrator cares to read your voluminous comments, he/she will see how unkindly you have treated me. Frankly, you are nasty to talk to, and if you do treat atheists like this on these threads, you're hurting the cause. W wrossite
Origenes, I agree. But if it is meaningless to ask what the probability of getting a different parameter setting is, then my argument is correct, and we should stop doing it. W wrossite
KF, Thanks for the post. Notice that it assumes many of the things I'm suggesting we can't. It talks as if there are in fact other universes. It talks as if we can know anything about them or their laws. It assumes a range and a distribution for parameters. None of this can be known. It is pure speculation. As I've repeatedly pointed out, you cannot calculate a probsbilty given a sample size of one. W wrossite
Origenes @ 40: Excellent comment. Thank you. Truth Will Set You Free
BA @ 41: Well said. Let's hope Wayne Rossiter learns this lesson. Truth Will Set You Free
Wayne Rossiter, the administrator agrees with me that your comments are 'unseemly' but he has seen fit not to ban you on this thread since you are in fact 'the person the whole post is about.' I agree with him in this instance. But rest assured that if you continue to participate on UD, resorting to personal attack to try to support an argument is one of the quickest routes to being banned from UD. Years ago, I myself learned that lesson the hard way and was banned for several months for ad hominem arguments. Personally, I like the rule very much since I have seen arguments on UD degenerate in to endless bouts of ugly name calling before the rule was finally more strictly enforced. bornagain77
Here is a meaningless question: “what is the probability that Leonardo Da Vinci created the Mona Lisa?” Why is this meaningless question? Because creating the Mona Lisa was not a random event. By the same token, if the universe was caused by intelligent design, then talking about probabilities is inappropriate — randomness is simply not involved. It is only after someone hypothesizes that the creation of our universe is a random event, by proposing a mechanism which produces universes with random constants, that we can discuss the probability of the coming into existence of the universe in that context. This post by Bartlett makes exactly this point. The design inference is not based on probabilities, but instead on holism. Only after Darwin suggested a random mechanism (natural selection) probabilities enter the discussion. Prof Wayne Rossiter seems to say 'but wrt the origin of the universe we don’t know what a random mechanism would look like, so we don’t know the probabilities, and therefore we don’t know if the universe is designed.' We know that the universe is designed when we see fine-tuning of the natural constants. This is a clear indication of teleology – functional coherence. There is a complete alignment of various constants wrt the top-level function — harboring life. Here is no reference to probabilities or improbabilities of any mechanism. It is just evidence on its own. No probabilities involved. Probabilities enter the arena only after someone proposes a random mechanism a la natural selection. It does not help the atheist to say: 'but I don’t offer you such a mechanism, so you don’t know how likely it is that this unknown mechanism produces a fine-tuned universe.' Origenes
WR: Pardon a point or two from John Leslie:
"One striking thing about the fine tuning is that a force strength or a particle mass often appears to require accurate tuning for several reasons at once. Look at electromagnetism. Electromagnetism seems to require tuning for there to be any clear-cut distinction between matter and radiation; for stars to burn neither too fast nor too slowly for life’s requirements; for protons to be stable; for complex chemistry to be possible; for chemical changes not to be extremely sluggish; and for carbon synthesis inside stars (carbon being quite probably crucial to life). Universes all obeying the same fundamental laws could still differ in the strengths of their physical forces, as was explained earlier, and random variations in electromagnetism from universe to universe might then ensure that it took on any particular strength sooner or later. Yet how could they possibly account for the fact that the same one strength satisfied many potentially conflicting requirements, each of them a requirement for impressively accurate tuning?" [Our Place in the Cosmos, The Royal Institute of Philosophy, 1998 (courtesy Wayback Machine) Emphases added.] AND: ". . . the need for such explanations does not depend on any estimate of how many universes would be observer-permitting, out of the entire field of possible universes. Claiming that our universe is ‘fine tuned for observers’, we base our claim on how life’s evolution would apparently have been rendered utterly impossible by comparatively minor alterations in physical force strengths, elementary particle masses and so forth. There is no need for us to ask whether very great alterations in these affairs would have rendered it fully possible once more, let alone whether physical worlds conforming to very different laws could have been observer-permitting without being in any way fine tuned. Here it can be useful to think of a fly on a wall, surrounded by an empty region. A bullet hits the fly Two explanations suggest themselves. Perhaps many bullets are hitting the wall or perhaps a marksman fired the bullet. There is no need to ask whether distant areas of the wall, or other quite different walls, are covered with flies so that more or less any bullet striking there would have hit one. The important point is that the local area contains just the one fly." [Emphasis his.]
In short, finding oneself at a locally isolated operating point for a cosmos is just as momentous as thinking in terms of whether a variable can go in magnitude from 0 to the transfinite. KF kairosfocus
wrossite, you act as if you have a priori knowledge of what the parameters should be prior to the creation of the universe. But you, as you yourself have repeatedly admitted, don't have a clue what the variance could possibly be. Nobody has a clue. Therefore, of necessity, the variance of parameters could possibly be anywhere from zero to infinity. That IS the default assumption for an unknown quantity when you have no a priori knowledge as to what its actual quantity could possibly be. Calling what is the default assumption 'stupid' just because it disagrees with your position is not supporting your argument with evidence or logic but is trying to support your position by semi-personal attack. Claiming that we can assign no probability is wrong. Dr. Gordon did just that using the conservative, but reasonable, assumption of variance in orders of magnitude for what we actually empirically observe. Then after calling what IS the default assumption with no a-priori knowledge 'stupid' you then go on to call me 'bitter and argumentative, and amateur'. The fact of the matter is that you are the one who is in the wrong on this matter. (Tilting at windmills comes to mind in that you are imagining a problem where none actually exists). But since now you have resorted to ad hominem to try to support your unsupportable position I will ask that you be banned from UD for personal attack. bornagain77
Robin Collins did in fact use the same analogy, and I called him out on it, whereafter he conceded that it is 'sloppy'. To say that, since time and space emerge from the big bang, all parameters could vary up to infinity is stupid. That again would mean that we could assign no probability to the observed setting. The probability would then be 1 in 10^infinity, which nobody claims about any parameter. I think we're done here. Others are welcome to comment, but bornagain77 seems just bitter and argumentative, and amateur. The fact remains that no probability can be made from a sample size of 1. End of story. W wrossite
correction "Both are ridiculous assertions conservative assumptions." There all better.
Contemporary Physics and God Part 2 Dr Bruce Gordon – video https://youtu.be/ff_sNyGNSko?t=282
Again, since matter-energy, space-time were brought into being at the Big Bang,,
Steven Hawking, George Ellis, and Roger Penrose turned their attention to the Theory of Relativity and its implications regarding our notions of time. In 1968 and 1970, they published papers in which they extended Einstein’s Theory of General Relativity to include measurements of time and space.1, 2 According to their calculations, time and space had a finite beginning that corresponded to the origin of matter and energy.”3 Steven W. Hawking, George F.R. Ellis, “The Cosmic Black-Body Radiation and the Existence of Singularities in our Universe,” Astrophysical Journal, 152, (1968) pp. 25-36. Steven W. Hawking, Roger Penrose, “The Singularities of Gravitational Collapse and Cosmology,” Proceedings of the Royal Society of London, series A, 314 (1970) pp. 529-548.
Again, since matter-energy, space-time were brought into being at the Big Bang, then the default assumption, on both Atheism and Theism, is that the constant could have varied from zero to infinity. The default assumption is not that the variance should be far more tightly constrained than what we empirically observe in terms of orders of magnitude. Repeatedly claiming that you just don't know what the variance could have been underscores this point and certainly does not negate the conservative assumption that Dr. Gordon made. Here Robin Collins is right after the tape measure analogy: https://youtu.be/ajqH4y8G0MI?t=1714 bornagain77
I am aware of that post. If you had actually read my blog, you would know that I reference Gordon directly from that lecture on that point! I use him as an example of really bad logic in coming up with his tape measure analogy (something Robin Collins actually agreed with me on when we both presented at this year's Christian Scientific Society meeting). He simply observes that the four forces acting on matter span 40 orders of magnitude, and then supposes that we can therefore assume that any one of those forces could've taken a value along those 40 orders of magnitude. He further assumes that any value would be equiprobable. Both are ridiculous assertions. W wrossite
wrossite, given that you have been basically arguing for the atheistic position, and that I deal almost exclusively with atheists on this blog, I hope you can forgive me for confusing you with an atheist. Frankly, I've never heard of you as an ID author but only as someone who has authored a book criticizing Theistic Evolution. As I don't think much of TE anyway, I've not been interested in your book. It just is not that important of a topic for me. I'm very glad you are a Christian. As to your second point. To be blunt, like the hundreds of atheists I've dealt on UD before who don't like me or my posts, I don't care what you personally think about the length of my posts or me and could care less if you like them or not or if you like me or not. Your third point on,, "the range of values that a parameter, say Newton’s gravitational constant, could have been?" I answered that specific point in my very first post at 3 (which was a short post). Apparently you did not bother to look at the video or read that short post. I cited Dr Gordon 'conservative' estimate for a reasonable possible range of values.
,,, at the 4:45 minute mark of the following video, Dr. Bruce comments that varying the gravitational constant by just one inch, on an imaginary ruler that stretched across the entire universe, would either increase or decrease our weight by a trillion fold: Contemporary Physics and God Part 2 Dr Bruce Gordon – video https://www.youtube.com/watch?v=ff_sNyGNSko You can a see a visualization of that imaginary ruler stretched across the universe in the following video,,, Finely Tuned Gravity (1 in 10^40 tolerance; which is just one inch of tolerance allowed on a imaginary ruler stretching across the diameter of the entire universe) – (27:32 minute mark) video https://www.youtube.com/watch?feature=player_detailpage&v=ajqH4y8G0MI#t=1652
Note that this is a conservative estimate of possible ranges for Gravity and any realistic estimate will be much wider. There is no reason for it not to scale from zero to infinity in both the Theistic and Atheistic worldview. Exactly what overriding rule of science or philosophy can you cite that would prevent G from scaling from Zero to Infinity at the beginning of the universe?
Steven Hawking, George Ellis, and Roger Penrose turned their attention to the Theory of Relativity and its implications regarding our notions of time. In 1968 and 1970, they published papers in which they extended Einstein's Theory of General Relativity to include measurements of time and space.1, 2 According to their calculations, time and space had a finite beginning that corresponded to the origin of matter and energy."3 Steven W. Hawking, George F.R. Ellis, "The Cosmic Black-Body Radiation and the Existence of Singularities in our Universe," Astrophysical Journal, 152, (1968) pp. 25-36. Steven W. Hawking, Roger Penrose, "The Singularities of Gravitational Collapse and Cosmology," Proceedings of the Royal Society of London, series A, 314 (1970) pp. 529-548.
God is free to choose any value he wants for G, from zero to infinity, and 'infinite random chance' could care less what value it picks from zero to infinity. Seeing as you did not even bother to read my first short post at the beginning of this thread, where Dr. Gordon gave a 'conservative estimate' for scaling that was very reasonable, and since you apparently still persist in thinking you have a valid point. And since you clearly do not respect anything I may say, I will respond to you no further. I have made my point clear. You have no objection against fine-tuning. bornagain77
Give me a number, and tell me how it was derived. wrossite
Two thoughts for bornagain77, First, wrossite is Wayne Rossiter, an ID author and Christian. No atheist metaphysics for me thank you. Second, as a rule if thumb, I would advise against writing a novel in you responses. Nobody will read them, and most won't know where to begin in responding. I only point out that, as I outline in my blog, respected scientists and philosophers on both sides seem to concede my point. We simply don't know the range or probability distribution for any given parameter. To make this point, I'll ask you again, what is the range of values that a parameter, say Newton's gravitational constant, could have been? And what is the shape of that distribution? W wrossite
Of supplemental note: Regardless of Darwinian evolution not having a rigid mathematical basis to test against, none-the-less mathematics, coupled with empirical observation, has been very effective at demonstrating that Darwinian evolution is mathematically impossible. Here are a few notes in that regard:
"In light of Doug Axe's number, and other similar results,, (1 in 10^77), it is overwhelmingly more likely than not that the mutation, random selection, mechanism will fail to produce even one gene or protein given the whole multi-billion year history of life on earth. There is not enough opportunities in the whole history of life on earth to search but a tiny fraction of the space of 10^77 possible combinations that correspond to every functional combination. Why? Well just one little number will help you put this in perspective. There have been only 10^40 organisms living in the entire history of life on earth. So if every organism, when it replicated, produced a new sequence of DNA to search that (1 in 10^77) space of possibilities, you would have only searched 10^40th of them. 10^40 over 10^77 is 1 in 10^37. Which is 10 trillion, trillion, trillion. In other words, If every organism in the history of life would have been searching for one those (functional) gene sequences we need, you would have searched 1 in 10 trillion, trillion, trillionth of the haystack. Which makes it overwhelmingly more likely than not that the (Darwinian) mechanism will fail. And if it is overwhelmingly more likely than not that the (Darwinian) mechanism will fail should we believe that is the way that life arose?" Stephen Meyer - 46:19 minute mark - Darwin's Doubt - video https://www.youtube.com/watch?v=Vg8bqXGrRa0&feature=player_detailpage#t=2778 Waiting Longer for Two Mutations - Michael J. Behe Excerpt: Citing malaria literature sources (White 2004) I had noted that the de novo appearance of chloroquine resistance in Plasmodium falciparum was an event of probability of 1 in 10^20. I then wrote that 'for humans to achieve a mutation like this by chance, we would have to wait 100 million times 10 million years' (1 quadrillion years)(Behe 2007) (because that is the extrapolated time that it would take to produce 10^20 humans). Durrett and Schmidt (2008, p. 1507) retort that my number ‘is 5 million times larger than the calculation we have just given’ using their model (which nonetheless "using their model" gives a prohibitively long waiting time of 216 million years). Their criticism compares apples to oranges. My figure of 10^20 is an empirical statistic from the literature; it is not, as their calculation is, a theoretical estimate from a population genetics model.,,, The difficulty with models such as Durrett and Schmidt’s is that their biological relevance is often uncertain, and unknown factors that are quite important to cellular evolution may be unintentionally left out of the model. That is why experimental or observational data on the evolution of microbes such as P. falciparum are invaluable,,, http://www.discovery.org/a/9461 The waiting time problem in a model hominin population - 2015 Sep 17 John Sanford, Wesley Brewer, Franzine Smith, and John Baumgardner Excerpt: The program Mendel’s Accountant realistically simulates the mutation/selection process,,, Given optimal settings, what is the longest nucleotide string that can arise within a reasonable waiting time within a hominin population of 10,000? Arguably, the waiting time for the fixation of a “string-of-one” is by itself problematic (Table 2). Waiting a minimum of 1.5 million years (realistically, much longer), for a single point mutation is not timely adaptation in the face of any type of pressing evolutionary challenge. This is especially problematic when we consider that it is estimated that it only took six million years for the chimp and human genomes to diverge by over 5 % [1]. This represents at least 75 million nucleotide changes in the human lineage, many of which must encode new information. While fixing one point mutation is problematic, our simulations show that the fixation of two co-dependent mutations is extremely problematic – requiring at least 84 million years (Table 2). This is ten-fold longer than the estimated time required for ape-to-man evolution. In this light, we suggest that a string of two specific mutations is a reasonable upper limit, in terms of the longest string length that is likely to evolve within a hominin population (at least in a way that is either timely or meaningful). Certainly the creation and fixation of a string of three (requiring at least 380 million years) would be extremely untimely (and trivial in effect), in terms of the evolution of modern man. It is widely thought that a larger population size can eliminate the waiting time problem. If that were true, then the waiting time problem would only be meaningful within small populations. While our simulations show that larger populations do help reduce waiting time, we see that the benefit of larger population size produces rapidly diminishing returns (Table 4 and Fig. 4). When we increase the hominin population from 10,000 to 1 million (our current upper limit for these types of experiments), the waiting time for creating a string of five is only reduced from two billion to 482 million years. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4573302/
Moreover, in so far as Darwinian evolution is dependent on the premises of reductive materialism (and I would argue that it is absolutely dependent on those materialistic premises), and regardless of whether Darwinists ever personally accept the empirical falsification or not, Darwinian evolution is now empirically falsified by advances in quantum biology:
Jim Al-Khalili, in the following video, states, “,,and Physicists and Chemists have had a long time to try and get use to it (Quantum Mechanics). Biologists, on the other hand have got off lightly in my view. They are very happy with their balls and sticks models of molecules. The balls are the atoms. The sticks are the bonds between the atoms. And when they can’t build them physically in the lab nowadays they have very powerful computers that will simulate a huge molecule.,, It doesn’t really require much in the way of quantum mechanics in the way to explain it.” ,,, Jim Al-Khalili goes on to state: “To paraphrase, (Erwin Schrödinger in his book “What Is Life”), he says at the molecular level living organisms have a certain order. A structure to them that’s very different from the random thermodynamic jostling of atoms and molecules in inanimate matter of the same complexity. In fact, living matter seems to behave in its order and its structure just like inanimate cooled down to near absolute zero. Where quantum effects play a very important role. There is something special about the structure, about the order, inside a living cell. So Schrodinger speculated that maybe quantum mechanics plays a role in life”. Jim Al-Khalili – Molecular Biology – 19th Century Materialism meets 21st Century Quantum Mechanics – video https://www.youtube.com/watch?v=rCs3WXHqOv8 Quantum correlations do not imply instant causation – August 12, 2016 Excerpt: A research team led by a Heriot-Watt scientist has shown that the universe is even weirder than had previously been thought. In 2015 the universe was officially proven to be weird. After many decades of research, a series of experiments showed that distant, entangled objects can seemingly interact with each other through what Albert Einstein famously dismissed as “Spooky action at a distance”. A new experiment by an international team led by Heriot-Watt’s Dr Alessandro Fedrizzi has now found that the universe is even weirder than that: entangled objects do not cause each other to behave the way they do. http://phys.org/news/2016-08-quantum-imply-instant-causation.html Experimental test of nonlocal causality – August 10, 2016 Looking beyond space and time to cope with quantum theory – 29 October 2012 Excerpt: “Our result gives weight to the idea that quantum correlations somehow arise from outside spacetime, in the sense that no story in space and time can describe them,” http://www.quantumlah.org/highlight/121029_hidden_influences.php
Verses:
John 1:1-4 In the beginning was the Word, and the Word was with God, and the Word was God. He was in the beginning with God. All things were made through Him, and without Him nothing was made that was made. In Him was life, and the life was the light of men. Mark 8:37 “Is anything worth more than your soul?”
bornagain77
In fact, numerical simulations and empirical evidence itself tells us that “Genetic Entropy”, the tendency of biological systems to drift towards decreasing complexity and decreasing information content, holds true as the overriding rule for biology over long periods of time.
“The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain – Michael Behe – December 2010 Excerpt: In its most recent issue The Quarterly Review of Biology has published a review by myself of laboratory evolution experiments of microbes going back four decades.,,, The gist of the paper is that so far the overwhelming number of adaptive (that is, helpful) mutations seen in laboratory evolution experiments are either loss or modification of function. Of course we had already known that the great majority of mutations that have a visible effect on an organism are deleterious. Now, surprisingly, it seems that even the great majority of helpful mutations degrade the genome to a greater or lesser extent.,,, I dub it “The First Rule of Adaptive Evolution”: Break or blunt any functional coded element whose loss would yield a net fitness gain. http://behe.uncommondescent.com/2010/12/the-first-rule-of-adaptive-evolution/ Can Purifying Natural Selection Preserve Biological Information? – May 2013 – Paul Gibson, John R. Baumgardner, Wesley H. Brewer, John C. Sanford In conclusion, numerical simulation shows that realistic levels of biological noise result in a high selection threshold. This results in the ongoing accumulation of low-impact deleterious mutations, with deleterious mutation count per individual increasing linearly over time. Even in very long experiments (more than 100,000 generations), slightly deleterious alleles accumulate steadily, causing eventual extinction. These findings provide independent validation of previous analytical and simulation studies [2–13]. Previous concerns about the problem of accumulation of nearly neutral mutations are strongly supported by our analysis. Indeed, when numerical simulations incorporate realistic levels of biological noise, our analyses indicate that the problem is much more severe than has been acknowledged, and that the large majority of deleterious mutations become invisible to the selection process.,,, http://www.worldscientific.com/doi/pdf/10.1142/9789814508728_0010 Genetic Entropy – references to several peer reviewed numerical simulations analyzing and falsifying all flavors of Darwinian evolution,, (via John Sanford and company) http://www.geneticentropy.org/#!properties/ctzx
And whereas Darwinian evolution has no known unversal constant or law of nature to appeal to so as to mathematically establish itself as a proper, testable, science, (in fact it almost directly contradicts the second law of thermodynamics, i.e. entropy), Intelligent Design does not suffer from such a disconnect from physical reality. Specifically, Intelligent Design can appeal directly to ‘the laws of conservation of information’ (Dembski, Marks, etc..) in order to establish itself as a proper, testable, and rigorous science.
Evolutionary Computing: The Invisible Hand of Intelligence – June 17, 2015 Excerpt: William Dembski and Robert Marks have shown that no evolutionary algorithm is superior to blind search — unless information is added from an intelligent cause, which means it is not, in the Darwinian sense, an evolutionary algorithm after all. This mathematically proven law, based on the accepted No Free Lunch Theorems, seems to be lost on the champions of evolutionary computing. Researchers keep confusing an evolutionary algorithm (a form of artificial selection) with “natural evolution.” ,,, Marks and Dembski account for the invisible hand required in evolutionary computing. The Lab’s website states, “The principal theme of the lab’s research is teasing apart the respective roles of internally generated and externally applied information in the performance of evolutionary systems.” So yes, systems can evolve, but when they appear to solve a problem (such as generating complex specified information or reaching a sufficiently narrow predefined target), intelligence can be shown to be active. Any internally generated information is conserved or degraded by the law of Conservation of Information.,,, What Marks and Dembski (mathematically) prove is as scientifically valid and relevant as Gödel’s Incompleteness Theorem in mathematics. You can’t prove a system of mathematics from within the system, and you can’t derive an information-rich pattern from within the pattern.,,, http://www.evolutionnews.org/2015/06/evolutionary_co_1096931.html
And since Intelligent Design is mathematically based on the ‘law of conservation of information’, that makes Intelligent Design testable and potentially falsifiable, and thus makes Intelligent Design, unlike Darwinism, a rigorous science instead of a unfalsifiable pseudo-science:
The Law of Physicodynamic Incompleteness – David L. Abel Excerpt: “If decision-node programming selections are made randomly or by law rather than with purposeful intent, no non-trivial (sophisticated) function will spontaneously arise.” If only one exception to this null hypothesis were published, the hypothesis would be falsified. Falsification would require an experiment devoid of behind-the-scenes steering. Any artificial selection hidden in the experimental design would disqualify the experimental falsification. After ten years of continual republication of the null hypothesis with appeals for falsification, no falsification has been provided. The time has come to extend this null hypothesis into a formal scientific prediction: “No non trivial algorithmic/computational utility will ever arise from chance and/or necessity alone.” https://www.academia.edu/Documents/in/The_Law_of_Physicodynamic_Incompleteness The Origin of Information: How to Solve It – Perry Marshall Where did the information in DNA come from? This is one of the most important and valuable questions in the history of science. Cosmic Fingerprints has issued a challenge to the scientific community: “Show an example of Information that doesn’t come from a mind. All you need is one.” “Information” is defined as digital communication between an encoder and a decoder, using agreed upon symbols. To date, no one has shown an example of a naturally occurring encoding / decoding system, i.e. one that has demonstrably come into existence without a designer. A private equity investment group is offering a technology prize for this discovery (up to 3 million dollars). We will financially reward and publicize the first person who can solve this;,,, To solve this problem is far more than an object of abstract religious or philosophical discussion. It would demonstrate a mechanism for producing coding systems, thus opening up new channels of scientific discovery. Such a find would have sweeping implications for Artificial Intelligence research. http://cosmicfingerprints.com/solve/
Verse:
1 Thessalonians 5:21 but test everything; hold fast what is good.
bornagain77
Of supplemental note to universal constants and natural laws. The universe scientifically opened up for Newton when he deduced the gravitational constant and applied it to Kepler’s mathematical equation for planetary motion:
Mathematics: The Beautiful Language of the Universe - 23 Dec. 2015 Excerpt: Newton recognized that Kepler’s mathematical equation for planetary motion, Kepler’s 3rd Law ( P2=A3 ), was purely based on empirical observation, and was only meant to measure what we observed within our solar system. Newton’s mathematical brilliance was in realizing that this basic equation could be made universal by applying a gravitational constant to the equation, in which gave birth to perhaps one of the most important equations to ever be derived by mankind; Newton’s Version of Kepler’s Third Law.,,, With his understanding of mathematics, Newton was able to derive the aforementioned gravitational constant for all objects in the universe ( G = 6.672×10-11 N m2 kg-2 ). This constant allowed him to unify astronomy and physics which then permitted predictions about how things moved in the universe.,,, http://www.universetoday.com/120681/mathematics-the-beautiful-language-of-the-universe/
And the gravitational constant again appears in Einstein's mathematical description of gravity, i.e. General Relativity:
The gravitational constant (also known as "universal gravitational constant", or as "Newton's constant"), denoted by the letter G, is an empirical physical constant involved in the calculation of gravitational effects in Sir Isaac Newton's law of universal gravitation and in Albert Einstein's general theory of relativity. Its value is approximately 6.674×10?11 N?m2/kg2.[1] https://en.wikipedia.org/wiki/Gravitational_constant
Universal constants and/or natural laws appear in all of the fundamental equations that describe the universe. Here is a list of those fundamental equations:
How the Recent Discoveries Support a Designed Universe - Dr. Walter L. Bradley - paper Excerpt: Only in the 20th century have we come to fully understand that the incredibly diverse phenomena that we observe in nature are the outworking of a very small number of physical laws, each of which may be described by a simple mathematical relationship. Indeed, so simple in mathematical form and small in number are these physical laws that they can all be written on one side of one sheet of paper, as seen in Table 1. 1. Mechanics (Hamilton's Equations) 2. Electrodynamics (Maxwell's Equations) 3. Statistical Mechanics (Boltzmann's Equations) 4. Quantum Mechanics (Schrödinger's Equations) 5. General Relativity (Einstein's Equation) http://www.leaderu.com/offices/bradley/docs/scievidence.html
It is interesting to point out, whilst all the equations that accurately describe the universe are based on universal constants or natural laws of some sort, the math of Darwinian evolution is not based on any universal constant or natural law of any sort.
The Evolution of Ernst: Interview with Ernst Mayr – 2004 Excerpt: biology (Darwinian Evolution) differs from the physical sciences in that in the physical sciences, all theories, I don’t know exceptions so I think it’s probably a safe statement, all theories are based somehow or other on natural laws. In biology, as several other people have shown, and I totally agree with them, there are no natural laws in biology corresponding to the natural laws of the physical sciences. http://www.scientificamerican.com/article/the-evolution-of-ernst-in/ WHAT SCIENTIFIC IDEA IS READY FOR RETIREMENT? Evolution is True – Roger Highfield – January 2014 Excerpt:,,, Whatever the case, those universal truths—’laws’—that physicists and chemists all rely upon appear relatively absent from biology. Little seems to have changed from a decade ago when the late and great John Maynard Smith wrote a chapter on evolutionary game theory for a book on the most powerful equations of science: his contribution did not include a single equation. http://www.edge.org/response-detail/25468 “It is our contention that if ‘random’ is given a serious and crucial interpretation from a probabilistic point of view, the randomness postulate is highly implausible and that an adequate scientific theory of evolution must await the discovery and elucidation of new natural laws—physical, physico-chemical, and biological.” Murray Eden, “Inadequacies of Neo-Darwinian Evolution as a Scientific Theory,” Mathematical Challenges to the Neo-Darwinian Interpretation of Evolution, editors Paul S. Moorhead and Martin M. Kaplan, June 1967, p. 109.
Without a universal constant or natural law to base its math on, Darwinian evolution is not testable, (i.e. potentially falsifiable by direct experiment), and therefore Darwinian evolution does not qualify as a proper science in the first place but is more realistically classified as a unfalsifiable pseudoscience:
“On the other hand, I disagree that Darwin’s theory is as `solid as any explanation in science.; Disagree? I regard the claim as preposterous. Quantum electrodynamics is accurate to thirteen or so decimal places; so, too, general relativity. A leaf trembling in the wrong way would suffice to shatter either theory. What can Darwinian theory offer in comparison?” – Berlinski, D., “A Scientific Scandal?: David Berlinski & Critics,” Commentary, July 8, 2003 Deeper into the Royal Society Evolution Paradigm Shift Meeting – 02/08/2016 Suzan Mazur: Peter Saunders in his interview comments to me said that neo-Darwinism is not a theory, it’s a paradigm and the reason it’s not a theory is that it’s not falsifiable. http://www.huffingtonpost.com/suzan-mazur/john-dupre-interview-deep_b_9184812.html Peter Saunders is Co-Director, Institute of Science in Society, London; Emeritus professor of Applied Mathematics, King’s College London. Peter Saunders has been applying mathematics in biology for over 40 years, in microbiology and physiology as well as in development and evolution. He has been a critic of neo-Darwinism for almost as long. “In so far as a scientific statement speaks about reality, it must be falsifiable; and in so far as it is not falsifiable, it does not speak about reality.” Karl Popper – The Two Fundamental Problems of the Theory of Knowledge (2014 edition), Routledge
To highlight the fact that Darwinian evolution, at least how Darwinists have it set up, is not a testable theory like other overarching theories of science are testable, it is good to point out the fact that Darwinists are notorious for their ability to avoid and/or ignore falsification from direct empirical observation.
"Being an evolutionist means there is no bad news. If new species appear abruptly in the fossil record, that just means evolution operates in spurts. If species then persist for eons with little modification, that just means evolution takes long breaks. If clever mechanisms are discovered in biology, that just means evolution is smarter than we imagined. If strikingly similar designs are found in distant species, that just means evolution repeats itself. If significant differences are found in allied species, that just means evolution sometimes introduces new designs rapidly. If no likely mechanism can be found for the large-scale change evolution requires, that just means evolution is mysterious. If adaptation responds to environmental signals, that just means evolution has more foresight than was thought. If major predictions of evolution are found to be false, that just means evolution is more complex than we thought." ~ Cornelius Hunter
In fact, not only does Evolution not have any universal constant or natural law to appeal to, in order to base its math on, as other overarching theories of science have, entropy, a law with great mathematical explanatory power in science, almost directly contradicts Darwinian claims that increases in functional complexity and/or information can be easily had:
Why Tornadoes Running Backward do not Violate the Second Law – Granville Sewell – May 2012 Excerpt: So, how does the spontaneous rearrangement of matter on a rocky, barren, planet into human brains and spaceships and jet airplanes and nuclear power plants and libraries full of science texts and novels, and supercomputers running partial differential equation solving software , represent a less obvious or less spectacular violation of the second law—or at least of the fundamental natural principle behind this law—than tornados turning rubble into houses and cars? Can anyone even imagine a more spectacular violation? https://uncommondesc.wpengine.com/intelligent-design/why-tornados-running-backward-do-not-violate-the-second-law/
bornagain77
Moreover, if we rightly let the Agent causality of God ‘back’ into the picture of modern physics, as the Christian founders of modern science originally envisioned, (Newton, Maxwell, Faraday, and Planck, among others), then an empirically backed reconciliation between Quantum Mechanics and General Relativity is readily achieved for us in Christ’s resurrection from the dead. Specifically, we have evidence that both Gravity and Quantum Mechanics were dealt with in Christ’s resurrection from the dead:
Shroud of Turin: From discovery of Photographic Negative, to 3D Information, to Quantum Hologram https://youtu.be/F-TL4QOCiis THE EVENT HORIZON (Space-Time Singularity) OF THE SHROUD OF TURIN. – Isabel Piczek – Particle Physicist Excerpt: We have stated before that the images on the Shroud firmly indicate the total absence of Gravity. Yet they also firmly indicate the presence of the Event Horizon. These two seemingly contradict each other and they necessitate the past presence of something more powerful than Gravity that had the capacity to solve the above paradox. http://shroud3d.com/findings/isabel-piczek-image-formation Turin shroud – (Particle Physicist explains event horizon) – video https://www.youtube.com/watch?v=HHVUGK6UFK8 The absorbed energy in the Shroud body image formation appears as contributed by discrete (quantum) values – Giovanni Fazio, Giuseppe Mandaglio – 2008 Excerpt: This result means that the optical density distribution,, can not be attributed at the absorbed energy described in the framework of the classical physics model. It is, in fact, necessary to hypothesize a absorption by discrete values of the energy where the ‘quantum’ is equal to the one necessary to yellow one fibril. http://cab.unime.it/journals/index.php/AAPP/article/view/C1A0802004/271 Astonishing discovery at Christ's tomb supports Turin Shroud - NOV 26TH 2016 Excerpt: The first attempts made to reproduce the face on the Shroud by radiation, used a CO2 laser which produced an image on a linen fabric that is similar at a macroscopic level. However, microscopic analysis showed a coloring that is too deep and many charred linen threads, features that are incompatible with the Shroud image. Instead, the results of ENEA “show that a short and intense burst of VUV directional radiation can color a linen cloth so as to reproduce many of the peculiar characteristics of the body image on the Shroud of Turin, including shades of color, the surface color of the fibrils of the outer linen fabric, and the absence of fluorescence”. 'However, Enea scientists warn, "it should be noted that the total power of VUV radiations required to instantly color the surface of linen that corresponds to a human of average height, body surface area equal to = 2000 MW/cm2 17000 cm2 = 34 thousand billion watts makes it impractical today to reproduce the entire Shroud image using a single laser excimer, since this power cannot be produced by any VUV light source built to date (the most powerful available on the market come to several billion watts )”. Comment The ENEA study of the Holy Shroud of Turin concluded that it would take 34 Thousand Billion Watts of VUV radiations to make the image on the shroud. This output of electromagnetic energy remains beyond human technology. https://www.ewtn.co.uk/news/latest/astonishing-discovery-at-christ-s-tomb-supports-turin-shroud (Centrality Concerns) The Resurrection of Jesus Christ from Death as the “Theory of Everything” – video https://www.youtube.com/watch?v=8uHST2uFPQY&list=PLtAP1KN7ahia8hmDlCYEKifQ8n65oNpQ5&index=4
Verse and Music:
Colossians 1:15-20 The Son is the image of the invisible God, the firstborn over all creation. For in him all things were created: things in heaven and on earth, visible and invisible, whether thrones or powers or rulers or authorities; all things have been created through him and for him. He is before all things, and in him all things hold together. And he is the head of the body, the church; he is the beginning and the firstborn from among the dead, so that in everything he might have the supremacy. For God was pleased to have all his fullness dwell in him, and through him to reconcile to himself all things, whether things on earth or things in heaven, by making peace through his blood, shed on the cross. Selah - Light of the Stable https://www.youtube.com/watch?v=Xy5y9SkiwcE
bornagain77
In 23 wrossite claims:
"There are no other universes that we know of, and we have no idea what they might be like if they do exist,,,"
While wrossite is right to claim that we have no evidence for the multiverse that atheists have conjectured, wrossite is quite wrong in claiming that we have no evidence for any other 'universes' whatsoever. To put it another way, whereas atheists have no empirical evidence whatsoever that the epistemologically self defeating multiverse is real, Theists have very strong evidence for their belief in a higher heavenly dimension and in a hellish dimension. This evidence comes from two of our strongest, most verified, theories in science. i.e. From Special and General Relativity respectively:
Special and General Relativity compared to Heavenly and Hellish Near Death Experiences – video https://www.youtube.com/watch?v=TbKELVHcvSI&index=1&list=PLtAP1KN7ahia8hmDlCYEKifQ8n65oNpQ5
And whereas, special relativity, by ‘brushing infinity under the rug’, has been successfully unified with quantum theory to produce Quantum Electrodynamics,,,
Theories of the Universe: Quantum Mechanics vs. General Relativity Excerpt: The first attempt at unifying relativity and quantum mechanics took place when special relativity was merged with electromagnetism. This created the theory of quantum electrodynamics, or QED. It is an example of what has come to be known as relativistic quantum field theory, or just quantum field theory. QED is considered by most physicists to be the most precise theory of natural phenomena ever developed. In the 1960s and ’70s, the success of QED prompted other physicists to try an analogous approach to unifying the weak, the strong, and the gravitational forces. Out of these discoveries came another set of theories that merged the strong and weak forces called quantum chromodynamics, or QCD, and quantum electroweak theory, or simply the electroweak theory, which you’ve already been introduced to. If you examine the forces and particles that have been combined in the theories we just covered, you’ll notice that the obvious force missing is that of gravity (i.e. General Relativity). http://www.infoplease.com/cig/theories-universe/quantum-mechanics-vs-general-relativity.html THE INFINITY PUZZLE: Quantum Field Theory and the Hunt for an Orderly Universe Excerpt: In quantum electrodynamics, which applies quantum mechanics to the electromagnetic field and its interactions with matter, the equations led to infinite results for the self-energy or mass of the electron. After nearly two decades of effort, this problem was solved after World War II by a procedure called renormalization, in which the infinities are rolled up into the electron’s observed mass and charge, and are thereafter conveniently ignored. Richard Feynman, who shared the 1965 Nobel Prize with Julian Schwinger and Sin-Itiro Tomonaga for this breakthrough, referred to this sleight of hand as “brushing infinity under the rug.” http://www.americanscientist.org/bookshelf/pub/tackling-infinity Double Slit, Quantum-Electrodynamics, and Christian Theism – video https://www.facebook.com/philip.cunningham.73/videos/vb.100000088262100/1127450170601248/?type=2&theater
And whereas, special relativity, by ‘brushing infinity under the rug’, has been successfully unified with quantum theory to produce Quantum Electrodynamics, no such mathematical ‘sleight of hand’ exists for general relativity. General relativity simply refuses to be mathematically unified with quantum mechanics. String theory is one, of several, attempts to, by hook or by crook, mathematically unify the two theories,
Unified field theory Excerpt: Gravity has yet to be successfully included in a theory of everything. Simply trying to combine the graviton with the strong and electroweak interactions runs into fundamental difficulties since the resulting theory is not renormalizable. Theoretical physicists have not yet formulated a widely accepted, consistent theory that combines general relativity and quantum mechanics. The incompatibility of the two theories remains an outstanding problem in the field of physics. Some theoretical physicists currently believe that a quantum theory of general relativity may require frameworks other than field theory itself, such as string theory or loop quantum gravity. https://en.wikipedia.org/wiki/Unified_field_theory#Current_status
Some theoretical physicists have remarked that this failure to unify Quantum Mechanics and General Relativity is ‘the collapse of physics as we know it’
Quantum Mechanics & Relativity – Michio Kaku – The Collapse Of Physics As We Know It ? – video https://www.facebook.com/philip.cunningham.73/videos/vb.100000088262100/1190432337636364/?type=2&theater
In fact, mathematically, it simply does not follow that there should be mathematical 'theory of everything'. Simply put, the belief that there should even be a 'theory of everything' is a metaphysical belief that is only firmly grounded within Theistic presuppositions:
The Limits Of Reason – Gregory Chaitin – 2006 Excerpt: “what Gödel discovered (in his incompleteness theorem) was just the tip of the iceberg: an infinite number of true mathematical theorems exist that cannot be proved from any finite system of axioms.” http://www.umcs.maine.edu/~chaitin/sciamer3.pdf Stephen Hawking’s “God-Haunted” Quest – December 24, 2014 Excerpt: Why in the world would a scientist blithely assume that there is or is even likely to be one unifying rational form to all things, unless he assumed that there is a singular, overarching intelligence that has placed it there? Why shouldn’t the world be chaotic, utterly random, meaningless? Why should one presume that something as orderly and rational as an equation would describe the universe’s structure? I would argue that the only finally reasonable ground for that assumption is the belief in an intelligent Creator, who has already thought into the world the very mathematics that the patient scientist discovers. http://www.evolutionnews.org/2014/12/stephen_hawking092351.html “So you think of physics in search of a “Grand Unified Theory of Everything”, Why should we even think there is such a thing? Why should we think there is some ultimate level of resolution? Right? It is part, it is a consequence of believing in some kind of design. Right? And there is some sense in which that however multifarious and diverse the phenomena of nature are, they are ultimately unified by the minimal set of laws and principles possible. In so far as science continues to operate with that assumption, there is a presupposition of design that is motivating the scientific process. Because it would be perfectly easy,, to stop the pursuit of science at much lower levels. You know understand a certain range of phenomena in a way that is appropriate to deal with that phenomena and just stop there and not go any deeper or any farther.”,,, You see, there is a sense in which there is design at the ultimate level, the ultimate teleology you might say, which provides the ultimate closure,,” Professor of philosophy Steve Fuller discusses intelligent design in Cambridge – Video – quoted at the 17:34 minute mark https://uncommondesc.wpengine.com/news/in-cambridge-professor-steve-fuller-discusses-why-the-hypothesis-of-intelligent-design-is-not-more-popular-among-scientists-and-others/
bornagain77
As to wrossite at 23, I think I here echoes of Stenger: Here is a devastating critique of atheist Victor Stenger's no fine tuning argument:
The Fine-Tuning of the Universe for Intelligent Life - Dr. Luke A. Barnes, a post-doctoral researcher at the Institute for Astronomy, ETH Zurich, Switzerland http://arxiv.org/PS_cache/arxiv/pdf/1112/1112.4647v1.pdf COSMOLOGIST LUKE BARNES ANSWERS 11 OBJECTIONS TO THE FINE-TUNING ARGUMENT – Sept. 2016 https://winteryknight.com/2016/09/11/cosmologist-luke-barnes-answers-11-objections-to-the-fine-tuning-argument-4/
further notes:
Bayesian considerations on the multiverse explanation of cosmic fine-tuning - V. Palonen Conclusions: ,,, The self-sampling assumption approach by Bostrom was shown to be inconsistent with probability theory. Several reasons were then given for favoring the ‘this universe’ (TU) approach and main criticisms against TU were answered. A formal argument for TU was given based on our present knowledge. The main result is that even under a multiverse we should use the proposition “this universe is fine-tuned” as data, even if we do not know the ‘true index’ 14 of our universe. It follows that because multiverse hypotheses do not predict fine-tuning for this particular universe any better than a single universe hypothesis, multiverse hypotheses are not adequate explanations for fine-tuning. Conversely, our data on cosmic fine-tuning does not lend support to the multiverse hypotheses. For physics in general, irrespective of whether there really is a multiverse or not, the common-sense result of the above discussion is that we should prefer those theories which best predict (for this or any universe) the phenomena we observe in our universe. http://arxiv.org/ftp/arxiv/papers/0802/0802.4013.pdf Infinitely wrong - Dr. Sheldon - November 2010 Excerpt: So you see, they gleefully cry, even [1 / 10^(10^123)] x infinity = 1! Even the most improbable events can be certain if you have an infinite number of tries.,,,Ahh, but does it? I mean, zero divided by zero is not one, nor is 1/infinity x infinity = 1. Why? Well for starters, it assumes that the two infinities have the same cardinality. https://web.archive.org/web/20140817050357/http://rbsp.info/PROCRUSTES/infinitely-wrong/
bornagain77
Moreover, most atheists do not seem to realize that if the universal constants were actually found to have even a small variance within them over the course of the universe's history, as is presupposed in the 'random' metaphysics of atheism, then this would destroy our ability to practice science, for it would undermine our ability to mathematically model the universe in a reliable fashion. For example, if the speed of light constant, or if the invisible glue that holds nuclei together, varied, e=mc2 would be totally useless to us as a reliable description of reality. Please note the chaos that would ensue if even a very small variance were found to be in the universal constants:
Scientists Question Nature’s Fundamental Laws – Michael Schirber – 2006 Excerpt: “There is absolutely no reason these constants should be constant,” says astronomer Michael Murphy of the University of Cambridge. “These are famous numbers in physics, but we have no real reason for why they are what they are.”,,, The observed differences are small-roughly a few parts in a million-but the implications are huge (if they hold up): The laws of physics would have to be rewritten, not to mention we might need to make room for six more spatial dimensions than the three that we are used to.”,,, The speed of light, for instance, might be measured one day with a ruler and a clock. If the next day the same measurement gave a different answer, no one could tell if the speed of light changed, the ruler length changed, or the clock ticking changed. http://www.space.com/2613-scientists-question-nature-fundamental-laws.html
Furthermore, both Einstein and Wigner are on record as to regarding it to be an epistemological miracle that we can even apply our mathematical intuition to the physics of the universe in the first place.
"You find it strange that I consider the comprehensibility of the world (to the extent that we are authorized to speak of such a comprehensibility) as a miracle or as an eternal mystery. Well, a priori, one should expect a chaotic world, which cannot be grasped by the mind in any way .. the kind of order created by Newton's theory of gravitation, for example, is wholly different. Even if a man proposes the axioms of the theory, the success of such a project presupposes a high degree of ordering of the objective world, and this could not be expected a priori. That is the 'miracle' which is constantly reinforced as our knowledge expands." Albert Einstein - Letters to Solovine - New York, Philosophical Library, 1987 The Unreasonable Effectiveness of Mathematics in the Natural Sciences - Eugene Wigner - 1960 Excerpt: ,,certainly it is hard to believe that our reasoning power was brought, by Darwin's process of natural selection, to the perfection which it seems to possess.,,, It is difficult to avoid the impression that a miracle confronts us here, quite comparable in its striking nature to the miracle that the human mind can string a thousand arguments together without getting itself into contradictions, or to the two miracles of the existence of laws of nature and of the human mind's capacity to divine them.,,, The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve. We should be grateful for it and hope that it will remain valid in future research and that it will extend, for better or for worse, to our pleasure, even though perhaps also to our bafflement, to wide branches of learning. http://www.dartmouth.edu/~matc/MathDrama/reading/Wigner.html Mathematics and Physics – A Happy Coincidence? – William Lane Craig – video https://www.youtube.com/watch?v=BF25AA4dgGg 1. If God did not exist the applicability of mathematics would be a happy coincidence. 2. The applicability of mathematics is not a happy coincidence. 3. Therefore, God exists.
Moreover, the fact that man actually possesses a 'transcendent' mathematical intuition in the first place is actually fairly strong evidence that man possesses a mind/soul that is not reducible to any possible materialistic explanation.
An Interview with David Berlinski - Jonathan Witt Berlinski: There is no argument against religion that is not also an argument against mathematics. Mathematicians are capable of grasping a world of objects that lies beyond space and time…. Interviewer:… Come again(?) … Berlinski: No need to come again: I got to where I was going the first time. The number four, after all, did not come into existence at a particular time, and it is not going to go out of existence at another time. It is neither here nor there. Nonetheless we are in some sense able to grasp the number by a faculty of our minds. Mathematical intuition is utterly mysterious. So for that matter is the fact that mathematical objects such as a Lie Group or a differentiable manifold have the power to interact with elementary particles or accelerating forces. But these are precisely the claims that theologians have always made as well – that human beings are capable by an exercise of their devotional abilities to come to some understanding of the deity; and the deity, although beyond space and time, is capable of interacting with material objects. http://tofspot.blogspot.com/2013/10/found-upon-web-and-reprinted-here.html "Either mathematics is too big for the human mind, or the human mind is more than a machine." - Kurt Gödel As quoted in Topoi : The Categorial Analysis of Logic (1979) by Robert Goldblatt, p. 13 "the intellect (is) immaterial and immortal. If today’s naturalists do not wish to agree with that, there is a challenge for them. ‘Don’t tell me, show me’: build an artificial intelligence system that imitates genuine mathematical insight. There seem to be no promising plans on the drawing board.,,," James Franklin is professor of mathematics at the University of New South Wales in Sydney.
But to continue on with the main point of the thread, universal constants that do not vary is a thoroughly Theistic presupposition and is certainly not an atheistic presupposition. In fact, the first major unification in science was when the Christian founders of modern science, particularly Newton, realized that the same force that caused an apple to fall at the Earth’s surface—gravity—was also responsible for holding the Moon in orbit about the Earth. i.e. When he realized that the law of gravity was ‘universal’
"The first major unification in physics was Sir Isaac Newton’s realization that the same force that caused an apple to fall at the Earth’s surface—gravity—was also responsible for holding the Moon in orbit about the Earth. This universal force would also act between the planets and the Sun, providing a common explanation for both terrestrial and astronomical phenomena. https://www.learner.org/courses/physics/unit/text.html?unit=3&secNum=3 The God Particle: Not the God of the Gaps, But the Whole Show – Monday, Aug. 2012 Excerpt: C. S. Lewis put it this way: “Men became scientific because they expected law in nature and they expected law in nature because they believed in a lawgiver.” http://www.christianpost.com/news/the-god-particle-not-the-god-of-the-gaps-but-the-whole-show-80307/ The Genius and Faith of Faraday and Maxwell – Ian H. Hutchinson – 2014 Conclusion: Lawfulness was not, in their thinking, inert, abstract, logical necessity, or complete reducibility to Cartesian mechanism; rather, it was an expectation they attributed to the existence of a divine lawgiver. These men’s insights into physics were made possible by their religious commitments. For them, the coherence of nature resulted from its origin in the mind of its Creator. http://www.thenewatlantis.com/publications/the-genius-and-faith-of-faraday-and-maxwell “Our monotheistic traditions reinforce the assumption that the universe is at root a unity, that is not governed by different legislation in different places.” John D. Barrow
At the 28:09 minute mark of the following video, Dr Hugh Ross speaks of the 7 places in the bible that speak of unchanging universal constants.
Symposium 2015 : Scientific Evidence For God's Existence - Hugh Ross - video https://youtu.be/4mEKZRm1xXg?t=1689
Here is one verse
Psalm 119:89-91 Your eternal word, O Lord, stands firm in heaven. Your faithfulness extends to every generation, as enduring as the earth you created. Your regulations remain true to this day, for everything serves your plans.
In fact, 'random chance' has always lied at the heart of Atheistic Metaphysics. Although 'random chance', as atheists usually use it in Darwinian evolution, is more or less synonymous with a miracle,,,
Pauli’s ideas on mind and matter in the context of contemporary science – Harald Atmanspacher Excerpt: “In discussions with biologists I met large difficulties when they apply the concept of ‘natural selection’ in a rather wide field, without being able to estimate the probability of the occurrence in a empirically given time of just those events, which have been important for the biological evolution. Treating the empirical time scale of the evolution theoretically as infinity they have then an easy game, apparently to avoid the concept of purposesiveness. While they pretend to stay in this way completely ‘scientific’ and ‘rational,’ they become actually very irrational, particularly because they use the word ‘chance’, not any longer combined with estimations of a mathematically defined probability, in its application to very rare single events more or less synonymous with the old word ‘miracle.’” Wolfgang Pauli (pp. 27-28) http://www.igpp.de/english/tda/pdf/paulijcs8.pdf
Although 'random chance', as atheists usually use it in Darwinian evolution, is more or less synonymous with a miracle, in physics 'random chance' shows up at the 'boundary conditions', 'piled up like a high wall and forming a boundary–a beginning of time–which we cannot climb over'
"The dilemma is this: Surveying our surroundings, we find them to be far from a “fortuitous concourse of atoms”. The picture of the world, as drawn in existing physical theories shows arrangements of the individual elements for which the odds are multillions to 1 against an origin by chance. Some people would like to call this non-random feature of the world purpose or design; but I will call it non-committally anti-chance. ,,Accordingly, we sweep anti-chance out of the laws of physics–out of the differential equations. Naturally, therefore, it reappears in the boundary conditions, for it must be got into the scheme somewhere. By sweeping it far enough away from the sphere of our current physical problems, we fancy we have got rid of it. It is only when some of us are so misguided as to try to get back billions of years into the past that we find the sweepings all piled up like a high wall and forming a boundary–a beginning of time–which we cannot climb over.” – Arthur Eddington
Thus wrossite, while you may pretend that the fine-tuned universal laws are just what they are and are of no big concern for the atheist, the fact of the matter is that your own atheistic metaphysics is what is contradicting you every step of the way. You simply do not have a leg to stand on metaphysically or scientifically as far as you presupposing the constants are just what they are and are of no big concern for you as an atheist. bornagain77
wrossite you claim "But the fact remains that we have no idea whether or not (or how much) a parameter could vary. We don’t know what variation is possible, and therefore we don’t know the probability of a thing varying." This is too funny. Who put you up to this? First the constants are, as far as we can tell or know, arbitrary, they are not, (as Weinberg himself admitted), mathematically necessary nor, with the Big Bang cosmology, are they to be considered eternally self existent.
“All the evidence we have says that the universe had a beginning.” - Cosmologist Alexander Vilenkin of Tufts University in Boston – in paper delivered at atheist Stephen Hawking's 70th birthday party (Characterized as 'Worst Birthday Present Ever') – January 2012 The Fine-Tuning of Nature’s Laws - Luke A. Barnes - Fall 2015 Excerpt: Today, our deepest understanding of the laws of nature is summarized in a set of equations. Using these equations, we can make very precise calculations of the most elementary physical phenomena, calculations that are confirmed by experimental evidence. But to make these predictions, we have to plug in some numbers that cannot themselves be calculated but are derived from measurements of some of the most basic features of the physical universe. These numbers specify such crucial quantities as the masses of fundamental particles and the strengths of their mutual interactions. After extensive experiments under all manner of conditions, physicists have found that these numbers appear not to change in different times and places, so they are called the fundamental constants of nature.,,, ,,, The results of all our investigations into the fundamental building blocks of matter and energy are summarized in the Standard Model of particle physics, which is essentially one long, imposing equation. Within this equation, there are twenty-six constants, describing the masses of the fifteen fundamental particles, along with values needed for calculating the forces between them, and a few others.,,, ,,, Compared to the range of possible masses that the particles described by the Standard Model could have, the range that avoids these kinds of complexity-obliterating disasters is extremely small. Imagine a huge chalkboard, with each point on the board representing a possible value for the up and down quark masses. If we wanted to color the parts of the board that support the chemistry that underpins life, and have our handiwork visible to the human eye, the chalkboard would have to be about ten light years (a hundred trillion kilometers) high.,,, http://www.thenewatlantis.com/publications/the-fine-tuning-of-natures-laws
Second, the atheistic presupposition IS that the universal constants would vary. The atheistic presupposition is not that the universal constants would not vary. Indeed, many prominent atheists constantly try to find an escape from the implications of the fine-tuning of the universe by invoking an infinity of random universes with vastly different universal laws so as to 'explain away' the infinitesimal life permitting range we find the constants to be in. How infinite randomness supposedly generates unchanging universal laws for each of those universes in the hypothetical infinite multiverse ensemble, is a miracle in its own right. A miracle that is kicked back to some imaginary random universe generating mechanism. An imaginary universe generating mechanism that requires even more extraordinary fine-tuning than the finely-tuned universal laws originally did themselves:
The Fine Tuning of the Universe (Gravity, Expansion, and Imaginary Universe Generating Machine) - animated video https://www.youtube.com/watch?v=5okFVrLdADk
To further highlight the fact that the finely-tuned universal constants are not necessary, atheists themselves have invoked 'inflation' to try to 'explain away' why the universe is as round and flat as it is.
BRUCE GORDON: Hawking’s irrational arguments – October 2010 Excerpt: ,,,The physical universe is causally incomplete and therefore neither self-originating nor self-sustaining. The world of space, time, matter and energy is dependent on a reality that transcends space, time, matter and energy. This transcendent reality cannot merely be a Platonic realm of mathematical descriptions, for such things are causally inert abstract entities that do not affect the material world,,, Rather, the transcendent reality on which our universe depends must be something that can exhibit agency – a mind that can choose among the infinite variety of mathematical descriptions and bring into existence a reality that corresponds to a consistent subset of them. This is what “breathes fire into the equations and makes a universe for them to describe.” Anything else invokes random miracles as an explanatory principle and spells the end of scientific rationality. Nowhere is this destructive consequence more evident than in the machinations of multiverse cosmology to “explain” cosmological fine-tuning. Cosmic inflation is invoked to “explain” why our universe is so flat and its background radiation so uniform. All possible solutions of string theory are invoked to “explain” the incredible fine-tuning of the cosmological constant. But the evidence for cosmic inflation is both thin and equivocal; the evidence for string theory and its extension, M-theory, is nonexistent; and the idea that conjoining them demonstrates that we live in a multiverse of bubble universes with different laws and constants is a mathematical fantasy. What is worse, multiplying without limit the opportunities for any event to happen in the context of a multiverse – where it is alleged that anything can spontaneously jump into existence without cause – produces a situation in which no absurdity is beyond the pale. For instance, we find multiverse cosmologists debating the “Boltzmann Brain” problem: In the most “reasonable” models for a multiverse, it is immeasurably more likely that our consciousness is associated with a brain that has spontaneously fluctuated into existence in the quantum vacuum than it is that we have parents and exist in an orderly universe with a 13.7 billion-year history. This is absurd. The multiverse hypothesis is therefore falsified because it renders false what we know to be true about ourselves. Clearly, embracing the multiverse idea entails a nihilistic irrationality that destroys the very possibility of science. Universes do not “spontaneously create” on the basis of abstract mathematical descriptions, nor does the fantasy of a limitless multiverse trump the explanatory power of transcendent intelligent design. What Mr. Hawking’s contrary assertions show is that mathematical savants can sometimes be metaphysical simpletons. Caveat emptor. http://www.washingtontimes.com/news/2010/oct/1/hawking-irrational-arguments/
As Dr. Gordon further highlights in the following article, the atheistic conjecture of inflation self destructs:
A Matter of Considerable Gravity: On the Purported Detection of Gravitational Waves and Cosmic Inflation - Bruce Gordon - April 4, 2014 Excerpt: Thirdly, at least two paradoxes result from the inflationary multiverse proposal that suggest our place in such a multiverse must be very special: the "Boltzmann Brain Paradox" and the "Youngness Paradox." In brief, if the inflationary mechanism is autonomously operative in a way that generates a multiverse, then with probability indistinguishable from one (i.e., virtual necessity) the typical observer in such a multiverse is an evanescent thermal fluctuation with memories of a past that never existed (a Boltzmann brain) rather than an observer of the sort we take ourselves to be. Alternatively, by a second measure, post-inflationary universes should overwhelmingly have just been formed, which means that our existence in an old universe like our own has a probability that is effectively zero (i.e., it's nigh impossible). So if our universe existed as part of such a multiverse, it would not be at all typical, but rather infinitely improbable (fine-tuned) with respect to its age and compatibility with stable life-forms. http://www.evolutionnews.org/2014/04/a_matter_of_con084001.html
Moreover, besides the catastrophic epistemological failure that atheists encounter when they try to 'explain away' why the universe is as round and as flat as it is with inflation, the Bible actually predicted, number 1, that the universe is expanding, number 2, that the CMBR is round, and number 3, that the universe is flat,,,
Hugh Ross PhD. - Scientific Evidence For Cosmological Constant (1 in 10^120 Expansion Of The Universe) video 23:12 minute mark https://youtu.be/fTP01yi-SSU?t=1392
Here are the verses from the Bible which Dr. Ross listed, which were written well over 2000 years before the discovery of the finely tuned expansion of the universe, that speak of God 'Stretching out the Heavens'; Job 9:8; Isaiah 40:22; Isaiah 44:24; Isaiah 48:13; Zechariah 12:1; Psalm 104:2; Isaiah 42:5; Isaiah 45:12; Isaiah 51:13; Jeremiah 51:15; Jeremiah 10:12. The following verse is my favorite out of the group of verses:
Job 9:8 He alone stretches out the heavens and treads on the waves of the sea.
With the discovery of the Cosmic Microwave Background Radiation (CMBR), the universe is found to actually be a circular sphere
Planck satellite unveils the Universe -- now and then (w/ Video showing the mapping of the 'sphere' of the Cosmic Microwave Background Radiation with the satellite) - 2010 http://phys.org/news197534140.html#nRlv
The Bible predicted the universe to be a circular sphere thousands of years before it was discovered by modern science:
Proverbs 8:26-27 While as yet He had not made the earth or the fields, or the primeval dust of the world. When He prepared the heavens, I was there, when He drew a circle on the face of the deep, Job 26:10 He has inscribed a circle on the face of the waters at the boundary between light and darkness.
Moreover, the universe is found to be exceptionally flat,,
"The Universe today is actually very close to the most unlikely state of all, absolute flatness. And that means it must have been born in an even flatter state, as Dicke and Peebles, two of the Princeton astronomers involved in the discovery of the 3 K background radiation, pointed out in 1979. Finding the Universe in a state of even approximate flatness today is even less likely than finding a perfectly sharpened pencil balancing on its point for millions of years, for, as Dicke and Peebles pointed out, any deviation of the Universe from flatness in the Big Bang would have grown, and grown markedly, as the Universe expanded and aged. Like the pencil balanced on its point and given the tiniest nudges, the Universe soon shifts away from perfect flatness." ~ John Gribbin, In Search of the Big Bang Yes, the world (universe) really is flat - December 8, 2016 Excerpt: The universe has all sorts of deformations in space-time where it varies from the perfectly flat. Any place where there’s mass or energy, there’s a corresponding bending of space-time — that’s General Relativity 101. So a couple light beams would naturally collide inside a wandering black hole, or bend along weird angles after encountering a galaxy or two. But average all those small-scale effects out and look at the big picture. When we examine very old light — say, the cosmic microwave background — that has been traveling the universe for more than 13.8 billion years, we get a true sense of the universe’s shape. And the answer, as far as we can tell, to within an incredibly small margin of uncertainty, is that the universe is flat.,,, ,,, but there are also no laws of physics that predict or restrict the topology. https://uncommondesc.wpengine.com/intelligent-design/yes-the-world-really-is-flat/
And again, the Bible predicted this exceptional flatness for the universe thousands of years before it was discovered by modern science:
Job 38:4-5 “Where were you when I laid the earth’s foundation? Tell me, if you understand. Who marked off its dimensions? Surely you know! Who stretched a measuring line across it?
To further highlight the fact that atheists presuppose variance in the universal constants, atheist Leonard Susskind, at the 7:19 minute mark of the following video, postulated that the laws, within the ‘unobservable’ parts of this universe, could very well be vastly different than what we measure for the constants 'locally'.
Leonard Susskind – Is the Universe Fine-Tuned for Life and Mind? https://www.youtube.com/watch?feature=player_detailpage&v=2cT4zZIHR3s#t=439
bornagain77
Silver Asiatic. Nope. I actually feel quite the opposite. The probability of our universe, so far as we know, is one. There are no other universes that we know of, and we have no idea what they might be like if they do exist. I'm not sure what you're getting at with this idea of changing relations over space. Could you expand on that? W wrossite
wrossite Ok, @21 you seem open to the possibility that there could be some other arrangements of matter than what exists at present. We do see change and movement in the universe. For the question of precise distance between objects, we know matter can change positions. Actually, it would be more extraordinary (asking the same question as SB in 19) to find all attributes in a fixed, unchanging (unchangeable) relationship. Silver Asiatic
StephenB "What about the macro relationship between the 20 constants, which requires almost unbelievable precision?" In my blog, I acknowledge this as an interesting and potentially valid form of the argument. We have no reason to think there should be coherence between those things, and yet there is a precise coherence. You'll get no beef from me on that right now. W wrossite
One more time. "We are arguing from what we do know (i.e. infinitesimal variance equals no life)." I've already agreed that an infinitesimal variance equals no life. But the fact remains that we have no idea whether or not (or how much) a parameter could vary. We don't know what variation is possible, and therefore we don't know the probability of a thing varying. W wrossite
Dr. Rossiter: What about the macro relationship between the 20 constants, which requires almost unbelievable precision? Even if I grant you the most conservative or least number of possibilities at the micro level that you would care to assume (the individual constants) the probability that all 20 of them will be perfectly balanced at the macro level is so low that a fine tuner is clearly indicated. StephenB
We are arguing from what we do know, (i.e. infinitesimal variance equals no life). You are arguing from what you don't know (i.e. you don't know what a parameter could possibly be). Again, this does not surprise me since Atheists have always retreated to 'ignorance of the gaps' arguments to try cover up embarrassing empirical findings. Darwinists today rely, and Darwin himself relied, on 'ignorance of the gaps' arguments
The Evolutionary Argument from Ignorance - Cornelius Hunter - December 1, 2016 Excerpt: The authors argue that the underlying patterns of the genetic code are not likely to be due to "chance coupled with presumable evolutionary pathways" (P-value < 10^-13), and conclude that they are "essentially irreducible to any natural origin." A common response from evolutionists, when presented with evidence such as this, is that we still don't understand biology very well. This argument from ignorance goes all the way back to Darwin. http://www.evolutionnews.org/2016/12/the_evolutionar_1103329.html
To reiterate, It would be hard to fathom a more unscientific worldview than Darwinian evolution and Atheistic materialism/naturalism in general have turned out to be. Of related note: At least Steven Weinberg himself was honest enough to admit 'the fix' that atheists are in"
Quote: “I don’t think one should underestimate the fix we are in. That in the end we will not be able to explain the world. That we will have some set of laws of nature (that) we will not be able to derive them on the grounds simply of mathematical consistency. Because we can already think of mathematically consistent laws that don’t describe the world as we know it. And we will always be left with a question ‘why are the laws nature what they are rather than some other laws?’. And I don’t see any way out of that. The fact that the constants of nature are suitable for life, which is clearly true, we observe,,,” (Weinberg then comments on the multiverse conjecture of atheists) “No one has constructed a theory in which that is true. I mean,, the (multiverse) theory would be speculative, but we don’t even have a theory in which that speculation is mathematically realized. But it is a possibility.” Steven Weinberg – as stated to Richard Dawkins at the 8:15 minute mark of the following video Leonard Susskind – Richard Dawkins and Steven Weinberg – 1 in 10^120 – Cosmological Constant points to intelligent design – video https://youtu.be/z4E_bT4ecgk?t=495
Please note how Steven Weinberg uses 'the fix' to describe his predicament as an atheist in regards to fine-tuning. He uses it as if he is trying to escape God altogether. Such a term, 'the fix', hardly conveys the impression of an unbiased examination of the evidence at hand, but instead conveys an impression of a priori philosophical, even emotional, bias against God. Again, very unscientific. bornagain77
Bornagain77. "If you vary a certain parameter by just an infinitesimal fraction, in either direction, life as we know it would not exist." Compare that to what I actually wrote: "To say that such-and-such a physical parameter rests upon a razor’s edge does tell us something. It tells us that any small change in the setting of that parameter would lead to a universe drastically different from the one we live in, and likely one that could never even produce material objects (let alone life) as we understand it. Fair enough. I agree." My concern is that you haven't actually read my argument yet. I go on, "What it doesn’t tell us is how likely any of those other settings are." That is, to say that the settings for the parameters rest upon a razor's edge is a statement of precision and (in)flexibility. But it is not a statement about the probability of that thing, because we don't know what (or even that) a parameter could be. We only know what it is. And, in my blog, I demonstrate that I'm not alone in making this observation. W wrossite
Silver Asiatic. "Yes there is an assumption that the known number of possible cases is a representative sample of the whole." See, for the moment I reject that statement. What are the "known number of possible cases"? Do we KNOW of any other setting for a given parameter? Do we KNOW that any such setting exists, or even could exist? W wrossite
It is the non-theist, not the theist, who introduces the notions of probability and chance to explain the apparent design we observe in the universe around us. From a naturalistic perspective the universes’ apparent design is an unsolved problem if not a conundrum. Theists, on the other hand, have an explanation. In other words, if the universe appears to be designed maybe it is because it really is. I discuss some of the philosophical implications of fine tuning more here: https://uncommondesc.wpengine.com/intelligent-design/scientists-driven-to-teleological-view-of-the-cosmos/#comment-622191 john_a_designer
wrossite, I really don't know what you are on about. Fine-tuning is really straight forward. As the references I highlighted illustrated, if you vary a certain parameter by just an infinitesimal fraction, in either direction, life as we know it would not exist. Under Atheistic Materialism this fine-tuning is surprising since Atheist's held that the universe did not have life in mind and that life was ultimately an accident of time and chance. Yet, under Theism this is not surprising since we hold that God created the universe with life, and particularly human life, in mind. Moreover it is found, when scrutinizing the details of physics and chemistry, that not only is the universe fine-tuned for carbon based life, but is specifically fine-tuned to be of particular benefit for life like human life (R. Collins, M. Denton). Which is a further finding that is, again, expected under Theism and surprising under Atheism. It seems to me, instead of dealing forthrightly with the evidence we have in hand, you are trying to retreat into a realm of imaginary possibilities. An atheist retreating into a realm of imaginary possibilities is not surprising for me. Atheists have a long history of claiming anything they can possibly imagine as being equivalent to an empirical finding of science. To make this point clear, I repeat this section of a post that I wrote earlier: Let us be VERY clear to the fact that ALL of science, every discipline within science, is dependent on basic Theistic presuppositions about the rational intelligibility of the universe and the ability of our mind to comprehend that rational intelligibility. Modern science was born, and continues to be dependent on, those basic Theistic presuppositions:.,,,
Science and Theism: Concord, not Conflict* – Robert C. Koons IV. The Dependency of Science Upon Theism (Page 21) Excerpt: Far from undermining the credibility of theism, the remarkable success of science in modern times is a remarkable confirmation of the truth of theism. It was from the perspective of Judeo-Christian theism—and from the perspective alone—that it was predictable that science would have succeeded as it has. Without the faith in the rational intelligibility of the world and the divine vocation of human beings to master it, modern science would never have been possible, and, even today, the continued rationality of the enterprise of science depends on convictions that can be reasonably grounded only in theistic metaphysics. http://www.robkoons.net/media/69b0dd04a9d2fc6dffff80b3ffffd524.pdf
Moreover, if we cast aside those basic Theistic presuppositions about the rational intelligibility of the universe and the ability of our mind to comprehend that rational intelligibility, and try to use naturalism, i.e. methodological naturalism, as our basis for understanding the universe, and for practicing science, then everything within that atheistic/naturalistic worldview, (i.e. supposed evidence for Darwinian evolution, observations of reality, beliefs about reality, sense of self, free will, even reality itself), collapses into self refuting, unrestrained, flights of fantasy and imagination.
Darwinian evolution, and atheism/naturalism in general, are built entirely upon a framework of illusions and fantasy Excerpt: Thus, basically, without God, everything within the atheistic/naturalistic worldview, (i.e. supposed evidence for Darwinian evolution, observations of reality, beliefs about reality, sense of self, free will, even reality itself), collapses into self refuting, unrestrained, flights of fantasy and imagination. It would be hard to fathom a more unscientific worldview than Darwinian evolution and Atheistic materialism/naturalism in general have turned out to be. Scientists should definitely stick with the worldview that brought them to the dance! i.e Christianity! https://docs.google.com/document/d/1Q94y-QgZZGF0Q7HdcE-qdFcVGErhWxsVKP7GOmpKD6o/edit
To reiterate, It would be hard to fathom a more unscientific worldview than Darwinian evolution and Atheistic materialism/naturalism in general have turned out to be. Verses:
2 Corinthians 10:5 Casting down imaginations, and every high thing that exalteth itself against the knowledge of God, and bringing into captivity every thought to the obedience of Christ; 2 Peter 1:16 For we have not followed cunningly devised fables, when we made known unto you the power and coming of our Lord Jesus Christ, but were eyewitnesses of his majesty.
bornagain77
Wayne
o then, I simply ask, what is the total number of equipossible cases for any given parameter used in the arguments for cosmological fine-tuning? From where will we conjure such a number or estimate?
We take what is known and extrapolate from that. Yes there is an assumption that the known number of possible cases is a representative sample of the whole, but that is the best knowledge we have at present. So, fine-tuning is "the best inference given the data we have". It's similar to the inference that the universe has an origin in time and is therefore finite in space and dimension. Estimates are built off of that. Yes, what is unknown could, perhaps, throw all those calculations out the window, but until then, we can arrive at "the best inference" available now. And that's fine-tuning. Silver Asiatic
"Thus, under this extended definition, to say that the probability of the parameters of physics falling into the life-permitting value is very improbable simply means that the ratio of life-permitting values to the range of possible values is very, very small." This is what I'm talking about. This assumes the ability to calculate probabilities for such-and-such a parameter. But, to do so requires that we can define "probability in terms of the ratio of number of 'favorable cases' to the total number of equipossible cases." So then, I simply ask, what is the total number of equipossible cases for any given parameter used in the arguments for cosmological fine-tuning? From where will we conjure such a number or estimate? That's really my entire beef with cosmological fine-tuning. all the best, Wayne p.s. Bornagain77 posted a bunch of videos. Ironically, they make precisely the missteps I describe in my blog. Craig, in particular, has no problem assuming a random selection of parameters drawn from a field of possibilities defined by the precision of a given parameter...and then immediately mocks such a step when the multiverse folks do it. wrossite
We don’t know the circumstances and laws which preceded the universe. Perhaps those circumstances and laws were such that the coming into existence of the universe, as it is, was guaranteed. However, this only pushes the fine-tuning problem back one step: how do we explain the fine-tuning of circumstances and laws which produced our universe? One thing is for sure, “chance” is no explanation. BTW if the universe is all there is, then the universe is neither in time nor in space. If so, it is meaningless to say that the universe is “old” or “large”, since these are relative concepts and there is nothing out there as a comparison. Origenes
Of related note to biology and fine-tuning:
William Bialek: More Perfect Than We Imagined - March 23, 2013 Excerpt: photoreceptor cells that carpet the retinal tissue of the eye and respond to light, are not just good or great or phabulous at their job. They are not merely exceptionally impressive by the standards of biology, with whatever slop and wiggle room the animate category implies. Photoreceptors operate at the outermost boundary allowed by the laws of physics, which means they are as good as they can be, period. Each one is designed to detect and respond to single photons of light — the smallest possible packages in which light comes wrapped. “Light is quantized, and you can’t count half a photon,” said William Bialek, a professor of physics and integrative genomics at Princeton University. “This is as far as it goes.” … Scientists have identified and mathematically anatomized an array of cases where optimization has left its fastidious mark, among them;,, the precision response in a fruit fly embryo to contouring molecules that help distinguish tail from head;,,, In each instance, biophysicists have calculated, the system couldn’t get faster, more sensitive or more efficient without first relocating to an alternate universe with alternate physical constants. http://darwins-god.blogspot.com/2013/03/william-bialek-more-perfect-than-we.html Study suggests humans can detect even the smallest units of light - July 21, 2016 Excerpt: Research,, has shown that humans can detect the presence of a single photon, the smallest measurable unit of light. Previous studies had established that human subjects acclimated to the dark were capable only of reporting flashes of five to seven photons.,,, it is remarkable: a photon, the smallest physical entity with quantum properties of which light consists, is interacting with a biological system consisting of billions of cells, all in a warm and wet environment," says Vaziri. "The response that the photon generates survives all the way to the level of our awareness despite the ubiquitous background noise. Any man-made detector would need to be cooled and isolated from noise to behave the same way.",,, The gathered data from more than 30,000 trials demonstrated that humans can indeed detect a single photon incident on their eye with a probability significantly above chance. "What we want to know next is how does a biological system achieve such sensitivity? How does it achieve this in the presence of noise? http://phys.org/news/2016-07-humans-smallest.html The Fine-Tuning for Discoverability - Robin Collins - March 22, 2014 Excerpt: The most dramatic confirmation of the discoverability/livability optimality thesis (DLO) is the dependence of the Cosmic Microwave Background Radiation (CMB) on the baryon to photon ratio.,,, ...the intensity of CMB depends on the photon to baryon ratio, (??b), which is the ratio of the average number of photons per unit volume of space to the average number of baryons (protons plus neutrons) per unit volume. At present this ratio is approximately a billion to one (10^9) , but it could be anywhere from one to infinity; it traces back to the degree of asymmetry in matter and anti - matter right after the beginning of the universe – for approximately every billion particles of antimatter, there was a billion and one particles of matter.,,, The only livability effect this ratio has is on whether or not galaxies can form that have near - optimally livability zones. As long as this condition is met, the value of this ratio has no further effects on livability. Hence, the DLO predicts that within this range, the value of this ratio will be such as to maximize the intensity of the CMB as observed by typical observers. According to my calculations – which have been verified by three other physicists -- to within the margin of error of the experimentally determined parameters (~20%), the value of the photon to baryon ratio is such that it maximizes the CMB. This is shown in Figure 1 below. (pg. 13) It is easy to see that this prediction could have been disconfirmed. In fact, when I first made the calculations in the fall of 2011, I made a mistake and thought I had refuted this thesis since those calculations showed the intensity of the CMB maximizes at a value different than the photon - baryon ratio in our universe. So, not only does the DLO lead us to expect this ratio, but it provides an ultimate explanation for why it has this value,,, This is a case of a teleological thesis serving both a predictive and an ultimate explanatory role.,,, http://home.messiah.edu/~rcollins/Fine-tuning/Greer-Heard%20Forum%20paper%20draft%20for%20posting.pdf 1 trillionth of a trillionth (10^-24) Fine tuning of Light, Atmosphere, and Water to Photosynthesis (etc..) - video (2016) https://youtu.be/NIwZqDkrj9I Water's quantum weirdness makes life possible - October 2011 Excerpt: WATER'S life-giving properties exist on a knife-edge. It turns out that life as we know it relies on a fortuitous, but incredibly delicate, balance of quantum forces.,,, They found that the hydrogen-oxygen bonds were slightly longer than the deuterium-oxygen ones, which is what you would expect if quantum uncertainty was affecting water’s structure. “No one has ever really measured that before,” says Benmore. We are used to the idea that the cosmos’s physical constants are fine-tuned for life. Now it seems water’s quantum forces can be added to this “just right” list. http://www.newscientist.com/article/mg21228354.900-waters-quantum-weirdness-makes-life-possible.html Protein Folding: One Picture Per Millisecond Illuminates The Process - 2008 Excerpt: The RUB-chemists initiated the folding process and then monitored the course of events. It turned out that within less than ten milliseconds, the motions of the water network were altered as well as the protein itself being restructured. “These two processes practically take place simultaneously“, Prof. Havenith-Newen states, “they are strongly correlated.“ These observations support the yet controversial suggestion that water plays a fundamental role in protein folding, and thus in protein function, and does not stay passive. http://www.sciencedaily.com/releases/2008/08/080805075610.htm Water Is 'Designer Fluid' That Helps Proteins Change Shape - 2008 Excerpt: "When bound to proteins, water molecules participate in a carefully choreographed ballet that permits the proteins to fold into their functional, native states. This delicate dance is essential to life." http://www.sciencedaily.com/releases/2008/08/080806113314.htm
bornagain77
A Fortunate Universe: Life in a Finely Tuned Cosmos Mung
The fine-tuning of the toilet-paper has crossed my desk... How can we be sure it is "the most fine-tuned toilet paper" and the best it can be? Well, I have few complains about the current popular toilet papers. They are shitty! J-Mac
From: http://www.discovery.org/a/91
iii. The Meaning of Probability In the last section we used the principle of indifference to rigorously justify the claim that the fine-tuning is highly improbable under the atheistic single-universe hypothesis. We did not explain, however, what it could mean to say that it is improbable, especially given that the universe is a unique, unrepeatable event. To address this issue, we shall now show how the probability invoked in the fine-tuning argument can be straightforwardly understood either as what could be called classical probability or as what is known as epistemic probability. Classical Probability The classical conception of probability defines probability in terms of the ratio of number of "favorable cases" to the total number of equipossible cases. (See Weatherford, chapter 2.) Thus, for instance, to say the probability of a die coming up "4" is 1/6 is simply to say that the number of ways a die could come up "4" is 1/6 the number of equipossible ways it could come up. Extending the this definition to the continuous case, classical probability can be defined in terms of the relevant ratio of ranges, areas, or volumes over which the principle of indifference applies. Thus, under this extended definition, to say that the probability of the parameters of physics falling into the life-permitting value is very improbable simply means that the ratio of life-permitting values to the range of possible values is very, very small. Finally, notice that this definition of probability implies the principle of indifference, and thus we can be certain that the principle of indifference holds for classical probability. Epistemic Probability Epistemic probability is a widely-recognized type of probability that applies to claims, statements, and hypotheses--that is, what philosophers call propositions. (12) Roughly, the epistemic probability of a proposition can be thought of as the degree of credence--that is, degree of confidence or belief--we rationally should have in the proposition. Put differently, epistemic probability is a measure of our rational degree of belief under a condition of ignorance concerning whether a proposition is true or false. For example, when one says that the special theory of relativity is probably true, one is making a statement of epistemic probability. After all, the theory is actually either true or false. But, we do not know for sure whether it is true or false, so we say it is probably true to indicate that we should put more confidence in its being true than in its being false. It is also commonly argued that the probability of a coin toss is best understood as a case of epistemic probability. Since the side the coin will land on is determined by the laws of physics, it is argued that our assignment of probability is simply a measure of our rational expectations concerning which side the coin will land on. Besides epistemic probability simpliciter, philosophers also speak of what is known as the conditional epistemic probability of one proposition on another. (A proposition is any claim, assertion, statement, or hypothesis about the world). The conditional epistemic probability of a proposition R on another proposition S--written as P(R/S)--can be defined as the degree to which the proposition S of itself should rationally lead us to expect that R is true. For example, there is a high conditional probability that it will rain today on the hypothesis that the weatherman has predicted a 100% chance of rain, whereas there is a low conditional probability that it will rain today on the hypothesis that the weatherman has predicted only a 2% chance of rain. That is, the hypothesis that the weatherman has predicted a 100% chance of rain today should strongly lead us to expect that it will rain, whereas the hypothesis that the weatherman has predicted a 2% should lead us to expect that it will not rain. Under the epistemic conception of probability, therefore, the statement that the fine-tuning of the Cosmos is very improbable under the atheistic single-universe hypothesis makes perfect sense: it is to be understood as making a statement about the degree to which the atheistic single-universe hypothesis would or should, of itself, rationally lead us to expect the cosmic fine-tuning.(13) Conclusion The above discussion shows that we have at least two ways of understanding improbability invoked in our main argument: as classical probability or epistemic probability. This undercuts the common atheist objection that it is meaningless to speak of the probability of the fine-tuning under the atheistic single-universe hypothesis since under this hypothesis the universe is not a repeatable event.
Barry Arrington
Fine Tuning pulls the rug out from underneath the Blind Watchmaker. Whoops. Then it rearranges the furniture in his office. Ouch. Biology, especially Atheist Biology, are having a tough time dealing with Fine Tuning. Anathema. ppolish
Some of the fundamentals of Darwinian Evolution, as I understand it are: The complexities of life we see all around us, and within us are assembled from the bottom up in a Natural Selection process which chooses beneficial mutations among a long series of such changes, while allowing less beneficial changes to wither away, or perhaps allowed to remain as flotsam or “junk.” The resulting “designs” we see from such a process are merely illusions, the appearance of design … not actual design as we see in all of the human artifacts we dwell among such as the automobile and computers. Evolution is said to be without purpose, without direction and without goals. What we may see as purpose, direction and goals are simply the result of the workings of natural processes – simply illusions of and the appearance of design, ___________________ So then why do we see purpose, direction and goals at every level of life – from the cellular level, to the systems level to the completed body plan? We see purpose in the various machines and structures within each of several trillion cells in our bodies. We see the Kinesin motor transporting cargo from one place on the cell to another. We see the marvel of DNA which, coupled with other cellular components, represents not only a massive mass storage capability, but also represents a type of blueprint package defining all aspects of the end product body. This DNA package also contains what can be described as a complete set of “shop travelers” which, much like a manufacturing process, provides step by step instructions and bills of materials for the manufacture of the myriad parts making up the completed human body – bones, hair, brain, liver, eye, nose … and more. And each of these subunits exhibits purpose -- specific purpose. What is finally assembled as an arm and hand for example, takes on a myriad of functional purposes such as accurately throwing a baseball, playing a musical instrument such as a violin and cradling a new born baby. Each of our vital organs play specific and necessary roles in keeping our body alive and functioning – there are goals and purpose expressed in each and every one of our body parts. What we see and experience in the finished goal directed and purposeful human body is beautifully expressed in many ways, such as when we witness a magnificent choral and orchestral performance such as Handel’s Messiah. What we experience in that concert hall is not an illusion -- it is real and is the culmination of a multitude of designs, both in the natural as well as the realm of human intelligence and ingenuity. ayearningforpublius
Very interesting indeed:
"On the one hand, it’s amazing to think that any slight variation in this value would massively alter the structure of our universe. But, an equally important question is, what is the likelihood that this value could’ve been slightly different? We have no idea what the range of values for G (Newton’s constant) could be, nor the probability function for it. It seems to me that it’s simply a misstep to suppose that the precision (as in, “one part in 10^28”) is somehow a statement about the probability of a thing being other than it is. The precision tells us that changing the value slightly would be catastrophic, but it doesn’t tell us the likelihood of that value actually being other than it is."
This seems like a very logical conclusion:
"[...] what is the probability that any constant or variable setting for the universe could be other than it is? I don’t think science can give us an empirical answer. In other words, is the universe fine-tuned? Yes. Is the fine-tuning highly improbable? It seems we don’t (perhaps can’t) know."
Even I can see the point. :) Don't recall who said that we should refrain from over-interpreting or under-interpreting any statement. Or something like that. Does anybody remember the exact quote? Thank you. What I do remember is that we should test everything and hold only what is good. It's written in the main reference to the ultimate source of true wisdom that is available to us now. That's priceless. Other things you can buy with Visa or MasterCard. :) Dionisio
Many individual constants are of such a high degree of precision as to defy human imitation, vastly exceeding in precision the most precise man-made machine (which is approx. 1 in 10^22 for a gravity wave detector). For example, the cosmological constant (dark energy) is balanced to 1 part in 10^120,,,
The 2 most dangerous numbers in the universe are threatening the end of physics - Jan. 14, 2016 Excerpt: Dangerous No. 2: The strength of dark energy ,,, you should be able to sum up all the energy of empty space to get a value representing the strength of dark energy. And although theoretical physicists have done so, there's one gigantic problem with their answer: "Dark energy should be 10^120 times stronger than the value we observe from astronomy," Cliff said. "This is a number so mind-boggling huge that it's impossible to get your head around ... this number is bigger than any number in astronomy — it's a thousand-trillion-trillion-trillion times bigger than the number of atoms in the universe. That's a pretty bad prediction." On the bright side, we're lucky that dark energy is smaller than theorists predict. If it followed our theoretical models, then the repulsive force of dark energy would be so huge that it would literally rip our universe apart. The fundamental forces that bind atoms together would be powerless against it and nothing could ever form — galaxies, stars, planets, and life as we know it would not exist. http://finance.yahoo.com/news/two-most-dangerous-numbers-universe-194557366.html What is the cosmological constant paradox, and what is its significance? David H. Bailey – 1 Jan 2015 Excerpt: Curiously, this observation is in accord with a prediction made by physicist Steven Weinberg in 1987, who argued from basic principles that the cosmological constant must be zero to within one part in roughly 10^120, or else the universe either would have dispersed too fast for stars and galaxies to have formed, or else would have recollapsed upon itself long ago [Susskind2005, pg. 80-82].,,, In short, the recent discovery of the accelerating expansion of the universe and the implied slightly positive value of the cosmological constant constitutes, in the words of physicist Leonard Susskind (who is an atheist), a “cataclysm,” a “stunning reversal of fortunes” [Susskind2005, pg., 22, 154]. It is literally shaking the entire field of theoretical physics, astronomy and cosmology to its foundations.,,, http://www.sciencemeetsreligion.org/physics/cosmo-constant.php
At the 8:15 minute mark of the following video, Richard Dawkins is set straight by Steven Weinberg, who is an atheist himself, on just how big the 'problem' of the 1 in 10^120 Cosmological Constant is:
Quote: “I don’t think one should underestimate the fix we are in. That in the end we will not be able to explain the world. That we will have some set of laws of nature (that) we will not be able to derive them on the grounds simply of mathematical consistency. Because we can already think of mathematically consistent laws that don’t describe the world as we know it. And we will always be left with a question ‘why are the laws nature what they are rather than some other laws?’. And I don’t see any way out of that. The fact that the constants of nature are suitable for life, which is clearly true, we observe,,,” (Weinberg then comments on the multiverse conjecture of atheists) “No one has constructed a theory in which that is true. I mean,, the (multiverse) theory would be speculative, but we don’t even have a theory in which that speculation is mathematically realized. But it is a possibility.” Steven Weinberg – as stated to Richard Dawkins at the 8:15 minute mark of the following video Leonard Susskind - Richard Dawkins and Steven Weinberg - 1 in 10^120 - Cosmological Constant points to intelligent design - video https://youtu.be/z4E_bT4ecgk?t=495
Whereas, the mass density of the universe is balanced to 'only' 1 part in 10^60. None-the-less, 1 in 10^60 equates to just a single grain of sand out of the entire universe
Evidence for Belief in God - Rich Deem Excerpt: Isn't the immense size of the universe evidence that humans are really insignificant, contradicting the idea that a God concerned with humanity created the universe? It turns out that the universe could not have been much smaller than it is in order for nuclear fusion to have occurred during the first 3 minutes after the Big Bang. Without this brief period of nucleosynthesis, the early universe would have consisted entirely of hydrogen. Likewise, the universe could not have been much larger than it is, or life would not have been possible. If the universe were just one part in 10^59 larger, the universe would have collapsed before life was possible. Since there are only 10^80 baryons in the universe, this means that an addition of just 10^21 baryons (about the mass of a grain of sand) would have made life impossible. The universe is exactly the size it must be for life to exist at all. http://www.godandscience.org/apologetics/atheismintro2.html
Whereas Gravity is balanced to 'only 1 in 10^40. None-the-less, at the 4:45 minute mark of the following video, Dr. Bruce comments that varying the gravitational constant by just one inch, on an imaginary ruler that stretched across the entire universe, would either increase or decrease our weight by a trillion fold:
Contemporary Physics and God Part 2 Dr Bruce Gordon - video https://www.youtube.com/watch?v=ff_sNyGNSko
You can a see a visualization of that imaginary ruler stretched across the universe in the following video,,,
Finely Tuned Gravity (1 in 10^40 tolerance; which is just one inch of tolerance allowed on a imaginary ruler stretching across the diameter of the entire universe) – (27:32 minute mark) video https://www.youtube.com/watch?feature=player_detailpage&v=ajqH4y8G0MI#t=1652
And although many constants are fine-tuned to such a degree as to put to shame the highest tolerance of any man-made machine, (which is again, approx. 1 in 10^22 for a gravity wave detector), one particular initial condition of the universe is fine-tuned to such an extraordinary degree that it actually drives Atheistic Materialism into complete epistemological failure. In the following article and video, William Lane Craig explains the epistemological failure that results for Atheistic Materialism for the initial 1 in 10^10^123 entropy of the universe
Multiverse and the Design Argument - William Lane Craig Excerpt: Roger Penrose of Oxford University has calculated that the odds of our universe’s low entropy condition obtaining by chance alone are on the order of 1 in 10^10(123), an inconceivable number. If our universe were but one member of a multiverse of randomly ordered worlds, then it is vastly more probable that we should be observing a much smaller universe. For example, the odds of our solar system’s being formed instantly by the random collision of particles is about 1 in 10^10(60), a vast number, but inconceivably smaller than 1 in 10^10(123). (Penrose calls it “utter chicken feed” by comparison [The Road to Reality (Knopf, 2005), pp. 762-5]). Or again, if our universe is but one member of a multiverse, then we ought to be observing highly extraordinary events, like horses’ popping into and out of existence by random collisions, or perpetual motion machines, since these are vastly more probable than all of nature’s constants and quantities’ falling by chance into the virtually infinitesimal life-permitting range. Observable universes like those strange worlds are simply much more plenteous in the ensemble of universes than worlds like ours and, therefore, ought to be observed by us if the universe were but a random member of a multiverse of worlds. Since we do not have such observations, that fact strongly disconfirms the multiverse hypothesis. On naturalism, at least, it is therefore highly probable that there is no multiverse. — Penrose puts it bluntly “these world ensemble hypothesis are worse than useless in explaining the anthropic fine-tuning of the universe”. http://www.reasonablefaith.org/multiverse-and-the-design-argument Does a Multiverse Explain the Fine Tuning of the Universe? - Dr. Craig (observer selection effect vs. Boltzmann Brains) - video https://www.youtube.com/watch?v=pb9aXduPfuA
In the following video, Dr Bruce Gordon discusses the initial entropy of the universe in greater detail.
The Multiverse confirms the Ontological Argument for God - video https://youtu.be/MgDn_k11ups
As well to reiterate,the epistemological failure, that the 1 in 10^10^123 initial entropy of the universe drives Atheistic Materialism into, is also touched upon in the preceding video. Thus in conclusion to the OPs question, as long as you don't mind forsaking rationality altogether then I guess you can go ahead and hold onto your Atheistic belief that God did not fine tune the universe. But, if you are a bit hesitant to give up sanity as I am, then you are forced to accept the fact that God fine-tuned the universe to an extraordinary degree when He created it. Personally, I don't know why Atheists battle so hard to deny what is so obvious, so good, and so right. The fact that God really exists, and cares deeply for humanity as is made evident through Jesus Christ, is simply the single most wonderful fact that a mortal human can possibly know. Verse:
2 Peter 1:16 For we have not followed cunningly devised fables, when we made known unto you the power and coming of our Lord Jesus Christ, but were eyewitnesses of his majesty.
bornagain77
I think it's very much the same thing as functional specified complexity. The precision fine tuning of many different initial conditions, constants, and natural laws necessary for life is strong evidence this is not merely chance. To use the icicle analogy, this is rather like dozens of them spelling out "Merry Christmas" on the side of your house. That didn't happen by accident, even if each individual shape can be explained as the product of natural law. anthropic
Well, it's the same kind of question as: "How can we really know that we know anything?" J-Mac

Leave a Reply