Uncommon Descent Serving The Intelligent Design Community

Biology prof: How can we really know if the universe is fine-tuned?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

From Waynesburg U biology prof Wayne Rossiter, author of Shadow of Oz: Theistic Evolution and the Absent God, a question about claims for fine tuning of the universe:

My major concern with arguments from fine-tuning in cosmology is, how do we really get from from observations of precision to statements of probability? To say that something is precise is not to say that it is improbable. Those are two different things.

As a third quick analogy, if we studied the fall patterns of icicles from the roof of my home, we might find that their placement is incredibly precise. Given the vast surface area a given icicle could fall on (my yard, the road, my neighbor’s yard, etc.), the fact that they consistently fall within a very narrow area directly below the edge of the roof (they more or less fall straight down) seems absurdly precise. Absurdly precise, if it was logical to entertain the possibility of icicles falling in ways other than straight down. But the presence of gravity and the lack of strong winds make this highly precise phenomenon highly probable. Said plainly, it would be absurd to treat the falling of an icicle straight down and the falling of it laterally into my neighbor’s yard as equally likely.

But, I think that’s the sort of assumption being made in the argument from cosmological fine-tuning. To say that such-and-such a physical parameter rests upon a razor’s edge does tell us something. It tells us that any small change in the setting of that parameter would lead to a universe drastically different from the one we live in, and likely one that could never even produce material objects (let alone life) as we understand it. Fair enough. I agree. What it doesn’t tell us is how likely any of those other settings are. More.

Thoughts?

See also: Copernicus, you are not going to believe who is using your name. Or how.

Follow UD News at Twitter!

Comments
A few quick questions: After this discussion about the probability of the fine tuning parameters is over, when the dust has been settled down, are we going to have a valid explanation for the origin of biological systems? Does the fine tuning alone resolve that problem? Should complex functional specified information be created? Can the fine-tuning create it? Is the fine-tuning a necessary condition? Is it sufficient? Thank you.Dionisio
December 20, 2016
December
12
Dec
20
20
2016
08:58 PM
8
08
58
PM
PDT
bornagain77, Incidentally, the quantum Zeno effect apparently also suppresses quantum tunneling. http://phys.org/news/2015-10-zeno-effect-verifiedatoms-wont.html -QQuerius
December 20, 2016
December
12
Dec
20
20
2016
08:01 PM
8
08
01
PM
PDT
The reason why I am very impressed with the Quantum Zeno effect is, to reiterate, Entropy is, by a wide margin, the most finely tuned of initial conditions of the Big Bang, i.e. 1 in 10^10^123 can't be written down in long hand notation even if every particle in the universe were used to try to denote the number. Another reason I am very impressed with the Quantum Zeno Effect is because of how foundational entropy is in its explanatory power for the actions within the space-time of the universe:
Shining Light on Dark Energy – October 21, 2012 Excerpt: It (Entropy) explains time; it explains every possible action in the universe;,, Even gravity, Vedral argued, can be expressed as a consequence of the law of entropy. ,,, The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe —,,, http://crev.info/2012/10/shining-light-on-dark-energy/
And to make entropy even more personal, entropy is also the primary reason why our physical, temporal, bodies grow old and die,,,
Aging Process – 85 years in 40 seconds – video http://www.youtube.com/watch?v=A91Fwf_sMhk *3 new mutations every time a cell divides in your body * Average cell of 15 year old has up to 6000 mutations *Average cell of 60 year old has 40,000 mutations Reproductive cells are ‘designed’ so that, early on in development, they are ‘set aside’ and thus they do not accumulate mutations as the rest of the cells of our bodies do. Regardless of this protective barrier against the accumulation of slightly detrimental mutations still we find that,,, *60-175 mutations are passed on to each new generation. Per John Sanford Entropy Explains Aging, Genetic Determinism Explains Longevity, and Undefined Terminology Explains Misunderstanding Both - 2007 Excerpt: There is a huge body of knowledge supporting the belief that age changes are characterized by increasing entropy, which results in the random loss of molecular fidelity, and accumulates to slowly overwhelm maintenance systems [1–4].,,, http://www.plosgenetics.org/article/info%3Adoi/10.1371/journal.pgen.0030220
And yet, to repeat,,,
Quantum Zeno effect Excerpt: The quantum Zeno effect is,,, an unstable particle, if observed continuously, will never decay. per wiki
This is just fascinating! Why in blue blazes should conscious observation put a freeze on entropic decay, unless consciousness was/is more foundational to reality than the 1 in 10^10^123 entropy is? In fact, when including other lines of evidence from quantum mechanics, we have a compelling argument for God from consciousness. Putting all the lines of evidence together the argument for God from consciousness can now be framed like this:
1. Consciousness either preceded all of material reality or is a ‘epi-phenomena’ of material reality. 2. If consciousness is a ‘epi-phenomena’ of material reality then consciousness will be found to have no special position within material reality. Whereas conversely, if consciousness precedes material reality then consciousness will be found to have a special position within material reality. 3. Consciousness is found to have a special, even a central position within material reality. 4. Therefore, consciousness is found to precede material reality. Five intersecting lines of experimental evidence from quantum mechanics that shows that consciousness precedes material reality (Double Slit, Wigner’s Quantum Symmetries, Wheeler’s Delayed Choice, Leggett’s Inequalities, Quantum Zeno effect): https://docs.google.com/document/d/1uLcJUgLm1vwFyjwcbwuYP0bK6k8mXy-of990HudzduI/edit
Verses, Video and Music:
Romans 8:18-21 I consider that our present sufferings are not worth comparing with the glory that will be revealed in us. The creation waits in eager expectation for the sons of God to be revealed. For the creation was subjected to frustration, not by its own choice, but by the will of the one who subjected it, in hope that the creation itself will be liberated from its bondage to decay and brought into the glorious freedom of the children of God. Psalm 102:25-27 Of old You laid the foundation of the earth, And the heavens are the work of Your hands. They will perish, but You will endure; Yes, they will all grow old like a garment; Like a cloak You will change them, And they will be changed. But You are the same, And Your years will have no end. "We have the sober scientific certainty that the heavens and earth shall ‘wax old as doth a garment’.... Dark indeed would be the prospects of the human race if unilluminated by that light which reveals ‘new heavens and a new earth.’" Sir William Thomson, Lord Kelvin (1824 – 1907) – pioneer in many different fields, particularly electromagnetism and thermodynamics. The Resurrection of Jesus Christ as the 'Theory of Everything' (Entropic Concerns) - video https://www.youtube.com/watch?v=rqv4wVP_Fkc&index=2&list=PLtAP1KN7ahia8hmDlCYEKifQ8n65oNpQ5 Evanescence – The Other Side (Lyric Video) http://www.vevo.com/watch/evanescence/the-other-side-lyric-video/USWV41200024?source=instantsearch
bornagain77
December 20, 2016
December
12
Dec
20
20
2016
07:31 PM
7
07
31
PM
PDT
Well, I find the 1 in 10^10^123 initial entropy of the universe to be devastating for atheistic metaphysics from two different angles. First, the 1 in 10^10^123 event is so ‘exorbitantly improbable’ that it drives, (via only 'mediocre improbable' Boltzmann Brains), atheistic materialism into catastrophic epistemological failure. Second, it also, via quantum mechanics, provides fairly compelling evidence that consciousness must precede material reality. As to the first point:
The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: "The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the "source" of the Second Law (Entropy)." http://irafs.org/irafs_1/cd_irafs02/texts/penrose.pdf The 'accuracy of the Creator's aim' would have had to be in 10^10^123" Hawking, S. and Penrose, R., The Nature of Space and Time, Princeton, Princeton University Press (1996), 34, 35. Multiverse and the Design Argument - William Lane Craig Excerpt: Roger Penrose of Oxford University has calculated that the odds of our universe’s low entropy condition obtaining by chance alone are on the order of 1 in 10^10(123), an inconceivable number. If our universe were but one member of a multiverse of randomly ordered worlds, then it is vastly more probable that we should be observing a much smaller universe. For example, the odds of our solar system’s being formed instantly by the random collision of particles is about 1 in 10^10(60), a vast number, but inconceivably smaller than 1 in 10^10(123). (Penrose calls it “utter chicken feed” by comparison [The Road to Reality (Knopf, 2005), pp. 762-5]). Or again, if our universe is but one member of a multiverse, then we ought to be observing highly extraordinary events, like horses’ popping into and out of existence by random collisions, or perpetual motion machines, since these are vastly more probable than all of nature’s constants and quantities’ falling by chance into the virtually infinitesimal life-permitting range. Observable universes like those strange worlds are simply much more plenteous in the ensemble of universes than worlds like ours and, therefore, ought to be observed by us if the universe were but a random member of a multiverse of worlds. Since we do not have such observations, that fact strongly disconfirms the multiverse hypothesis. On naturalism, at least, it is therefore highly probable that there is no multiverse. — Penrose puts it bluntly “these world ensemble hypothesis are worse than useless in explaining the anthropic fine-tuning of the universe”. http://www.reasonablefaith.org/multiverse-and-the-design-argument Does a Multiverse Explain the Fine Tuning of the Universe? - Dr. Craig (observer selection effect vs. Boltzmann Brains) - video https://www.youtube.com/watch?v=pb9aXduPfuA
Thus, if you believe you live in a rational universe and are not a 'Boltzmann brain', then you are forced to believe it is 'exorbitantly' more likely that Theism is true. As to the second point, an unstable particle, if observed continuously, will never decay. This is known as the Quantum Zeno Effect:
Quantum Zeno Effect The quantum Zeno effect is,, an unstable particle, if observed continuously, will never decay. http://en.wikipedia.org/wiki/Quantum_Zeno_effect Interaction-free measurements by quantum Zeno stabilization of ultracold atoms – 14 April 2015 Excerpt: In our experiments, we employ an ultracold gas in an unstable spin configuration, which can undergo a rapid decay. The object—realized by a laser beam—prevents this decay because of the indirect quantum Zeno effect and thus, its presence can be detected without interacting with a single atom. http://www.nature.com/ncomms/2015/150414/ncomms7811/full/ncomms7811.html?WT.ec_id=NCOMMS-20150415 “It has been experimentally confirmed,, that unstable particles will not decay, or will decay less rapidly, if they are observed. Somehow, observation changes the quantum system. We’re talking pure observation, not interacting with the system in any way.” Douglas Ell – Counting to God – pg. 189 – 2014 – Douglas Ell graduated early from MIT, where he double majored in math and physics. He then obtained a masters in theoretical mathematics from the University of Maryland. After graduating from law school, magna cum laude, he became a prominent attorney. 'Zeno effect' verified: Atoms won't move while you watch - Oct. 22, 2015 Excerpt: One of the oddest predictions of quantum theory – that a system can’t change while you’re watching it,,, Graduate students Yogesh Patil and Srivatsan Chakram created and cooled a gas of about a billion Rubidium atoms inside a vacuum chamber and suspended the mass between laser beams. In that state the atoms arrange in an orderly lattice just as they would in a crystalline solid. But at such low temperatures the atoms can “tunnel” from place to place in the lattice. ,,, The researchers demonstrated that they were able to suppress quantum tunneling merely by observing the atoms. http://www.news.cornell.edu/stories/2015/10/zeno-effect-verified-atoms-wont-move-while-you-watch
bornagain77
December 20, 2016
December
12
Dec
20
20
2016
07:31 PM
7
07
31
PM
PDT
Bornagain77: Would 1 in 10^10^123 initial entropy be considered ‘exorbitantly improbable’?
That depends. It is not at all improbable in the context of 'eternal inflation' — in fact nothing is. There may very well be compelling arguments against eternal inflation, but improbability of events is not one of them.Origenes
December 20, 2016
December
12
Dec
20
20
2016
05:50 PM
5
05
50
PM
PDT
as to: "My only beef is with the assertion that the observed values are exorbitantly improbable. I simply want ID folks and apologists to stop making those statements. They seem almost entirely unfounded. That is all." Would 1 in 10^10^123 initial entropy be considered 'exorbitantly improbable'? :) I see no reason to stop arguing that 1 in 10^10^123 initial entropy is 'exorbitantly improbable'. It certainly is 'exorbitantly improbable'. After having spent a few days trying to see if there is any real meat to your criticism, I still think you are tilting at windmills. Perhaps more so. You simply offer no compelling reason. There is nothing that you have said that makes me question the extraordinary nature to which we find the constants or the initial conditions of the universe.bornagain77
December 20, 2016
December
12
Dec
20
20
2016
05:09 PM
5
05
09
PM
PDT
Or as my former boss, a Marine 2 star General used to say. "That should be obvious even to a sea-going corporal."ayearningforpublius
December 20, 2016
December
12
Dec
20
20
2016
04:55 PM
4
04
55
PM
PDT
I repeat my comments and observations from @5 above: ------------------------------------- Some of the fundamentals of Darwinian Evolution, as I understand it are: The complexities of life we see all around us, and within us are assembled from the bottom up in a Natural Selection process which chooses beneficial mutations among a long series of such changes, while allowing less beneficial changes to wither away, or perhaps allowed to remain as flotsam or “junk.” The resulting “designs” we see from such a process are merely illusions, the appearance of design … not actual design as we see in all of the human artifacts we dwell among such as the automobile and computers. Evolution is said to be without purpose, without direction and without goals. What we may see as purpose, direction and goals are simply the result of the workings of natural processes – simply illusions of and the appearance of design, ___________________ So then why do we see purpose, direction and goals at every level of life – from the cellular level, to the systems level to the completed body plan? We see purpose in the various machines and structures within each of several trillion cells in our bodies. We see the Kinesin motor transporting cargo from one place on the cell to another. We see the marvel of DNA which, coupled with other cellular components, represents not only a massive mass storage capability, but also represents a type of blueprint package defining all aspects of the end product body. This DNA package also contains what can be described as a complete set of “shop travelers” which, much like a manufacturing process, provides step by step instructions and bills of materials for the manufacture of the myriad parts making up the completed human body – bones, hair, brain, liver, eye, nose … and more. And each of these subunits exhibits purpose — specific purpose. What is finally assembled as an arm and hand for example, takes on a myriad of functional purposes such as accurately throwing a baseball, playing a musical instrument such as a violin and cradling a new born baby. Each of our vital organs play specific and necessary roles in keeping our body alive and functioning – there are goals and purpose expressed in each and every one of our body parts. What we see and experience in the finished goal directed and purposeful human body is beautifully expressed in many ways, such as when we witness a magnificent choral and orchestral performance such as Handel’s Messiah. What we experience in that concert hall is not an illusion — it is real and is the culmination of a multitude of designs, both in the natural as well as the realm of human intelligence and ingenuity. _____________________________ It seems as simple as that! Why all this talking over the heads of the common man?ayearningforpublius
December 20, 2016
December
12
Dec
20
20
2016
04:54 PM
4
04
54
PM
PDT
wrossite: ... I’ll ask you the same question I’ve asked others: What is the range of possible values for a given parameter (let’s say Newton’s gravitational constant) and what does the probability distribution look like? (and how did you derive it)?
Is it not obviously the case that the probability distribution depends on the proposed mechanism? And if this is indeed the case then what proposed mechanism does Kairosfocus use as a context? As I understand it there are at least two contenders: 1) The 'cosmic landscape', which is related to string theory. If I understand it correctly, 10^500 different universes governed by the present laws of nature but with an uniform distribution of different values of the physical constants. There seems to be considerable room for discussion about probabilities. Craig writes:
"even though there may be a huge number of possible universes lying within the life-permitting region of the cosmic landscape, nevertheless that life-permitting region will be unfathomably tiny compared to the entire landscape, so that the existence of a life-permitting universe is fantastically improbable. Indeed, given the number of constants that require fine-tuning, it is far from clear that 10^500 possible universes is enough to guarantee that even one life-permitting world will appear by chance in the landscape!"
2) Eternal inflation.
Carrier: Everyone agrees multiverse theory refutes any fine tuning argument for God. Because on a standard multiverse theory (e.g. eternal inflation), all configurations of physical universes will be realized eventually, and therefore the improbability of any of them is negated. No matter how improbable an individual universe is, the probability that it exists if a multiverse exists is effectively 100%.
If Carrier is correct, then there is no sense in talking about probabilities in the context of 'eternal inflation'.Origenes
December 20, 2016
December
12
Dec
20
20
2016
04:38 PM
4
04
38
PM
PDT
KF,
DS, kindly see just above. The relevant calcs are done in stat mech or in info theory, where systems are WLOG reducible to strings that describe in some description language.
Ok, I take it this means your proposal is meant to actually be carried out. Please let me know if anyone publishes on this second-order sensitivity analysis.daveS
December 20, 2016
December
12
Dec
20
20
2016
04:28 PM
4
04
28
PM
PDT
All, it seems that my point has been conceded over and over, and yet some of you want to misrepresent the extent of my argument. My argument is incredibly modest. From the get-go, I have acknowledged the precision (sensitivity) argument. I'm not taking exception with that point. My only beef is with the assertion that the observed values are exorbitantly improbable. I simply want ID folks and apologists to stop making those statements. They seem almost entirely unfounded. That is all. Wwrossite
December 20, 2016
December
12
Dec
20
20
2016
04:26 PM
4
04
26
PM
PDT
Wayne @ 81 wrote,
“My major concern with arguments from fine-tuning in cosmology is, how do we really get from observations of precision to statements of probability? To say that something is precise is not to say that it is improbable. Those are two different things…To say that such-and-such a physical parameter rests upon a razor’s edge does tell us something. It tells us that any small change in the setting of that parameter would lead to a universe drastically different from the one we live in, and likely one that could never even produce material objects (let alone life) as we understand it. Fair enough. I agree. What it doesn’t tell us is how likely any of those other settings are.” Okay, so “precision” is a way of speaking of sensitivity. Here we agree. Therefor what? To even talk about sensitivity as if it’s important, we must assume that it was possible for a parameter to fall outside of those narrow ranges of precision. We cannot… Bruce Gordon offering, “Suppose that the universe is about 30 billion light years across…and you stretch a tape measure across that. That comes out to about 10^28 inches. Peg Newton’s constant on one of those inches. Now, what would be the consequence say, of moving one inch to the left or to the right?” These are explicit appeals to statements regarding parameter space (a range of possible values, all being equiprobable). If we can’t say such things, then we shouldn’t.
A couple of people commenting here have said that they think that you have made a good point here. I agree that we cannot at present derive probabilities for every one of the fine-tuned constants. However, I see a problem. Why can’t we say that? Is it logically impossible? Earlier @ &72 I wrote: “Assuming that they could be different (and not knowing if they could be, it is possible that they could) how would you derive probability for each one?” Again, I’ll concede that we have a problem getting “from observations of precision to statements of probability,” however, I don’t see why that should limit us about speculating what metaphysically could be. Here is the fine-tuning argument that William Lane Craig likes to use:
The fine-tuning of the universe to support life is either due to law, chance or design It is not due to law or chance Therefore, the fine-tuning is due to design
And, of course if the universe is designed it must have a designer. Craig does NOT claim that this is a scientific argument. Rather, he argues that it is a philosophical argument with a premise that is derived inductively from empirical observations of the universe. Philosophical augments deal with what is logically possible. In other words, I don’t see how it is logically impossible that a given cosmological constant could not be different. Would you argue that because we don’t know how to state these constants in strict probabilistic terms that the fine-tuning cannot possibly have been different? In other words, if it is logically impossible, can you explain how?john_a_designer
December 20, 2016
December
12
Dec
20
20
2016
04:16 PM
4
04
16
PM
PDT
DS, kindly see just above. The relevant calcs are done in stat mech or in info theory, where systems are WLOG reducible to strings that describe in some description language. In any case, long before we get to such, the spectrum from maximal flexibility to no flexibility is incapable of eliminating the import of the sensitivity analysis. The observed cosmos is at a sharp resonance point per the sensitivity analysis, with some values set in the same range in multiple contexts. Fine tuning is a serious issue. KFkairosfocus
December 20, 2016
December
12
Dec
20
20
2016
03:58 PM
3
03
58
PM
PDT
WR, with all due respects, I have made no such decision. I have simply pointed out that we look here at sensitivity analysis, an absolutely standard mathematical procedure . . . and since 1953, we have increasingly seen that if parameters were just slightly different, we would not have a cell based life permitting cosmos. If sensitivity by possibility of varied values does not apply and the values of quantities, parameters, structure of laws etc are all locked -- not particularly credible but we consider for argument -- then it points to a locking force, which would itself manifest awesome fine tuning across a span of at least 90 bn light years and what 13.8 by . . . big shoes to fill. Second, I point you to statistical mechanics, in which the flat distribution model is a first case, and there is a more general approach that allows for varying probabilities, e.g. through expressions of form SUM pi log pi; consider the comparable case of equiprobable symbols in strings, and strings with diverse probability symbols [which, necessarily, lowers the uncertainty involved]. in effect, maximum flexibility, no flexibility, the spectrum that blends the two between. None of the three cases -- note, that covers the available ground -- is capable of eliminating fine tuning as a significant issue as manifested in sensitivity analysis. The issue is there, the issue of one value taking several significance is there, and so forth. What is needed is not to raise clouds of dust regarding the phenomenon, but to address its import as a striking meta observation. KFkairosfocus
December 20, 2016
December
12
Dec
20
20
2016
03:38 PM
3
03
38
PM
PDT
to reiterate wrossite's claim at 44,,,
KF, Thanks for the post. Notice that it assumes many of the things I’m suggesting we can’t. It talks as if there are in fact other universes. It talks as if we can know anything about them or their laws. It assumes a range and a distribution for parameters. None of this can be known. It is pure speculation. As I’ve repeatedly pointed out, you cannot calculate a probability given a sample size of one. W
wrossite is basically trying to lay down a mandate that states 'we are not allowed to speculate on what might have been before time began at the big bang' And as was pointed out in 51, and then further clarified in 70, wrossite is basically trying to do the impossible in that he is trying to get man to act against the 'timeless' nature of his own thoughts. The overall gist of post 70 was,,
"since man thinks, speaks, and writes in terms of (immaterial) information, then this makes the nature of man’s thoughts, of necessity, ‘timeless’."
But to go further, wrossite's claim that we can't speculate as to what was before time began at the big bang strongly reminds me of Godel's incompleteness theorem. Godel, by using 'timeless' mathematics and logic, (specifically using the 'logic of infinity'), proved that “Anything you can draw a circle around cannot explain itself without referring to something outside the circle—something you have to assume but cannot prove” Stephen Hawking himself conceded this point to Godel's incompleteness theorem in his book 'The Grand Design':
"Gödel's incompleteness theorem (1931), proves that there are limits to what can be ascertained by mathematics. Kurt Gödel (ref. on cite), halted the achievement of a unifying all-encompassing theory of everything in his theorem that: “Anything you can draw a circle around cannot explain itself without referring to something outside the circle—something you have to assume but cannot prove”. Thus, based on the position that an equation cannot prove itself, the constructs are based on assumptions some of which will be unprovable." Cf., Stephen Hawking & Leonard Miodinow, The Grand Design (2010) @ 15-6
And although we may not be able to mathematically prove what is outside the circle of the universe, (or outside any circle we may draw around anything else), which is the main point that I believe wrossite is trying to drive at, none-the-less, I hold that we can still at least logically know what is outside the circle of the universe. (In fact, Godel proved his incompleteness theorem using logic instead of proving it with math)
Taking God Out of the Equation - Biblical Worldview - by Ron Tagliapietra - January 1, 2012 Excerpt: Kurt Gödel (1906–1978) proved that no logical systems (if they include the counting numbers) can have all three of the following properties. 1. Validity ... all conclusions are reached by valid reasoning. 2. Consistency ... no conclusions contradict any other conclusions. 3. Completeness ... all statements made in the system are either true or false. The details filled a book, but the basic concept was simple and elegant. He (Godel) summed it up this way: “Anything you can draw a circle around cannot explain itself without referring to something outside the circle—something you have to assume but cannot prove.” For this reason, his proof is also called the Incompleteness Theorem. Kurt Gödel had dropped a bomb on the foundations of mathematics. Math could not play the role of God as infinite and autonomous. It was shocking, though, that logic could prove that mathematics could not be its own ultimate foundation. Christians should not have been surprised. The first two conditions are true about math: it is valid and consistent. But only God fulfills the third condition. Only He is complete and therefore self-dependent (autonomous). God alone is “all in all” (1 Corinthians 15:28), “the beginning and the end” (Revelation 22:13). God is the ultimate authority (Hebrews 6:13), and in Christ are hidden all the treasures of wisdom and knowledge (Colossians 2:3). http://www.answersingenesis.org/articles/am/v7/n1/equation#
I think that it is fairly obvious that either God or random chance must be 'outside the circle' of the universe. Yet, through detailed analysis of Godel's incompleteness theorem, it is found that random chance, (i.e. anti-theism), cannot possibly ground mathematics. Therefore random chance cannot possibly be the assumption that we are forced to make for what is outside Godel's circle for mathematics or, more importantly, outside Godel's circle for the universe
A BIBLICAL VIEW OF MATHEMATICS Vern Poythress - Doctorate in theology, PhD in Mathematics (Harvard) 15. Implications of Gödel’s proof B. Metaphysical problems of anti-theistic mathematics: unity and plurality Excerpt: Because of the above difficulties, anti-theistic philosophy of mathematics is condemned to oscillate, much as we have done in our argument, between the poles of a priori knowledge and a posteriori knowledge. Why? It will not acknowledge the true God, wise Creator of both the human mind with its mathematical intuition and the external world with its mathematical properties. In sections 22-23 we shall see how the Biblical view furnishes us with a real solution to the problem of “knowing” that 2 + 2 = 4 and knowing that S is true. http://www.frame-poythress.org/a-biblical-view-of-mathematics/
Therefore, via process of elimination, via Godel's incompleteness, God must be what, or more specifically, Who is outside the circle of the universe.bornagain77
December 20, 2016
December
12
Dec
20
20
2016
01:45 PM
1
01
45
PM
PDT
Origenes, thank you for your comments to my unclear sentence; “What we cannot then really extrapolate is, was that by coincidence or by design, and how does the figures come about?” You say: “What figures? Where does the possibility of coincidence come from? For the upteenth time: probabilities or improbabilities enter the arena only after some smartass proposes a random mechanism. _________________________________________________________________ According to eyewitnesses, God wrote two figures in stone, the numbers six and seven, and stated them publicly with trumpet blast. It seems that the figure six in relation to divine law is like a God Mode, where all the laws, constants, physical laws and such like came about through one key stroke activated by voice. How or why did an all seeing God choose to write that he created in six days; coincidence, or random choice for primitive misunderstanding convenience; or, by design and truth for our future benefit and good? Sure, we can in common sense perhaps understand irreducible complexity, but we still cannot prove the God Mode by which such came about, and certainly not time wise. We can only believe. Or, perhaps dismissively say such questions are not scientific enough! Surely, therefore, there must also be irreducible complexity in the cosmos, fine tuning included? Just because we may think by consensus science we can theoretically detect some form of beginning, does not disprove a matured beginning in six days. However, Darwin was a "smartass;" he simple dismissed miracles saying the witnesses were unreliable. The Big Bang Theory does similar. Perhaps inflation theory is the real "smartass" of the Big Bang, that and not having a verifiable cosmic theory from no space. Or, tying down Yahweh to the Big Bang and Darwinism which will eventually flush him down some black hole in disbelief. Evolutionism is Satan's best means to divide and exorcise out Christianity. It certainly is eclipsing the Judaeo-Christian God and making a mess of scripture. According to Genesis, he knows, in beguiling tones, how to fine tune disbelief in one and then for many to be effected.mw
December 20, 2016
December
12
Dec
20
20
2016
12:06 PM
12
12
06
PM
PDT
KF. First, my point was that others (e.g. Gordon and Craig) DO TREAT the range of possible values as equiprobable. If that's wrong, then they need to know it. Now, since you've decided that they aren't equiprobable, I'll ask you the same question I've asked others: What is the range of possible values for a given parameter (let's say Newton's gravitational constant) and what does the probability distribution look like? (and how did you derive it)? Wwrossite
December 20, 2016
December
12
Dec
20
20
2016
11:10 AM
11
11
10
AM
PDT
KF, Yes, I understand the argument, but my question has to do with actually carrying out this sensitivity analysis, i.e., actually performing calculations in MATLAB or some other language (hopefully :P). I assume an early step would be to identify and parameterize the "second-level inputs", which I tentatively labeled H_i. These are "cosmos factories" which generate collections of fundamental constants of nature. Do you intend this to be more of a thought experiment rather an analysis to be actually performed?daveS
December 20, 2016
December
12
Dec
20
20
2016
10:52 AM
10
10
52
AM
PDT
WR, at no point whatsoever above, have I argued that the values in paramteres tied to our cosmos and its life permitting constraints, are required to be equi-probable. I have explicitly pointed to the opposite, identifying the implication of there being a locking force that sets the parameters. Namely, that the fine tuning goes up one level. I highlighted that the fine tuning challenge is hard to escape -- if one deals with it on the merits. KF PS: Can you show me where sensitivity analysis is not a normal or typical facet of analysis of frameworks for designs or models etc? I submit, you cannot. Such analysis is not inherently wedded to equiprobable possibilities, as say statistical thermodynamics readily shows. What do you thing expressions of form [SUM pi log pi] are about, but providing for cases where probabilities are not flat random?kairosfocus
December 20, 2016
December
12
Dec
20
20
2016
09:37 AM
9
09
37
AM
PDT
DS, Kindly cf Robin Collin et al and the bread-baking factory discussion; as though there was need for some authority to say anything before it can be addressed on its patent merits. A system that consistently turns out well baked loaves of bread rather than a doughy mess or a burned hockey puck, is very carefully calibrated so to do. Similarly, when we see a lone fly on a patch of wall swatted by a bullet we are looking at a tack driver rifle and a marksman able to use the capacity of the rifle -- if this were a world by chance driven pursuit of life permitting zones, we should expect to be in the equivalent of Leslie's fly-carpeted portion of the wall, for reasons quite similar to the driving logic behind the statistical form of the second law of thermodynamics. This is fine tuning. And the point was, if the system is "locked" to produce our cosmos, that implies a prior mechanism that does the locking. Further to this, in a multiverse type scenario, we are looking at the implication that we are dealing with a deeply isolated narrow "resonance," which so happens to bring together a great many factors in a context of mutual fit and constraints that enables what we see. that strongly points to unified purpose and to powerful, intelligent, knowledgeable and skilled mind behind it. I suggest that, absent strong empirically grounded reason to hold otherwise, the evident fine tuning of the observed cosmos strongly points to a designer. KFkairosfocus
December 20, 2016
December
12
Dec
20
20
2016
09:29 AM
9
09
29
AM
PDT
WR, re:
My major concern with arguments from fine-tuning in cosmology is, how do we really get from from observations of precision to statements of probability?
Actually, no. As I have pointed out for some time now, the first issue is, the sensitivity analysis in the framework of physics that undergirds the cosmos. As, for instance, Sir Fred Hoyle pointed out long ago now, as the first significant person to note a fine tuning result. Also, as Leslie highlighted with his lone fly on a section of wall argument. We have a system that evidently has closely co-adapted components, many of which are multiply constrained, and this is as an integral part of a unified system that enables function, here, a life permitting cosmos with C-chemistry, aqueous medium cell based life. That first needs to be faced. Yes, probability issues do come in but they come in in that context. KFkairosfocus
December 20, 2016
December
12
Dec
20
20
2016
09:18 AM
9
09
18
AM
PDT
WR
These are explicit appeals to statements regarding parameter space (a range of possible values, all being equiprobable).
Yes, but in those two quotes you offered, neither explicitly mentioned probabilities, strictly speaking. They're just referring to imaginary scenarios. As I said earlier, if we can't know if any other range of values is possible (and therefore cannot speak of them) then we can't know if any other universe is possible. So discussions on the possibility of a multiverse die right there. But again, what I thought was strange was that you didn't hesitate to consider the probability of a multiverse, and not only that, but accept that there is some sort of likelihood that there are 10^500 of them. That simply doesn't follow. What Craig and Gordon are doing is simply using imaginary concepts and drawing some common sense ideas from them. Multiverse proponents dress up their ideas with some mathematics, but they're doing the same thing. It's completely imaginary with zero directly observable evidence to support it. We don't know if any other universe is possible.Silver Asiatic
December 20, 2016
December
12
Dec
20
20
2016
09:09 AM
9
09
09
AM
PDT
KF, PS: Regarding this second-level analysis, it seems to me that we would have to posit a family of super-force/principles of action regimes H_i such that the physical constants are logically "locked together" under some of the H_i but not the others. For example, perhaps under H_1, the physical constants could all range between 0 and infinity, each with some probability distribution. Perhaps under H_2, the physical constants are locked together logically, with the values we observe in our universe. There might be an H_3 under which the physical constants are again locked together, but with values different from those in our universe. And so forth. Does that sound about right?daveS
December 20, 2016
December
12
Dec
20
20
2016
08:13 AM
8
08
13
AM
PDT
KF,
We need to accept that sensitivity analysis is inherently part of analysing models or systems with parameters and structuring frameworks that are not locked by force of logical necessity. (And in this case, if such a range of entities is so locked to fit together, the implied super-force and principles of action yielding the structure would be a very interesting target for level 2 sensitivity analysis.)
Has anyone attempted this "level 2" sensitivity analysis in the case where the physical constants are indeed locked together by logical necessity? This reminds me a bit of the discussions we've had over Euler's Identity exp(iπ) = −1, which is logically necessary, I take it. Can a second order analysis be performed on this instance?daveS
December 20, 2016
December
12
Dec
20
20
2016
07:54 AM
7
07
54
AM
PDT
From my blog: "My major concern with arguments from fine-tuning in cosmology is, how do we really get from from observations of precision to statements of probability? To say that something is precise is not to say that it is improbable. Those are two different things...To say that such-and-such a physical parameter rests upon a razor’s edge does tell us something. It tells us that any small change in the setting of that parameter would lead to a universe drastically different from the one we live in, and likely one that could never even produce material objects (let alone life) as we understand it. Fair enough. I agree. What it doesn’t tell us is how likely any of those other settings are." Okay, so "precision" is a way of speaking of sensitivity. Here we agree. Therefor what? To even talk about sensitivity as if it's important, we must assume that it was possible for a parameter to fall outside of those narrow ranges of precision. We cannot. And yet, so many do. I offer two examples in my blog (Bill Craig saying, “[Fine-tuning] is like all the roulette wheels in Monte Carlo’s yielding simultaneously numbers within narrowly prescribed limits and those numbers bearing certain precise relations among themselves.” Bruce Gordon offering, “Suppose that the universe is about 30 billion light years across…and you stretch a tape measure across that. That comes out to about 10^28 inches. Peg Newton’s constant on one of those inches. Now, what would be the consequence say, of moving one inch to the left or to the right?” These are explicit appeals to statements regarding parameter space (a range of possible values, all being equiprobable). If we can't say such things, then we shouldn't. I'm not sure why we're suddenly discussing biology, the origins of life, etc. I haven't attacked any of these. In fact, I'd rather see us spend out time talking about these items, because they can be experimentally and empirically examined. Genuine probabilities can be expressed. Wwrossite
December 20, 2016
December
12
Dec
20
20
2016
07:11 AM
7
07
11
AM
PDT
PPS: Walker and Davies update Hoyle:
In physics, particularly in statistical mechanics, we base many of our calculations on the assumption of metric transitivity, which asserts that a system’s trajectory will eventually [--> given "enough time and search resources"] explore the entirety of its state space – thus everything that is phys-ically possible will eventually happen. It should then be trivially true that one could choose an arbitrary “final state” (e.g., a living organism) and “explain” it by evolving the system backwards in time choosing an appropriate state at some ’start’ time t_0 (fine-tuning the initial state). In the case of a chaotic system the initial state must be specified to arbitrarily high precision. But this account amounts to no more than saying that the world is as it is because it was as it was, and our current narrative therefore scarcely constitutes an explanation in the true scientific sense. We are left in a bit of a conundrum with respect to the problem of specifying the initial conditions necessary to explain our world. A key point is that if we require specialness in our initial state (such that we observe the current state of the world and not any other state) metric transitivity cannot hold true, as it blurs any dependency on initial conditions – that is, it makes little sense for us to single out any particular state as special by calling it the ’initial’ state. If we instead relax the assumption of metric transitivity (which seems more realistic for many real world physical systems – including life), then our phase space will consist of isolated pocket regions and it is not necessarily possible to get to any other physically possible state (see e.g. Fig. 1 for a cellular automata example).
[--> or, there may not be "enough" time and/or resources for the relevant exploration, i.e. we see the 500 - 1,000 bit complexity threshold at work vs 10^57 - 10^80 atoms with fast rxn rates at about 10^-13 to 10^-15 s leading to inability to explore more than a vanishingly small fraction on the gamut of Sol system or observed cosmos . . . the only actually, credibly observed cosmos]
Thus the initial state must be tuned to be in the region of phase space in which we find ourselves [--> notice, fine tuning], and there are regions of the configuration space our physical universe would be excluded from accessing, even if those states may be equally consistent and permissible under the microscopic laws of physics (starting from a different initial state). Thus according to the standard picture, we require special initial conditions to explain the complexity of the world, but also have a sense that we should not be on a particularly special trajectory to get here (or anywhere else) as it would be a sign of fine–tuning of the initial conditions. [ --> notice, the "loading"] Stated most simply, a potential problem with the way we currently formulate physics is that you can’t necessarily get everywhere from anywhere (see Walker [31] for discussion). ["The “Hard Problem” of Life," June 23, 2016, a discussion by Sara Imari Walker and Paul C.W. Davies at Arxiv.]
kairosfocus
December 20, 2016
December
12
Dec
20
20
2016
04:40 AM
4
04
40
AM
PDT
Folks, Pardon a few prelim, personal notes, I am a bit less headachy and time-squeezed than I was yesterday, as parliament sits today. And, the creative mindstorm has reached critical mass and is over even as U3 is on the table as WIP -- here's lookin' at ya, St Helena GBP 285 mn airport controversy [and Wiki comes up for Kudos, News . . . by contrast with much of the UK press] -- with U1 and U2 initially complete as well as a scope-sequence with refs. Now to look at fine tuning, from the Math angle. Mathematics can aptly be understood as the [study of the] logic of structure and quantity. It is an inherently abstract discipline and it is inextricably deeply entangled with the physical sciences. Where, in order for such sciences to exist, we must live in a world where sufficiently rationally and responsibly free agents are possible -- and actual -- that such logic can be freely taken up and pursued. (BTW, this already constrains the nature of reality, but that is metaphysics.) Also, we must have a cosmos that is observer-permitting at local and cosmological level. Down that road lies the privileged planet discussion, which again we can set aside for another interesting day. Our present focus is the mathematics of a cosmos such as ours. Allow me to suggest that there are well-known results relevant to the constitution of a fine-tuned cosmos. Sir Fred Hoyle:
>>[Sir Fred Hoyle, In a talk at Caltech c 1981 (nb. this longstanding UD post):] From 1953 onward, Willy Fowler and I have always been intrigued by the remarkable relation of the 7.65 MeV energy level in the nucleus of 12 C to the 7.12 MeV level in 16 O. If you wanted to produce carbon and oxygen in roughly equal quantities by stellar nucleosynthesis, these are the two levels you would have to fix, and your fixing would have to be just where these levels are actually found to be. Another put-up job? . . . I am inclined to think so. A common sense interpretation of the facts suggests that a super intellect has "monkeyed" with the physics as well as the chemistry and biology, and there are no blind forces worth speaking about in nature. [F. Hoyle, Annual Review of Astronomy and Astrophysics, 20 (1982): 16.]>> . . . also, in the same talk at Caltech: >>The big problem in biology, as I see it, is to understand the origin of the information carried by the explicit structures of biomolecules. The issue isn't so much the rather crude fact that a protein consists of a chain of amino acids linked together in a certain way, but that the explicit ordering of the amino acids endows the chain with remarkable properties, which other orderings wouldn't give. The case of the enzymes is well known . . . If amino acids were linked at random, there would be a vast number of arrange-ments that would be useless in serving the pur-poses of a living cell. When you consider that a typical enzyme has a chain of perhaps 200 links and that there are 20 possibilities for each link,it's easy to see that the number of useless arrangements is enormous, more than the number of atoms in all the galaxies visible in the largest telescopes. [ --> 20^200 = 1.6 * 10^260] This is for one enzyme, and there are upwards of 2000 of them, mainly serving very different purposes. So how did the situation get to where we find it to be? This is, as I see it, the biological problem - the information problem . . . . I was constantly plagued by the thought that the number of ways in which even a single enzyme could be wrongly constructed was greater than the number of all the atoms in the universe. So try as I would, I couldn't convince myself that even the whole universe would be sufficient to find life by random processes - by what are called the blind forces of nature . . . . By far the simplest way to arrive at the correct sequences of amino acids in the enzymes would be by thought, not by random processes . . . . Now imagine yourself as a superintellect working through possibilities in polymer chemistry. Would you not be astonished that polymers based on the carbon atom turned out in your calculations to have the remarkable properties of the enzymes and other biomolecules? Would you not be bowled over in surprise to find that a living cell was a feasible construct? Would you not say to yourself, in whatever language supercalculating intellects use: Some supercalculating intellect must have designed the properties of the carbon atom, otherwise the chance of my finding such an atom through the blind forces of nature would be utterly minuscule. Of course you would, and if you were a sensible superintellect you would conclude that the carbon atom is a fix. >> . . . and again: >> I do not believe that any physicist who examined the evidence could fail to draw the inference that the laws of nuclear physics have been deliberately designed with regard to the [--> nuclear synthesis] consequences they produce within stars. ["The Universe: Past and Present Reflections." Engineering and Science, November, 1981. pp. 8–12]>>
Notice, how his primary focus is on sensitivity analysis? Once Mathematics is in the door, inherently all of it is in the door; logic is free-ranging and Mathematics turns on logic. We have structure, we have quantities, what does logic have to say about this? Consequently, probability is a secondary matter. We need to accept that sensitivity analysis is inherently part of analysing models or systems with parameters and structuring frameworks that are not locked by force of logical necessity. (And in this case, if such a range of entities is so locked to fit together, the implied super-force and principles of action yielding the structure would be a very interesting target for level 2 sensitivity analysis.) Let me clip the Matlab-Simulink folks, world class experts on the math of systems and designs:
Sensitivity analysis is defined as the study of how uncertainty in the output of a model can be attributed to different sources of uncertainty in the model input[1]. In the context of using Simulink® Design Optimization™ software, sensitivity analysis refers to understanding how the parameters and states (optimization design variables) of a Simulink model influence the optimization cost function. Examples of using sensitivity analysis include: Before optimization — Determine the influence of the parameters of a Simulink model on the output. Use sensitivity analysis to rank parameters in order of influence, and obtain initial guesses for parameters for estimation or optimization. After optimization — Test how robust the cost function is to small changes in the values of optimized parameters. One approach to sensitivity analysis is local sensitivity analysis, which is derivative based (numerical or analytical). Mathematically, the sensitivity of the cost function with respect to certain parameters is equal to the partial derivative of the cost function with respect to those parameters. The term local refers to the fact that all derivatives are taken at a single point. For simple cost functions, this approach is efficient. However, this approach can be infeasible for complex models, where formulating the cost function (or the partial derivatives) is nontrivial. For example, models with discontinuities do not always have derivatives. Local sensitivity analysis is a one-at-a-time (OAT) technique. OAT techniques analyze the effect of one parameter on the cost function at a time, keeping the other parameters fixed. They explore only a small fraction of the design space, especially when there are many parameters. Also, they do not provide insight about how the interactions between parameters influence the cost function. Another approach to sensitivity analysis is global sensitivity analysis, often implemented using Monte Carlo techniques. This approach uses a representative (global) set of samples to explore the design space. Use Simulink Design Optimization software to perform global sensitivity analysis using the Sensitivity Analysis tool, or at the command line . . .
They are telling how to use their software, but that is just a tool. The point is they are speaking about sensitivity analysis. Now, what happens when such analysis delivers the result that we are at a "special" "resonance" where various components must be just so in a neighbourhood, N, in order for a recognisable distinction to obtain, is that we are looking at an island of function in a configuration space. A locally tight, narrow island of special, life-permitting function, is just what we are looking at; hence, John Leslie again:
"One striking thing about the fine tuning is that a force strength or a particle mass often appears to require accurate tuning for several reasons at once. Look at electromagnetism. Electromagnetism seems to require tuning for there to be any clear-cut distinction between matter and radiation; for stars to burn neither too fast nor too slowly for life’s requirements; for protons to be stable; for complex chemistry to be possible; for chemical changes not to be extremely sluggish; and for carbon synthesis inside stars (carbon being quite probably crucial to life). Universes all obeying the same fundamental laws could still differ in the strengths of their physical forces, as was explained earlier, and random variations in electromagnetism from universe to universe might then ensure that it took on any particular strength sooner or later. Yet how could they possibly account for the fact that the same one strength satisfied many potentially conflicting requirements, each of them a requirement for impressively accurate tuning?" [Our Place in the Cosmos, The Royal Institute of Philosophy, 1998 (courtesy Wayback Machine) Emphases added.] AND: ". . . the need for such explanations does not depend on any estimate of how many universes would be observer-permitting, out of the entire field of possible universes. Claiming that our universe is ‘fine tuned for observers’, we base our claim on how life’s evolution would apparently have been rendered utterly impossible by comparatively minor alterations in physical force strengths, elementary particle masses and so forth. There is no need for us to ask whether very great alterations in these affairs would have rendered it fully possible once more, let alone whether physical worlds conforming to very different laws could have been observer-permitting without being in any way fine tuned. Here it can be useful to think of a fly on a wall, surrounded by an empty region. A bullet hits the fly Two explanations suggest themselves. Perhaps many bullets are hitting the wall or perhaps a marksman fired the bullet. There is no need to ask whether distant areas of the wall, or other quite different walls, are covered with flies so that more or less any bullet striking there would have hit one. The important point is that the local area contains just the one fly." [Emphasis his.]
In short, the Math is speaking in the language of sensitivity analysis. We get to probabilities by two possible routes. One, we can look at indifference and think in terms of a simple random moving about leading to a Monte Carlo type analysis; and/or we may modify to look at biased distributions. Two, we can exploit the conceptual or quantitative duality between information and probability. But, we are not at all locked up to specific models and/or mechanisms of probability. Indeed, I add, probability in the murky middle is an index of ignorance and/or uncertainty. It maximises at flat randomness in a relevant range, local or global. (That is, we are least certain when any conceived possible outcome is in effect equi-possible so far as we know.) And, as noted, if a force is "locking" the system at a life permitting point, that takes the fine tuning issue up one level. Fine tuning is in the door and is not so easily got rid of. The realistic options are a designed, specifically functional world, or a quasi-infinite wider reality, a multiverse. In the latter case, we ought not to be looking at a world at a narrow resonance like this. And, at this point, we are not looking at empirical observation so we are looking at worldviews analysis in philosophy. Which means all serious options -- no "invisible friend" strawman tactics, please -- are on the table to be assessed per comparative difficulties. Where, simply to do serious Math, we must be responsibly, rationally free to significant degree. Post Hume's guillotine, this pins us to a challenge to find an IS that inherently grounds OUGHT at world root level. Without further elaborate argument, I note the balance of centuries of debate. There is just one serious candidate, the inherently good Creator God, a necessary and maximally great being worthy of loyalty and respectful, responsible reasonable service by doing the good in accord with our evident nature. If you doubt or dismiss, simply put up a feasible and comparably good candidate. Enjoy the Christmas season. KF PS: Regarding "Fiat lux," etc, I find Heb 1 instructive:
Heb 1:1 God, having spoken to the fathers long ago in [the voices and writings of] the prophets in many separate revelations [each of which set forth a portion of the truth], and in many ways, 2 has in these last days spoken [with finality] to us in [the person of One who is by His character and nature] His Son [namely Jesus], whom He appointed heir and lawful owner of all things, through whom also He created the universe [that is, the universe as a space-time-matter continuum]. 3 The Son is the radiance and only expression of the glory of [our awesome] God [reflecting God’s [a]Shekinah glory, the Light-being, the brilliant light of the divine], and the exact representation and perfect imprint of His [Father’s] essence, and upholding and maintaining and propelling all things [the entire physical and spiritual universe] by His powerful word [carrying the universe along to its predetermined goal]. When He [Himself and no other] had [by offering Himself on the cross as a sacrifice for sin] accomplished purification from sins and established our freedom from guilt, He sat down [revealing His completed work] at the right hand of the Majesty on high [revealing His Divine authority], 4 having become as much superior to angels, since He has inherited a more excellent and glorious [b]name than they [that is, Son—the name above all names] . . . [AMP]
kairosfocus
December 20, 2016
December
12
Dec
20
20
2016
04:33 AM
4
04
33
AM
PDT
mw: I do think Prof W Rossiter has a point. It appears the cosmos is the result of fine tuning, that is design.
Indeed that is design! There is a complete alignment of various constants wrt the top-level function — harboring life. We see a staggering functional coherence. So, we infer design. Again, at this point there is no reference to probabilities or improbabilities of any mechanism. The unity of function is just evidence on its own. No probabilities involved.
mw: What we cannot then really extrapolate is, was that by coincidence or by design, and how does the figures come about?
What figures? Where does the possibility of coincidence come from? For the upteenth time: probabilities or improbabilities enter the arena only after some smartass proposes a random mechanism.
Bartlett: I have often noticed something of a confusion on one of the major points of the Intelligent Design movement – whether or not the design inference is primarily based on the failure of Darwinism and/or mechanism. This is expressed in a recent thread by a commenter saying, “The arguments for this view [Intelligent Design] are largely based on the improbability of other mechanisms (e.g. evolution) producing the world we observe.” I’m not going to name the commenter because this is a common confusion that a lot of people have. ... The only reason for probabilities in the modern design argument is because Darwinites have said, “you can get that without design”, so we modeled NotDesign as well, to show that it can’t be done that way. ... the *only* reason we are talking about probabilities is to answer an objection. The original evidence *remains* the primary evidence that it was based on.
Origenes
December 20, 2016
December
12
Dec
20
20
2016
04:17 AM
4
04
17
AM
PDT
john_a_designer you mention "Or, if the ratio of the electromagnetic force constant to the gravitational force constant had not been precisely balanced to 1 part in 10^40 then we would have no stars of the right size to support life. We need both fast burning large stars to produce the essential elements for life’s chemistry and planet formation as well as long burning small stars to burn long enough to provide planetary systems habitable for life." Thanks, that is what I was looking for. I will reference that below my 'sloppy' Gordon reference.bornagain77
December 20, 2016
December
12
Dec
20
20
2016
03:29 AM
3
03
29
AM
PDT
I do think Prof W Rossiter has a point. It appears the cosmos is the result of fine tuning, that is design. What we cannot then really extrapolate is, was that by coincidence or by design, and how does the figures come about? Though, his assertion, that time and space began at the Big Bang, appears not take into account the space/time of God and from out of which we came, live and have our being (Acts 17:27-28). God, the only person to witness the creation, cast into the vaults of heaven the material to create the cosmos out of nothing created. Space is space, spiritual or material, otherwise, where does it end or where does it really begin; in eternity? As God is both beginning and the end (Rev 1:11), is space some form of an eternal wheel, or one of the characteristics of God? How can we fine tune space? It is like saying we can fine tune God? Or did space compress and roll into an infinitely hot ball, held in non-space and non-gravity? Or is such, a powerful beguiling fudge of a theory, the best human nature can produce without guidance from superior knowledge? While BA77 does sterling work in relation to giving excellent data against Darwinism, it seems, in this case, he appears to want to batter people into submission—fine tuning comes from the Big Bang theory; there is little room for other considerations, not even the word of God written in stone. In my case, what is missing is an understanding of miracles and their effects on data, when God in a divine law said he created in six days, and that what he said was easy to understand, and is unalterable. A maturing miracle we are clueless to produce. Neither can we incorporate a miracle into any calculations for the Big Bang Theory. It we did, we would see God was true and the theory false (based on the word of God). However, that would prove God true. If we could prove God true, we would be greater than God! We have no understanding of how a spoken word can produce a cosmos. We have only the word of God, his historic and scientific word based on evidence at Sinai, when a whole nation publicly witnessed thunder, lightning, dark cloud and the word of God. Words cut in stone, and placed in the holy of holies. Carried with utmost respect and fear. The same God who, just before his crucifixion, worshipped in the synagogue remembering, he created in six days. Or did he worship under his breath God created by the Big Bang time scale, and hence, when Yahweh condemned a man stoned to death for working on the sabbath (Num 15:32-36) we make Jesus, God in part and God in whole, a murderer and a liar in our disbelief relative to the Big Bang time scale and Darwinism. After the resurrection, in heaven, did Jesus change his worship? If and when we go to heaven, as St John saw, there is the Ark of the Testimony of God (Rev 11:19), amidst thunder, lightning and hail. Does heaven contain a lie, stretched out truth in relative to what God wrote, the only scripture ever written by God, hence of utmost truth and importance for our protection and guidance. “He has made everything suitable for its time; moreover, he has put a sense of past and future into their minds, yet they cannot find out what God has done from the beginning to the end.” (Eccl 3:11) And: “Thus says the LORD: If the heavens above can be measured, and the foundations of the earth below can be explored, then I will reject all the offspring of Israel because of all they have done, says the LORD.” (Jer 31:37) In other words, God is saying we cannot measure how God created. If we could he would have to reject the offspring of Israel. His words not mine. Has anyone seen God in the theoretical Big Bang with his finger on the trigger. No, are we perhaps not being beguiled all over again. ‘God did not really mean six days, know theory, you will be equal to God in that knowledge!’ Jesus teaches that Satan makes war against the Holy Mother and the remnant of her seed; those who keep the Commandments of God without adding or subtracting to them and keep his teaching (Rev 12:17) (Rev 22:18-19). It is worth noting at Christmas, that, the baby in the manger is the God of Sinai who wrote scripture at Sinai.mw
December 20, 2016
December
12
Dec
20
20
2016
03:18 AM
3
03
18
AM
PDT
1 2 3 4 5 6

Leave a Reply