Uncommon Descent Serving The Intelligent Design Community

The Multiverse Gods, final part

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

BeatriceWe’ve been looking at Victor Stenger’s claim that fine-tuning is a fallacy. In part one, we looked at the two fundamental metaphysical theories of the universe–materialist and theist–recognizing how materialists have been losing ground by being forced to admit to a creation, making multiverse-theory a rear-guard action covering their retreat, which attempts to turn the unwanted creator into an impersonal force.

In part two, we discussed the Widow’s Mite fallacy where Stenger uses physical units for a metaphysical property, which like Jesus’ disciples, mistakes a physical quantity for a metaphysical one. The most obvious difference between the two is that physical quantities have units, whereas metaphysical ones are unitless. But in addition, metaphysical quantities are percentages, integrals, they involve a comparison of areas or volumes, as in Bayesian hypothesis testing we are making a ratio of the range to the domain of a fit variable.

Read more…

Comments
Glad to have amused you, ba77 :) However, when science doesn't make sense to a scientist, it's occasionally worth checking whether it makes sense at all :) Just sayin' :pElizabeth Liddle
July 1, 2011
July
07
Jul
1
01
2011
04:15 AM
4
04
15
AM
PDT
Elizabeth: 'but it’s not making any scientific sense at all to me!' HA HA HA,,, Thanks for the laugh!!!bornagain77
July 1, 2011
July
07
Jul
1
01
2011
04:14 AM
4
04
14
AM
PDT
I have to say, Robert Sheldon, I simply do not understand your last two articles at all. You seem, oddly to me, to be rubbishing traditional frequentist statistics ("We were taught to show restraint in our extra parameters with some gobbledy-gook about an F-test"), and recommending Bayesian approaches instead. Well, I'm all for Bayesian approaches, but they too come with problems, not least being the determination of your priors. To quote William Dembski:
The approach to design detection that I propose eliminates chance hypotheses when the probability of matching a suitable pattern (i.e., specification) on the basis of these chance hypotheses is small and yet an event matching the pattern still happens (i.e., the arrow lands in a small target). This eliminative approach to statistical rationality, as epitomized in Fisher’s approach to significance testing, is the one most widely employed in scientific applications. Nevertheless, there is an alternative approach to statistical rationality that is at odds with this eliminative approach. This is the Bayesian approach, which is essentially comparative rather than eliminative, comparing the probability of an event conditional on a chance hypothesis to its probability conditional on a design hypothesis, and preferring the hypothesis that confers the greater probability. I’ve argued at length elsewhere that Bayesian methods are inadequate for drawing design inferences.48 Among the reasons I’ve given is the need to assess prior probabilities in employing these methods, the concomitant problem of rationally grounding these priors, and the lack of empirical grounding in estimating probabilities conditional on design hypotheses.
http://www.designinference.com/documents/2005.06.Specification.pdf The problem with Fisherian statistics is not that you can fit as many parameters as you want (you can't - you lose a degree of freedom for each one, and the supply of degrees of freedom is finite), but because specifying the null adequately is often question-beggingly difficult. The good thing about Bayesian approaches is that your competing hypotheses are much more explicit. And then, about those "unitless quantities": yes they are literally meta-physical in the sense that they are a way of describing the relationship between quantities rather than the quantities themselves, but - so what? We use unitless quantities in quantitative analyses the whole time, as well as normalized quantities, which are also meta-physical, in the same sense: standard deviations for instance. What relevance does this have to your argument? You write:
Let me recap the multiverse-theory logic again. In order to explain a most unusual distribution of matter and energy in our own universe, we hypothesize an infinite array of universes arranged as on a giant bingo card in the Grand Casino, where every possible solution is available somewhere. It is a theory that says everyone's a winner, and therefore none are special. But if the destination isn't so special any more, than the journey is. In such a reality, we have merely traded our special position for speciality of speed, traded x for dx/dt, replaced first order with second order. We've allowed all modes to exist, and therefore the fastest growing mode dominates. And what is the fastest growing mode? Boltzmann brains are an example of a mode that is more probable than intact humans popping out of the void, but it isn't the fastest growing mode. If energy is the limited resource, then energy-gobbling modes will be the fastest. If information is a limited resource, then information sequestering modes will be the fastest. If the Grand Casino were an African savannah, then these modes would be the predators, and naive eco-sensitive conservationists the prey. If indeed the multiverse-theory be correct, then the vast array of worlds would be dominated by the Destroyers, the Devourers, the Shiva of universes, finding ever more ruthless ways to consume our energy, our matter, our information into the Great 11-dimensional Borg.
You seem to be conveying the picture of an infinite field of non-contingent events in which monkeys typing Shakespeare must occur in one of them, and even Shakespeare writing Shakespeare must occur in one of them. And even Shakepeare typing Shakespeare, I guess. But that is a strawman. What at least one version of multiverse theory does is to propose that there are large numbers (perhaps an infinite number) of universes each with an arbitrary set of parameters (randomly drawn from some celestial distribution), and in only a tiny subset of those will the parameters be such that heavy elements form, and that only in this tiny subset is life possible, and with life, intelligent lifeforms capable of asking the question: "is ours the only universe?" That is quite different from the idea that there are universes that don't have laws governing contingencies, and that bizarre things like brains without bodies will occur with greater probability that brains with bodies. Your prose is exhilarating to read, but it's not making any scientific sense at all to me!Elizabeth Liddle
July 1, 2011
July
07
Jul
1
01
2011
03:47 AM
3
03
47
AM
PDT
But in addition, metaphysical quantities are percentages, integrals, they involve a comparison of areas or volumes, as in Bayesian hypothesis testing we are making a ratio of the range to the domain of a fit variable.
I don't follow this, can you explain, please.Heinrich
July 1, 2011
July
07
Jul
1
01
2011
03:04 AM
3
03
04
AM
PDT
Yes, Dr. Sheldon, your insight has applications throughout this blog in the ongoing discussions, so thanks.CannuckianYankee
June 30, 2011
June
06
Jun
30
30
2011
07:44 PM
7
07
44
PM
PDT
Rob Sheldon, I always look forward to your posts. This one is no exception and is perhaps on of your pinnacle contributions. Thank you again!MedsRex
June 30, 2011
June
06
Jun
30
30
2011
06:56 PM
6
06
56
PM
PDT
Dr. Sheldon, simply poetic!bornagain77
June 30, 2011
June
06
Jun
30
30
2011
04:33 PM
4
04
33
PM
PDT
From the article: Even before infinite time, will it not become smart enough that it is indistinguishable from the Biblical God? And could not such a super-intelligent being use the same physics of 11-dimensions to make as many "special" universes as it wanted, including ours? And would not this planned creation of universes be ever-so-much more likely than the Boltzmann brain version of our universe? Therefore doesn't multiverse-theory necessarily make a planned and purposeful creation not just probable but necessary? And there we have it. Fantastic post, Rob Sheldon. I think you can go a lot further with this than you do, but man, you went far enough as is.nullasalus
June 30, 2011
June
06
Jun
30
30
2011
02:08 PM
2
02
08
PM
PDT
Many on here are using the supernatural as a strawman tactic. However, are very uni-verse is the product of supernatural origin, and the supernatural origin is recognized by mainstream science. Supernatural: (of a manifestation or event) Attributed to some force beyond scientific understanding or the laws of nature - a supernatural being Unnaturally or extraordinarily great - a woman of supernatural beauty How did the universe originate: "There was only one Big Bang singularity, and it contained the whole Universe." http://www.physlink.com/education/askexperts/ae649.cfm So what is a the definition of a singularity: sin·gu·lar·i·ty singularities, plural The state, fact, quality, or condition of being singular - he believed in the singularity of all cultures A peculiarity or odd trait A point at which a function takes an infinite value, esp. in space-time when matter is infinitely dense, as at the center of a black hole A point in the future (often set at or around 2030 A.D.) beyond which overwhelming technical changes (esp. the development of superhuman artificial intelligence) make reliable predictions impossible Was the singularity of nature? "In short then, a singularity represents an infinity and we generally don't think nature is infinite." If it is not of nature, then it is some force beyond the scientific understanding of the laws of nature.[see above] http://www.physlink.com/education/askexperts/ae251.cfm If nature is not infinite, and the beginning of the universe [the singularity] was infinite then the beginning of the universe was not of nature. And supernatural is defined as "some force beyond scientific understanding or the laws of nature," then the beginning of the universe is scientifically definable, and can be recognized as supernatural.junkdnaforlife
June 30, 2011
June
06
Jun
30
30
2011
02:03 PM
2
02
03
PM
PDT
1 2

Leave a Reply