Uncommon Descent Serving The Intelligent Design Community

Dawkins on arguments pointing to God

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Ran across this clip at Christian Post:

Atheist author and evolutionary biologist Richard Dawkins says the best argument for God he’s ever hard has to do with a deistic God as the fine-tuner of the universe . . . .

Dawkins prefaced his answer by making it clear that he is not “in any sense admitting that there is a good argument,” and insisted that “there is no decent argument for the existence of deities.” . . . .

“It’s still a very, very bad argument, but it’s the best one going,” he added, noting that a major problem with the argument is that it leaves unexplained where the fine tuner came from.

As for evolution, however, he said there is simply no argument at all that he can consider.

“There are reasons why people don’t get it, such as the time scale involved is so huge. People find it difficult to grasp how long a time has been available for the changes that are talked about,” the evolutionary biologist asserted . . .

What do we think, why? END

PS: Kindly cf the discussion below, and particularly 24 (also 42) and 64. What we see here is a rhetorical attempt to push ethical theism beyond the BATNA windows of the conventional wisdom on what is acceptable thinking by posing on confident manner as a celebrity intellectual (without having to account for the — on fair comment — puerile, strawmannish character of his own anti-theistical arguments):

Overton_window_PC_cave

 

Comments
F/N: Putting the worldviews challenge posed by Dawkins in claiming there are no good arguments to God (though fine tuning seems to give him pause) issue back on the table: http://research.avondale.edu.au/cgi/viewcontent.cgi?article=1029&context=teach http://www.angelfire.com/pro/kairosfocus/resources/Intro_phil/toolkit.htm#intro --> Note tips here: http://www.angelfire.com/pro/kairosfocus/resources/Intro_phil/toolkit.htm#tips http://www.kingsgrantbaptist.com/documents/WorldviewGeisler.pdf http://www3.dbu.edu/naugle/pdf/WV-HistyTheolImplications.pdf We are dealing with a worldviews issue, and we need to ponder the debate and talking points on arguments pointing to God in that light. Let me again point to Plantinga on two dozen or so theistic arguments: https://www.calvin.edu/academic/philosophy/virtual_library/articles/plantinga_alvin/two_dozen_or_so_theistic_arguments.pdf Notice where he begins these lecture notes:
I've been arguing that theistic belief does not (in general) need argument either for deontological justification, or for positive epistemic status, (or for Foley rationality or Alstonian justification); belief in God is properly basic.
[--> a reasonable worldview start-point, especially on one's experience of reality or one's community's experience of reality and of encounter with God . . . here the millions transformed by encounter with God in the face of the risen Christ i/l/o the witness of the 500 to the birth, life, ministry, passion and resurrection in fulfillment of 100's of OT prophecies etc is very much in the background]
But doesn't follow, of course that there aren't any good arguments. Are there some? At least a couple of dozen or so. Swinburne: good argument one that has premises that everyone knows. Maybe aren't any such arguments: and if there are some, maybe none of them would be good arguments for anyone. (Note again the possibility that a person might, when confronted with an arg he sees to be valid for a conclusion he deeply disbelieves from premises he know to be true, give up (some of) those premises: in this way you can reduce someone from knowledge to ignorance by giving him an argument he sees to be valid from premises he knows to be true.) These arguments are not coercive in the sense that every person is obliged to accept their premises on pain of irrationality. Maybe just that some or many sensible people do accept their premises (oneself)What are these arguments like, and what role do they play? They are probabilistic, either with respect to the premises, or with respect to the connection between the premises and conclusion, or both. They can serve to bolster and confirm ('helps' a la John Calvin); perhaps to convince. Distinguish two considerations here: (1) you or someone else might just find yourself with these beliefs; so using them as premises get an effective theistic arg for the person in question. (2) The other question has to do with warrant, with conditional probability in epistemic sense: perhaps in at least some of these cases if our faculties are functioning properly and we consider the premises we are inclined to accept them; and (under those conditions) the conclusion has considerable epistemic probability (in the explained sense) on the premises.
Food for thought. KFkairosfocus
January 17, 2016
January
01
Jan
17
17
2016
03:53 AM
3
03
53
AM
PDT
And so* . . . ? *[As in, on what grounds is it irrelevant or useless or not a reasonable physicist's step or reasoning to explore the sensitivity of the relevant expansion equations to the value of mass density, and to find the extreme sensitivity to one part in 10^24 or so significant? That, earlier it was apparently even moreso significant, to 1 part in 10^59 or so? Why would a reasonable person reject that this is a classic example of what is meant by the "natural" emergence of the fine tuning issue during scientific investigations and its SCIENTIFIC -- cosmological -- significance?]kairosfocus
January 17, 2016
January
01
Jan
17
17
2016
03:32 AM
3
03
32
AM
PDT
Mass of an aggregate, typically, is a very contingent matter.
Yes, but 1 nanosecond after the Big Bang is not quite typical.daveS
January 16, 2016
January
01
Jan
16
16
2016
06:25 PM
6
06
25
PM
PDT
PPS: Let me note this is not about changing laws of physics but the scope of mass per unit volume, by a factor of order 1 part in 10^24. Mass of an aggregate, typically, is a very contingent matter.kairosfocus
January 16, 2016
January
01
Jan
16
16
2016
05:48 PM
5
05
48
PM
PDT
DS, the case illustrates fine tuning in the context of a cosmos hosting life such as ours at this point on its timeline, with actually two levels of sensitivity. At t0 + 1 ns, a single grain of fairly coarse sand's weight relative to the cosmos density, is the difference between over expansion and early collapse, about 1 part in 2 *10^24. This speaks to fine tuning and uses the sensitivity analysis approach. At an earlier point we are looking at 1 part in 10^59, which is far beyond that degree. KF PS: This illustrates a specific case of fine tuning on its own terms, at the other end from 4%.kairosfocus
January 16, 2016
January
01
Jan
16
16
2016
12:19 PM
12
12
19
PM
PDT
KF, I'm well aware of these examples of course, but it doesn't address any of my questions in post #41. Incidentally, I do find it intriguing that you quote a datapoint referring to the state of the universe 1 nanosecond after the Big Bang, yet steadfastly hold to a neutral position on the standard vs. bible-base (YEC) timelines you discuss on your website. Yet I'm the selectively hyperskeptical one.daveS
January 16, 2016
January
01
Jan
16
16
2016
11:07 AM
11
11
07
AM
PDT
DS, there is a concrete example just above, showing sensitivity analysis and fine tuning, a case that would meet any of the descriptive responses you seem to wish to deride by snatching out of context while obviously arising during a scientific investigation. KFkairosfocus
January 16, 2016
January
01
Jan
16
16
2016
10:51 AM
10
10
51
AM
PDT
KF,
eye opening
We can add this to "intriguing", "impressive", and "suggestive".daveS
January 16, 2016
January
01
Jan
16
16
2016
09:09 AM
9
09
09
AM
PDT
PPS: An almost at random illustrative case: http://www.astro.ucla.edu/~wright/cosmo_03.htm#FO >>The figure above shows a(t) for three models with three different densities at a time 1 nanosecond after the Big Bang. The black curve shows a critical density case that matches the WMAP-based concordance model, which has density = 447,225,917,218,507,401,284,016 gm/cc at 1 ns after the Big Bang [--> That's what 1 grain of sand in a then very small volume]. Adding only 0.2 gm/cc to this 447 sextillion gm/cc causes the Big Crunch to be right now! Taking away 0.2 gm/cc gives a model with a matter density ?M [Omega_sub_M] that is too low for our observations. Thus the density 1 ns after the Big Bang was set to an accuracy of better than 1 part in 2235 sextillion. Even earlier it was set to an accuracy better than 1 part in 10^59! Since if the density is slightly high, the Universe will die in an early Big Crunch, this is called the "oldness" problem in cosmology. And since the critical density Universe has flat spatial geometry, it is also called the "flatness" problem -- or the "flatness-oldness" problem. Whatever the mechanism for setting the density to equal the critical density, it works extremely well, and it would be a remarkable coincidence if ?o were close to 1 but not exactly 1. >> (Look at the image at the linked, please.) --> Does this put a handle on one case, just one among dozens? a grain of sand difference in the universe density 1 ns after the singularity [when the observable universe was [let me clarify: comparatively] tiny] and runaway expansion or already crunched down.kairosfocus
January 16, 2016
January
01
Jan
16
16
2016
08:39 AM
8
08
39
AM
PDT
KF,
DS, The set C and/or R, i.e. the complex numbers and the reals, especially — given issues of ranges of values — the near neighbourhoods of the values standardised in our models.
Pardon, but that's very nonspecific. Furthermore, how do you justify taking "near" neighborhoods of measured constants in our universe? How do you test this assumption? I can reel off many potential configurations that don't lie in the set you described. For example, why can't we have differing numbers of constants in some universes? And how do we know the configuration space is infinite? Now let me stress again: I don't reject fine-tuning out of hand, it's just that it leads to questions which even you must admit are currently unanswerable.daveS
January 16, 2016
January
01
Jan
16
16
2016
08:39 AM
8
08
39
AM
PDT
DS, The set C and/or R, i.e. the complex numbers and the reals, especially -- given issues of ranges of values -- the near neighbourhoods of the values standardised in our models. Way back in 6th form, I remember discussions on the r^2 term in inverse square law forces. Yesterday, I was looking on evaluations and tests on rest mass of photons which is theoretically zero, but is open to being evaluated. As for origin of the cosmos, that is a significant scientific focus for cosmology. I found it eye opening to see discussion of how the universe we inhabit came about and has the structures and patterns it has. Including getting to the periodic table of elements and the significance of cosmological parameters for C-Chemistry, aqueous medium cell based life. Seeing the big picture that unifies or at least asking about it, is a key scientific issue. And, a worldviews issue -- including cleansing ourselves from Plato's cave message dominance and policy dominance ideological agendas and games. Games that may well lead us into marches of folly. Yes, a lot of people are less familiar with such broad and deep topics, but I can vividly recall the impact of standing in the dark night with a child to discuss a lunar eclipse in progress and the significance of the milky way running roughly N-S as galactic disk, with centre in Sagittarius. Sol system, galaxy, galaxies, stars and more those point to big issues and interesting ones. As well as how such appear in the celestial sphere, especially something as familiar as the night sky . . . literally a nightly, full colour universe show courtesy your friendly local Creator. The Heavens declare . . . Understanding our world and its roots is no small insignificant thing. KF PS: The interested may want to look here: http://iose-gen.blogspot.com/2010/06/cosmology-and-timelines-of-world.html#cosmointrokairosfocus
January 16, 2016
January
01
Jan
16
16
2016
08:01 AM
8
08
01
AM
PDT
KF, You use this and similar phrases several times:
... the space of the relevant variables etc ...
This is part of what I asked for earlier. Can you specify mathematically the space of the relevant variables? Edit: I'll echo what I stated way back in #8: I don't reject the fine-tuning argument altogether, but in your previous post you touch on several "linked" issues, including the decidedly nonmundane problem of the origin of the universe(!) That's a perfect illustration of my point.daveS
January 16, 2016
January
01
Jan
16
16
2016
07:13 AM
7
07
13
AM
PDT
DS I again point out what you seem utterly unwilling to acknowledge exists. Sensitivity analysis of models etc is STANDARD practice, to see how things behave if the variables, parameters etc move away from whatever values were standardised. When that was done it soon became evident that the physics of the observed cosmos is at a deeply isolated operating point in the space of the relevant variables etc. That is striking as a finding on the various eqns etc used to construct the explanatory models. And it is relevant directly to their use given the facts on precision and accuracy. Where, relevant mathematical topics, techniques and praxis as well as those of modelling, are always relevant to and inextricably intertwined with science. It also raises questions as to how such came about, and those questions embrace both science and philosophy. Such also obtains for string theory and multiverse models, both of which are major foci in science. The other point I indicate is a strictly logical one. When any system of propositions entail a conclusion that people find unacceptable, a fairly common move is to reject the U to deny one or more of the Ps. Sometimes it is reasonable, other times it is unreasonable. In either case, the issue then becomes, what implied commitments are now on the table, and how do they stand up to the alternatives per comparative difficulties. With the question of selective hyperskepticism a legitimate part of that if a double standard is being exerted on what one will accept vs reject. And, when New Atheism, so called is explicitly on the table -- as in Dawkins et al -- motive is always going to be an important and legitimate question. Let me put up how Plantinga discusses the same point:
Note again the possibility that a person might, when confronted with an arg he sees to be valid for a conclusion he deeply disbelieves from premises he know to be true, give up (some of) those premises: in this way you can reduce someone from knowledge to ignorance by giving him an argument he sees to be valid from premises he knows to be true.
That sort of induced ignorance will always be open to question and needs explaining, especially if the rejected premises are such that they would not be questioned in another reasonably related context. On the fine tuning issue, the issue on the table is tht this matter did not come about from Hoyle and Fowler doing metaphysics, but astrophysics. The two neighbouring resonances for C and H turn out to be crucial in making these two elements no 4 and no 3 in cosmic abundance. Sensitivity assessment on the resonances shows that 4% one way, 1/2% another way, and bingo, the circumstance that supports a water based, C-Chemistry life architecture is undermined. Sensitivity analysis. Not, esoteric metaphysical speculation. Standard praxis in ever so many fields of analysis, tied closely to analysis of observational or manufacturing errors and error propagation. Red flag issue, if suddenly dismissed or studiously ignored as a relevant to the discussion. Quite similar to how, suddenly [well, for years now among design inference objectors in the circle around UD], pointing out that a sol system of 10^57 atoms or an observed cosmos of 10^80 atoms is not going to be able to credibly search blindly in a config space of 10^150 or 10^301 respectively meets with attempts to dismiss or worse. Never mind, the easily established point that there is such a thing as functionally specific, information rich configuration that can have high complexity beyond that sort of threshold. Or, that such Wicken wiring diagram nodes and arcs meshes constrained by requisites of functionality rooted in tightly coupled interactions of parts, will confine us to very narrow zones in the field of possible clumped and/or scattered configs. What I followed GP and likely he Dembski in referring to as islands of function [in wide seas of non-function]. Never mind, further, that the idea of some golden search being hit on blindly that makes all of this irrelevant in a Darwin's pond or the like, runs into the problem that a blind search somehow selects a subset of a config space. Instantly, we know that the set of subsets of a space of cardinality C is of exponentially higher cardinality 2^C. Dembski was right to highlight that search for a good search, S4S, faces a much harder challenge than direct search. (He of course used a more elaborate argument, I am simplifying to draw out the problem.) Then, the implication of there being a proposed iconic tree of life in effect demands that we have a vast continent of incrementally accessible functional forms, never mind what just the protein fold domain in AA sequence space data tells us about a large number of small structurally isolated domains so prevalent that we have a lot of such divergences between neighbouring species, including I gather us and chimps. Body plans, credibly, come in islands of function, and the fossil evidence of commonplace stasis, gaps, sudden appearances etc supports that. Coming back full circle, the first point on the table is that fine tuning is credibly real -- once we start from plain vanilla things like sensitivity analysis, is generally accepted as real, and the issue that most have therefore addressed is where it comes from. Multiverse vs design is the main debate. The challenge that multiverse faces, is that a multiverse generator is itself fine tuned, including BTW, a forcing law that would lock the constants in our sub-cosmos. Next, our cosmos is at a deeply isolated operating point in the relevant parameter space. Which is there once the math is on the table and we have no grounds to lock the numbers in the way pi is locked. (Think about how Einstein put in a term [cosmological constant, now in crude terms the yeast bubbler term that pushes expansion rate of the "loaf" that has galaxies in it like raisins] to get rid of expansion in was it 1915-16 for GTR cosmological models, only to have this term become a part of the overall discussion after red shift was put on the table then also as fine tuning came up, as to what value, why.) That brings up how expansion points to a finitely remote beginning and begs the question: begin-ner. Yes, there are converging issues here. Multiple pieces need to fit together coherently. Beyond, we see the point that in a quasi infinite multiverse we are looking at the point that a Boltzmann brain popping up in a local domain would be more likely to be observed than what we observe. And yes, that is a debate out there. Clipping Albrect and Sorbo:
A century ago Boltzmann considered a “cosmology” where the observed universe should be regarded as a rare ?uctuation out of some equilibrium state. The prediction of this point of view, quite generically, is that we live in a universe which maximizes the total entropy of the system consistent with existing observations. Other universes simply occur as much more rare ?uctuations. This means as much as possible of the system should be found in equilibrium as often as possible. From this point of view, it is very surprising that we ?nd the universe around us in such a low entropy state. In fact, the logical conclusion of this line of reasoning is utterly solipsistic. The most likely ?uctuation consistent with everything you know is simply your brain (complete with “memories” of the Hubble Deep ?elds, WMAP data, etc) ?uctuating brie?y out of chaos and then immediately equilibrating back into chaos again. This is sometimes called the “Boltzmann’s Brain” paradox.
The linked issue of inflation is also fine tuned to get to the sort of cosmos we see. Leslie's carpet of flies on the wall is likewise far more likely than the locally lone fly gets swatted by a bullet . . . and tack driver rifles and marksmen able to use them are incredibly fine tuned. We are in a zone with a swatted lone fly. That is not going to just go away, and it is appropriate to speak of that as fine tuning. And, a very simple and powerful candidate explanation for fine tuning is a marksman who knows how to swat it with a bullet, armed with a tack driver of a rifle. Which, per the OP, is something that the likes of Dawkins will find extremely unwelcome. And in this thread, I always have the OP in mind as main focus. KFkairosfocus
January 16, 2016
January
01
Jan
16
16
2016
07:01 AM
7
07
01
AM
PDT
KF: And as I have repeatedly pointed out, no one is stopping you from using these arguments. Would you characterize Hoyle's conclusion that his results were impressive and suggestive "science"?
But then, if one used to accept P1, P2 . . . Pn and discovers that they jointly entail U an unwelcome conclusion, some will reason ~ U so relevant Pj is rejected. The issue then becomes, what price rejection.
This looks to fall in the category of motive mongering to me. Do I regularly claim to be able to read your mind? I don't think so.daveS
January 15, 2016
January
01
Jan
15
15
2016
07:12 PM
7
07
12
PM
PDT
DS, sensitivity analysis -- as has been repeatedly pointed out -- is standard procedure not something exotic or dubious. That is exactly what appears in Hoyle's discussion of the C and O resonances tied to the abundance of these elements, which he found quite impressive and suggestive. But then, if one used to accept P1, P2 . . . Pn and discovers that they jointly entail U an unwelcome conclusion, some will reason ~ U so relevant Pj is rejected. The issue then becomes, what price rejection. KFkairosfocus
January 15, 2016
January
01
Jan
15
15
2016
02:08 PM
2
02
08
PM
PDT
KF,
DS, pardon but there you go again. The fine tuning issue is a mathematical one, with broad implications. It needs to be addressed.
I'm not saying that you shouldn't address it. You can explore the impacts of constants having whatever values you like. I am saying some of us are not likely to find the results especially persuasive, at least relative to more mundane evidence for design. Maybe you do, but you are not me. I originally posted in this thread because at first I was slightly surprised that Dawkins himself said fine-tuning is the best argument for the existence of God that he's heard.daveS
January 15, 2016
January
01
Jan
15
15
2016
09:20 AM
9
09
20
AM
PDT
DS, pardon but there you go again. The fine tuning issue is a mathematical one, with broad implications. It needs to be addressed. So does the related point that he observed cosmos had a beginning a finite time ago, pointing to its contingency and need for a begin-ner. Next, the concept in question has long since been discussed. As for the search challenge implied by a large space of possibilities, that has been on record since Dembski in NFL. Let me quote him:
p. 148:“The great myth of contemporary evolutionary biology is that the information needed to explain complex biological structures can be purchased without intelligence. My aim throughout this book is to dispel that myth . . . . Eigen and his colleagues must have something else in mind besides information simpliciter when they describe the origin of information as the central problem of biology. I submit that what they have in mind is specified complexity [[cf. here below], or what equivalently we have been calling in this Chapter Complex Specified information or CSI . . . . Biological specification always refers to function. An organism is a functional system comprising many functional subsystems. . . . In virtue of their function [[a living organism's subsystems] embody patterns that are objectively given and can be identified independently of the systems that embody them. Hence these systems are specified in the sense required by the complexity-specificity criterion . . . the specification can be cashed out in any number of ways [[through observing the requisites of functional organisation within the cell, or in organs and tissues or at the level of the organism as a whole.
[NB: Dembski cites: Wouters, p. 148: "globally in terms of the viability of whole organisms," Behe, p. 148: "minimal function of biochemical systems," Dawkins, pp. 148 - 9: "Complicated things have some quality, specifiable in advance, that is highly unlikely to have been acquired by ran-| dom chance alone. In the case of living things, the quality that is specified in advance is . . . the ability to propagate genes in reproduction." On p. 149, he roughly cites Orgel's famous remark from 1973, which exactly cited reads: In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity . . . And, p. 149, he highlights Paul Davis in The Fifth Miracle: "Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity."] . . .”
p. 144: [[Specified complexity can be more formally defined:] “. . . since a universal probability bound of 1 [[chance] in 10^150 corresponds to a universal complexity bound of 500 bits of information, [[the cluster] (T, E) constitutes CSI because T [[ effectively the target hot zone in the field of possibilities] subsumes E [[ effectively the observed event from that field], T is detachable from E, and and T measures at least 500 bits of information . . . ”
2^500 = 3.27 * 10^150, 2^1,000 = 1.07*10^301, where per his universal bound estimate, the number of observable states of 10^80 atoms in 10^25 s acting as observers at a rate of 10^45/s is something like 10^150. This is similar to bounds estimated by Abel. In sh0rt, 1000 bits will exhaust the universe with a search of less than 1 part in 10^150 of the space of 1,000 bits. The sol system of 10^57 atoms and at a more reasonable rate of 10^12 - 14 acts/s will be exhausted by 500 bits. The common implicit notion is that here is a golden search out there. Dembski discussed search for a good search S4S, a simplified view is that a search is a subset of a set. So, the cardinality of a set being C that of the set of searches will be 2^C, the power set. Search for a good search is exponentially harder than the already hard direct search. This is the context in which it is fair to conclude that cosmos or sol system scope resources would be swamped by the scope to be searched. For 1,000 bits, comparing a 10^17 s, 10^12 - 14 tries each atom per sec, the scope of search to that of the config space would be as one straw to a haystack that dwarfs the observed cosmos. Blind search is not a good tool for anything of complexity beyond 143 ASCII 7 bit characters, or 125 bytes. This is a key part of the context where FSCO/I is only seen to come about by intelligence acting based on insight and imagination. As we see from simply composing text for comments here. KF PS: You can label sensitivity analysis as speculative all you want. It is not. Without good reason to hold that parameters and constants, values of things like mass of the cosmos etc are locked as the value of pi is, it is reasonable to explore the impacts of their having held differing values. And, the presence of an undetected super law that locks would imply a higher level of fine tuning for that law at least as exacting as the cumulative exactness of the laws it locks. Which is so, whether or no you are interested in it.kairosfocus
January 15, 2016
January
01
Jan
15
15
2016
08:35 AM
8
08
35
AM
PDT
KF,
DS, First, if you simply state a preference, why enter a debate on a broad issue and act as though that very broadness is a flaw?
Two things: 1) As a common man/layperson, I'm not the only one who finds arguments concerning fine-tuning of the fundamental constants unconvincing. If you show me something a little more concrete that fits in a petri dish, you would have a better chance of convincing me. This is just an FYI from a member of your target audience---some of us prefer this sort of evidence. 2) It's not the broadness of topic that I'm talking about. Rather, I'm referring to arguments which are by their very nature speculative (IMHO), which I referred to above.
Five years back, in a discussion initiated by the pseudonymous Mathgrrl, I went with VJT and others and used a threshold metric form that is self explanatory and simple in showing how at 500 or 1000 bits, relevant search resources are exhausted.
This seems like something worthy of publication. Have any of the heavy-hitters, actual working researchers, acknowledged your calculation?daveS
January 15, 2016
January
01
Jan
15
15
2016
07:52 AM
7
07
52
AM
PDT
DS, First, if you simply state a preference, why enter a debate on a broad issue and act as though that very broadness is a flaw? Second, maybe you need to see Mr Ewert's further remarks on discussion. The problem with your characterisation of a descriptive phrase is it pretends it is an idiosyncrasy introduced by some dubious bloggist who needs to be put in its place by being demanded to publish an abbreviation in peer reviewed literature deeply hostile to design thought. Neat dismissive talking point. Not so fast. I am certain you have been present when it has been repeatedly pointed out that the concept in fact is 40 years old as something used in this context and is due to Orgel and Wicken, who spoke to specified complexity in a bio-functional context and to how wiring diagram based, functionally specific organisation is information rich. This concept was picked up by Thaxton et al in the early 80s in drafting TMLO and antedates Dembski et al. In NFL, in discussing specified complexity, Dembski clealy states and cites as to how in the biological context such is cashed out in terms of function. Meyer, speaks to specified function in the context of being information rich specified complexity. Durston et al speak to random, ordered and functional sequence complexity and provide a quantification. Five years back, in a discussion initiated by the pseudonymous Mathgrrl, I went with VJT and others and used a threshold metric form that is self explanatory and simple in showing how at 500 or 1000 bits, relevant search resources are exhausted. So, as far as I am concerned, any need for the concept to be in the literature has long since been passed and the insistence on a dismissive talking point in the teeth of repeated correction is an indicator of selective hyperskepticism rather than any serious objection. I use an abbreviation that is convenient, FSCO/I, with reasonable basis. Deal with it on a reasonable basis please, not a selectively hyperskeptical talking point dismissal in the teeth of correction and concrete demonstrations of the reality the phrase describes. Starting with the exploded view of an Abu 6500 3C reel, the node arc process flow network of an oil refinery, the nature of text strings in English etc or machine readable code, and the process flow network involved in cellilar metabolism, or just the code based synthesis of proteins in the ribosome. With all due respect, the bottomline is that insistent dismissive denial of evident and manifest reality because one does not like a descriptive abbreviation for a summary phrase, is a sign of selective hyperskepticism, not serious discussion. KFkairosfocus
January 15, 2016
January
01
Jan
15
15
2016
07:13 AM
7
07
13
AM
PDT
KF,
Saying you have a preference elsewhere does not change what the case is.
Absolutely. It's just my preference.
FSCO/I is real [etc]
Maybe so, but I share Winston Ewert's position on this concept. Have you tried submitting a paper on FSCO/I to BIO-Complexity or some other journal? Has the acronym ever appeared in print?
Are you at least willing to accept that serious thinkers may have good reason for thinking that way?
I do understand that serious thinkers, even skeptics such as Steven Weinberg, acknowledge that fine-tuning is an argument to be reckoned with.daveS
January 15, 2016
January
01
Jan
15
15
2016
06:48 AM
6
06
48
AM
PDT
F/N: In the face of a worldviews challenge from Dawkins, some may find the discussion here on: http://nicenesystheol.blogspot.com/2010/11/unit-2-gospel-on-mars-hill-foundations.html#u2_bld_wvu helpful. KFkairosfocus
January 15, 2016
January
01
Jan
15
15
2016
06:47 AM
6
06
47
AM
PDT
DS: The issue raised in the OP and the relevant evidence are what they are. Dawkins has spoken to fine tuning (and -- let me add -- to a grand worldviews assertion that there are no good arguments pointing to the reality of God), you have raised several points, I have summarised the case in response to the context. Saying you have a preference elsewhere does not change what the case is. Though, it is significant to note that there is a very specific point I have made regarding sensitivity analysis and a wider point about complex interactive multi component entities that requires the parts to be co-adapted and properly organised to work together. Beyond all this, there is a longstanding fairly specific point long made: FSCO/I is real and on a trillion member basis reliably points to design as cause. This is backed by a needle in haystack, blind search challenge which supports why the observations are as they are. Cell based life, and major body plans, are full of FSCO/I. Is or is that at minimum, reason to be open to the possibility that cell based life is designed? Likewise, a significant number of eminent thinkers in the field see fine tuning as a reality of the cosmos that supports C chem, aqueous medium, cell based life. There is a debate on what that points to. Some think, selection effects in a multiverse -- others point to Boltzmann brains and the lone fly vs carpet of flies issue. Others are at minimum open to the point that such fine tuning points to design. Are you at least willing to accept that serious thinkers may have good reason for thinking that way? That then feeds the rope vs chain issue. KF PS: A worldviews level case is generally a cumulative, convergent, mutually supportive evidence argument by comparative difficulties case. In that context the range of evidence relevant to the reality of God and the reasonableness of believing in God will all be relevant. So will be what Dawkins et al seem to typically fail to recognise: there are implied worldview commitments and implications in rejecting the cluster of evidences and linked arguments that are held to point to God. The cumulative commitment in their new atheism, is open to very serious challenge. And that needs to also come to the table, for, every tub must stand on its own worldview bottom. That includes addressing, for instance the necessity of our having responsible rational freedom as a basis for rational discussion, and it includes a coherent accounting for our being under moral government. It also includes a major historical and current case in point on which millions claim life transforming experience of God: Jesus of Nazareth, risen from death as vindicated Messiah, with 500 unbreakable witnesses and eyewitness lifetime record as well as the resulting global movement. In this context issues of implying mass -- even grand, beyond Plato's cave type -- delusion and deception for human rationality are very relevant. PPS: Such worldviews comparative difficulties issues include factual adequacy across a worldview range, coherence [logical and dynamical], balanced explanatory power in a grand discussion of best explanations across competing alternatives. Where Dawkins' confident manner assertion there are no good arguments pointing to God is as grand a worldview assertion as one gets. I hardly need to underscore that his God Delusion was widely rebutted from various informed directions as grossly inadequate philosophically. Indeed as in effect sophomoric or even strawmannish.kairosfocus
January 15, 2016
January
01
Jan
15
15
2016
06:18 AM
6
06
18
AM
PDT
KF, Your post #25 illustrates the point I am trying to make. As I stated in my post to EA above, I would be most persuaded by a minimal example of design. In #25 (and elsewhere in this thread), you bring in every topic under the sun. If we're talking about scientific evidence for design, do we need to discuss objective moral law and Jesus??daveS
January 15, 2016
January
01
Jan
15
15
2016
05:59 AM
5
05
59
AM
PDT
EA,
daveS: I would agree that evidence for design in biology is stronger than evidence for design in cosmology. However, not because it is on “a more modest scale.” Rather because it includes additional factors or evidences.
Well, I'm saying that I would find evidence on a more modest scale more persuasive. Not necessarily that there is very little biological evidence. Experiments which could be done in a lab in the context of the current known laws of physics. A minimal example of design, if you will. That's what I would like to see.daveS
January 15, 2016
January
01
Jan
15
15
2016
05:41 AM
5
05
41
AM
PDT
It’s still a very, very bad argument, but it’s the best one going,” he added, noting that a major problem with the argument is that it leaves unexplained where the fine tuner came from. Well, I find it silly when people want to squeeze reality into their small heads. Especially divine reality. He contradicts himself because with regard to evolution he admits that people cannot comprehend the scale of things (while he can, of course). How then can he not apply the same to his own absurd claim that there must be an explanation of God. Who told him that must be the case? It's the same old distinction between scientists and poets noted by Chesterton. The professional illnesses of scientists are paranoidal hyperskepticism and the inability to see differences between reality and a scientific model.EugeneS
January 15, 2016
January
01
Jan
15
15
2016
02:47 AM
2
02
47
AM
PDT
F/N: There is a principle, that one should not take counsel of one's opponents, should not fight on ground of their choosing (i.e. if ambushed a first task is to break out of the kill zone), and should not allow such to frame the terms of a discussion or debate. In applying that principle, it leads me to the contrast between a chain and a rope: a chain is no stronger than its weakest link, but because a rope depends on the interaction of its components starting with individual fibres, the strength of the whole is far more than the strength of each fibre. This is often forgotten in a world of shadow show games, dismissive talking point tactics, deep ideologically driven polarisation and closed minded selective hyperskepticism. Accordingly, let me put on the table a useful skeletal summary by Wallace:
(1) The Temporal Nature of the Cosmos (Cosmological) (a) The Universe began to exist (b) Anything that begins to exist must have a cause (c) Therefore, the Universe must have a cause (d) This cause must be eternal (uncaused), non-spatial, immaterial, atemporal, and personal (having the ability to willfully cause the beginning of the universe) (e) The cause fits the description we typically assign to God (2) The Appearance of Design (Teleological) (a) Human artifacts (like watches [--> I add, don't forget Paley's thought exercise of the self-replicating time keeping watch in Ch 2 of his nat theol]) are products of intelligent design (b) Many aspects and elements of our universe resemble human artifacts (c) Like effects typically have like causes (d) Therefore, it is highly probable the appearance of design in the Universe is simply the reflection of an intelligent designer (d) Given the complexity and expansive nature of the Universe, this designer must be incredibly intelligent and powerful (God) (3) The Existence of Objective Moral Truth (Axiological) (a) There is an objective (transcendent) moral law (b) Every law has a law giver (c) Therefore, there is an objective (transcendent) moral law giver (d) The best explanation for this objective (transcendent) law giver is God (4) The Existence of Absolute Laws of Logic (Transcendent) (a) The laws of logic exist i. The laws of logic are conceptual laws ii. The laws of logic are transcendent iii. The laws of logic pre-existed humans (b) All conceptual laws reflect the mind of a law giver (c) The best and most reasonable explanation for the kind of mind necessary for the existence of the transcendent, objective, conceptual laws of logic is a transcendent, objective, eternal Being (God) (5) The Unique Nature of Our World and Universe (Anthropic) (a) Our universe appears uniquely designed so: i. Life can exist ii. This same life can examine the universe (b) This unique design cannot be the result of random chance or unguided probabilities (c) There is, therefore, a God who designed the universe to support human life and reveal His existence as creator of the Cosmos
In addition, we have the fact of millions of people who attest to having encountered the living God in life transforming ways through the experience of believing the gospel of Christ rendered credible through the testimony of the 500 witnesses. If just one is right, God encountered in the face of Jesus is real. But if one is willing to argue to mass delusion, that then raises severe questions about the reliability of the deliverances of human thought and conscious experiences. A road no wise person will go down. And, there are more facets, many more. KFkairosfocus
January 15, 2016
January
01
Jan
15
15
2016
02:28 AM
2
02
28
AM
PDT
EA & DS: I would actually highlight that both the cosmological and the biological design inferences are examples of arguments to design as causal process. This, in light of the phenomenon of functionally specific complex organisation of many parts to form a coherent whole. A whole, that a little sensitivity analysis or playing with tolerances will soon show (mathematically for the cosmos, empirically for biosystems) is at a deeply isolated operating point in a configuration space. Both arguments are quite strong to the mind not bent on not seeing their force. (For my part, I am haunted by Jesus' warning to certain people: "BECAUSE I tell the truth, you cannot understand/acknowledge/receive what I am saying." There is such a thing as the fallacy of the closed, selectively hyperskeptical, en-darkened mind. Too often, such will only be inclined to change through existential crisis where, struck down off their high horses on the road to a Damascus of their agenda, they are shocked by pain and light. But we have no general promise of such an encounter, only a warning that refusal to heed the evidence of the world without and the mind, heart and conscience within leaves us without excuse and liable to a debased, en-darkened mind that feeds a benumbed conscience and a life out of moral control. We must address our responsibility before evident truth we do or should know, not the hopeless task of appealing to and persuading those determined not to see things other than in terms of the shadow shows of some Plato's cave or other.) The biological and cosmological domains are indeed epistemologically independent, but interacting. To get to observed cell based life, you have to get to circumstellar, habitable zone terrestrial planets in spiral galaxy habitable zones, in a cosmos where H, He, O and C as well as N are sufficiently abundant and where you meet dozens and dozens of varied criteria. Then you have to get through a Darwin's pond or the like to ground OOL. As in, origin of C-Chemistry, aqueous medium, metabolic cells with integral communication, control and regulatory systems embracing the additional factor of codes and the further factor of an integrated von Neumann kinematic self-replicator facility. Onward, you have to account for multicellular, far more informationally complex body plan based life forms with plans that have to be embryologically AND ecologically viable from the outset . . . or they never get started. For these, materialist just so stories told while dressed up in a lab coat are not good enough. But, the cosmological side is not to be despised. DS, you full well know that sensitivity analysis is a commonplace feature of design studies, as happens routinely in say electronics , , , as in, red band on a resistor 2% likely expensive metal oxide, Gold band, 5%, silver, 10%, none 20%. Where, you better design with sensitivity and drift in mind without over-much dependence on a technician to keep things in whack (especially if selling to consumers), And, when one sees multiple components that are mutually coupled in interactive ways and that one is dealing with a system at a tight operating point, that is a strong sign of co-ordinating intent. Let me again clip Kreeft from his argument no 8, adapting Norris Clarke:
Starting point. This world is given to us as a dynamic, ordered system of many active component elements. Their natures (natural properties) are ordered to interact with each other in stable, reciprocal relationships which we call physical laws. For example, every hydrogen atom in our universe is ordered to combine with every oxygen atom in the proportion of 2:1 (which implies that every oxygen atom is reciprocally ordered to combine with every hydrogen atom in the proportion of 1:2). So it is with the chemical valences of all the basic elements. So too all particles with mass are ordered to move toward every other according to the fixed proportions of the law of gravity. In such an interconnected, interlocking, dynamic system, the active nature of each component is defined by its relation with others, and so presupposes the others for its own intelligibility and ability to act. Contemporary science reveals to us that our world-system is not merely an aggregate of many separate, unrelated laws, but rather a tightly interlocking whole, where relationship to the whole structures and determines the parts. The parts can no longer be understood apart from the whole; its influence permeates them all. Argument. In any such system as the above (like our world) no component part or active element can be self-sufficient or self-explanatory. For any part presupposes all the other parts—the whole system already in place—to match its own relational properties. It can’t act unless the others are there to interact reciprocally with it. Any one part could be self-sufficient only if it were the cause of the whole rest of the system—which is impossible, since no part can act except in collaboration with the others. Nor can the system as a whole explain its own existence, since it is made up of the component parts and is not a separate being, on its own, independent of them. So neither the parts nor the whole are self-sufficient; neither can explain the actual existence of this dynamically interactive system . . .
John Leslie points out:
One striking thing about the fine tuning is that a force strength or a particle mass often appears to require accurate tuning for several reasons at once. Look at electromagnetism. Electromagnetism seems to require tuning for there to be any clear-cut distinction between matter and radiation; for stars to burn neither too fast nor too slowly for life’s requirements; for protons to be stable; for complex chemistry to be possible; for chemical changes not to be extremely sluggish; and for carbon synthesis inside stars (carbon being quite probably crucial to life). Universes all obeying the same fundamental laws could still differ in the strengths of their physical forces, as was explained earlier, and random variations in electromagnetism from universe to universe might then ensure that it took on any particular strength sooner or later. Yet how could they possibly account for the fact that the same one strength satisfied many potentially conflicting requirements, each of them a requirement for impressively accurate tuning? [Our Place in the Cosmos, 1998 (courtesy Wayback Machine)]
Robin Collins:
Suppose we went on a mission to Mars, and found a domed structure in which everything was set up just right for life to exist. The temperature, for example, was set around 70 °F and the humidity was at 50%; moreover, there was an oxygen recycling system, an energy gathering system, and a whole system for the production of food. Put simply, the domed structure appeared to be a fully functioning biosphere. What conclusion would we draw from finding this structure? Would we draw the conclusion that it just happened to form by chance? Certainly not. Instead, we would unanimously conclude that it was designed by some intelligent being. Why would we draw this conclusion? Because an intelligent designer appears to be the only plausible explanation for the existence of the structure. That is, the only alternative explanation we can think of–that the structure was formed by some natural process–seems extremely unlikely. Of course, it is possible that, for example, through some volcanic eruption various metals and other compounds could have formed, and then separated out in just the right way to produce the “biosphere,” but such a scenario strikes us as extraordinarily unlikely, thus making this alternative explanation unbelievable. The universe is analogous to such a “biosphere,” according to recent findings in physics . . . . Scientists call this extraordinary balancing of the parameters of physics and the initial conditions of the universe the “fine-tuning of the cosmos” . . . For example, theoretical physicist and popular science writer Paul Davies–whose early writings were not particularly sympathetic to theism–claims that with regard to basic structure of the universe, “the impression of design is overwhelming” (Davies, 1988, p. 203) . . . A few examples of this fine-tuning are listed below: 1. If the initial explosion of the big bang had differed in strength by as little as 1 part in 1060, the universe would have either quickly collapsed back on itself, or expanded too rapidly for stars to form. In either case, life would be impossible. [See Davies, 1982, pp. 90-91. (As John Jefferson Davis points out (p. 140), an accuracy of one part in 10^60 can be compared to firing a bullet at a one-inch target on the other side of the observable universe, twenty billion light years away, and hitting the target.) 2. Calculations indicate that if the strong nuclear force, the force that binds protons and neutrons together in an atom, had been stronger or weaker by as little as 5%, life would be impossible. (Leslie, 1989, pp. 4, 35; Barrow and Tipler, p. 322.) 3. Calculations by Brandon Carter show that if gravity had been stronger or weaker by 1 part in 10 to the 40th power, then life-sustaining stars like the sun could not exist. This would most likely make life impossible. (Davies, 1984, p. 242.) 4. If the neutron were not about 1.001 times the mass of the proton, all protons would have decayed into neutrons or all neutrons would have decayed into protons, and thus life would not be possible. (Leslie, 1989, pp. 39-40 ) 5. If the electromagnetic force were slightly stronger or weaker, life would be impossible, for a variety of different reasons. (Leslie, 1988, p. 299.) . . .
Sir Fred Hoyle, on the impact of the first fine tuning case in point:
From 1953 onward, Willy Fowler and I have always been intrigued by the remarkable relation of the 7.65 MeV energy level in the nucleus of 12 C to the 7.12 MeV level in 16 O. If you wanted to produce carbon and oxygen in roughly equal quantities by stellar nucleosynthesis, these are the two levels you would have to fix, and your fixing would have to be just where these levels are actually found to be. Another put-up job? . . . I am inclined to think so. A common sense interpretation of the facts suggests that a super intellect has “monkeyed” with the physics as well as the chemistry and biology, and there are no blind forces worth speaking about in nature.[F. Hoyle, Annual Review of Astronomy and Astrophysics, 20 (1982): 16.]
Hugh Ross, Canadian Astrophysicist and champion of Old Earth Creationism, aptly uses the picture of tuning the resonance circuits of a radio to give a bit more context:
As you tune your radio, there are certain frequencies where the circuit has just the right resonance and you lock onto a station. The internal structure of an atomic nucleus is something like that, with specific energy or resonance levels. If two nuclear fragments collide with a resulting energy that just matches a resonance level, they will tend to stick and form a stable nucleus. Behold! Cosmic alchemy will occur! In the carbon atom, the resonance just happens to match the combined energy of the beryllium atom and a colliding helium nucleus. Without it, there would be relatively few carbon atoms. Similarly, the internal details of the oxygen nucleus play a critical role. Oxygen can be formed by combining helium and carbon nuclei, but the corresponding resonance level in the oxygen nucleus is half a percent too low for the combination to stay together easily. Had the resonance level in the carbon been 4 percent lower, there would be essentially no carbon. Had that level in the oxygen been only half a percent higher, virtually all the carbon would have been converted to oxygen. Without that carbon abundance, neither you nor I would be here. [[Beyond the Cosmos (Colorado Springs, Colo.: NavPress Publishing Group, 1996), pg. 32.]
And, as for the notion that it is just one or two oddballs out there -- and BTW Hoyle as holder of a Nobel-equivalent prize, holds enough weight that even his lone voice should give pause, here is Luke Barnes:
There are a great many scientists, of varying religious persuasions, who accept that the universe is fine-tuned for life, e.g. Barrow, Carr, Carter, Davies, Dawkins, Deutsch, Ellis, Greene, Guth, Harrison, Hawking, Linde, Page, Penrose, Polkinghorne, Rees, Sandage, Smolin, Susskind, Tegmark, Tipler, Vilenkin, Weinberg, Wheeler, Wilczek. They differ, of course, on what conclusion we should draw from this fact . . .
Note his context:
The fine-tuning of the universe for intelligent life has received much attention in recent times. Beginning with the classic papers of Carter (1974) and Carr & Rees (1979), and the extensive discussion of Barrow & Tipler (1986), a number of authors have noticed that very small changes in the laws, parameters and initial conditions of physics would result in a universe unable to evolve and support intelligent life . . . . The claim that the universe is fine-tuned can be formulated as: FT: In the set of possible physics, the subset that permit the evolution of life is very small. [--> notice, this is not an "anti-evolution" argument, though I would suggest the broader form, life-permitting configurations in the field of possibilities] FT can be understood as a counterfactual claim, that is, a claim about what would have been. Such claims are not uncommon in everyday life. For example, we can formulate the claim that Roger Federer would almost certainly defeat me in a game of tennis as: "in the set of possible games of tennis between myself and Roger Federer, the set in which I win is extremely small”. This claim is undoubtedly true, even though none of the innitely-many possible games has been played. Our formulation of FT, however, is in obvious need of refinement. What determines the set of possible physics? Where exactly do we draw the line between "universes”? How is "smallness” being measured? Are we considering only cases where the evolution of life is physically impossible or just extremely improbable? What is life? We will press on with the our formulation of FT as it stands, pausing to note its inadequacies when appropriate. As it stands, FT is precise enough to distinguish itself from a number of other claims for which it is often mistaken. FT is not the claim that this universe is optimal for life, that it contains the maximum amount of life per unit volume or per baryon, that carbon-based life is the only possible type of life, or that the only kinds of universes that support life are minor variations on this universe. These claims, true or false, are simply beside the point. The reason why FT is an interesting claim is that it makes the existence of life in this universe appear to be something remarkable, something in need of explanation. The intuition here is that, if ours were the only universe, and if the causes that established the physics of our universe were indierent to whether it would evolve life, then the chances of hitting upon a life-permitting universe are very small. As Leslie (1989, pg. 121) notes, [a] chief reason for thinking that something stands in special need of explanation is that we actually glimpse some tidy way in which it might be explained” . . .
Walter Bradley gives the wider context, by laying out some general "engineering requisites" for a life-habitable universe; design specifications, so to speak:
- Order to provide the stable environment that is conducive to the development of life, but with just enough chaotic behavior to provide a driving force for change. - Sufficient chemical stability and elemental diversity to build the complex molecules necessary for essential life functions: processing energy, storing information, and replicating. A universe of just hydrogen and helium will not "work." - Predictability in chemical reactions, allowing compounds to form from the various elements. - A "universal connector," an element that is essential for the molecules of life. It must have the chemical property that permits it to react readily with almost all other elements, forming bonds that are stable, but not too stable, so disassembly is also possible. Carbon is the only element in our periodic chart that satisfies this requirement. - A "universal solvent" in which the chemistry of life can unfold. Since chemical reactions are too slow in the solid state, and complex life would not likely be sustained as a gas, there is a need for a liquid element or compound that readily dissolves both the reactants and the reaction products essential to living systems: namely, a liquid with the properties of water. [[Added note: Water requires both hydrogen and oxygen.] - A stable source of energy to sustain living systems in which there must be photons from the sun with sufficient energy to drive organic, chemical reactions, but not so energetic as to destroy organic molecules (as in the case of highly energetic ultraviolet radiation).
Such requisites met in the context of a finely tuned observed cosmos plainly make design of the cosmos a plausible view, even if it is in the context of what has been termed a multiverse. As D. Halsmer, J. Asper, N. Roman, T. Todd observe of water, a seemingly simple wonder molecule whose properties go back into the roots of the physics of the cosmos and then spread out into a wide range of factors setting the solvent context for cell based life to the point that NASA first hunts for water:
The remarkable properties of water are numerous. Its very high specific heat maintains relatively stable temperatures both in oceans and organisms. As a liquid, its thermal conductivity is four times any other common liquid, which makes it possible for cells to efficiently distribute heat. On the other hand, ice has a low thermal conductivity, making it a good thermal shield in high latitudes. A latent heat of fusion only surpassed by that of ammonia tends to keep water in liquid form and creates a natural thermostat at 0°C. Likewise, the highest latent heat of vaporization of any substance - more than five times the energy required to heat the same amount of water from 0°C-100°C - allows water vapor to store large amounts of heat in the atmosphere. This very high latent heat of vaporization is also vital biologically because at body temperature or above, the only way for a person to dissipate heat is to sweat it off. Water's remarkable capabilities are definitely not only thermal. A high vapor tension allows air to hold more moisture, which enables precipitation. Water's great surface tension is necessary for good capillary effect for tall plants, and it allows soil to hold more water. Water's low viscosity makes it possible for blood to flow through small capillaries. A very well documented anomaly is that water expands into the solid state, which keeps ice on the surface of the oceans instead of accumulating on the ocean floor. Possibly the most important trait of water is its unrivaled solvency abilities, which allow it to transport great amounts of minerals to immobile organisms and also hold all of the contents of blood. It is also only mildly reactive, which keeps it from harmfully reacting as it dissolves substances. Recent research has revealed how water acts as an efficient lubricator in many biological systems from snails to human digestion. By itself, water is not very effective in this role, but it works well with certain additives, such as some glycoproteins. The sum of these traits makes water an ideal medium for life. Literally, every property of water is suited for supporting life. It is no wonder why liquid water is the first requirement in the search for extraterrestrial intelligence. All these traits are contained in a simple molecule of only three atoms. One of the most difficult tasks for an engineer is to design for multiple criteria at once. ... Satisfying all these criteria in one simple design is an engineering marvel. Also, the design process goes very deep since many characteristics would necessarily be changed if one were to alter fundamental physical properties such as the strong nuclear force or the size of the electron. [["The Coherence of an Engineered World," International Journal of Design & Nature and Ecodynamics, Vol. 4(1):47-65 (2009). HT: ENV.]
In short, the elegantly simple water molecule is set to a finely balanced, life-facilitating operating point, based on fundamental forces and parameters of the cosmos. Forces that had to be built in from the formation of the cosmos itself. Which fine-tuning from the outset, therefore strongly suggests a purpose to create life in the cosmos from its beginning. Moreover, the authors also note how C, H and O just happen to be the fourth, first and third most abundant atoms in the cosmos, helium --the first noble gas -- being number two. This -- again on fundamental parameters and laws of our cosmos -- does not suggest a mere accident of happy coincidence:
The explanation has to do with fusion within stars. Early [[stellar, nuclear fusion] reactions start with hydrogen atoms and then produce deuterium (mass 2), tritium (mass 3), and alpha particles (mass 4), but no stable mass 5 exists. This limits the creation of heavy elements and was considered one of "God's mistakes" until further investigation. In actuality, the lack of a stable mass 5 necessitates bigger jumps of four which lead to carbon (mass 12) and oxygen (mass 16). Otherwise, the reactions would have climbed right up the periodic table in mass steps of one (until iron, which is the cutoff above which fusion requires energy rather than creating it). The process would have left oxygen and carbon no more abundant than any other element.
In short, sensitivity analysis has a power all of its own, and has caught serious and widespread attention among quite weighty thinkers. Indeed, that is a context in which multiverse suggestions have gained wide currency. Though, there are multiple problems with such suggestions. For one, the Boltzmann Brain popping up in a limited domain and fading away is far more "likely" and/or similarly, finding ourselves in John Leslie's portion of the wall carpeted with flies. Instead, pretty commonplace sensitivity analysis on the math -- commonplace enough that it is refusal to do it that would flag itself as odd -- shows that we are at a complex, locally at least, deeply isolated operating point in the domain of the physics that sets up a cosmos. That is a part of a blazing light from the heavens that has knocked us off our high horse. The other part is, we inhabit a cosmos that credibly had a finitely remote beginning usually estimated at 10 0- 20 BYA on several grounds, and refined to about 14 BYA, more exactly 13.75 BYA on cosmological expansion grounds. That which begins to exist is patently contingent, and it is therefore subject to cause beyond itself, not least it points to causally necessary enabling on/off factors that need to go on for a beginning. In simple terms, when we hear a little bang, we instinctively turn around and look for the source. (Especially here in M/rat -- 20 years ago I remember being "spotted" by an airport worker at ANU as coming from here. Why? When a noise happens you all jump and look around. Our friend off to the S makes noises!) When we face a Big Bang, it is very reasonable to go looking for a big bang-er. More broadly yet, we know atomic matter to be contingent. Contingency like that points to necessary being root, where -- were there ever an utter nothing, such non-being would forever obtain. And yes, this is a wide-ranging worldviews case to be evaluated on comparative difficulties. Such is the nature of the beast. KFkairosfocus
January 15, 2016
January
01
Jan
15
15
2016
01:27 AM
1
01
27
AM
PDT
JC, have you studied the tyrannies of C20 -- especially Stalin, Hitler, Mao and Pol Pot? Or, the American and global Abortion holocaust? Or the power of the Marlboro man advertising campaign and its health consequences? Or more recent campaigns? The appeal to common experience, the common sense of being morally governed -- oops, where does that come from and what is the implication of treating it as ultimately illusory and/or driven by blind forces of chance and/or necessity? -- and social consensus too often boils down to the amoral, nihilist credo, might and manipulation make 'right' and 'truth' etc. (It also implies that the morally based lone voice dissident [who generally has little power beyond his voice and courage to stand "contra mundum"] is always wrong by definition. The powerful and influential -- of course -- get away with pushing things because they have enough clout to play astro-turfing games and/or to manipulate sufficient numbers emotionally.) That is, we are at a sadly familiar point of seeing the absurdity of socio-cultural relativism. That is, the door to manipulative marches of folly lies open. Complete with the shadow shows of a Plato's cave -- ask yourself, who was buying those shows, why? Likewise, who is paying for and who is controlling today's mass media equivalent and educational equivalent? Do you really want to leave morality in those hands? Where, further, as the case in Ac 27 highlights, such clearly includes the case where money/power interests and their bought and paid for technicos manipulate a "democratic" majority into a march of folly and injustice. Where too, the issue now is not so much "the majority is right," but "my bought and paid for majority/consensus (especially when that includes influential institutions) is is right . . . for me." At least, until things blow up when the march of folly goes over the cliff. Well did Plato warn us about intellectually fashionable evolutionary materialism 2350 years ago:
Ath. . . .[The avant garde philosophers and poets, c. 360 BC] say that fire and water, and earth and air [i.e the classical "material" elements of the cosmos], all exist by nature and chance, and none of them by art . . . [such that] all that is in the heaven, as well as animals and all plants, and all the seasons come from these elements, not by the action of mind, as they say, or of any God, or from art, but as I was saying, by nature and chance only [ --> that is, evolutionary materialism is ancient and would trace all things to blind chance and mechanical necessity] . . . . [Thus, they hold] that the principles of justice have no existence at all in nature, but that mankind are always disputing about them and altering them; and that the alterations which are made by art and by law have no basis in nature, but are of authority for the moment and at the time at which they are made.-
[ --> Relativism, too, is not new; complete with its radical amorality rooted in a worldview that has no foundational IS that can ground OUGHT, leading to an effectively arbitrary foundation only for morality, ethics and law: accident of personal preference, the ebbs and flows of power politics, accidents of history and and the shifting sands of manipulated community opinion driven by "winds and waves of doctrine and the cunning craftiness of men in their deceitful scheming . . . " cf a video on Plato's parable of the cave; from the perspective of pondering who set up the manipulative shadow-shows, why.]
These, my friends, are the sayings of wise men, poets and prose writers, which find a way into the minds of youth. They are told by them that the highest right is might,
[ --> Evolutionary materialism -- having no IS that can properly ground OUGHT -- leads to the promotion of amorality on which the only basis for "OUGHT" is seen to be might (and manipulation: might in "spin") . . . ]
and in this way the young fall into impieties, under the idea that the Gods are not such as the law bids them imagine; and hence arise factions [ --> Evolutionary materialism-motivated amorality "naturally" leads to continual contentions and power struggles influenced by that amorality at the hands of ruthless power hungry nihilistic agendas], these philosophers inviting them to lead a true life according to nature, that is,to live in real dominion over others [ --> such amoral and/or nihilistic factions, if they gain power, "naturally" tend towards ruthless abuse and arbitrariness . . . they have not learned the habits nor accepted the principles of mutual respect, justice, fairness and keeping the civil peace of justice, so they will want to deceive, manipulate and crush -- as the consistent history of radical revolutions over the past 250 years so plainly shows again and again], and not in legal subjection to them.
The lessons of long, sad history -- bought and paid for with blood and tears -- speak, but are we listening? KFkairosfocus
January 15, 2016
January
01
Jan
15
15
2016
12:14 AM
12
12
14
AM
PDT
daveS: I would agree that evidence for design in biology is stronger than evidence for design in cosmology. However, not because it is on "a more modest scale." Rather because it includes additional factors or evidences. In addition to fine tuning generally, some of the additional strengths of the argument in biology just off the top of my head would be: - Many billions of examples, versus a sample size of 1 - Design in biology is more experimentally supported, rather than just theoretically - Biology is closer to our experiential realm (note that even ardent materialists like Dawkins admit biological systems "appear to be designed") - Biological systems can be directly compared to systems that we know in fact were designed - The resulting systems in biology are more complex and specified than the resulting systems in cosmology (e.g., a cell compared to a solar system) - Perhaps most key: biological systems contain and operate on the basis of digitally coded information - Finally, it is stronger because design in biology holds, even if the cosmos were formed by accident ---- That said, the argument for fine tuning in the cosmos is quite strong, and has compelled a fair number of formerly materialistic scientists to question their materialistic assumptions and consider the possibility of design, or at least admit the lack of a good materialistic explanation for the origin of the cosmos.Eric Anderson
January 14, 2016
January
01
Jan
14
14
2016
10:33 PM
10
10
33
PM
PDT
KF: "JC, propose another IS that can ground ought at world foundation level..." I did. For society to survive, individuals in the society must agree on a set of rules. You haven't explained why these can only be the result of an inherently good god. I assert that these rules are the result of human reasoning. Can you provide a reason why this can't workJonas Crump
January 14, 2016
January
01
Jan
14
14
2016
05:50 PM
5
05
50
PM
PDT
1 2 3 4

Leave a Reply