Uncommon Descent Serving The Intelligent Design Community

Shermer vs. Nelson, Northern Arizona University, 16 November 2010

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Michael ShermerPaul Nelson

Michael Shermer and I are taking our ID versus Darwinian Evolution show back on the road, this time at Northern Arizona University in Flagstaff. The date is Tuesday, November 16, and the venue is Prochnow Auditorium; here are some details:

Debate on Evolution vs. Intelligent Design with Michael Shermer and Paul Nelson. This event is only open to NAU students, faculty and staff and is free with a ticket and ID. Tickets can be picked up at the NAU Central Ticket Office starting October 26. A limited number of tickets will be available at the door. Please… bring NAU ID with you to the event. This event is part of SUN Entertainment’s Lecture and Debate Series.

Here’s the Facebook entry for the event. If you’re NAU-connected, I’ll see you there!

Comments
Paul, Bring up Shermer's "patternicity" if he talks about the pattern of universal common descent. Do these events evah take place in New England?Joseph
November 6, 2010
November
11
Nov
6
06
2010
06:05 AM
6
06
05
AM
PDT
Dr. Nelson, I would like to see Shermer forced to defend himself from a 'presuppositional' apologetic viewpointi.e. Just how does he justify his reasoning from a atheistic worldview?: This following site is a easy to use, and understand, interactive website that takes the user through what is termed 'Presuppositional apologetics'. The website clearly shows that our use of the laws of logic, mathematics, science and morality cannot be accounted for unless we believe in a God who guarantees our perceptions and reasoning are trustworthy in the first place. Proof That God Exists - easy to use interactive website http://www.proofthatgodexists.org/?index.php Materialism simply dissolves into absurdity when pushed to extremes and certainly offers no guarantee to us for believing our perceptions and reasoning within science are trustworthy in the first place: Dr. Bruce Gordon - The Absurdity Of The Multiverse & Materialism in General - video http://www.metacafe.com/?watch/?5318486/? Here is a old favorite of mine that I just loaded on vimeo: The Christian Founders Of Science - Henry F. Schaefer III http://vimeo.com/16523153 of related note: Science, Christianity & Richard Dawkins - John Lennox http://www.metacafe.com/watch/4702666/bornagain77
November 5, 2010
November
11
Nov
5
05
2010
03:44 AM
3
03
44
AM
PDT
Gp, Thanks. Nice to see we are on the same page. What makes ID unique is that it involves forward planned top down organization. I agree with you about functionality. That is what I am trying to get at when there is more than one specified feature or target working in tandem with another. That functionality marks a place where even higher specificity is inferred and of course more complexity. At higher levels we may have irreducible complexity. This is a lot like what Meyer is trying to get at with his cellular arguments regarding hierarchical specificity in the design of cell and beyond into the actual body plans. But yeah, in those cases of functionality we are definitely seeing very complex specificity which implies premeditated targets. That is what specificty is about - targets.Frost122585
November 5, 2010
November
11
Nov
5
05
2010
01:24 AM
1
01
24
AM
PDT
Frost122585: Interesting contribution. I agree with you that defining specification is one of the important points in ID. But maybe it is not so difficult, if we stick to an empirical level, and avoid philosophical or mathematical approach. I have many times suggested a simple operative definition of functional specifications for digital strings in my definition of dFSCI: "Any string for which a conscious observer can explicitly define a function, and give an objective way to measure it." The important point here is that, although a conscious observer is needed to recognize the function, he has to be able to objectively define and measure it, so that everybody can share the definition and the measure. For instance, I have often suggested that an objective way to define a function for an enzyme is: "Any molecule which can accelerate the reaction such at least of such and such in a lab setting". Another important point is that there is no necessity that the defined function be really designed: any functional definition will do. In a recent discussion, my interlocutor suggested that is old computer could be used as a doorstop. Well, that's fine. That is a legitimate functional definition. That illustrates a final important point in the definition: more that one function can be defined for the same object. But the computation of the functional complexity (dFSCI) is specific for each defined function. So, an old computer is able to act as a doorstop, or to compute, but the complexity required for each of the two defined functions is different. That brings us to the final definition of dFSCI: Any string which can be read as a series of digital values, for which a function can be objectively defined and measured, and for which the functional complexity (ratio of the search space to the functional space) is higher than an appropriate threshold, is dFSCI. (I have not mentioned here for brevity the problem of compressibility, which can be well defined too). dFSCI is seen only in designed things. I am satisfied with this definition. It is completely empirical, and it does not rely on any complex philosophical assumption.gpuccio
November 5, 2010
November
11
Nov
5
05
2010
01:07 AM
1
01
07
AM
PDT
Paul, When you debate Shermer he will be sure to bring up the fossil record and Darwinian simple mechanisms as supposed proof that ID is not a scientific alternative theory of origins. But the one thing that I find that the Darwinists cannot handle is the fact that the improbability of all the necessary super complex biological features integral to the design of complex life cannot be explained by mere chance (improbability) nor combined with natural predictible laws of physics alone. And that is because of the problem of "specificity." I have been thinking about specificity for years now- as the fundamental term and component to the theory of ID which either makes ID valid and cogent or uncogent and ambiguous. It is not good enough just to say that somthing is specified- specificity must be scientifically defined. We all know the term specified complexity- which describes the two components of the biological information problem- one being its total improbability to arise on its own due to its many features (complexity) and the second one being the fact that symmetries (evenly fit relationships) and functions of biological features just "seem" like they were designed because they work so incredibly well and are often extremely necessary to tasks of a biological entity(eyes, immune system etc).So the key for ID has become to come up with a hard scientific definition of specificity which makes it readily detectable. Specificity to me clearly is when you have features which appear to be the end result of targets- and synergistic relationships with appear to be "a target between targets." Inter-working relationships within biological entities are the hall mark of specified complexity because they show not just some complex specific features but even more a system of interconnection which clearly implies a design architecture. Intelligent beings "premeditate" what it is they are going to design- hence when we see a system that appears to be premeditated we must consider the possibility compared to the alternative (undesigned natural law and chance). Design is a forward looking top down process first before it is a bottom up one. That is what specificity is- it is the apparent target regions and synergistic relationships within structures that imply or should be inferred as the result of intelligent causation. Neo Darwinism is the opposite assertion that biological complexity originated by chance (which is a metaphysical assertion) from the bottom up alone. So that is where I would rest the crux of the ID argument in that chance alone cannot explain the origin of DNA- but on the other hand DNA not only has intense complexity beyond what one would expect nature to produce by chance- but it also has specificity which is the apparent synergistic relationship among its parts working together as one united nexus for the various purposes that DNA serves. Those implied target regions- structures that were designed by an intelligent agent from the top down- constitute a real alterntive explanation to the current neodarwinistic or evolutionary models. To further illustrate the point: What we have within biological things are features that are specific to real life purposes associated with those biological things. Those specific features clearly are the end result of a targeted process especially given the intricate synergistic mechanical role and relationship they play. The end result is not only improbable and not only what you would not expect from Darwinian processes but it is actually exactly what you would expect from a designing intelligence. The only objection to the ID inference is the proposed mere "possibility" that even though life forms appear designed it is not enough to infer that the are based in that inference because it is always "possible" that chance and natural laws did it somehow alone. This however is a fallacy because if in a court of law you were to convict someone on DNA evidence you could always argue that it was somehow "possible" that the person was framed. Yet we know as rational agents ourselves that we must go where the evidence leads and not reach for mere other possibilities as a substitute for where the evidence leads just because we may not "like" the answer we have found. Specificity therefore is the key tenant of the theory of ID. Once you make the case that design from intelligent agency is inferred by targeted features then you have won the debate with the modern scientific watchmaker argument.Frost122585
November 4, 2010
November
11
Nov
4
04
2010
09:55 PM
9
09
55
PM
PDT
Shermer loves his strawman arguments, that's for sure. In his Nov 2010 SciAm article he writes admiringly of Hitchens because Hitchens asked "How do Creationists explain blind salmanders?" Well Mike Creationists explain blind salamanders as the result of adaptation- that is God provided his reations with the ability to adapt to their environment. Not one Creationist would doubt that traits can be lost. It is going the other way- getting sight to a population that never had it- that Creationists doubt. IOW Paul what yu need is an assistant to count his strawman arguments.Joseph
November 3, 2010
November
11
Nov
3
03
2010
06:09 PM
6
06
09
PM
PDT

Leave a Reply