Informatics Information News

How is Bill Dembski’s Being as Communion doing?

Spread the love

Currently (9:00 am EST) in the top 100 in the Kindle store, despite the sweetheart deals offered this summer, for buying the book.

For a thing to be real, it must be able to communicate with other things. If this is so, then the problem of being receives a straightforward resolution: to be is to be in communion. So the fundamental science, indeed the science that needs to underwrite all other sciences, is a theory of communication. Within such a theory of communication the proper object of study becomes not isolated particles but the information that passes between entities. In Being as Communion philosopher and mathematician William Dembski provides a non-technical overview of his work on information. Dembski attempts to make good on the promise of John Wheeler, Paul Davies, and others that information is poised to replace matter as the primary stuff of reality. With profound implications for theology and metaphysics, Being as Communion develops a relational ontology that is at once congenial to science and open to teleology in nature. All those interested in the intersections of theology, philosophy and science should read this book.

Here’s part of a review a reader sent:

Dembski leaves nothing to chance, not even chance itself. He is also a mathematician, so he looks at chance from the perspective of probability theory. He sees chance events through the law of large numbers and probability distribution. When looking at any event, we may prematurely assume—taken in isolation—that the event is (strictly speaking) random; however, in looking at all events aggregately, the probability distribution of those events will begin to show a pattern. He writes:

“For instance, as a coin is tossed repeatedly, the proportion of heads will tend to ½. This stable pattern to coin tossing is justified both theoretically (various probabilistic laws of “large numbers” confirm it) and practically (when people flip coins a large number of times, they tend to see roughly the same proportion of heads and tails).”

Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.”

See also: Brief excerpt from Being as Communion

Also How is Steve Meyer’s Darwin’s Doubt doing? (Continues to lead, and Christians defending Darwin continue to detract.)

Thought: Will Christians defending Darwin actually read Being as Communion first? Detract later?

Follow UD News at Twitter!

101 Replies to “How is Bill Dembski’s Being as Communion doing?

  1. 1
    Tamara Knight says:

    Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.”

    Do you have a page number for where he tells us how to measure it?

  2. 2
    DiEb says:

    Congratulations – though it is just #262,411 in the Kindle store now…

    It ranks #77 in “Kindle Store > Kindle eBooks > Nonfiction > Politics & Social Sciences > Philosophy > Logic & Language”…

    Either there is a hell of fluctuation, or this is just another example of Denyse O’Leary’s reading skills.

  3. 3
    News says:

    Yes, DiEb at 1, that’s the metric referred to. Just like Meyer is #2 in a similarly specific area. My guess is Dembski’s book will probably climb.

  4. 4
    kairosfocus says:

    TK, please start with the 101 here, in my always linked. Have you done any info theory? If so, you should instantly recognise what Dembski was summarising in what you clipped to imply dismissal. KF

  5. 5
    tragic mishap says:

    Still working my way through this one. I would like to see more conversation on it.

    He says it’s the third book in a trilogy starting with The Design Inference and No Free Lunch, but I’m finding it quite different from those books. It’s less of a hardcore, logical argument and more of an appeal. It feels like he’s appealing to common values shared by IDists, TEs and atheists who have doubts about materialism.

    I’m not sure he has made his argument very straightforward or clear, but I’m only halfway through. He’s arguing that we should think about reducing reality to information instead of matter, while making it clear he believes information is reducible to God and/or free will. But he wants to argue for information first so as to find common ground with people who don’t believe in God. Basically it sounds like someone arguing for something he doesn’t quite believe, and so it doesn’t contain the crisp, clear reasoning I’m used to from his books.

    But again, I haven’t finished it.

  6. 6
    bornagain77 says:

    There are very good ‘scientific’ reasons for believing that the entire universe is ultimately reducible to information in its foundational basis.

    “Physics is the only real science. The rest are just stamp collecting.”
    — Ernest Rutherford

    From the best scientific evidence we now have, from multiple intersecting lines of evidence, we now have very good reason to believe that the entire universe came instantaneously into origination at the Big Bang. Not only was all mass-energy brought into being, but space-time itself was also instantaneously brought into being at the Big Bang,,,

    “Every solution to the equations of general relativity guarantees the existence of a singular boundary for space and time in the past.”
    (Hawking, Penrose, Ellis) – 1970

    “All the evidence we have says that the universe had a beginning.” –
    (Paper announced at Hawking’s 70th birthday party – characterized as the “Worst birthday present ever”)
    Cosmologist Alexander Vilenkin of Tufts University in Boston – January 2012

    Thus it logically follows that whatever brought the universe into being had to be transcendent of space-time, mass-energy. Yet the only thing that we know of that is completely transcendent of space-time, matter-energy is information.

    “One of the things I do in my classes, to get this idea across to students, is I hold up two computer disks. One is loaded with software, and the other one is blank. And I ask them, ‘what is the difference in mass between these two computer disks, as a result of the difference in the information content that they posses’? And of course the answer is, ‘Zero! None! There is no difference as a result of the information. And that’s because information is a mass-less quantity. Now, if information is not a material entity, then how can any materialistic explanation account for its origin? How can any material cause explain it’s origin?
    And this is the real and fundamental problem that the presence of information in biology has posed. It creates a fundamental challenge to the materialistic, evolutionary scenarios because information is a different kind of entity that matter and energy cannot produce.
    In the nineteenth century we thought that there were two fundamental entities in science; matter, and energy. At the beginning of the twenty first century, we now recognize that there’s a third fundamental entity; and its ‘information’. It’s not reducible to matter. It’s not reducible to energy. But it’s still a very important thing that is real; we buy it, we sell it, we send it down wires.
    Now, what do we make of the fact, that information is present at the very root of all biological function? In biology, we have matter, we have energy, but we also have this third, very important entity; information. I think the biology of the information age, poses a fundamental challenge to any materialistic approach to the origin of life.”
    -Dr. Stephen C. Meyer earned his Ph.D. in the History and Philosophy of science from Cambridge University for a dissertation on the history of origin-of-life biology and the methodology of the historical sciences.

    Intelligent design: Why can’t biological information originate through a materialistic process? – Stephen Meyer – video
    http://www.youtube.com/watch?v=wqiXNxyoof8

    Thus the question becomes ‘was information used to bring space-time, mass-energy into being?’,,, Simple enough question, but how do we prove it? It turns out that quantum mechanics and quantum teleportation have shed light directly on this question!,,,
    Both Wheeler and Zeilinger, from their work in quantum mechanics, have made strong claims that everything in the universe is ultimately reducible to information,,,

    “it from bit” Every “it”— every particle, every field of force, even the space-time continuum itself derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. “It from bit” symbolizes the idea that every item of the physical world has a bottom—a very deep bottom, in most instances, an immaterial source and explanation, that which we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment—evoked responses, in short all matter and all things physical are information-theoretic in origin and this is a participatory universe.”
    – Princeton University physicist John Wheeler (1911–2008) (Wheeler, John A. (1990), “Information, physics, quantum: The search for links”, in W. Zurek, Complexity, Entropy, and the Physics of Information (Redwood City, California: Addison-Wesley))

    Why the Quantum? It from Bit? A Participatory Universe?
    Excerpt: In conclusion, it may very well be said that information is the irreducible kernel from which everything else flows. Thence the question why nature appears quantized is simply a consequence of the fact that information itself is quantized by necessity. It might even be fair to observe that the concept that information is fundamental is very old knowledge of humanity, witness for example the beginning of gospel according to John: “In the beginning was the Word.”
    Anton Zeilinger – a leading expert in quantum teleportation:
    http://www.metanexus.net/archi.....linger.pdf

    Breakthroughs in quantum teleportation have gone one step further and have actually reduced photons to quantum information and ‘teleported’ them to another location in the universe.

    How Teleportation Will Work –
    Excerpt: In 1993, the idea of teleportation moved out of the realm of science fiction and into the world of theoretical possibility. It was then that physicist Charles Bennett and a team of researchers at IBM confirmed that quantum teleportation was possible, but only if the original object being teleported was destroyed. — As predicted, the original photon no longer existed once the replica was made.
    http://science.howstuffworks.c.....ation1.htm

    Quantum Teleportation – IBM Research Page
    Excerpt: “it would destroy the original (photon) in the process,,”
    http://www.research.ibm.com/qu.....portation/

    In fact an entire human can, theoretically, be reduced to quantum information and teleported to another location in the universe:

    Quantum Teleportation Of A Human? – video
    https://vimeo.com/75163272

    Will Human Teleportation Ever Be Possible?
    As experiments in relocating particles advance, will we be able to say, “Beam me up, Scotty” one day soon? By Corey S. Powell|Monday, June 16, 2014
    Excerpt: Note a fascinating common thread through all these possibilities. Whether you regard yourself as a pile of atoms, a DNA sequence, a series of sensory inputs or an elaborate computer file, in all of these interpretations you are nothing but a stack of data. According to the principle of unitarity, quantum information is never lost. Put them together, and those two statements lead to a staggering corollary: At the most fundamental level, the laws of physics say you are immortal.
    http://discovermagazine.com/20.....eportation

    Thus not only is information not reducible to a energy-matter basis, as is presupposed in Darwinism, but in actuality both energy and matter ultimately reduce to a information basis as is presupposed in Christian Theism (John1).

    Verse and Music:

    John 1:1-4
    In the beginning was the Word, and the Word was with God, and the Word was God. He was in the beginning with God. All things were made through Him, and without Him nothing was made that was made. In Him was life, and the life was the light of men.

    Moreover, photons are also shown to now be dependent on a ‘non-local’, beyond space and time, cause for their actions in the universe,,,

    What Does Quantum Physics Have to Do with Free Will? – By Antoine Suarez – July 22, 2013
    Excerpt: What is more, recent experiments are bringing to light that the experimenter’s free will and consciousness should be considered axioms (founding principles) of standard quantum physics theory. So for instance, in experiments involving “entanglement” (the phenomenon Einstein called “spooky action at a distance”), to conclude that quantum correlations of two particles are nonlocal (i.e. cannot be explained by signals traveling at velocity less than or equal to the speed of light), it is crucial to assume that the experimenter can make free choices, and is not constrained in what orientation he/she sets the measuring devices.
    To understand these implications it is crucial to be aware that quantum physics is not only a description of the material and visible world around us, but also speaks about non-material influences coming from outside the space-time.,,,
    https://www.bigquestionsonline.com/content/what-does-quantum-physics-have-do-free-will

    Needless to say, these findings are not compatible with atheistic materialism

    Verse and Music:

    Hebrews 1:3
    The Son is the radiance of God’s glory and the exact representation of his being, sustaining all things by his powerful word.

    Goo Goo Dolls: All That You Are
    https://www.youtube.com/watch?v=9uQu5FTZ668

  7. 7
    Joe says:

    Tamara Knight:

    Do you have a page number for where he tells us how to measure it?

    Here you go Tamara. Enjoy!

  8. 8
    Tamara Knight says:

    Have you done any info theory? If so, you should instantly recognise what Dembski was summarising in what you clipped to imply dismissal.

    In the Shannon sense most certainly. I’ve been in the electronics business since a 300 baud modem was a desirable piece of equipment. But the clip I commented on seems to bear no relationship to the Shannon sense. How much information is in a human genome? How much more in the sum of everybody’s genomes?

  9. 9
    Adapa says:

    Joe

    Here you go Tamara. Enjoy!

    Interesting. You contend Dembski defines “information” by the Shannon metric which considers message length only and specifically excludes any meaning the message may carry.

    Where did Dembski say that?

  10. 10
    tragic mishap says:

    I forgot one really big question I would ask Dembski if I could.

    Do you think that there is any real difference between natural law and what Nagel calls “teleological law”?

    If the basic stuff of the physical universe is random and all order is imposed by laws which resolve the randomness, then isn’t it the case that all laws are teleological?

  11. 11
    keith s says:

    Denyse:

    Yes, DiEb at 1, that’s the metric referred to. Just like Meyer is #2 in a similarly specific area. My guess is Dembski’s book will probably climb.

    No, Denyse. You wrote this:

    Currently (9:00 am EST) in the top 100 in the Kindle store,

    That is not correct, as DiEb points out.

    Will you add a postscript to the OP explaining your error and correcting it?

  12. 12
    Joe says:

    Tamara Knight:

    How much information is in a human genome?

    More than blind and unguided processes can account for.

  13. 13
    Joe says:

    Adapa:

    You contend Dembski defines “information” by the Shannon metric

    No, I was talking about how to MEASURE it. Please try again.

  14. 14
    Tamara Knight says:

    Joe

    Adapa:
    You contend Dembski defines “information” by the Shannon metric

    No, I was talking about how to MEASURE it. Please try again.

    But only in response to my request for instructions about how to measure it in the Dembski sense. Why bring up Shannon at all?

    Joe

    Tamara Knight:
    How much information is in a human genome?

    More than blind and unguided processes can account for.

    More than a little irony in that response Joe.
    In the Shannon sense it can be calculated very easily. Two bits per base pair for 3 billion base pairs is 750Mbytes of raw information, but in the real world that can be compressed. However the structured information in a genome will typically compress more than a totally random string.

    In the Dembski sense I await some figure on which to comment.

  15. 15
    Adapa says:

    Joe

    No, I was talking about how to MEASURE it. Please try again.

    The human genome has about 3.2 billion base pairs.

    The marbled lungfish (Protopterus aethiopicus) genome has 130 billion base pairs.

    You say that makes the fish have 40X more information than a human?

  16. 16
    Edward says:

    Yes, and the lungfish only awaits the correct set of circumstances to activate that information. Are you prepared for your lungfish overlords, Adapa?

  17. 17

    TK,

    Dembski has provided a measure of information in his previous work. This book leans more philosophical, thus my reason for designating it a prequel.

    You can read some of it here: Being As Communion – A Metaphysics of Information

    Although, I would recommend you also read the work of Luciano Floridi to get a good idea of how their views differ. You can find his work here: Luciano Floridi

    I also recommend you read Szostak and Hazen’s work, if you haven’t already. You can find that here: Functional information and the emergence of biocomplexity

    Enjoy,
    -M

  18. 18
    centrestream says:

    Adapa: “The human genome has about 3.2 billion base pairs.
    The marbled lungfish (Protopterus aethiopicus) genome has 130 billion base pairs.”

    Does that mean that the lungfish is 40x more likely to be designed that humans? It seems like a logical conclusion given that ID’s entire argument is based on probability

  19. 19
    centrestream says:

    Can someone explain to me why this cutting-edge science book is being published as one of the Ashgate Science and Religion series? Maybe it is because the editors are well recognized in the field of science.

    “Roger Trigg, Emeritus Professor of Philosophy, University of Warwick and Kellogg College, University of Oxford, UK and J. Wentzel van Huyssteen, Princeton Theological Seminary, USA”~/I>

    OK, maybe not.

  20. 20
    Mung says:

    centrestream:

    Can someone explain to me why this cutting-edge science book is being published as one of the Ashgate Science and Religion series?

    From the Preface:

    By contrast, Being as Communion attempts to paint a metaphysical picture of what the world must be like for intelligent design to be credible…

    Next stupid question?

  21. 21
    Mung says:

    Tamara Knight:

    But only in response to my request for instructions about how to measure it in the Dembski sense. Why bring up Shannon at all?

    Because you asked about measuring information.

    Next?

  22. 22
    Mung says:

    Adapa:

    Interesting. You contend Dembski defines “information” by the Shannon metric which considers message length only and specifically excludes any meaning the message may carry.

    That is incorrect. Try again.

  23. 23
    centrestream says:

    Mung: “Next stupid question?”

    Well, it is nice that ID and UD are willing to be honest enough to admit that this book is not about science. My respect for both has increased.

  24. 24
    Dick says:

    centrestream:

    I’m not sure why you think it noteworthy that people “admit” that BAC is not science. Why does it have to be? The book is written as a critique of materialism – a metaphysical hypothesis – and as such it is itself a metaphysical argument.

  25. 25
    Mung says:

    Tamara Knight:

    But the clip I commented on seems to bear no relationship to the Shannon sense.

    Really? That’s simply bizarre. Here’s what you clipped;

    Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.”

    How is that not a description of Shannon’s metric?

    Is it the absence of realization? Is it the absence of a matrix of possibilities? Did you read the paper Joe linked?

    Seriously people. You’re like the Kansas City Royals trying to get a win against Madison Bumgarner.

  26. 26
    Mung says:

    Some of us have actually cracked open the book and peeked inside before mouthing off about it. Imagine that.

  27. 27
    centrestream says:

    Dick: “I’m not sure why you think it noteworthy that people “admit” that BAC is not science. Why does it have to be? The book is written as a critique of materialism – a metaphysical hypothesis – and as such it is itself a metaphysical argument.”

    Dick, I was just confused because the book is being flogged on UD, a blog that “supports the intelligent design community”. Given the assertion that ID is a science, and that Dembski is one of its founders, I naturally (and erroneously) jumped to the conclusion that this book was about science. But knowing that this is a book about metaphysics and religion, and not about science, will temper any future comment I have about it. I offer my most sincere mea culpa.

  28. 28
    ppolish says:

    “Without readers, a book is a bundle of cellulose sheets with meaningless ink stains. Likewise, a text in the metabolic library needs to be read to reveal its meaning: the metabolic phenotype that determines which fuels an organism can use, and which molecule it can manufacture”
    Arrival of the Fittest- Andreas Wagner pg 83, my current page.

    After reading Being as Communion, I’m guessing Dr Dembski would agree with Dr Wagner. Although I suspect Dr Wagner will soon careen of the path and end up in the materialist “chance and necessity” rut.

  29. 29
    Reality says:

    Mung, have you and Joe ever been seen in the same room at the same time?

    Have you read “the book”? If so, will you please provide some of Dembski’s best points in it and on what pages those points can be found?

  30. 30
    Mung says:

    Reality:

    Have you read “the book”?

    Given your obvious interest in this book it’s a shame that you’re not aware of the threads here at UD about it in which I have stated my progress in reading it.

    Reality:

    If so, will you please provide some of Dembski’s best points in it and on what pages those points can be found?

    Why?

  31. 31
    sparc says:

    I’ve added both editions to novelrank.com. You may check the development of sales there:
    paperback edition
    kindle edition
    (to my best knowledge counting only starts from today)

  32. 32
    keith s says:

    Denyse,

    A reminder.

  33. 33
    Tamara Knight says:

    Mung

    Why bring up Shannon at all?

    Because you asked about measuring information.

    No, I asked about measuring information in the Dembski sense. I’ve been earning a living for forty years working with information in the Shannon sense, so understand it’s concepts quite well. But I was asking for clarification in the Dembski sense. I don’t have a problem if you consider these to be overlapping sets, but discussion gets nowhere if use the word “information” without elaboration to mean either or both. So let’s agree to use the terms “Demski information” and “Shannon information” to avoid confusion. The maximum Shannon information in a human genome is about 750Mbytes. How much Dembski information is there?

    Mung

    Interesting. You contend Dembski defines “information” by the Shannon metric which considers message length only and specifically excludes any meaning the message may carry.

    That is incorrect.

    Which bit are you contesting Mung? The (correct) definition of the Shannon metric, or your view on Dembski’s opinion?

    Mung

    Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.”

    How is that not a description of Shannon’s metric?

    Because Shannon precisely defined a way of measuring information, the fact that it can measured does not follow from anything Dembski (or Dawkins, I’m not taking sides here) defines. If Dembski is postulating something beyond Shannon information that is fine, but the what he is actually saying is “It follows that Dembski information can be measured. I was simply asking how to do this.

    Simple thought experiment. Take an ovum and let it split into three identical cells, as would normally produce identical triplets. We have three identical cells, let’s call them Mung, Tamara and Barry. Genetically engineer Tamara by duplicating a gene 256k base pairs long, but insert a 256k base-pair random sequence at the same locus in Mung. Leave Barry untouched as a disinterested moderator. Mung will gain 64kbytes of Shannon information, but Tamara gets virtually no extra Shannon information. (Think change-in-size of their respective ZIP files in memory as a pretty good measure of the change of their Shannon information). How much does their Dembski information change? Or even can you arrange the three of them order, highest to lowest.

    And suppose these three cells are allowed to grow into mature organisms. Mung and Barry would effectively be identical twins, but Tamara would be a mutant with an extra copy of a gene. I’m not a biologist, but my understanding is that this extra copy could either:
    a) Make Tamara “better” than her siblings because her cells produce more of a good thing
    b) Make Tamara “worse” than her siblings because her cells produce too much of a good thing
    c) Give Tamara a redundant spare copy of a gene (which can of course subsequently mutate without detriment).
    How does Tamara’s Dembski information change in these three scenarios?

  34. 34
  35. 35
    sparc says:

    According to Novelrank Amazon sold one (1) copy of the paperback and two (2) copies of the kindle edition of Dembskis’s “Being as communion” since yesterday.

  36. 36
    Learned Hand says:

    sparc,

    That’s an interesting site, thanks for the information. So much more detailed and significant information than simply, and falsely, claiming that it’s “in the top 100 in the Kindle store.”

  37. 37
    ppolish says:

    Bill Nye’s new book is an Amazon best seller. Not a single original thought in that one I bet.

    Hardly a page goes by in Dr Dembski’s new book without an original thought or profound insight. But I doubt it will ever be a Kindle Top 100 ever again:) Although it may tie for Top Book for a second…

  38. 38
    roding says:

    But I doubt it will ever be a Kindle Top 100 ever again:)

    It never actually was in the proper Top 100, just in this category: Kindle Store > Kindle eBooks > Nonfiction > Politics & Social Sciences > Philosophy > Logic & Language”…

    As Dieb pointed out in #2. The News Desk does not appear to want to correct this factual error. As KF might say, that seems rather telling.

  39. 39
    centrestream says:

    Sparc: “According to Novelrank Amazon sold one (1) copy of the paperback and two (2) copies of the kindle edition of Dembskis’s “Being as communion” since yesterday.”

    I hope that Dembski wasn’t relying on the sales of this book to supplement his income.

  40. 40
    ppolish says:

    Dr Dembski up to #26 (Logic & Language)
    Bill Nye now at #12 (Religion & Spirituality)

    Per current Official Kindle stats. Proper Kindle stats.

  41. 41
    ppolish says:

    Descartes currently in #9 spot in Logic & Language. Don’t turn around Rene, Dr D is breathing down your back. He’s already passed kindle Plato.

  42. 42
    ppolish says:

    Novelrank is clearly bogus Centrestream. Amazon knows their data. Data Gathering Pros. Audited & Accurate.

  43. 43
    keith s says:

    Denyse,

    You claim to be a journalist. Ethical journalists correct their errors.

    From the Society of Professional Journalists Code of Ethics:

    BE ACCOUNTABLE

    Journalists are accountable to their readers, listeners, viewers and each other.

    Journalists should…

    – Admit mistakes and correct them promptly.

    You have been shown, and reminded several times, that a crucial claim in your OP is false.

    The ethical things to do is to admit your error and correct it. Why haven’t you done that?

  44. 44
    ppolish says:

    Barnes & Noble has the $110 hardcover edition for sale. Although back order only. But Nook Book available now;
    http://m.barnesandnoble.com/w/.....0754638575

    Some people don’t like to shop Amazon.

  45. 45
    ppolish says:

    Keith, Being as Communion WAS Top 100 as stated in OP. Heck, it was #77. Now it is #26.

    http://www.amazon.com/Best-Sel.....57430011#2

  46. 46
    sparc says:

    Barnes & Noble sales rank: 942,042

  47. 47
    keith s says:

    ppolish,

    Denyse made this claim:

    Currently (9:00 am EST) in the top 100 in the Kindle store, despite the sweetheart deals offered this summer, for buying the book.

    Here are the top 100. Being as Communion is not among them, of course.

    Denyse screwed up, but she is refusing to do the ethical thing by admitting and correcting her mistake.

  48. 48
    ppolish says:

    Sparc, do you have a link to that ranking? I want to watch it rise towards top half million:)

  49. 49
    ppolish says:

    Moved up from #26 to #25 for Kindle “Logic & Language” category in last hour. I have to get back to my Bicycle Shop duties now but I’ll check again later:)

  50. 50
    keith s says:

    ppolish,

    Denyse said it was in the top 100. It’s not even in the top 100,000:

    Amazon Best Sellers Rank: #128,325 Paid in Kindle Store

    I’d say she owes us a correction, wouldn’t you?

    It would be the ethical thing to do.

  51. 51
    ppolish says:

    Keith, you and I both bought the book pre-release. I read it once and will read it again after I finish Arrival of the Fittest. I’m really enjoying that book too – a 5000 dimension Universal Library? Irreducible Complexity from Inconceivable Complexity. I like it:)

    But Dr Dembski focuses on information. Like the message sent in the OP:
    “Currently (9:00 am EST) in the top 100 in the Kindle store, despite the sweetheart deals offered this summer, for buying the book.”

    But sticking with Dembski, how would a Material Naturalist evaluate the information sent and received in that message. And the requirement for apology? Dembski argues a Material Naturalist could not answer those question definitively.

  52. 52
    keith s says:

    ppolish,

    But sticking with Dembski, how would a Material Naturalist evaluate the information sent and received in that message.

    Why should it be a problem for a materialist? The transfer of information doesn’t require any non-materialist assumptions.

    And the requirement for apology?

    Not an apology — an admission of error and a correction. It’s required by Denyse’s own journalistic ethics — assuming she subscribes to the same ethical system as her fellow journalists. And if not, she should make that very clear.

    Dembski argues a Material Naturalist could not answer those question definitively.

    I’ve read most of Dembski’s previous work, and it hasn’t been very well argued. We’ll see if he does any better this time.

  53. 53
    Mung says:

    Just so it’s perfectly clear, yet again, the sort of people we are dealing with:

    Tamara: Why bring up Shannon at all?

    Mung: Because you asked about measuring information.

    Tamara: No, I asked about measuring information…

    No? You asked about measuring information but you did not ask about measuring information? How does that work?

    Tamara: I asked about measuring information in the Dembski sense. … I was asking for clarification in the Dembski sense.

    How is “the Dembski sense” different?

    Tamara: So let’s agree to use the terms “Demski information” and “Shannon information” to avoid confusion.

    Let’s not. Unless and until you can establish that there is a meaningful difference.

    Adapa: Interesting. You contend Dembski defines “information” by the Shannon metric which considers message length only and specifically excludes any meaning the message may carry.

    Mung: That is incorrect. Try again.

    Tamara: Which bit are you contesting Mung? The (correct) definition of the Shannon metric, or your view on Dembski’s opinion?

    Can you explain how the entropy of the source depends on the length of the message?

    Can you explain how the average information per symbol depends on the length of the message?

    the Shannon metric … considers message length only

    You agree with that. After all your years in information theory. Why?

  54. 54
    keith s says:

    Mung quote mines Tamara Knight:

    Mung: Because you asked about measuring information.

    Tamara: No, I asked about measuring information…

    No? You asked about measuring information but you did not ask about measuring information? How does that work?

    Here’s what Tamara actually wrote:

    No, I asked about measuring information in the Dembski sense.

    You’re a real asset to the ID movement, Mung. Glad you’re not on my side.

  55. 55
    sparc says:

    ppolish:

    Sparc, do you have a link to that ranking? I want to watch it rise towards top half million:)

    Just look under “Product Details” on the book’s Barnes & Noble page. PLease be patient: It still ranks at 942,042. It would be interesting though, how much a single deal would change this number.

  56. 56
    ppolish says:

    Thank you for the link, Sparc. But that ranking is for the $100 hardcover only. Nobody is going to buy that except Dembski’s mom. I preordered and already received/read the paperback. Good stuff.

  57. 57
    Tamara Knight says:

    Mung, as keith s pointed out you have provided us with a text book example of something called “quote mining”, a new name to me for a familiar concept. On another thread this was posted

    Silver Asiatic
    With a few very minor exceptions, the effects of natural selection are not estimated or predicted. It can’t even be predicted which mutations confer an adaptive advantage because so much depends on environmental factors, resources, competitive pressures and even the effect of other mutations in the same organism.

    I think i will quote that in future elsewhere, but it doesn’t need any “mining”. In context, it was meant as a dismissal of evolution, but as a stand alone statement I think it is a concise and even elegant summary of what Evolution is. Would the practicing evolutionary biologists here agree?

    How is “the Dembski sense” different?

    That is what I was asking you. Give me any string of information and I can tell you the Shannon length. Are you now claiming the Dembski length would be the same. Because the Shannon information in a random string of base pairs will never less than that in a designed string. When I give you some simple examples to illustrate the problem with your position, you ignore them and instead throw up a smoke screen.

    Can you explain how the entropy of the source depends on the length of the message?

    Who has told you it does? It has nothing to do with Shannon, but if you illustrate by example what precisely you would like me to explain, I’ll try.

    Can you explain how the average information per symbol depends on the length of the message?

    Not without clarification about what you mean by “symbol”. Shannon sets a minimum length of a message to send a particular set of data. If you chose to send genome data using symbols A,C,G,T, then you will send two bits per symbol. Apart from a small overhead to control begin and end, this applies regardless of the length of the message.

    “the Shannon metric … considers message length only”

    You agree with that. After all your years in information theory. Why?

    Well apart from the fact that that is how Shannon defined it I see no reason at all.

  58. 58
    Joe says:

    Tamara Knight:

    But only in response to my request for instructions about how to measure it in the Dembski sense. Why bring up Shannon at all?

    Because it is relevant and Shannon provided a metric pertaining to measuring information.

    Two bits per base pair for 3 billion base pairs is 750Mbytes of raw information,

    Wrong- 2 bits per nucleotide.

  59. 59
    Joe says:

    keith s:

    I’ve read most of Dembski’s previous work, and it hasn’t been very well argued.

    That’s an opinion from someone who doesn’t know how to argue.

  60. 60
    roding says:

    Keith S at #50

    It would be the ethical thing to do.

    It’s very disappointing isn’t it? It does not appear normal and accepted journalistic standards are in use here. A little humility to admit what was probably an honest mistake would have gone a long way.

    For this reason I’m not sure I want to bother reading UD anymore until there is some change in the editorial staff. Better things to do with my time.

  61. 61
    Me_Think says:

    Mung @ 53

    Can you explain how the entropy of the source depends on the length of the message?

    Can you explain how the average information per symbol depends on the length of the message?

    I can answer though I have no idea why you need it:

    Entropy of “Can you explain how”:
    3/19 Log[19/3]+6/19 Log[19/2]+(10 Log[19])/19 = 2.55209

    Entropy of “Can you explain how the entropy of the source depends on the length of the message”
    15/82 Log[82/15]+6/41 Log[41/6]+7/82 Log[82/7]+9/41 Log[41/3]+2/41 Log[41/2]+3/41 Log[82/3]+(7 Log[41])/41+(3 Log[82])/41 = 2.72192

  62. 62
    Tamara Knight says:

    Because it is relevant and Shannon provided a metric pertaining to measuring information.

    He did indeed, and such measurements give a numerical answer. And if that metric could be applied to Dembski information, you would have no problem coming up with a numerical answer to the questions I posed Mung @post 33

    Two bits per base pair for 3 billion base pairs is 750Mbytes of raw information,

    Wrong- 2 bits per nucleotide.

    Really. So what are you going to claim next, that cell division doubles the amount of “information”? Because the all the Shannon information is present in each strand of DNA

  63. 63
    Joe says:

    Tamara:

    And if that metric could be applied to Dembski information, you would have no problem coming up with a numerical answer to the questions I posed Mung @post 33.

    No one cares about your questions.

    As for 2 bits per nucleotide, well that is the math. There are 4 possible nucleotides. 4 = 2^2 = 2 bits per nucleotide. And no having two copies of the same thing doesn’t increase the information.

  64. 64
    Tamara Knight says:

    As for 2 bits per nucleotide, well that is the math. There are 4 possible nucleotides. 4 = 2^2 = 2 bits per nucleotide. And no having two copies of the same thing doesn’t increase the information.

    Is that the closest you ever get to conceding an error, or do you simply not realise there are only four possible base pairs too.

    No one cares about your questions.

    Well not here apparently. You just leave the difficult stuff to real scientists I suppose. But I do wonder though how you might react to a question that blows one of your cherished ideas out of the water if you were interested.

  65. 65
    Adapa says:

    Joe

    And no having two copies of the same thing doesn’t increase the information.

    It certainly increases the Shannon information which you told us was how Dembski measures information.

    Which string has more information?

    “I saw a can”

    or

    “I saw a cancan.”

    According to Joe they both have the same. That’s what happens when you make up this crap as you go.

  66. 66
    Joe says:

    Tamara:

    Is that the closest you ever get to conceding an error, or do you simply not realise there are only four possible base pairs too.

    What error?

    But I do wonder though how you might react to a question that blows one of your cherished ideas out of the water if you were interested.

    If I ever encounter one I will let you know.

  67. 67
    Joe says:

    Adapa:

    It certainly increases the Shannon information which you told us was how Dembski measures information.

    It doesn’t increase the CSI, which is what we are discussing.

    Which string has more information?

    “I saw a can”

    or

    “I saw a cancan.”

    There is different information there. However neither exhibits CSI.

  68. 68
    Mung says:

    keiths can’t read, and neither can Tamara.

    I was accused of leaving this bit in bold out:

    Tamara: I asked about measuring information in the Dembski sense. … I was asking for clarification in the Dembski sense.

    And yet there it is.

    see for yourself

  69. 69
    Mung says:

    Tamara: So let’s agree to use the terms “Demski information” and “Shannon information” to avoid confusion.

    Sorry but I searched in vain in Dembski’s new book for “Dembski information.” It appears to be something you just made up. How is that supposed to avoid confusion?

  70. 70
    Joe says:

    Mung, Thank you, belatedly, for alerting Barry to the timeline mix-up wrt my banning.

    😎

  71. 71
    Tamara Knight says:

    And yet there it is.
    see for yourself

    Mung, stop digging. If anybody follows that link it will take them to the top of your post, not where you use the full quote. You have to have yout little “joke” first.

    Mung: Because you asked about measuring information.

    Tamara: No, I asked about measuring information…

    Mung:No? You asked about measuring information but you did not ask about measuring information? How does that work?

    .

    Sorry but I searched in vain in Dembski’s new book for “Dembski information.” It appears to be something you just made up.

    Well done. What a hero, you must be a very fast reader. But where did I claim otherwise? I made it up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”. If you prefer, I’m happy to use the term “information” to mean information in its broadest everyday sense, and “Shannon information” only when making an appeal to the properties of Shannon information.

  72. 72
    Joe says:

    Shannon’s methodology for measuring information works with CSI. CSI is just complex Shannon information that has meaning or function.

  73. 73
    Tamara Knight says:

    What error?

    Joe, when you said:

    Wrong- 2 bits per nucleotide.

    I thought you might have meant:

    Right- I might have described it as 2 bits per nucleotide myself, but I can see that is exactly the same as two bit per base pair.

    and I can see giving you the benefit of the doubt is a futile gesture. I’ll just accept you’re never wrong

    If I ever encounter one I will let you know.

    I don’t think you ever will Joe. Some years ago I remember the concept of something called Morton’s Demon* being introduced to a debate about Creationists. I can see it applies to elements of ID proponents too.

    I don’t suppose any of us like to be seen to lack knowledge and understanding. We all try to cover up the gaps to some degree, present speculation as fact to seem knowledgable and keep a conversation going, etc, etc. But I’ve never understood why some people feel the need to use their obvious intelligence to build mental barricades to protect their ignorance, rather than to continually try and reduce it.

    *If you are not familiar, it is based on the concept of Mexwell’s Demon, a hypothetical entity guarding a portal between two chambers of gas, and blocking entry cold atoms. If such an entity were possible, it would allow a perpetual motion machine to be constructed. Heisenberg explained why it cannot exist. Morton’s Demon performs a similar function and prevents uncomfortable information from ever reaching Morton’s conscious brain.

  74. 74
    Tamara Knight says:

    Shannon’s methodology for measuring information works with CSI. CSI is just complex Shannon information that has meaning or function.

    Then you should be able to calculate it Joe. Try the example cases at post 33. Just the direction of change would be a start.

  75. 75
    Joe says:

    You should be able to calculate it, Tamara.

  76. 76
    Joe says:

    Tamara, It is two bits per nucleotide for the reason I presented. As for lacking knowledge and understanding, that would be you with respect to natural selection.

    Strange how you can never make a point but instead rely on innuendos and misrepresentations.

  77. 77
    Joe says:

    Morton’s Demon pertains to evidence. And if there is ever any evidence that supports blind watchmaker evolution I will definitely take a look and consider it with all the respect science deserves.

    OTOH I see evos utilize Morton’s Demon on a daily basis…

  78. 78
    Tamara Knight says:

    You should be able to calculate it, Tamara.

    And I can of course, based on your understanding of CSI

    Generally:
    ………………………….|.. Control..|…..256k base pair….|….256k base pair….|
    ______________|_________.|_.gene_duplication_|_random_insertion_|
    Information……… |…….. X ….. |………… X+T ………… |……… X+M ……………|

    .
    And specifically for the 256k base pair gene duplication in the proposed scenarios
    .
    ______________|_Scenario a)_|_Scenario b)_|_Scenario c)_|
    Information…….. | …. X+Ta ….. | … X+Tb ……. |….. X+Tc ….. |

    .
    For any particular example:

    For Shannon information M is positive and nominally 32Kbytes, T is positive and small, and T=Ta=Tb=Tc

    For Demski/CSI M is zero, T is indeterminate, but Ta>Tc>Tb

    Oh, and every Shannon variables is equal to the equivalent CSI variable. I can’t get that bit to work, perhaps you can explain it?

    And incidently, if you think anybody, ever “utilizes” Morton’s Demon, then the concept has gone tiotally over your head.

  79. 79
    tragic mishap says:

    Has anyone on this thread actually read “Being as Communion” or are at least reading it? lol.

    I want to discuss that book. 🙁

    I am on chapter 16 now and I’m finding that chapter very interesting. He makes a point about fine-tuning that now seems obvious to me but I had never realized it before. Fine-tuning can’t be defined mathematically because there’s no way to define the number of possibilities.

  80. 80
    Mung says:

    Tamara: I made it up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”.

    And you tell me to stop digging?

    Tamara: Why bring up Shannon at all?

    Mung: Because you asked about measuring information.

    Tamara: No, I asked about measuring information[in the Dembski sense (something I just made up without giving it any meaning)]

    No? You asked about measuring information.

    Let me bold that for you:

    Tamara: No, I asked about measuring information

    Right. You asked about measuring information.

    And somehow I am supposed to understand that you meant “Dembski information” something that apparently only you know what it is? And that you were not talking about measuring information at all, or in some special sense known only to you?

    Here’s Tamara quoting from the OP:

    Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.

    She then follows that with a question:

    Do you have a page number for where he tells us how to measure it?

    IT. How to measure WHAT? Information?

    And I am the villain for pointing out the obvious? Please.

    Dembski was clearly talking about measuring information.

    You were clearly asking about how to measure [it] information.

    Now, is there something else we can help you with?

  81. 81
    Mung says:

    tragic m, there are probably a number of us who have read it or are reading it. I had put it down for a while but am now on Chapter 11.

  82. 82
    Mung says:

    “As we saw in Chapter 6, information, as a numerical measure, is usually defined as the negative logarithm to the base two of a probability (or some logarithmic average of probabilities, often referred to as entropy). This has the effect of transforming probabilities into bits and of allowing them to be added (like money) rather than multiplied (like probabilities)…Such a logarithmic transformation of probabilities is useful in communication theory …”

    – Dembski. p. 159

    Shannon is rolling over in his grave!

  83. 83
    Mung says:

    Tamara:

    I made it up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”. If you prefer, I’m happy to use the term “information” to mean information in its broadest everyday sense, and “Shannon information” only when making an appeal to the properties of Shannon information.

    Actually, I’d prefer you say information when you mean information. If what you mean is “Shannon’s Measure of Information” then say that, or SMI for short.

    Now, in an attempt to move things along. Do you believe that SMI can be applied to ANY probability distribution?

    Did Shannon limit his measure to only certain probability distributions?

    Or am I just not making sense to you with these questions?

    So how might Shannon and Dembski differ? That’s hard to say, because while it’s clear that Shannon was interested in the engineering problem and not the “meaning” it doesn’t follow that he thought that there was such a thing as information that could exist independently of any matrix of probability.

    Dembski otoh thinks that is the very defining feature of information. If so, there is no such thing as information that cannot be measured by SMI. (Assuming you agree that SMI applies to any probability distribution.)

    Let’s just say that, for now, this is the case in theory.

    We may not know the relevant probability distribution, but it does not follow that information can exist without the actualization of a possibility within a matrix of possibility.

    Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.”

  84. 84
    Mung says:

    p.s. If, as Dembski claims, “Information is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility” and if, as I claim, SMI applies to ANY probability distribution, then it does in fact follow that information can be measured.

    Dembki is correct, and there’s nothing at all controversial here.

  85. 85
    Mung says:

    p.p.s. And it’s not the ID proponent who has to resort to making things up.

  86. 86
    Tamara Knight says:

    Mung, you leave me in a bit of a quandary. Either I accuse you of being incapable of grasping a most basic concept of communication, or of being a contrarian through and through. I’m not sure which is ruder, but I can see no other explanation. Do you honestly not understand the differnce between making up (i.e. defining) a label and making up the underlying concept. Lavoisier made up “hydrogen” to describe the chemical element that burns to produce water. It is a made up word. I suggested we use the phrase “Dembski information” to discuss the case where information in the Dembski sense differs form that in the Shannon sense.

    p.s. If, as Dembski claims, “Information is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility” and if, as I claim, SMI applies to ANY probability distribution, then it does in fact follow that information can be measured.

    Dembki is correct, and there’s nothing at all controversial here.

    Of course there isnt’t, because both claims are statements of the obvious. The whole purpose of Shannon’s work was to measure the amount of information. Think of SMI as a precise measure of how much you would need to tell me, (or how much I would need to tell you,) such that we both knew the same. The non sequitur comes when you then try and postulate some additional property to this information that does not affect the SMI. If you go back to my post 78, from my understanding of what you think Dembski is claiming, added CSI can either increase of decrease SMI, but please feel free to comment on my assertion that “Demski/CSI M is zero, T is indeterminate, but Ta>Tc>Tb” if you know differently. Equating CSI to SMI is as useless as equating Shannon entropy to Thermodynamic entropy. It’s apples and pears.

  87. 87
    sparc says:

    updates on Amazon sales
    paperback edition 9 copies in the US, 1 copy in Canada since Nov. 11
    kindle edition 11 copies in the US, 1 copy in Canada since Nov. 11
    I.e., a total of 21 copies within roughly a week. However, Amazon is not the only distributor.

  88. 88
    Tamara Knight says:

    And sparc, somebody finally bought a copy on amazon.co.uk. Its ranking shot up nearly 200,000 places as a result: It was ranked at 400,579 yesterday, now it’s at 201,079.

  89. 89
    Mung says:

    Tamara, you’re the one who has admitted to making up the term “Dembski Information” to cover up your ignorance of Dembski’s actual position in the book. What’s so hard to grasp?

    Can we move on now?

    Seriously, I thought I was saying I was willing to move on.

    Tamara:

    Do you honestly not understand the differnce between making up (i.e. defining) a label and making up the underlying concept.

    You made up the label [Dembski Information] but had no underlying concept to attach to the label. Yes, I understand that. I don’t say you made up the underlying concept. I say you made up the label. You admit to making up the label. What was the underlying concept?

    Tamara:

    Lavoisier made up “hydrogen” to describe the chemical element that burns to produce water. It is a made up word. I suggested we use the phrase “Dembski information” to discuss the case where information in the Dembski sense differs form that in the Shannon sense.

    A neologism then. That’s what you have created?

    Let’s see how this works:

    [Lavoisier] hydrogen – the chemical element that burns to produce water.

    [Tamara] dembski information – information in the Dembski sense differs form that in the Shannon sense.

    How are the two even remotely analogous?

  90. 90
    Mung says:

    Let’s recap:

    Tamara Knight @ 71:

    I made it [Dembski Information] up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”.

    Where’s the quandary, Tamara?

    Tamara:

    The whole purpose of Shannon’s work was to measure the amount of information.

    Not really.

  91. 91
    Mung says:

    Is there a reason ID critics ought to be allowed to just make stuff up and get away with it?

    Tamara made up the term “Dembski Information.”

    The putative reason given is that it would allow “Dembski Information” to be distinguished from “Shannon Information.”

    And the difference between the two is?

    If there’s no difference, then there is no need to introduce a new term, a new term that adds nothing of substance to the discussion.

    I’m thinking this one needs to be added to the Darwinian Debating Devices.

    Suggestion on a name for it?

  92. 92
    Adapa says:

    Mung

    Suggestion on a name for it?

    Falsely accusing someone of making up things by making up your own porkies about them should be called “Mungtrolling”.

  93. 93
    Mung says:

    Adapa, your judgment about what constitutes trolling is suspect, and Tamara already admitted to making up the term “Dembski Information.”

  94. 94
    Tamara Knight says:

    Tamara: The whole purpose of Shannon’s work was to measure the amount of information.

    Mung: Not really.

    I was trying to keep it simple Mung, but Shannon (as I understand and use his work) devised a way of quantifying information. Esentially a measure of how difficult it is for a sender of information to convey his exact and unambiguous meaning to a receiver under real world conditions. If you think you have a deeper understanding of his work than I do, I assure you I am always keen to learn from others with a better understanding than I have. Please expand on where your understanding is different to mine, but use your own words so we can narrow down and debate our diferences. Having to use a cut and paste of somebody else’s understanding gets us nowhere unless that somebody is prepared to join in the debate.

    Mumg poses a question:
    The putative reason given is that it would allow “Dembski Information” to be distinguished from “Shannon Information.” And the difference between the two is?

    and then claims:
    If there’s no difference, then there is no need to introduce a new term, a new term that adds nothing of substance to the discussion.

    Indeed, if there is no difference. But it is the if we are debating. If your assertion is true, then one of these two statements from my post at 78 must be wrong because they are mutually exclusive.

    For Shannon information M is positive and nominally 32Kbytes, T is positive and small, and T=Ta=Tb=Tc

    For Demski/CSI M is zero, T is indeterminate, but Ta>Tc>Tb

    Which one do you think is wrong and why? I feel pretty confident on the Shannon one, but accept I may be wrong on the Dembski/CSI one.

    However, if you claim they are the same, then it must follow that the quantity of Shannon information is totally uncorrelated with the quality of the product resulting from the “use” of that information. Whatever Dembski/CSI is, it has to have the property that it can be enhanced or degraded by any change (in fact even without any change) in the amount of Shannon information describing it. One measure is totally objective, the other by its very nature has to be subjective. How can there possibly be no differnce between them Mung?

  95. 95
    Joe says:

    CSI is just complex Shannon “information” with meaning or function. IOW it is information in the every day and normal use of the word. That is if one insists on saying Shannon “information” isn’t really information…

  96. 96
    Tamara Knight says:

    Joe:
    CSI is just complex Shannon “information” with meaning or function.

    Isn’t that what I just postulated above?

    The problem is in the “just …. with meaning or function” A qualitative assessment of meaning that makes the underlying quantitative Shannon metric irrelevant. We know that minor changes to a few bits of a genome can massively affect the quality of what you define as CSI, whilst keeping the quantity of Shannon information (and also presumably the quantity of CSI) essentially unchanged. Being as you continually claim to accept the standard definitions, I presume you will agree that all such minor changes have a minimal affect on the Shannon information. Most will have a neutral effect on the owner of the genome and leave its CSI unchanged. Some will have a detrimental effect, degrading its CSI, and a few will have a beneficial effect which improves its CSI.

    Where I suspect we differ is that you will claim that the beneficial change can only occur where the new genome has been sprinkled with the designers’ magic dust, and a similar point mutation which produces the identical change in the quality of the CSI can never, under any circumstances whatsoever, result from a random copying error. And for reasons only you know, and are not prepared to disclose to anybody.

  97. 97
    Joe says:

    A qualitative assessment of meaning that makes the underlying quantitative Shannon metric irrelevant.

    What? Please explain

    We know that minor changes to a few bits of a genome can massively affect the quality of what you define as CSI

    Actually Crick defined biological information and yes minor changes can ruin a coding sequence.

    whilst keeping the quantity of Shannon information (and also presumably the quantity of CSI) essentially unchanged.

    That is the difference between CSI and CSI capacity.

    Where I suspect we differ is that you will claim that the beneficial change can only occur where the new genome has been sprinkled with the designers’ magic dust,

    And computers are sprinkled with the programmers’ magic dust.

    and a similar point mutation which produces the identical change in the quality of the CSI can never, under any circumstances whatsoever, result from a random copying error.

    Really? No one says that all mutations are directed, Tamara.

  98. 98
    Tamara Knight says:

    Really? No one says that all mutations are directed, Tamara.

    Indeed, but that is not the opposite of “some undirected mutations are beneficial” is it? And in the past you have refused to accept that a random mutation can ever be beneficial. Are you just trying to hide what you really think from scrutiny*, or just not being clear about your changed position on the issue?

    *I use that word ia its dictionary sense, “searching examination or investigation; minute inquiry.” A previous comment of yours suggest you take the stem to mean something different

  99. 99
    Joe says:

    Tamara Knight:

    And in the past you have refused to accept that a random mutation can ever be beneficial.

    I don’t recall ever doing such a thing and I am very sure that I have said the opposite.

    Tamara’s scrutiny amounts to not reading or understanding what I post and then badgering me with her twisted interpretation.

  100. 100
    Mung says:

    I thought for a moment that Tamara had something interesting to say. Then I discovered that Tamara was a liar.

    http://www.uncommondescent.com.....ent-532169

  101. 101
    Mung says:

    Let’s try again.

    Tamara:

    I made it [Dembski Information] up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”.

    And the difference is?

Leave a Reply