Information Intelligent Design

Eric Holloway: Dembski’s filter is critical for internet communication

Spread the love

It turns out that legions of critics of the explanatory filter use it all the time, without noticing:

William Dembski created quite a stir in the world of information theory with his book The Design Inference. For the first time outlining a rigorous method for identifying design, which he called the explanatory filter. Since then many critics have claimed that Dembski’s proposed filter is without merit due to the lack of application in the couple of decades since its invention. But, are the critics right, or are they wrong—wrong in the way that a fish doesn’t recognize water because water is the very atmosphere of the fish’s existence?…

It turns out, Dembski’s filter is the bedrock of our modern information technology. The ability to eliminate random chance and then infer an independent pattern is the fundamental principle behind communication, cryptography, and authentication.

Eric Holloway, “Is William Dembski’s Explanatory Filter the most widely used theory ever?” at Mind Matters News

You may also enjoy:

Does information theory support design in nature? William Dembski makes a convincing case, using accepted information theory principles relevant to computer science. (Eric Holloway)

and

Study shows: Eating raisins causes plantar warts. Sure. Because, if you torture a Big Data enough, it will confess to anything. (Robert J. Marks)

6 Replies to “Eric Holloway: Dembski’s filter is critical for internet communication

  1. 1
    JVL says:

    The ability to eliminate random chance . . .

    Ah yes, but how do you do that?

    Also, many systems that are under contention are NOT due to random chance alone. That makes the determination a bit harder don’t you think?

  2. 2
    ET says:

    It starts knowledge of cause-and-effect relationships.

    Also, many systems that are under contention are NOT due to random chance alone.

    How do you know that? Natural selection is nothing more than contingent serendipity. And it always starts with that which needs explaining in the first place.

    Then there is the paper “Waiting for TWO Mutations”, which pretty much puts it all down to random chance.

  3. 3
    EDTA says:

    JVL,

    If you observe a law-like regularity in something, say the formation of a crystal, then you know randomness was not playing a role. In the example, physical laws were conspiring to make the crystal uniform (i.e., non-random) in structure.

  4. 4
    Querius says:

    EDTA,

    Exactly. When a non-random process is combined with a random one, the overall result is no longer random. This is where we start looking for correlations and emergent behavior, which can indeed be tricky to identify.

    The elephant in the room is identifying the source of information.

    There’s a sort of Maxwell’s Daemon or Information Karma (r) explanation that resorts to overall increases in entropy but fails when you think about it.

    To exaggerate . . . you can’t explain the spontaneous emergence of a Tesla Model X by referencing an explosion in Beirut. Perhaps the overall entropy increases, but there’s no correlation between a high level of design engineering with an “offsetting” (tragic) destruction of part of a city.

    -Q

  5. 5
    kairosfocus says:

    Folks, simply ponder a pivotal t/comms quantity, signal to noise ratio. That implies ability to differentiate on empirical characteristics then so separate that one can measure the two and deduce a key quantity. KF

  6. 6
    Querius says:

    I was glad to see that Dembski mentions cryptography. Modern versions appear similar to noise and an observer can have difficulty in distinguishing when a signal begins and when it stops within the stream of data. DNA and (apparently overlapping) epigenetic data have additional structure that’s not well understood. So naturally, it’s called “junk” (or noise).

    -Q

Leave a Reply