Uncommon Descent Serving The Intelligent Design Community

How is Bill Dembski’s Being as Communion doing?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Currently (9:00 am EST) in the top 100 in the Kindle store, despite the sweetheart deals offered this summer, for buying the book.

For a thing to be real, it must be able to communicate with other things. If this is so, then the problem of being receives a straightforward resolution: to be is to be in communion. So the fundamental science, indeed the science that needs to underwrite all other sciences, is a theory of communication. Within such a theory of communication the proper object of study becomes not isolated particles but the information that passes between entities. In Being as Communion philosopher and mathematician William Dembski provides a non-technical overview of his work on information. Dembski attempts to make good on the promise of John Wheeler, Paul Davies, and others that information is poised to replace matter as the primary stuff of reality. With profound implications for theology and metaphysics, Being as Communion develops a relational ontology that is at once congenial to science and open to teleology in nature. All those interested in the intersections of theology, philosophy and science should read this book.

Here’s part of a review a reader sent:

Dembski leaves nothing to chance, not even chance itself. He is also a mathematician, so he looks at chance from the perspective of probability theory. He sees chance events through the law of large numbers and probability distribution. When looking at any event, we may prematurely assume—taken in isolation—that the event is (strictly speaking) random; however, in looking at all events aggregately, the probability distribution of those events will begin to show a pattern. He writes:

“For instance, as a coin is tossed repeatedly, the proportion of heads will tend to ½. This stable pattern to coin tossing is justified both theoretically (various probabilistic laws of “large numbers” confirm it) and practically (when people flip coins a large number of times, they tend to see roughly the same proportion of heads and tails).”

Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.”

See also: Brief excerpt from Being as Communion

Also How is Steve Meyer’s Darwin’s Doubt doing? (Continues to lead, and Christians defending Darwin continue to detract.)

Thought: Will Christians defending Darwin actually read Being as Communion first? Detract later?

Follow UD News at Twitter!

Comments
Let's try again. Tamara:
I made it [Dembski Information] up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”.
And the difference is?Mung
November 25, 2014
November
11
Nov
25
25
2014
09:38 PM
9
09
38
PM
PDT
I thought for a moment that Tamara had something interesting to say. Then I discovered that Tamara was a liar. https://uncommondescent.com/intelligent-design/on-specified-complexity-orgel-and-dembski/#comment-532169Mung
November 25, 2014
November
11
Nov
25
25
2014
09:27 PM
9
09
27
PM
PDT
Tamara Knight:
And in the past you have refused to accept that a random mutation can ever be beneficial.
I don't recall ever doing such a thing and I am very sure that I have said the opposite. Tamara's scrutiny amounts to not reading or understanding what I post and then badgering me with her twisted interpretation.Joe
November 25, 2014
November
11
Nov
25
25
2014
10:40 AM
10
10
40
AM
PDT
Really? No one says that all mutations are directed, Tamara.
Indeed, but that is not the opposite of "some undirected mutations are beneficial" is it? And in the past you have refused to accept that a random mutation can ever be beneficial. Are you just trying to hide what you really think from scrutiny*, or just not being clear about your changed position on the issue? *I use that word ia its dictionary sense, "searching examination or investigation; minute inquiry." A previous comment of yours suggest you take the stem to mean something differentTamara Knight
November 25, 2014
November
11
Nov
25
25
2014
10:10 AM
10
10
10
AM
PDT
A qualitative assessment of meaning that makes the underlying quantitative Shannon metric irrelevant.
What? Please explain
We know that minor changes to a few bits of a genome can massively affect the quality of what you define as CSI
Actually Crick defined biological information and yes minor changes can ruin a coding sequence.
whilst keeping the quantity of Shannon information (and also presumably the quantity of CSI) essentially unchanged.
That is the difference between CSI and CSI capacity.
Where I suspect we differ is that you will claim that the beneficial change can only occur where the new genome has been sprinkled with the designers’ magic dust,
And computers are sprinkled with the programmers' magic dust.
and a similar point mutation which produces the identical change in the quality of the CSI can never, under any circumstances whatsoever, result from a random copying error.
Really? No one says that all mutations are directed, Tamara.Joe
November 25, 2014
November
11
Nov
25
25
2014
07:43 AM
7
07
43
AM
PDT
Joe: CSI is just complex Shannon “information” with meaning or function.
Isn't that what I just postulated above? The problem is in the "just .... with meaning or function" A qualitative assessment of meaning that makes the underlying quantitative Shannon metric irrelevant. We know that minor changes to a few bits of a genome can massively affect the quality of what you define as CSI, whilst keeping the quantity of Shannon information (and also presumably the quantity of CSI) essentially unchanged. Being as you continually claim to accept the standard definitions, I presume you will agree that all such minor changes have a minimal affect on the Shannon information. Most will have a neutral effect on the owner of the genome and leave its CSI unchanged. Some will have a detrimental effect, degrading its CSI, and a few will have a beneficial effect which improves its CSI. Where I suspect we differ is that you will claim that the beneficial change can only occur where the new genome has been sprinkled with the designers' magic dust, and a similar point mutation which produces the identical change in the quality of the CSI can never, under any circumstances whatsoever, result from a random copying error. And for reasons only you know, and are not prepared to disclose to anybody.Tamara Knight
November 25, 2014
November
11
Nov
25
25
2014
06:52 AM
6
06
52
AM
PDT
CSI is just complex Shannon "information" with meaning or function. IOW it is information in the every day and normal use of the word. That is if one insists on saying Shannon "information" isn't really information...Joe
November 25, 2014
November
11
Nov
25
25
2014
06:00 AM
6
06
00
AM
PDT
Tamara: The whole purpose of Shannon’s work was to measure the amount of information. Mung: Not really.
I was trying to keep it simple Mung, but Shannon (as I understand and use his work) devised a way of quantifying information. Esentially a measure of how difficult it is for a sender of information to convey his exact and unambiguous meaning to a receiver under real world conditions. If you think you have a deeper understanding of his work than I do, I assure you I am always keen to learn from others with a better understanding than I have. Please expand on where your understanding is different to mine, but use your own words so we can narrow down and debate our diferences. Having to use a cut and paste of somebody else's understanding gets us nowhere unless that somebody is prepared to join in the debate.
Mumg poses a question: The putative reason given is that it would allow “Dembski Information” to be distinguished from “Shannon Information.” And the difference between the two is? and then claims: If there’s no difference, then there is no need to introduce a new term, a new term that adds nothing of substance to the discussion.
Indeed, if there is no difference. But it is the if we are debating. If your assertion is true, then one of these two statements from my post at 78 must be wrong because they are mutually exclusive.
For Shannon information M is positive and nominally 32Kbytes, T is positive and small, and T=Ta=Tb=Tc For Demski/CSI M is zero, T is indeterminate, but Ta>Tc>Tb
Which one do you think is wrong and why? I feel pretty confident on the Shannon one, but accept I may be wrong on the Dembski/CSI one. However, if you claim they are the same, then it must follow that the quantity of Shannon information is totally uncorrelated with the quality of the product resulting from the "use" of that information. Whatever Dembski/CSI is, it has to have the property that it can be enhanced or degraded by any change (in fact even without any change) in the amount of Shannon information describing it. One measure is totally objective, the other by its very nature has to be subjective. How can there possibly be no differnce between them Mung?Tamara Knight
November 25, 2014
November
11
Nov
25
25
2014
04:48 AM
4
04
48
AM
PDT
Adapa, your judgment about what constitutes trolling is suspect, and Tamara already admitted to making up the term "Dembski Information."Mung
November 22, 2014
November
11
Nov
22
22
2014
08:32 PM
8
08
32
PM
PDT
Mung Suggestion on a name for it? Falsely accusing someone of making up things by making up your own porkies about them should be called "Mungtrolling".Adapa
November 22, 2014
November
11
Nov
22
22
2014
08:05 PM
8
08
05
PM
PDT
Is there a reason ID critics ought to be allowed to just make stuff up and get away with it? Tamara made up the term "Dembski Information." The putative reason given is that it would allow "Dembski Information" to be distinguished from "Shannon Information." And the difference between the two is? If there's no difference, then there is no need to introduce a new term, a new term that adds nothing of substance to the discussion. I'm thinking this one needs to be added to the Darwinian Debating Devices. Suggestion on a name for it?Mung
November 22, 2014
November
11
Nov
22
22
2014
07:53 PM
7
07
53
PM
PDT
Let's recap: Tamara Knight @ 71:
I made it [Dembski Information] up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”.
Where's the quandary, Tamara? Tamara:
The whole purpose of Shannon’s work was to measure the amount of information.
Not really.Mung
November 21, 2014
November
11
Nov
21
21
2014
07:12 PM
7
07
12
PM
PDT
Tamara, you're the one who has admitted to making up the term "Dembski Information" to cover up your ignorance of Dembski's actual position in the book. What's so hard to grasp? Can we move on now? Seriously, I thought I was saying I was willing to move on. Tamara:
Do you honestly not understand the differnce between making up (i.e. defining) a label and making up the underlying concept.
You made up the label [Dembski Information] but had no underlying concept to attach to the label. Yes, I understand that. I don't say you made up the underlying concept. I say you made up the label. You admit to making up the label. What was the underlying concept? Tamara:
Lavoisier made up “hydrogen” to describe the chemical element that burns to produce water. It is a made up word. I suggested we use the phrase “Dembski information” to discuss the case where information in the Dembski sense differs form that in the Shannon sense.
A neologism then. That's what you have created? Let's see how this works: [Lavoisier] hydrogen - the chemical element that burns to produce water. [Tamara] dembski information - information in the Dembski sense differs form that in the Shannon sense. How are the two even remotely analogous?Mung
November 21, 2014
November
11
Nov
21
21
2014
05:56 PM
5
05
56
PM
PDT
And sparc, somebody finally bought a copy on amazon.co.uk. Its ranking shot up nearly 200,000 places as a result: It was ranked at 400,579 yesterday, now it's at 201,079.Tamara Knight
November 20, 2014
November
11
Nov
20
20
2014
09:30 AM
9
09
30
AM
PDT
updates on Amazon sales paperback edition 9 copies in the US, 1 copy in Canada since Nov. 11 kindle edition 11 copies in the US, 1 copy in Canada since Nov. 11 I.e., a total of 21 copies within roughly a week. However, Amazon is not the only distributor.sparc
November 18, 2014
November
11
Nov
18
18
2014
07:59 PM
7
07
59
PM
PDT
Mung, you leave me in a bit of a quandary. Either I accuse you of being incapable of grasping a most basic concept of communication, or of being a contrarian through and through. I'm not sure which is ruder, but I can see no other explanation. Do you honestly not understand the differnce between making up (i.e. defining) a label and making up the underlying concept. Lavoisier made up "hydrogen" to describe the chemical element that burns to produce water. It is a made up word. I suggested we use the phrase "Dembski information" to discuss the case where information in the Dembski sense differs form that in the Shannon sense.
p.s. If, as Dembski claims, “Information is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility” and if, as I claim, SMI applies to ANY probability distribution, then it does in fact follow that information can be measured. Dembki is correct, and there’s nothing at all controversial here.
Of course there isnt't, because both claims are statements of the obvious. The whole purpose of Shannon's work was to measure the amount of information. Think of SMI as a precise measure of how much you would need to tell me, (or how much I would need to tell you,) such that we both knew the same. The non sequitur comes when you then try and postulate some additional property to this information that does not affect the SMI. If you go back to my post 78, from my understanding of what you think Dembski is claiming, added CSI can either increase of decrease SMI, but please feel free to comment on my assertion that "Demski/CSI M is zero, T is indeterminate, but Ta>Tc>Tb" if you know differently. Equating CSI to SMI is as useless as equating Shannon entropy to Thermodynamic entropy. It's apples and pears.Tamara Knight
November 17, 2014
November
11
Nov
17
17
2014
10:33 AM
10
10
33
AM
PDT
p.p.s. And it's not the ID proponent who has to resort to making things up.Mung
November 16, 2014
November
11
Nov
16
16
2014
08:22 PM
8
08
22
PM
PDT
p.s. If, as Dembski claims, "Information is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility" and if, as I claim, SMI applies to ANY probability distribution, then it does in fact follow that information can be measured. Dembki is correct, and there's nothing at all controversial here.Mung
November 14, 2014
November
11
Nov
14
14
2014
04:58 PM
4
04
58
PM
PDT
Tamara:
I made it up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”. If you prefer, I’m happy to use the term “information” to mean information in its broadest everyday sense, and “Shannon information” only when making an appeal to the properties of Shannon information.
Actually, I'd prefer you say information when you mean information. If what you mean is "Shannon's Measure of Information" then say that, or SMI for short. Now, in an attempt to move things along. Do you believe that SMI can be applied to ANY probability distribution? Did Shannon limit his measure to only certain probability distributions? Or am I just not making sense to you with these questions? So how might Shannon and Dembski differ? That's hard to say, because while it's clear that Shannon was interested in the engineering problem and not the "meaning" it doesn't follow that he thought that there was such a thing as information that could exist independently of any matrix of probability. Dembski otoh thinks that is the very defining feature of information. If so, there is no such thing as information that cannot be measured by SMI. (Assuming you agree that SMI applies to any probability distribution.) Let's just say that, for now, this is the case in theory. We may not know the relevant probability distribution, but it does not follow that information can exist without the actualization of a possibility within a matrix of possibility.
Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.”
Mung
November 14, 2014
November
11
Nov
14
14
2014
04:51 PM
4
04
51
PM
PDT
"As we saw in Chapter 6, information, as a numerical measure, is usually defined as the negative logarithm to the base two of a probability (or some logarithmic average of probabilities, often referred to as entropy). This has the effect of transforming probabilities into bits and of allowing them to be added (like money) rather than multiplied (like probabilities)...Such a logarithmic transformation of probabilities is useful in communication theory ..." - Dembski. p. 159 Shannon is rolling over in his grave!Mung
November 14, 2014
November
11
Nov
14
14
2014
04:33 PM
4
04
33
PM
PDT
tragic m, there are probably a number of us who have read it or are reading it. I had put it down for a while but am now on Chapter 11.Mung
November 14, 2014
November
11
Nov
14
14
2014
04:26 PM
4
04
26
PM
PDT
Tamara: I made it up as part of a proposed way to ensure there was no ambiguity as to whether we were discussing information “in the Dembski sense” or “in the Shannon sense”. And you tell me to stop digging? Tamara: Why bring up Shannon at all? Mung: Because you asked about measuring information. Tamara: No, I asked about measuring information[in the Dembski sense (something I just made up without giving it any meaning)] No? You asked about measuring information. Let me bold that for you: Tamara: No, I asked about measuring information Right. You asked about measuring information. And somehow I am supposed to understand that you meant "Dembski information" something that apparently only you know what it is? And that you were not talking about measuring information at all, or in some special sense known only to you? Here's Tamara quoting from the OP:
Information, Dembski writes, “is produced when certain possibilities are realized to the exclusion of others within a matrix of possibility…. It follows that information can be measured.
She then follows that with a question:
Do you have a page number for where he tells us how to measure it?
IT. How to measure WHAT? Information? And I am the villain for pointing out the obvious? Please. Dembski was clearly talking about measuring information. You were clearly asking about how to measure [it] information. Now, is there something else we can help you with?Mung
November 14, 2014
November
11
Nov
14
14
2014
04:24 PM
4
04
24
PM
PDT
Has anyone on this thread actually read "Being as Communion" or are at least reading it? lol. I want to discuss that book. :( I am on chapter 16 now and I'm finding that chapter very interesting. He makes a point about fine-tuning that now seems obvious to me but I had never realized it before. Fine-tuning can't be defined mathematically because there's no way to define the number of possibilities.tragic mishap
November 14, 2014
November
11
Nov
14
14
2014
06:44 AM
6
06
44
AM
PDT
You should be able to calculate it, Tamara.
And I can of course, based on your understanding of CSI Generally: ...............................|.. Control..|.....256k base pair....|....256k base pair....| ______________|_________.|_.gene_duplication_|_random_insertion_| Information......... |........ X ..... |............ X+T ............ |......... X+M ...............| . And specifically for the 256k base pair gene duplication in the proposed scenarios . ______________|_Scenario a)_|_Scenario b)_|_Scenario c)_| Information........ | .... X+Ta ..... | ... X+Tb ....... |..... X+Tc ..... | . For any particular example: For Shannon information M is positive and nominally 32Kbytes, T is positive and small, and T=Ta=Tb=Tc For Demski/CSI M is zero, T is indeterminate, but Ta>Tc>Tb Oh, and every Shannon variables is equal to the equivalent CSI variable. I can't get that bit to work, perhaps you can explain it? And incidently, if you think anybody, ever "utilizes" Morton’s Demon, then the concept has gone tiotally over your head.Tamara Knight
November 14, 2014
November
11
Nov
14
14
2014
06:09 AM
6
06
09
AM
PDT
Morton's Demon pertains to evidence. And if there is ever any evidence that supports blind watchmaker evolution I will definitely take a look and consider it with all the respect science deserves. OTOH I see evos utilize Morton's Demon on a daily basis...Joe
November 14, 2014
November
11
Nov
14
14
2014
04:30 AM
4
04
30
AM
PDT
Tamara, It is two bits per nucleotide for the reason I presented. As for lacking knowledge and understanding, that would be you with respect to natural selection. Strange how you can never make a point but instead rely on innuendos and misrepresentations.Joe
November 14, 2014
November
11
Nov
14
14
2014
04:23 AM
4
04
23
AM
PDT
You should be able to calculate it, Tamara.Joe
November 14, 2014
November
11
Nov
14
14
2014
04:20 AM
4
04
20
AM
PDT
Shannon’s methodology for measuring information works with CSI. CSI is just complex Shannon information that has meaning or function.
Then you should be able to calculate it Joe. Try the example cases at post 33. Just the direction of change would be a start.Tamara Knight
November 14, 2014
November
11
Nov
14
14
2014
04:14 AM
4
04
14
AM
PDT
What error?
Joe, when you said:
Wrong- 2 bits per nucleotide.
I thought you might have meant:
Right- I might have described it as 2 bits per nucleotide myself, but I can see that is exactly the same as two bit per base pair.
and I can see giving you the benefit of the doubt is a futile gesture. I'll just accept you're never wrong
If I ever encounter one I will let you know.
I don't think you ever will Joe. Some years ago I remember the concept of something called Morton's Demon* being introduced to a debate about Creationists. I can see it applies to elements of ID proponents too. I don't suppose any of us like to be seen to lack knowledge and understanding. We all try to cover up the gaps to some degree, present speculation as fact to seem knowledgable and keep a conversation going, etc, etc. But I've never understood why some people feel the need to use their obvious intelligence to build mental barricades to protect their ignorance, rather than to continually try and reduce it. *If you are not familiar, it is based on the concept of Mexwell's Demon, a hypothetical entity guarding a portal between two chambers of gas, and blocking entry cold atoms. If such an entity were possible, it would allow a perpetual motion machine to be constructed. Heisenberg explained why it cannot exist. Morton's Demon performs a similar function and prevents uncomfortable information from ever reaching Morton's conscious brain.Tamara Knight
November 14, 2014
November
11
Nov
14
14
2014
04:11 AM
4
04
11
AM
PDT
Shannon's methodology for measuring information works with CSI. CSI is just complex Shannon information that has meaning or function.Joe
November 14, 2014
November
11
Nov
14
14
2014
03:34 AM
3
03
34
AM
PDT
1 2 3 4

Leave a Reply