Uncommon Descent Serving The Intelligent Design Community

Bad math: Why Larry Moran’s “I’m not a Darwinian” isn’t a valid reply to Meyer’s argument

Categories
Intelligent Design
Share
Facebook
Twitter/X
LinkedIn
Flipboard
Print
Email

Professor Larry Moran has written a response to my post, A succinct case for Intelligent Design. Unfortunately, Professor Moran gets his facts wrong from the get-go. He writes:

It seems to me that the [Intelligent Design creationist] movement concentrates on criticizing evolution (and materialism) and doesn’t really present much of a case for believing that the history of life was directed by gods.

Now, it’s no skin off my nose if Professor Moran wants to call us creationists. Frankly, I couldn’t care less. But the Intelligent Design movement has never claimed to have scientific evidence that the history of life was “directed by gods.” What we claim is that certain highly specific, functional systems which are found in living things were designed by some intelligent agent or agents. By “intelligent,” I don’t mean “humanlike”; rather, what I mean is: capable of engaging in abstract reasoning, when selecting suitable means to achieve one’s goals. In the most clear-cut Intelligent Design cases, the agent has to engage in mathematical reasoning – whether it be about squares (in the case of the monolith on the Moon in the movie 2001, whose sides are in the ration 1:4:9) or about digital code (in the case of the DNA we find in living things), or about which complex geometrical arrangements of amino acid chains will prove to be capable of performing a biologically useful task (in the case of protein design).

When I speak of the agent’s “goals,” I don’t mean the agent’s personal motives for doing something, which we have no way of inferring from the products they design; rather, I simply mean the task that the agent was attempting to perform, or the problem that they were trying to solve. Beyond that, there is nothing more that we could possibly infer about the agent, unless we were acquainted with them or with other members of their species. For instance, we cannot infer that the designer of an artifact was a sentient being (since the ability to design doesn’t imply the ability to feel) , or a material being (whatever that vague term means), or a physical entity (since there’s no reason why a designer needs to exhibit law-governed behavior), or even a complex or composite entity. To be sure, all the agents that we are familiar with possess these characteristics, but we cannot infer them from the products designed by an agent. Finally, the fact that an agent is capable of performing a variety of functions does not necessarily imply that the agent is composed of multiple detachable parts. We simply don’t know that. In short: the scientific inferences we can make about non-human designers are extremely modest.

Moran’s verdict: “No case for Intelligent Design”

After quoting the 123-word passage from Meyer’s book which I highlighted in my original post, summarizing the four fundamental problems with unguided evolution, Professor Moran accuses Dr. Meyer of claiming that Intelligent Design must be true because Darwinism is false:

This passage merely affirms what we all know to be true; namely that there is no case for Intelligent Design Creationism. It’s just a bunch of whining about the inadequacies of the IDiot version of evolution. That version assumes that all of evolution is due to natural selection acting on random mutations and this gives rise to the appearance of design.

I don’t believe in that version of evolution and I don’t think that most species look as though they were designed. Does that mean that I’m an Intelligent Design Creationist? Of course not. Meyers (and Torley) have fallen for the trap of the false dichotomy.

Even if all four of Stephen Meyer’s critiques were correct, he still isn’t offering an alternative explanation and he still isn’t showing us evidence for an intelligent designer—or any other kind of designer.

As anyone who has read Darwin’s Doubt knows, this is a complete travesty of Meyer’s argument. Professor Moran is displaying his ignorance here.

The evidence for an intelligent designer, in a nutshell

Dr. Meyer’s case for an intelligent designer is spelt out with admirable lucidity in an Evolution News and Views post titled, Does Darwin’s Doubt Commit the God-of-the-Gaps Fallacy? (October 16, 2013). The argument proceeds as follows:

Premise One: Despite a thorough search and evaluation, no materialistic causes or evolutionary mechanisms have demonstrated the power to produce large amounts of specified or functional information (or integrated circuitry).

Premise Two: Intelligent causes have demonstrated the power to produce large amounts of specified/functional information (and integrated circuitry).

Conclusion: Intelligent design constitutes the best, most causally adequate, explanation for the specified/functional information (and circuitry) that was necessary to produce the Cambrian animals…

In fact, the argument for intelligent design developed in Darwin’s Doubt constitutes an “inference to the best explanation” based upon our best available knowledge….[A]n inference to the best explanation …asserts the superior explanatory power of a proposed cause based upon its established — its known — causal adequacy, and based upon a lack of demonstrated efficacy, despite a thorough search, of any other adequate cause. The inference to design, therefore, depends on present knowledgee of the causal powers of various materialistic entities and processes (inadequate) and intelligent agents (adequate).

Meyer’s argument can also be found in chapters 17 and 18 of his book, Darwin’s Doubt. Sadly, Professor Moran evinces no sign of having read those chapters. One wonders whether he merely skimmed Dr. Meyer’s book.

Why the neutral theory of evolution won’t remedy the deficiencies of neo-Darwinism

But let us return to Professor Moran’s remarks about natural selection. In his introduction to The Origin of Species, Charles Darwin wrote: “I am convinced that Natural Selection has been the main but not the exclusive means of modification.” In a similar vein, Richard Dawkins famously declared: “Evolution by natural selection is the only workable theory ever proposed that is capable of explaining life, and it does so brilliantly.”

Professor Moran does not share these views. He rejects the view that “evolution is due to natural selection acting on random mutations and this gives rise to the appearance of design,” forthrightly asserting: “I don’t believe in that version of evolution.” He maintains that “a huge number of mutations are neutral and there are far more neutral mutations fixed by random genetic drift that there are beneficial mutations fixed by natural selection.” This, he declares, is what modern-day evolutionists believe. In an earlier post, he complains that “you have to read very carefully to find any mention of modern evolutionary theory in Meyer’s book – he prefers to focus his attack on mutation + natural selection.”

What Professor Moran does not tell us here is that Dr. Stephen Meyer wrote a detailed and extensive critique of the neutral theory of evolution in his book, Darwin’s Doubt. In his critique, Dr. Meyer focuses on the ground-breaking work of Dr. Michael Lynch, a geneticist who espouses the neutral theory of evolution. Meyer argues that this theory is incapable of accounting for the origin of new animal body plans, because it is built on faulty mathematical assumptions (bolding mine – VJT):

Michael Lynch, a geneticist at Indiana University, … proposes a neutral or “non-adaptive” theory of evolution in which natural selection plays a largely insignificant role…

Lynch argues that in small populations, animal genomes will inevitably grow over time as nonprotein-coding sections of DNA (as well as gene duplicates) accumulate due to the weakness of natural selection. He thinks that these neutral mutations drive the evolution of animals.

… [F]or Lynch’s theory to explain the origin of new and functional genes and proteins (and the anatomical complexities that depend on them), his theory would have to solve the problem of combinatorial inflation… He would have to show that random mutations could efficiently search the relevant combinatorial space of possible sequences corresponding to a given novel functional gene or protein.

Nevertheless, Lynch does not even address the problem of combinatorial inflation or the closely related problem of the rarity of genes and proteins in sequence space…

Lynch does argue in one paper that neutral evolutionary processes can generate new complex adaptations – adaptations requiring multiple coordinated mutations – within realistic waiting times. In particular, writing in a recent paper with colleague Adam Abegg of St. Louis University, he argues that “conventional population genetic mechanisms” such as random mutation and genetic drift can cause the “relatively rapid emergence of specific complex adaptations.” …

But some things are just too good to be true, and it turns out that Lynch and Abegg made a subtle but fundamental mathematical error in coming to their conclusion. Appropriately, perhaps, the first person to demonstrate that Lynch’s incredible claim was problematic was Douglas Axe… In the end, he traced Lynch and Abegg’s claims to two erroneous equations, both of which were based on erroneous assumptions. In essence, Lynch and Abegg assumed that organisms will acquire a given complex adaptation by traversing a direct path to the new anatomical structure. Each mutation would build on the previous one in the most efficient manner possible – with no setbacks, false starts, aimless wandering, or genetic degradation – until the desired structure or system (or gene) is constructed. Thus, they formulated an undirected model of evolutionary change, and one that assumes, moreover, that there is no mechanism available (such as natural selection) that can lock in potentially favorable mutational changes on the way to some complex advantageous structure….

Yet nothing in Lynch’s neutral model ensures that potentially advantageous mutations will remain in place while other mutations accrue. As Axe explains, “Productive changes cannot be ‘banked,’ whereas Equation 2 [one of Lynch’s equations] presupposes that they can.” Instead, Axe shows, mathematically, that degradation (the fixation of mutational changes that make the complex adaptation less likely to arise) will occur much more rapidly than constructive mutations, causing the expected waiting time to increase exponentially.
(2013, pp. 321, 322, 326, 327-328)

Quoting Marshall – but missing the big picture

In another post, Professor Moran quotes with relish from a critical review of Dr. Meyer’s book by the eminent UC paleontologist, Professor Charles Marshall:

…when it comes to explaining the Cambrian explosion, Darwin’s Doubt is compromised by Meyer’s lack of scientific knowledge, his “god of the gaps” approach, and selective scholarship that appears driven by his deep belief in an explicit role of an intelligent designer in the history of life.

However, Dr. Meyer has responded at length to Professor Marshall’s criticisms, in a four-part series. Meyer’s most telling points can be found in his second post, which is titled, To Build New Animals, No New Genetic Information Needed? More in Reply in Charles Marshall. I’ll quote a few brief excerpts (bolding mine – VJT):

…Marshall simply assumes that most of the genetic information necessary to build the Cambrian animals already existed before the Cambrian explosion. In fact, he seems to presuppose the existence of what Susumu Ohno called a “pananimalian genome,”16 a nearly complete set of the genes necessary to build Cambrian animals within some phenotypically simpler, ur-metazoan ancestor. Thus, he states the new animal phyla “emerged through the rewiring of the gene regulatory networks (GRNs) of already existing genes.”17 …

Nevertheless, this question-begging assumption does not solve the central problem posed by Darwin’s Doubt — that of the origin of the genetic (and epigenetic) information necessary to produce the Cambrian animals. It merely pushes the problem back several tens or hundreds of millions of years, assuming that such a universal genetic toolkit ever existed.

Readers of the book will recall my discussion, in Chapters 9 and 10, of recent mutagenesis experiments. These experiments have established the extreme rarity of functional genes and proteins among the many (combinatorially) possible ways of arranging nucleotide bases or amino acids within their corresponding “sequence spaces.” … This extreme rarity also helps to explain why mathematical biologists, using standard population genetics models, are calculating exceedingly long waiting times (well in excess of available evolutionary time) for the production of new genes and proteins when producing such genes or proteins requires even a few coordinated mutations.20

For these reasons, defining the Cambrian explosion as a 25 million year event, as Marshall does, instead of a 10 million year event, as many other Cambrian experts do (and as I do in Darwin’s Doubt), makes no appreciable difference in solving the problem of the origin of genetic information — such is the extreme rarity of functional bio-macromolecules within their relevant sequence spaces. Nor, for that matter, does positing the origin of a complete set of genes (that is, many more than just one) for building all the Cambrian animals 100 million years before the Cambrian explosion. That merely pushes the problem back…In any case, the experimentally based calculations in Darwin’s Doubt show that neither ten million, nor several hundred million years would afford enough opportunities to produce the genetic information necessary to build even a single novel gene or protein, let alone all the new genes and proteins needed to produce new animal forms.

Nobody would question Professor Marshall’s expertise in paleontology, but the argument in Dr. Stephen Meyer’s book, Darwin’s Doubt, is ultimately a mathematical one. Until evolutionists demonstrate that they can grapple with the mathematics in Meyer’s argument, their criticisms of his book will continue to miss the mark.

Reading the critical reviews of Meyer’s book reinforced my conviction that many contemporary biologists fail to grasp that the scientific case for unguided evolution is built on a foundation of faulty math. As a philosopher of science, Dr. Meyer is to be congratulated for having the courage to publicly declare that the emperor has no clothes.

Finally, here’s what Harvard geneticist George Church (who isby no means an Intelligent Design theorist) said about Darwin’s Doubt:

Stephen Meyer’s new book Darwin’s Doubt represents an opportunity for bridge-building, rather than dismissive polarization — bridges across cultural divides in great need of professional, respectful dialog — and bridges to span evolutionary gaps.

Readers can find many more comments on this Web page by highly qualified scientists praising Darwin’s Doubt. Professor Moran is welcome to call them all “idiots” if he likes. But somehow I don’t think he’ll do that. Or will he?

Comments
Upright BiPed:
I gave you a material definition of my terms at the point I used the word. You have ignored that definition and set out to obfuscate the material issue behind the word. I won’t be enabling you. It’s an embarrassment.
Why would you use an established term such as "protocol" and redefine it to mean "convention"? There is no reason to do that. Simply use terms as they are already defined. Kairosfocus's description of "protocol" does not match yours.Carpathian
May 30, 2015
May
05
May
30
30
2015
12:52 PM
12
12
52
PM
PDT
kairosfocus:
Carpathian, have you looked at the diagrams I have put up? If not, kindly cf Yockey’s mapping of protein synthesis. Note the encoding process to RNA code (including snipping and splicing] then transfer through nuclear ports to the ribosome, where there is a decoding driven, 3-letter codon framing process that is used to create the protein chain, terminating of course with a stop codon. KF
Kairosfocus, I accept your description of a mapping which causes a problem for Upright BiPed. What I'd like to see from you, is an example of a protocol in the cell just as you defined it with your layered communications protocol. Where in the cell does this occur? I don't see two intelligent agents in the cell. I don't see even one.Carpathian
May 30, 2015
May
05
May
30
30
2015
12:48 PM
12
12
48
PM
PDT
Carpathian, have you looked at the diagrams I have put up? If not, kindly cf Yockey's mapping of protein synthesis. Note the encoding process to RNA code (including snipping and splicing] then transfer through nuclear ports to the ribosome, where there is a decoding driven, 3-letter codon framing process that is used to create the protein chain, terminating of course with a stop codon. KFkairosfocus
May 30, 2015
May
05
May
30
30
2015
12:32 PM
12
12
32
PM
PDT
Carp, please hold a record album up to your ear. Hear anything? The reason you don't hear the album is because there are no vibrations in air pressure coming from it. You need vibrations in air pressure, right around your head, and that's your big problem. How do you get from grooves in a vinyl disc to vibrations in air pressure around your head? What on earth would be required for that? My best guess is that you might need something very particular. Most probably, something very particular to the way the grooves are arranged on the disc in the first place. It seems to me that the method for retrieving the information off the disc would have to be in some sort of substantial prior agreement with the way the disc was made.
you have failed to meet kairosfocus’s definition of what a protocol is
I gave you a material definition of my terms at the point I used the word. You have ignored that definition and set out to obfuscate the material issue behind the word. I won't be enabling you. It's an embarrassment. As always, it doesn't matter what you call it, and thats the real point here. Bye.Upright BiPed
May 30, 2015
May
05
May
30
30
2015
12:14 PM
12
12
14
PM
PDT
mike1962:
The protocol exist by the fact that codons are always composed of three nucleotides each and the existence of two stop codons. This protocol is more fundamental than the data that is conveyed. That’s enough to qualify as a protocol.
No it's not. Convince kairosfocus that it is enough to qualify as a protocol as he described it.Carpathian
May 30, 2015
May
05
May
30
30
2015
11:55 AM
11
11
55
AM
PDT
Upright BiPed:
No on can be this dense. Seven days of it.
I would never call you dense, but you have failed to meet kairosfocus's definition of what a protocol is. There is a "look-up table", no protocol. Match up what happens in a cell with kairosfocus's protocol. Shut me up by showing that. You can't because there is no equivalent in a cell. Your semiotic argument is also wrong. Peirce, the originator of the term, says a footprint in the mud is semiotic as it is a "sign" of human activity. Putting a footprint in the mud to signify that there was human activity is not semiotic. It's because of you that I have looked at Peirce. What he says does not agree with your interpretation. Semiotic codes are not used to produce a result as you claim. Show me where Peirce claims that semiosis is a mechanism for producing a result.Carpathian
May 30, 2015
May
05
May
30
30
2015
11:50 AM
11
11
50
AM
PDT
No on can be this dense. Seven days of it.Upright BiPed
May 30, 2015
May
05
May
30
30
2015
11:23 AM
11
11
23
AM
PDT
Carpathian: Protocol has already been defined outside of ID, and in information technology, can be defined as the rules of communication between at least two intelligent agents. Where are the intelligent agents in the cell?
If I write a program to generate random data and send it to unused IP addresses using a pre-defined UDP protocol, where is the intelligent receiver? These is none even though the UDP protocol is employed refuting your demand that a protocol necessarily requires a receiver in any given instance of utilization. But to answer your question with regards to cells, the sending agent is the cell that built DNA strand, the receiving agent is the cell that interprets it and builds proteins. All utilizing a predetermined protocol defined as tri-nucleotide codons with two stop codons.
That is not a protocol, it is a translation.
The format of the transference of information is not arbitrary. The protocol exist by the fact that codons are always composed of three nucleotides each and the existence of two stop codons. This protocol is more fundamental than the data that is conveyed. That's enough to qualify as a protocol. But by all means keep on denying the obvious.mike1962
May 30, 2015
May
05
May
30
30
2015
11:14 AM
11
11
14
AM
PDT
groovamos:
Making the argument dependent on a software engineering point of view is obfuscation.
That is exactly what ID is doing. The claim is that "information" is being processed. Upright BiPed claims a "protocol" is at work along with a "code". kairosfocus talks about a "von neumann" replicator. Biology is about chemistry, not software engineering.Carpathian
May 30, 2015
May
05
May
30
30
2015
10:10 AM
10
10
10
AM
PDT
kairosfocus:
Carpathian, in any communicative situation, there naturally are two sides, transmission and reception.
But where are they in the cell? That is the question that requires an answer in order for Upright BiPed to be correct that there is a protocol when it comes to cellular activity. I am easily corrected when I make a mistake. I don't see two intelligent node in the cell. Show me where they are and I will agree with Upright BiPed that there is a protocol at work in the cell. From what you written about layered communications, I think you agree with me that a "look-up" table does not qualify as a protocol.Carpathian
May 30, 2015
May
05
May
30
30
2015
09:14 AM
9
09
14
AM
PDT
Mung:
I could go on and on.
And you probably will.Carpathian
May 30, 2015
May
05
May
30
30
2015
09:07 AM
9
09
07
AM
PDT
mike1962:
And if you don’t think the codon/ribosome protein synthesis system utilizes a protocol (e.g, why precisely three nucleotides to a codon, and what the hell does a stop-codon do?) then that’s your problem.
That is not a protocol, it is a translation. A protocol requires two intelligent nodes. Show me two intelligent nodes exchanging information in a cell. If by "protocol", ID means "translation", then use the term "translation". If by "protocol" ID means mean "convention", then use the term "convention". Protocol has already been defined outside of ID, and in information technology, can be defined as the rules of communication between at least two intelligent agents. Where are the intelligent agents in the cell?Carpathian
May 30, 2015
May
05
May
30
30
2015
09:04 AM
9
09
04
AM
PDT
Carpathian, in any communicative situation, there naturally are two sides, transmission and reception. That such an irreducibly complex entity should arise by blind chance and mechanical necessity is utterly implausible indeed, and such would have to be acting on the TX side and the Rx side withut correlation given the high contingency and lack of unifying purpose. However, for a protocol to correlate the two sides can be the product of an individual designer or a team or as a result of a negotiated standard for research or industry is a well known matter of fact. In short, you were looking at the design inference in action through the wrong end of the telescope. KFkairosfocus
May 30, 2015
May
05
May
30
30
2015
08:56 AM
8
08
56
AM
PDT
mike1962:
What does your fallible memory have to do with that fact that the point you were trying to make was patently wrong. You didn’t merely get the name wrong, you claimed UDP is not a protocol. Your memory of the name of the protocol has nothing to do with the fact that you denied UDP is a protocol.
I got the acronym wrong. In x.25 and other protocols, packets can remain unacknowledged for a period of time, i.e. the sending end will not require an ACK for packet before it send the next but will send a block of up to n packets before the receiving end has to start ACKing.Carpathian
May 30, 2015
May
05
May
30
30
2015
08:54 AM
8
08
54
AM
PDT
kairosfocus:
7 –> The dividing line here is that – codes address content that uses discrete state elements [e.g. alphanumeric characters, codons for genes, binary digits], – protocols are concerned with setting up co-ordinated communication with due regard to the natural layer-cake effect, and – mod/demod is concerned with encapsulating, sending, propagating and receiving then recovering signals in the midst of noise (and having regard to bandwidth and channel capacity issues).
Where are the two sides of this co-ordinated communication in the cell? Can you give me a cellular equivalent to the link layer? Can you give me a cellular equivalent to the transport layer?Carpathian
May 30, 2015
May
05
May
30
30
2015
08:48 AM
8
08
48
AM
PDT
kairosfocus:
The protocol context is that of interaction and a standardised programme of correct behaviour.
The interaction is from at least two intelligent parties. A man walks into a room, bows before a king, who then says a few words, an then taps the man on the shoulders with a sword. That is an example of protocol. Two intelligent agents are required for a protocol. Where are there at least two intelligent agents in the cell? Show me the intelligent agents required for a protocol that exist in the cell.Carpathian
May 30, 2015
May
05
May
30
30
2015
08:41 AM
8
08
41
AM
PDT
Andre:
Bit yes in a nutshell protocols in communication systems in its simplest form is a set of rules. Do our interlocutors agree with the meaning?
I agree but you don't go far enough. A protocol defines rules for at least two ends in a communications systems. In a multi-drop environment, you usually have one master and many slaves, with every single one of them being an intelligent node. In this scenario, the protocol also defines who is allowed to transmit messages at any given time. None of this multiple intelligent communications happens in the cell. Everything anyone has presented so far has been at the physical layer. Mung mentions during his Morse code example something he calls "discontinuity". I think what he means is "independence" of data. In other words, the data on the telegraph wires is time and position independent. Any char could at any time occupy a place on the data stream. This does not happen with DNA. It is fixed like the groove on a vinyl record and it is copied the same way.Carpathian
May 30, 2015
May
05
May
30
30
2015
08:30 AM
8
08
30
AM
PDT
groovamos:
You made the statement that, vague as it reads, seems to say that a phonograph record and playback apparatus is not used in information transfer. Which if that is what is being said, is bogus.
What I said, as vague as it reads, is that there is no protocol, i.e., a bi-directional transfer of information that is required for protocol. As an example, you cannot send a NAK to the phonograph record nor can you send a NAK to the cartridge. The vinyl record analogy is exactly what we see in regards to the workings of a cell. The DNA is for all intents and purposes static. Like the grooves in a vinyl record, the DNA simply gets copied. There is no other high-level "IT-like" communications happening. I handle correction well. Show me a protocol exists as Upright BiPed claims. There is nothing even close to a high-level two way communications as is required for a protocol.Carpathian
May 30, 2015
May
05
May
30
30
2015
08:19 AM
8
08
19
AM
PDT
Andre, the underlying logic of standardised peer-peer formatting at multiple levels within a communication system -- let's just call it the layer-cake protocol effect [LCPE] -- is patent, and links to what happens in royal courts etc just beg to be used: e.g. peers. A necessity if it is to work. Such can properly be described as a protocol. Once we extend beyond one-off level to cases where we want networks of interacting units, broadcast, simplex, half duplex, full duplex modes, the natural tendency is for standards to spread out, and the need for documentation then becomes obvious. But already at the one-off level, the FSCO/I and IC pointing strongly to design are present. KF PS: I am beginning to think part of what we are facing is the Wikipedia, domineering selectively hyperskeptical, rhetorically manipulative ideological faction subculture spreading out across the web. Complete, with conclusion- and- sentence- already- in- hand- just- find- a- plausible- peg- to- hang- on witch-hunts for heretics such as we here at UD represent. Cyber-stalking and even on the ground stalking naturally follow from such a mentality.kairosfocus
May 30, 2015
May
05
May
30
30
2015
02:28 AM
2
02
28
AM
PDT
Andre, I added the above exchange to the headlined FYI-FTR: https://uncommondescent.com/irreducible-complexity/fyi-ftr-communication-system-framework-model/ KF PS: I see Evolve seems to be absent, and has been once the FYI-FTR was put up. Let's see if there is a return on the weekend.kairosfocus
May 30, 2015
May
05
May
30
30
2015
02:20 AM
2
02
20
AM
PDT
KF Thank you for that. The question was probably directed incorrectly this is more for the hyper sceptics. Bit yes in a nutshell protocols in communication systems in its simplest form is a set of rules. Do our interlocutors agree with the meaning?Andre
May 30, 2015
May
05
May
30
30
2015
01:48 AM
1
01
48
AM
PDT
Andre, the general and relevant meanings are readily accessible and clips have been given. Relevant here is a Wiki clip:
In telecommunications, a communication protocol is a system of rules that allow two or more entities of a communication system to communicate between them to transmit information via any kind of variation of a physical quantity. [--> notice, generality, which goes beyond particular discrete-state cases, albeit such cases are particularly important in creating the concept behind the term] These are the rules or standard that defines the syntax, semantics and synchronization of communication and possible error recovery methods. Protocols may be implemented by hardware, software, or a combination of both.[1] Communicating systems use well-defined formats (protocol) for exchanging messages. [--> Format, of course extends beyond discrete state, and in fact all real world signals are analogue [consider impacts of power supply glitches and the role of decoupling capacitors if you doubt me], certain imposed standards and thresholds. Each message has an exact meaning intended to elicit a response from a range of possible responses pre-determined for that particular situation. The specified behavior is typically independent of how it is to be implemented. Communication protocols have to be agreed upon by the parties involved.[2] To reach agreement, a protocol may be developed into a technical standard. A programming language describes the same for computations, so there is a close analogy between protocols and programming languages: protocols are to communications as programming languages are to computations.[3]
In recent times digital comms has dominated and handshaking too, so I think there is a tendency of overly specific focus. The blunder above by an objector, on User Datagram PROTOCOL (it's right there in the name) is emblematic. Wiki is again instructive, speaking against known ideological bias:
The User Datagram Protocol (UDP) is one of the core members of the Internet protocol suite. The protocol was designed by David P. Reed in 1980 and formally defined in RFC 768. UDP uses a simple connectionless transmission model with a minimum of protocol mechanism. It has no handshaking dialogues, and thus exposes any unreliability of the underlying network protocol to the user's program. There is no guarantee of delivery, ordering, or duplicate protection. UDP provides checksums for data integrity, and port numbers for addressing different functions at the source and destination of the datagram. With UDP, computer applications can send messages, in this case referred to as datagrams, to other hosts on an Internet Protocol (IP) network without prior communications to set up special transmission channels or data paths. UDP is suitable for purposes where error checking and correction is either not necessary or is performed in the application, avoiding the overhead of such processing at the network interface level. Time-sensitive applications often use UDP because dropping packets is preferable to waiting for delayed packets, which may not be an option in a real-time system.[1] If error correction facilities are needed at the network interface level, an application may use the Transmission Control Protocol (TCP) or Stream Control Transmission Protocol (SCTP) which are designed for this purpose.
The protocol context is that of interaction and a standardised programme of correct behaviour. That's why the term was borrowed from the context of diplomacy, courts of law and royal courts. There is no reason to confine it to discrete state cases -- which in reality are also analogue anyway. As my response to your diagram request shows, the logic of what is going on is primary, attached labels, secondary:
1 --> To communicate there must be a co-ordinated in-commonness of corresponding elements . . . hence 2 --> the natural emergence of layered peer units in the comms system. As well, 3 --> we naturally have start point and destination and wish to send an understandable message. 4 --> This leads to standards, specifications and co-ordination regarding:
- transduction, - modulation and/or encoding, - ports/interfaces [at all sorts of levels, well do I recall incoming/outgoing specs for TTL, CMOS & ECL logic and for UARTs], - power amplification and coupling to a channel/medium, - detection of a message at the receiving end, - demod and decoding, - presentation to the sink.
5 --> All of this naturally leads to a need for standards within a comms system, and standardisation naturally tends to spread where there is an incentive to be in mutual communication, e.g. the spreading of AM radio and stereophonic records [the fate of quadraphonic records is instructive on failure to meet reasonable accord] 6 --> Terms for such standards, such as codes, modulation systems and protocols are secondary to the underlying realities they describe. 7 --> The dividing line here is that
- codes address content that uses discrete state elements [e.g. alphanumeric characters, codons for genes, binary digits], - protocols are concerned with setting up co-ordinated communication with due regard to the natural layer-cake effect, and - mod/demod is concerned with encapsulating, sending, propagating and receiving then recovering signals in the midst of noise (and having regard to bandwidth and channel capacity issues).
8 --> The upshot is, that once communication becomes a significant, non-trivial task, standards and a complex framework of specific rules embedded in the organisation of functional elements leads to a system that is in itself information-rich. 9 --> That is, any complex communication system implies functionally specific complex [irreducibly so in fact -- all of the core has to be there and has to be right for the whole to work] organisation and associated information, FSCO/I. 10 --> FSCO/I, per trillions of observed cases, has just one empirically known source, design; a point backed up by the needle-in-haystack blind chance and necessity search challenge. Regardless of dismissive rhetoric to the contrary. 11 --> And in the discrete signal case, we further deal with code, which is a manifestation of a phenomenon that in itself strongly points to verbalising intelligence as root cause. 12 --> Where, when we come across entities that manifest FSCO/I like this, the underlying FSCO/I is embedded in the organisation of the system and 13 --> it can therefore in principle be retrieved and measured by analysing the system and subsystems on node-arc networks and devising a reasonable structured chain of Y/N q's to specify the description. 14 --> The chain length of y/n q's then is an index in functionally specific bits, of the info content of the organisation. 15 --> Such holds for hardware and for software insofar as the latter is embedded in moving or stored signals.
So, we see how the underlying logic of communication systems of any significant complexity points to design as credible source, due to the embedded FSCO/I. In the world of the cell, as Yockey summarised in his diagram (cf. the just linked) and as we can see in action in other diagrams and a video there, the protein synthesis system embeds a communication system pivoting on D/RNA as string data structure coded elements that hold regulatory and assembly instructions as well as the content of such. Where, too, the proteins, functional RNAs etc that come from the code further show FSCO/I that is remote from the physical-chemical action steps involved in the communication and assembly process. (Think, string chaining --> folding --> agglomeration and activation --> biofunction.) Next, for proteins and proteinaceous enzymes [I here distinguish ribozymes], the functionally relevant configs are deeply isolated in AA sequence space, and for that matter AA-AA peptide bonds are themselves not the only chemically relevant possibilities in play. Now, it is quite evident that cumulatively such strongly points to intelligently directed highly skilled configuration as cause, i.e. design. But, it is predictable that such a conclusion will be stoutly resisted by all sorts of rhetorical artifices, due to a priori commitment to evolutionary materialism. If you doubt me, note Lewontin in his notorious 1997 NYRB remark:
It is not that the methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes [[--> another major begging of the question . . . ] to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counter-intuitive, no matter how mystifying to the uninitiated. Moreover, that materialism is absolute [[--> i.e. here we see the fallacious, indoctrinated, ideological, closed mind . . . ], for we cannot allow a Divine Foot in the door. [Those tempted to cry the accusation, quote-mining should see the annotated fuller cite as linked]
Until that ideological bewitchment is exposed, highlighted as a crude fallacy, and becomes utterly untenable as a violation of the vision of seeking empirically warranted truth that gave science the credibility it had, there will be no willingness to receive anything counter to such ideological closed-mindedness, no matter how compelling. Hence my emphasis on showing the facts and inviting the reasonable onlooker to see for him- or her- self what is going on. KFkairosfocus
May 30, 2015
May
05
May
30
30
2015
01:26 AM
1
01
26
AM
PDT
Hi Andre, We're trying to define a protocol for how we will go about defining the meaning of protocol. Any suggestions?Mung
May 29, 2015
May
05
May
29
29
2015
08:08 PM
8
08
08
PM
PDT
Do people know what protocol actually mean or are we going to differ on its meaning?Andre
May 29, 2015
May
05
May
29
29
2015
06:59 PM
6
06
59
PM
PDT
the stereochemical hypothesis
...the first models of the genetic code were all based on the stereochemical hypothesis, the idea that the coding rules are dictated by chemical relationships in three dimensions, whereas language is made of arbitrary conventions. Eventually, however, the stereochemical hypothesis had to be abandoned because it became clear that the rules of the genetic code are not the result of chemical necessity. In this sense they are as arbitrary as the rules of language, and this makes us realize that at the molecular level there is not only recursion but also arbitrariness. - Marcello Barbieri, Code Biology
Mung
May 29, 2015
May
05
May
29
29
2015
05:30 PM
5
05
30
PM
PDT
carpathian: There are actually no exceptions at all but the data carried by the medium, is completely unrelated to how it is delivered. No exceptions to non-baseband digital communications links such as ethernet exist? How so? point us to some knowledgeable source on that please. This again in no way suggests that a protocol is present. ???? The presence of information is not related in any way to the protocol. why the hang up on protocols? This is nothing but deflection. I made a wide angle view statement that all machinery around the exchange of information is itself built on information. Whether that machinery is built on software/hardware or hardware only should present no either/or scenario to grapple with. This is all obfuscation based around software engineering which can be dispensed with. You made the statement that, vague as it reads, seems to say that a phonograph record and playback apparatus is not used in information transfer. Which if that is what is being said, is bogus. Making the argument dependent on a software engineering point of view is obfuscation.groovamos
May 29, 2015
May
05
May
29
29
2015
03:04 PM
3
03
04
PM
PDT
I bet there's no C compiler in the cell either. So therefore no code. No RAM chips, so no memory in the cell. And no CPU in the cell either. Therefore no processing. I could go on and on.Mung
May 29, 2015
May
05
May
29
29
2015
10:11 AM
10
10
11
AM
PDT
Carpathian: Every now and then my fallible memory shows itself. It’s been years since I’ve done any work with this stuff.
What does your fallible memory have to do with that fact that the point you were trying to make was patently wrong. You didn't merely get the name wrong, you claimed UDP is not a protocol. Your memory of the name of the protocol has nothing to do with the fact that you denied UDP is a protocol.mike1962
May 29, 2015
May
05
May
29
29
2015
09:54 AM
9
09
54
AM
PDT
Carpathian: Nothing like this happens in the cell.
Nothing like a subject of the Crown bowing before the Queen or a servant pouring wine in a glass happens in a cell either. So what? Protocols can be implemented in numerous ways, in numerous media. What makes a protocol a protocol is a previously agreed upon set of rules governing communication. What's going on in a cell and protein synthesis is wildly more ingenious and interesting than your C code above. And if you don't think the codon/ribosome protein synthesis system utilizes a protocol (e.g, why precisely three nucleotides to a codon, and what the hell does a stop-codon do?) then that's your problem.mike1962
May 29, 2015
May
05
May
29
29
2015
09:41 AM
9
09
41
AM
PDT
Mung: Here is a part of what it takes to implement a protocol driver.
#define HDLC_EscapeBit 0x20 unsigned HDLC_State; enum { HDLC_State_AwaitingOpeningFlag , HDLC_State_AwaitingClosingFlag , HDLC_State_ReceivedFrame }; unsigned PPP_State; enum { PPP_State_LinkDead , PPP_State_LinkEstablishment , PPP_State_LCP_LocalOptions , PPP_State_LCP_LocalOptions_AwaitResp , PPP_State_LCP_LocalOptions_AwaitResp_2 , PPP_State_LCP_LocalOptions_AwaitResp_3 , PPP_State_LCP_LocalOptions_AwaitResp_4 , PPP_State_AuthenticationPhase , PPP_State_NetworkLayerProtocolPhase , PPP_State_LinkTerminationPhase };
Here's more:
HDLC_fcs = PPPINITFCS16; if( uiLen && pSrc && pDest && (uiLen > 8) ^ fcstab[(HDLC_fcs ^ HDLC_AllStationsAddress) & 0xff]; pDest[TxCount++] = HDLC_Control_Esc; pDest[TxCount++] = HDLC_Control_Field ^ HDLC_EscapeBit; HDLC_fcs = (HDLC_fcs >> 8) ^ fcstab[(HDLC_fcs ^ HDLC_Control_Field) & 0xff];
..and more:
while( pFrame->State != HDLC_State_ReceivedFrame ) { if( SerialRead( pConnection->uiHandle, &ucHDLC_Octet, 1 ) == 0 ) { break; } switch( pFrame->State ) { case HDLC_State_AwaitingOpeningFlag: pFrame->EscapeFlag = 0; pFrame->Count = 0; pFrame->State = HDLC_State_AwaitingClosingFlag; pFrame->fcs = PPPINITFCS16; if( ucHDLC_Octet == HDLC_FlagSequence ) { break; } case HDLC_State_AwaitingClosingFlag: if( ucHDLC_Octet == HDLC_FlagSequence) { if( pFrame->Count == 0 ) { pFrame->EscapeFlag = 0; pFrame->fcs = PPPINITFCS16; break; }
Nothing like this happens in the cell.Carpathian
May 29, 2015
May
05
May
29
29
2015
09:35 AM
9
09
35
AM
PDT
1 2 3 4 5 6 23

Leave a Reply