Uncommon Descent Serving The Intelligent Design Community

Is the term “biological information” meaningful?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Some say that it’s not clear that the term is useful. A friend writes, “Information is always “about” something. How does one quantify “aboutness. ” He considers the term too vague to be helpful.”

He suggests the work of Christoph Adami, for example, this piece at Quanta:

The polymath Christoph Adami is investigating life’s origins by reimagining living things as self-perpetuating information strings.

Once you start thinking about life as information, how does it change the way you think about the conditions under which life might have arisen?

Life is information stored in a symbolic language. It’s self-referential, which is necessary because any piece of information is rare, and the only way you make it stop being rare is by copying the sequence with instructions given within the sequence. The secret of all life is that through the copying process, we take something that is extraordinarily rare and make it extraordinarily abundant.

But where did that first bit of self-referential information come from?

We of course know that all life on Earth has enormous amounts of information that comes from evolution, which allows information to grow slowly. Before evolution, you couldn’t have this process. As a consequence, the first piece of information has to have arisen by chance.

A lot of your work has been in figuring out just that probability, that life would have arisen by chance.

On the one hand, the problem is easy; on the other, it’s difficult. We don’t know what that symbolic language was at the origins of life. It could have been RNA or any other set of molecules. But it has to have been an alphabet. The easy part is asking simply what the likelihood of life is, given absolutely no knowledge of the distribution of the letters of the alphabet. In other words, each letter of the alphabet is at your disposal with equal frequency. More.

So Chance wrote an alphabet? And created “aboutness”? It;s not the same type of chance we live with today.

See also: Does the universe have a “most basic ingredient” that isn’t information?

New Scientist astounds: Information is physical

and

Data basic: An introduction to information theory

Follow UD News at Twitter!

Comments
Eric Anderson: "Bits can be calculated with math. Function is recognized by our cognition." I am not sure I understand the difference. Math and calculations are a form of cognition too. Everything we do has a cognitive foundation and aspect. Have you read my post #26? Attributing a binary value (male or female, functional or not functional) to what we observe is a form of assessment which is the foundation of all scientific reasoning, as much as measuring variables by continuous numbers. Both procedures are strictly scientific, both procedures are a form of cognition. The concept of measure is a very refined form of cognition. So, where is the problem? Measuring functional information is not different from measuring weight in people according to their sex. "I don’t think we can throw out the entire concept of common descent." And I agree with you. Designed descent by designed modifications is a perfectly reasonable way to implement a new design. That's what many programmers do when they implement new features in existing software.gpuccio
February 17, 2017
February
02
Feb
17
17
2017
09:49 AM
9
09
49
AM
PDT
Mung: "But if someone were to ask you, how many steps does it take to create ‘x’ amount of function what would you be able to say, meaningfully? In what units is function measured?" I am not sure I understand the question. Let's say that a program which is n bits long can do something. Then you decide that you want to add some new functionality, so you define the new functionality and then you implement it changing the program code. Let's say that both the original code and the new addition are extremely efficient in terms of programming. Now, your new program is n+x bits long. So, the functional complexity linked to the added function is x bits. Where is the problem?gpuccio
February 17, 2017
February
02
Feb
17
17
2017
09:42 AM
9
09
42
AM
PDT
bill cole @29: Thanks for your further thoughts.
The question in my mind is if the theory is completely broken and not just Neo-Darwinism but all ideas of common descent.
I think you have to be more specific. Not all ideas of common descent are broken. After all, presumably you accept the fact that you descended from your parents, grand parents, great-grand parents and so on. And presumably you accept the fact that you are slightly different from your great-great grandfather. Now I know that isn't what you are talking about, which is why we need to be more specific. Some design proponents would propose a type of "front loading" that would play out the various forms of life over time. There is scant evidence for such a scenario and it might turn out to be wrong, but at least it would be consistent with design principles -- and would also be a form of common descent. Some design proponents are also willing to accept limited common descent, in the sense that new species could flow from an original form. So, for example, we could end up with various species of finch that relate back to a common ancestor. There is some decent evidence for this kind of common descent. Whether we have new species or variations on a single species often turns on our definitions and particulars of categorization, but the idea that an ancestral population can eventually split into two or more slightly different populations over time has merit. So, no. I don't think we can throw out the entire concept of common descent. The key, as I said, is the claim that all this can come about through a long series of accidents. That is the preposterous claim. But that is also the key claim. If that fails, then the overall theory fails. And it fails even if there are some other biological aspects that make sense (limited common descent, genetics, etc.) independent of evolutionary theory. Many aspects of biology don't depend on evolutionary theory, and don't need it to be true for them to be true.Eric Anderson
February 17, 2017
February
02
Feb
17
17
2017
09:22 AM
9
09
22
AM
PDT
Origines @25:
Gpuccio studies protein sequences which are functional and are preserved in many species. Both properties are strong indicators for not being ‘bits of nonsense’.
gpuccio @27:
Thank you! That’s exactly what I am trying to say.
----- How does gpuccio know that they are not bits of nonsense? Because he ran a calculation of the bits and came up with some number of bits? No! Because, as you say, he recognized that there is an indicator of information: function. Bits can be calculated with math. Function is recognized by our cognition.Eric Anderson
February 17, 2017
February
02
Feb
17
17
2017
09:07 AM
9
09
07
AM
PDT
gpuccio:
Functional complexity is a simple and powerful concept, and it is objective and measurable.
But that wasn't really the question, or it least it wasn't my question. :) One can always create an operational definition, which is what I think you have done. But if someone were to ask you, how many steps does it take to create 'x' amount of function what would you be able to say, meaningfully? In what units is function measured?Mung
February 17, 2017
February
02
Feb
17
17
2017
08:12 AM
8
08
12
AM
PDT
Eric gpuccio
That we can build a functionally-integrated, complex, highly-scalable, massively-parallel system architecture, that operates on the basis of a 4-bit digital code, with storage, retrieval and translation mechanisms, implementing concatenation and error correction algorithms — that all this and more can be built by introducing random errors into a database.
Yes, this is absurd. The question in my mind is if the theory is completely broken and not just Neo-Darwinism but all ideas of common descent. If we look at the design concept of multicellular life it appears to optimized to reduce variation. The theory is founded on the concept of variation. As gpuccio properly states, mutations do occur but the evidence is mostly loss of function with an occasional serendipitous adaptive advantage. So the design concept of the genome is a sequence which creates diversity as long as the sequence is known or designed. Any random change is usually problematic. Without very accurate replicating and error correction mechanisms I don't believe multicellular life is possible. This design flies in the face of any evolutionary concept of animal a and b evolving from animal c. Is Genesis the most accurate description we have?bill cole
February 17, 2017
February
02
Feb
17
17
2017
07:57 AM
7
07
57
AM
PDT
bill cole: I am not in particular an expert about DNA repair.It is certainly a very complex and fascinating issue, and we can assume that it is extremely efficient in correcting random errors. It's enough to look at this Wikipedia page, to understand what happens if that complex repair system cannot work well: https://en.wikipedia.org/wiki/DNA_repair-deficiency_disorder However, we certainly know that random mutations do happen, even in the presence of some normally functioning DNA repair mechanism. Indeed, there are different ways to measure mutations. It is also reasonable to believe that random mutations are the cause of genetic diseases in humans, including those genetic diseases that cause defects in the DNA repair system. For example, look at the following Wikipedia page about xeroderma pigmentosum: https://en.wikipedia.org/wiki/Xeroderma_pigmentosum In particular: "One of the most frequent defects in xeroderma pigmentosum is an autosomal recessive genetic defect in which nucleotide excision repair (NER) enzymes are mutated, leading to a reduction in or elimination of NER.[6] If left unchecked, damage caused by ultraviolet light can cause mutations in individual cell's DNA." And the table in that page lists the main known mutations that cause the disease. All that should not be a surprise. We know well, from software programming, that even the best error correction system can never prevent all possible errors, especially in very complex systems. And biological systems are extremely complex.gpuccio
February 17, 2017
February
02
Feb
17
17
2017
12:38 AM
12
12
38
AM
PDT
Origenes: "Gpuccio studies protein sequences which are functional and are preserved in many species. Both are strong indicators for not being ‘bits of nonsense’." Thank you! :) That's exactly what I am trying to say. I have been doing exactly that in these recent days, exploring whole genome comparisons using the metrics I have proposed in my posts here, and I get real results, real numbers that measure objective things. So, it's really difficult for me to read about the difficulty of measuring functional complexity, because I know from personal experience that it can be measured. It can be easy or very difficult, in different cases, but there is no absolute obstacle, in principle. Functional complexity is a simple and powerful concept, and it is objective and measurable.gpuccio
February 17, 2017
February
02
Feb
17
17
2017
12:26 AM
12
12
26
AM
PDT
Eric Anderson @22: Well, I agree that we agree in essence. However, I am too, in this discussion, "focusing on an important nuance that is too often glossed over". When you say that "The latter (specification, function) is not so amenable to calculation", I think you are not really considering that, for a binary variable, establishing if what we observe can be categorized in one or the other level or the variable is a quantitative assessment. You may not want to call it a "calculation", but that's exactly part of the quantitative analysis we daily perform in science. So, if I measure weight in a number of people, I have a numerical variable. But if I group my subjects according to their sex, I have a binary variable. According to your language, I am "calculating" weight (I am measuring it, and I can derive quantitative parameters, like mean, standard deviation, and so on), and I am "recognizing" if the subjects are male or female. Therefore, function is essentially not different from sex, or any other binary (or, in general, categorical) variable that we "recognize" according to some explicit definition that allows a categorization. There is no special elusive quality in function, any more than there is in sex, or any other categorization. But there is more: the recognition of function is a procedure that requires two important steps: a) An explicit definition of the function, and how to assess its presence or absence: that takes the whole issue out of subjectivity, and makes it wholly objective. b) The application of the above procedure to each individual case, that is a quantitative assessment. For example, if we define the function of a watch as the ability to measure time with at least a specific level of precision, we have to check how well each object in our sample measures time, to decide if we can categorize it as a clock, or not. That is a calculation, a calculation that gives us the final binary value for the variable: "is this a clock?". So, function is not some mystic property. It can be observed, measured, and expressed as a binary value. So, the bits necessary to implement a function are no mystical concept They can be measured as the minimum amount of specific configuration that can objectively implement the defined function.gpuccio
February 17, 2017
February
02
Feb
17
17
2017
12:20 AM
12
12
20
AM
PDT
Eric Anderson:
GPuccio: We can measure the amount of specific bits of information that are necessary to implement some well defined function.
Bits of what? What makes you think you are calculating bits of information? For all we know, you might be calculating bits of nonsense.
Gpuccio studies protein sequences which are functional and are preserved in many species. Both properties are strong indicators for not being 'bits of nonsense'.Origenes
February 17, 2017
February
02
Feb
17
17
2017
12:17 AM
12
12
17
AM
PDT
Eric Anderson @23:
If we stop and think about what is actually being claimed, we start to realize what an absolute, utter, complete, preposterous absurdity it is.
"If we stop and think..." That's a fundamental requirement that unfortunately we can't guarantee. People are free to seriously think or not to think seriously. Once people start thinking seriously we see things like the website "a third way of evolution" created. And still they may not be thinking with totally open mind outside preconceived boxes.Dionisio
February 16, 2017
February
02
Feb
16
16
2017
08:57 PM
8
08
57
PM
PDT
bill cole @21:
Could this whole idea of evolution be completely wrong?
What is the fundamental, foundational, basic claim of evolution? There are lots of claims, but at the most basic level the NeoDarwinian claim is this: That we can build a functionally-integrated, complex, highly-scalable, massively-parallel system architecture, that operates on the basis of a 4-bit digital code, with storage, retrieval and translation mechanisms, implementing concatenation and error correction algorithms -- that all this and more can be built by introducing random errors into a database. If we stop an think about what is actually being claimed, we start to realize what an absolute, utter, complete, preposterous absurdity it is.Eric Anderson
February 16, 2017
February
02
Feb
16
16
2017
05:24 PM
5
05
24
PM
PDT
gpuccio @20:
We can measure the amount of specific bits of information that are necessary to implement some well defined function.
Bits of what? What makes you think you are calculating bits of information? For all we know, you might be calculating bits of nonsense. The number of bits (per Shannon) can be precisely the same in each case and the calculation tells us precisely nothing about whether we are dealing with information or not. Rather, it is our recognition of meaning, purpose, function that tells us we are dealing with information, not the fact that we have calculated some particular number of bits. That is the point. We can measure complexity. But we recognize information -- because it informs.
That is the measure of the functional complexity of that function, as defined.
Not quite. It is the measure of the complexity only. The functional part (or the information) must be determined separately and apart from the complexity. And it is not a bits calculation. It is a recognition of, as you say, function, or meaning or purpose.
The important point is: complex functions, those that require a high number of specific bits to be implemented (for example, more than 500 – 1000 bits), are only detected in designed objects. That kind of specific configuration requires the intervention of a conscious being, capable of understanding and purpose and will, to exist in a material object. That’s why, if we observe a material object that implements a complex function, we can infer design for its origin.
Absolutely agree. We are on the same page generally. I am, in this discussion, focusing on an important nuance that is too often glossed over. Namely, that complex specified information consists both of (a) complexity, and (b) specification (or function, if you prefer). The first can be calculated in various ways, including, per Shannon, in bits. The latter is not so amenable to calculation. It is a recognition -- an awareness of the fact that we are dealing with specification, meaning, function, purpose, something that informs. When we look at some functional system and say that it exceeds 500 bits, there are two parts to the analysis: a recognition of meaning/function, and a calculation of complexity. Both go together. And both are necessary in order to infer design. But we don't calculate 'CSI' in toto. We calculate the 'C'. We recognize the 'SI'.Eric Anderson
February 16, 2017
February
02
Feb
16
16
2017
05:13 PM
5
05
13
PM
PDT
gpuccio A slight change of subject. What is you level of knowledge of DNA repair? I understand it is a process that occurs during the cell cycle but also when transcription takes place. It is accurate to 10^10 but when a human is built there are 22 billion potential mutations across the animal. There are, however only about 1 or so per every few cells. If there is another pass(dna repair) when the gene is transcribed then the 1 gets corrected during transcription. If this is true what are the real raw materials of evolutionary theory since mutations are continuously corrected with very rare exceptions? Could this whole idea of evolution be completely wrong? BTW cancers often occur when this machinery breaks.bill cole
February 16, 2017
February
02
Feb
16
16
2017
04:23 PM
4
04
23
PM
PDT
Eric Anderson: I insist. We can measure the amount of specific bits of information that are necessary to implement some well defined function. That is the measure of the functional complexity of that function, as defined. So, the functional complexity is referred to a specific function, not to the object itself. However, the basic idea is: if the object can perform the defined function, because its configuration exhibits all the necessary bits, then we can say that the object exhibits functional information, in that amount and for that function. There is no absolute information in an object. In a sense, the highest "information" can be found in random strings, but that "information" is not specially functional. A random string is not a computer program. It cannot implement highly specific functions, like ordering numbers, or solving mathematical equations, and so on. The important point is: complex functions, those that require a high number of specific bits to be implemented (for example, more than 500 - 1000 bits), are only detected in designed objects. That kind of specific configuration requires the intervention of a conscious being, capable of understanding and purpose and will, to exist in a material object. That's why, if we observe a material object that implements a complex function, we can infer design for its origin.gpuccio
February 16, 2017
February
02
Feb
16
16
2017
03:59 PM
3
03
59
PM
PDT
groovamos @16: Some good thoughts. This jumped out at me though:
Shannon showed that information can be measured.
Shannon specifically said he was not measuring information -- certainly not in any substantive sense that might be relevant. So-called Shannon "information" would be much better described as the Shannon "measure". Measurement of what? Essentially, channel carrying capacity. If Shannon's result had been termed the "Shannon Measure", rather than "Shannon Information", we would all be a lot better off and no small amount of confusion would have been avoided.Eric Anderson
February 16, 2017
February
02
Feb
16
16
2017
03:10 PM
3
03
10
PM
PDT
Mung:
I agree with other posters that information is always information about something. But when it comes to “biological information” what does that term mean, and can the concept of “biological information” be quantified such that it can be measured?
We quantify and measure complexity, not information per se.
Can “aboutness” be quantified and measured?
Not really. But see my last comment below.
Would people agree or disagree that using the concept of shannon informtion isn’t helpful because shannon information isn’t about aboutness.
So-called "Shannon information" isn't helpful to the task, because it isn't measuring information -- as Shannon himself explained.
If some proposed measure of “biological information” isn’t measuring aboutness, what is it measuring?
Good question. I personally don't think information lends itself very well to a precise mathematical measurement -- as though information were some kind of physical object that could be measured, weighed, and quantified precisely. We could assign values to certain pieces of information and then find out whether an artifact contains those pieces of information. Then we could come up with a measurement value and declare that the artifact has such-and-such "quantity" of information. However, such an approach would simply be an exercise in categorizing, assigning values, and then comparing artifacts against our assigned values. I suppose that could constitute "measuring" and "quantifying" information in some loose sense. Might even be useful in some cases. But I suspect we would have so many exceptions and corner cases that any rules would be swallowed up by the exceptions.Eric Anderson
February 16, 2017
February
02
Feb
16
16
2017
03:04 PM
3
03
04
PM
PDT
groovamus: "I’ve said it on here before, things as extant don’t have information without an involved mind." But machines can do things, and they can do those things because of the information that a mind implemented in them. We, as conscious observers, can witness what a machine can do, and define that as a function. The simple fact remains that a function, once defined, needs some amount of information to be implemented. That is the functional information for that function.gpuccio
February 16, 2017
February
02
Feb
16
16
2017
11:42 AM
11
11
42
AM
PDT
Shannon showed that information can be measured. I would like to know from Mr. Adami, that if information from the "environment" resides in the genes, as he apparently thinks, how much "information" resides in the environment? How much information from my genes matches up somehow with the environment? And BTW Mr. Adami, where in the genes is the information that makes your face appear the way it does, and your voice to sound the way it does? Where exactly? Did that information start out in the "environment" and if so, where is it in the "environment"? I've said it on here before, things as extant don't have information without an involved mind. A rock doesn't have information, only the descriptions, the measurements with tolerances, etc. have information. Since there can be arbitrarily large numbers of descriptions of a rock, including an almost infinite number of measurements with associated tolerances which can in themselves be almost infinitely precise, a rock can be said to represent infinite information which is to say that for humans it really has zero information, but our description of the rock can encompass an arbitrarily large amount of information. Just ONE measurement of the rock can contain information approaching infinity if it is estimated to a precision approaching infinity. So this brings us back to a requirement for information measure to make any sense: a mind or minds are somehow involved. Shannon avoided philosophical implications of his work and I think this has caused some confusion and misunderstanding in the ID community. For example Tom Bethell in his wonderful "Darwin's House Of Cards" quotes Steven Meyer: "The problem is Shannon's theory 'cannot distinguish between functional or message bearing signals from random or useless ones' as Stephen Meyer said...." Big problem for Bethell and Meyer in saying this. It takes a mind to decide if a "random" signal is useless, and a mind might not have a clue as to what "random" means, and whether measurements of randomness can be undertaken (actually yes) which is why I put the word in quotes elsewhere. "Random" signals are generated by processes. Such processes or classes of them may be useful for a particular mind and useless to everyone else. A weather station encoding wind direction and speed over a comm link is sending randomly generated numbers, useless to almost everyone except a few people. A person trying to type out random letters may think they are random, and to him they might be useless (not really - there has to be a motivation if the person is sane). But his method in doing so will have bias(es). We have an example of that on this thread: https://uncommondescent.com/design-inference/intelligent-design-basics-information-part-iii-shannon/ On that link we have the OP actually posting a "random" string of all caps @39. Having a mind, I can estimate a probability of the caps lock being on during the typing. The more letters are in the string, the higher the probability. At 100% probability I have one bit of Shannon information, the same for 0%. But this is why Shannon bowed out of the philosophical aspects of meaning as it applies to the measurement of information. Even he showed that the surprisal value of a message or character transmitted/stored is a measure of information content. But how much information is in a message like: "The sun shines on the earth" based on surprisal? You would have to come up with some arcane scenarios to show that a class of minds would find the information value there. The bottom line is this: the ID community thinkers and Mr. Adami should be more careful in their discussion of Shannon information.groovamos
February 16, 2017
February
02
Feb
16
16
2017
11:31 AM
11
11
31
AM
PDT
Mung: "But are you really quantifying the amount of information if you’re not quantifying the amount of aboutness?". Yes. I am quantifying the amount of information necessary to implement the function as defined. There are quantitative variables and qualitative (categorical) variables. there is nothing special in that. "Is there a way to measure “biological information” and the situations in which that measure can properly be used." Of course there is. I have done exactly that many times. "Is there more than one way to measure “biological information”?" Of course. You can change the measure unit, the methodology, and so on. Again, there is nothing special in that. "For example, can the amount of “biological information” inherited by a child from each parent be quantified?" Of course. In general, for most genes, children inherit a similar amount of information (one allele from each parent). Of course, the functionality of each allele can be different. For example, one allele can be non functional. Like in genetic diseases. Or it can be imprinted, and therefore not expressed in the child. "How about the amount of “biological information” gained or lost during the course of evolution in a specific lineage." That's exactly what I am working at. You could have a look at some of my previous posts: https://uncommondescent.com/intelligent-design/homologies-differences-and-information-jumps/ https://uncommondescent.com/intelligent-design/information-jumps-again-some-more-facts-and-thoughts-about-prickle-1-and-taxonomically-restricted-genes/ https://uncommondescent.com/intelligent-design/the-highly-engineered-transition-to-vertebrates-an-example-of-functional-information-analysis/gpuccio
February 16, 2017
February
02
Feb
16
16
2017
08:43 AM
8
08
43
AM
PDT
gpuccio writes:
There is no need to quantify it [aboutness], because it can be treated as a categorical, binary variable. What we quantify is the information linked to some functional specification.
But are you really quantifying the amount of information if you're not quantifying the amount of aboutness? Is there a way to measure "biological information" and the situations in which that measure can properly be used. Is there more than one way to measure "biological information"? For example, can the amount of "biological information" inherited by a child from each parent be quantified? How about the amount of "biological information" gained or lost during the course of evolution in a specific lineage.Mung
February 16, 2017
February
02
Feb
16
16
2017
07:21 AM
7
07
21
AM
PDT
I agree with other posters that information is always information about something. But when it comes to "biological information" what does that term mean, and can the concept of "biological information" be quantified such that it can be measured? Can "aboutness" be quantified and measured? Would people agree or disagree that using the concept of shannon informtion isn't helpful because shannon information isn't about aboutness. :) If some proposed measure of "biological information" isn't measuring aboutness, what is it measuring?Mung
February 16, 2017
February
02
Feb
16
16
2017
07:09 AM
7
07
09
AM
PDT
gpuccio @8:
Strange that such simple and important concepts are so obstinately ignored.
You pressed the right button as usual: it's about "functionally specified information". (Perhaps "cooked" with a "complex complexity" flavor?)Dionisio
February 16, 2017
February
02
Feb
16
16
2017
04:10 AM
4
04
10
AM
PDT
BA77 @1, @2 & @6: Very interesting references. Thank you.Dionisio
February 16, 2017
February
02
Feb
16
16
2017
03:56 AM
3
03
56
AM
PDT
News, Thank you for opening this discussion thread.Dionisio
February 16, 2017
February
02
Feb
16
16
2017
03:54 AM
3
03
54
AM
PDT
Some say that it’s not clear that the term is useful. A friend writes, “Information is always “about” something. How does one quantify “aboutness. ” He considers the term too vague to be helpful.”
Information is obviously a useful term. How can anyone in his right mind doubt this? Well, we can be pretty sure that a particular metaphysical bias is in play when someone does. Materialism cannot ground information, because it originates from a level over and beyond fermions and bosons. Whenever we speak of information we are referring to the alignment of low-level functions in support of the top-level function. Every post on this forum uses letters as the basic building blocks at the bottom level. These letters are arranged in accord with influencing conventions of spelling to form words one level up. To reach the next higher level, words are chosen for the purpose of expressing a thought and arranged in accord with influencing grammatical conventions in order for that thought to be intelligibly conveyed. The point is: the parts are subservient to the whole and it is exactly this subservience of the parts that materialism cannot explain. Materialistic 'explanations' always boil down to 'it must be blind luck'. The materialist would sleep much better if only the term 'information' was indeed 'too vague to be helpful'.Origenes
February 16, 2017
February
02
Feb
16
16
2017
03:30 AM
3
03
30
AM
PDT
“Information is always “about” something." Yes. That's why ID has developed the concept of specified information, and in particular of functionally specified information. The "about" is of course the specification, and in biology the specification is a function. "How does one quantify “aboutness. ” There is no need to quantify it, because it can be treated as a categorical, binary variable. What we quantify is the information linked to some functional specification. Strange that such simple and important concepts are so obstinately ignored.gpuccio
February 16, 2017
February
02
Feb
16
16
2017
01:06 AM
1
01
06
AM
PDT
A friend writes, “Information is always “about” something. How does one quantify “aboutness. ” Yes! Information is about something. And DNA is information about something. It is the technical description of how to make a living organism. Each element in the DNA plays its own role in that description. Each element in the DNA has its own unique "aboutness".bFast
February 15, 2017
February
02
Feb
15
15
2017
09:18 PM
9
09
18
PM
PDT
Atheist's logic 101 - cartoon "If I can only create life here in the lab (or in my computer), it will prove that no intelligence was necessary to create life in the beginning" http://dl0.creation.com/articles/p073/c07370/Scientist-synthesize-life-machine.jpg Panda’s Thumb Richard Hoppe forgot about Humpty Zombie - April 15, 2014 Excerpt: I discovered if you crank up Avida’s cosmic radiation parameter to maximum and have the Avida genomes utterly scrambled, the Avidian organisms still kept reproducing. If I recall correctly, they died if the radiation was moderate, but just crank it to the max and the creatures come back to life! This would be like putting dogs in a microwave oven for 3 days, running it at full blast, and then demanding they reproduce. And guess what, the little Avida critters reproduced. This little discovery in Avida 1.6 was unfortunately not reported in Nature. Why? It was a far more stupendous discovery! Do you think it’s too late for Richard Hoppe and I to co-author a submission? Hoppe eventually capitulated that there was indeed this feature of Avida. To his credit he sent a letter to Dr. Adami to inform him of the discovery. Dr. Adami sent Evan Dorn to the Access Research Network forum, and Evan confirmed the feature by posting a reply there. http://www.creationevolutionuniversity.com/idcs/?p=90
Avida, when using realistic biological parameters as its default settings, instead of using highly unrealistic default settings as it currently does, actually supports Genetic Entropy instead of Darwinian evolution:
Biological Information - Mendel's Accountant and Avida 1-31-2015 by Paul Giem https://www.youtube.com/watch?v=cGd0pznOh0A&list=PLHDSWJBW3DNUUhiC9VwPnhl-ymuObyTWJ&index=14 Computational Evolution Experiments Reveal a Net Loss of Genetic Information Despite Selection Chase W. Nelson1,* and John C. Sanford http://www.worldscientific.com/doi/pdf/10.1142/9789814508728_0014
further notes:
Podcast: Winston Ewert on computer simulation of evolution (AVIDA) that sneaks in information https://uncommondescent.com/intelligent-design/podcast-winston-ewert-on-computer-simulation-of-evolution-avida-that-sneaks-in-information/ LIFE’S CONSERVATION LAW - William Dembski - Robert Marks - Pg. 13 Excerpt: (Computer) Simulations such as Dawkins’s WEASEL, Adami’s AVIDA, Ray’s Tierra, and Schneider’s ev appear to support Darwinian evolution, but only for lack of clear accounting practices that track the information smuggled into them.,,, Information does not magically materialize. It can be created by intelligence or it can be shunted around by natural forces. But natural forces, and Darwinian processes in particular, do not create information. Active information enables us to see why this is the case. http://evoinfo.org/publications/lifes-conservation-law/ “Information does not magically materialize. It can be created by intelligence or it can be shunted around by natural forces. But natural forces, and Darwinian processes in particular, do not create information.” - William Dembski Main Publications - Evolutionary Informatics http://evoinfo.org/publications/
bornagain77
February 15, 2017
February
02
Feb
15
15
2017
07:36 PM
7
07
36
PM
PDT
AD, thanks, will watch it later.bornagain77
February 15, 2017
February
02
Feb
15
15
2017
07:23 PM
7
07
23
PM
PDT
1 2 3

Leave a Reply