From Casey Luskin’s talk at Evolution News & Views:
Information is not always easy to define, but it often involves a measure of degree of randomness. The fundamental intuition behind information is a reduction in possibilities. The more possibilities you rule out, the more information you’ve conveyed.
Nature can produce “information” under certain definitions. Intelligent agents also produce information (certain types, at least). As Henry Quastler observed, “The creation of new information is habitually associated with conscious activity.”
To put it another way: The reduction in uncertainty could occur by an intelligent agent, or through a physical occurrence.
Types of information and example follow.
Conclusion
To summarize, Information can be understood and defined in different ways. Some are useful for detecting design and/or measuring bioinformation. Some are not. Semantic information is useful for detecting design, but it’s not the only way to detect design. It’s a special case of design, not the general case. Semantic information as a class falls within complex and specified information, which is a more general mode of design detection. The genetic code is a form of syntactic, semantic, and complex specified information. More.
See also: Life as “self-perpetuating information strings”? At least Adami is on the right track in focusing on understanding information, not chemistry, as the key driver.
Follow UD News at Twitter!