Uncommon Descent Serving The Intelligent Design Community

Uh Oh! Is He Going To Get Gould-ed?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

The word “borked” has entered our lexicon as a result of the treatment Judge Robert Bork received during Confirmation Hearings for Supreme Court Justice of the U.S. To put it mildly, he was not treated very well. When Stephen J. Gould came out with his theory of Punctuated Equilibria, he, too, was not treated well by the Darwinian establishment until such time as he made clear that his theory was firmly a part of Darwinian thought.

Now another geologist, Michael Rampino, has just set himself up for equal treatment. In a PhysOrg entry, Rampino points out what has been so obvious for so long a time: evolution is NOT gradual! It is episodic. He also seeks to go further back in time to Patrick Matthew, who predates Darwin and his notion of NS by thirty years or so. I think Michael has been reading far too much here at UD for his own good health (academic, anyway). It’ll be interesting to see how quickly he is gobbled up by the Darwinian thought police.

Comments
proof of principle: Information can overcome the local limits of the second law on the molecular level: Maxwell’s ID Demon Converts Info to Energy Excerpt: Sano said the experiment does demonstrate that information can be used as a medium for transferring energy.” http://www.creationsafaris.com/crev201011.htm#20101116abornagain77
November 17, 2010
November
11
Nov
17
17
2010
07:05 AM
7
07
05
AM
PDT
markf, if you care I tracked down the entropic/information measurement,,,, measuring from a thermodynamic perspective (a more accurate measure for ascertaining a 'true' total information content) the information content of a 'simple' bacterium is found to be 10^12 bits, comparable to about 100 million pages of the encyclopedia Britannica: Moleular Biophysics - Information theory. Relation between information and entropy. http://www.astroscu.unam.mx/~angel/tsb/molecular.htm Carl Sagan, Cornell, "The information content of a simple cell has been estimated as around 10^12 bits, comparable to about a hundred million pages of the Encyclopaedia Britannica.", Life, http://www.bible.ca/tracks/dp-lawsScience.htmbornagain77
November 16, 2010
November
11
Nov
16
16
2010
06:57 PM
6
06
57
PM
PDT
Mark: Suppose we find an incident where a gene mutates 30 base pairs simultaneously to create a totally new function. What has that told us about how it happened? I suppose that a single occurrence would not say anything definitive, but if that were the constant observation in thousands of cases, an explanation would certainly be necessary. ID would remain obviously loyal to the design explanation, and would try to get more details from facts on possible models of implementation (guided mutations, artificial selection, or others). Non IDists would be free to try some other explanation.gpuccio
November 16, 2010
November
11
Nov
16
16
2010
02:35 PM
2
02
35
PM
PDT
Thanks, it does, and properly collected and filed. :)bornagain77
November 16, 2010
November
11
Nov
16
16
2010
01:44 PM
1
01
44
PM
PDT
BA: I've pulled out Hoyle's book. On pages 99-101, he deals with a straightforward calculation of probabilities concerning selectable mutations. He writes: "The chance of setting a particular base pair in a particular gene in G generations is ~10^-9G, and the chance that two base pairs are set right in the same gene is ~(10^-9)^2. For a total of 2N genes in a population of N individuals the probability of one emerging in a repaired condition after G generations is therefore ~2N(10^-9)^2, which to be of order unity requires G=~ 10^9/(2N)1/2. A mammalian population with 2N=10^6 would require G=~10^6 generations, which is so long that further errors would accumulate in every individual before the two base pairs were corrected in any individual. . . . From this example we can say that for any discarded gene properly to be recovered in a practical situation, it is necessary that the genes in quesiton shall not differ from a working condition by more than one or two base-pair errors. Once genes drift by more than this from a working condition they can be considered to have gone permanently dead, therby explaining an otherwise mysterious conclusion of classical biology, that once species become highly specialized they tend to become extinct." Hope this helps your collection! ;)PaV
November 16, 2010
November
11
Nov
16
16
2010
01:25 PM
1
01
25
PM
PDT
#85 BF Thanks - I think we have taken this subject as far as it is going to go.markf
November 16, 2010
November
11
Nov
16
16
2010
12:35 PM
12
12
35
PM
PDT
markf: (a) what specification do you use when measuring the information in an organism The 'incomplete' measure which is currently used, which I'm not well versed in, is: Functional information and the emergence of bio-complexity: Robert M. Hazen, Patrick L. Griffin, James M. Carothers, and Jack W. Szostak: Abstract: Complex emergent systems of many interacting components, including complex biological systems, have the potential to perform quantifiable functions. Accordingly, we define 'functional information,' I(Ex), as a measure of system complexity. For a given system and function, x (e.g., a folded RNA sequence that binds to GTP), and degree of function, Ex (e.g., the RNA-GTP binding energy), I(Ex)= -log2 [F(Ex)], where F(Ex) is the fraction of all possible configurations of the system that possess a degree of function > Ex. Functional information, which we illustrate with letter sequences, artificial life, and biopolymers, thus represents the probability that an arbitrary configuration of a system will achieve a specific function to a specified degree. In each case we observe evidence for several distinct solutions with different maximum degrees of function, features that lead to steps in plots of information versus degree of functions. http://genetics.mgh.harvard.edu/szostakweb/publications/Szostak_pdfs/Hazen_etal_PNAS_2007.pdf Mathematically Defining Functional Information In Molecular Biology - Kirk Durston - short video http://www.metacafe.com/watch/3995236 Entire video: http://vimeo.com/1775160 and this paper: Measuring the functional sequence complexity of proteins - Kirk K Durston, David KY Chiu, David L Abel and Jack T Trevors - 2007 Excerpt: We have extended Shannon uncertainty by incorporating the data variable with a functionality variable. The resulting measured unit, which we call Functional bit (Fit), is calculated from the sequence data jointly with the defined functionality variable. To demonstrate the relevance to functional bioinformatics, a method to measure functional sequence complexity was developed and applied to 35 protein families.,,, http://www.tbiomed.com/content/4/1/47 The reason I say incomplete is that it is not a true precise measure of the transcendent information present within a lifeform that would be gained by say perhaps measuring a bacterium's complete molecular offset from thermodynamic equilibrium and calculating the functional information bits present since,,, "Gain in entropy always means loss of information, and nothing more." Gilbert Newton Lewis as well another precise measure may be possible in that: Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon - Charles H. Bennett Excerpt: Of course, in practice, almost all data processing is done on macroscopic apparatus, dissipating macroscopic amounts of energy far in excess of what would be required by Landauer’s principle. Nevertheless, some stages of biomolecular information processing, such as transcription of DNA to RNA, appear to be accomplished by chemical reactions that are reversible not only in principle but in practice.,,,, http://www.hep.princeton.edu/~mcdonald/examples/QM/bennett_shpmp_34_501_03.pdf and,, Landauer's principle Of Note: "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase ,,, Specifically, each bit of lost information will lead to the release of an (specific) amount (at least kT ln 2) of heat.,, http://en.wikipedia.org/wiki/Landauer%27s_principle yet there are problems in extending the thermodynamic measure of Landauer to biology in regards to getting a precise measure of exactly when information is lost in a cell,,, Landauer’s Principle and Divergenceless Dynamical Systems - 2009 The profound links between Landauer’s principle and the second law of thermodynamics [21] suggest that the present results may help to explore analogues of the second law in non-standard contexts, like the biological ones discussed in [26, 27]. The lack of sub-additivity exhibited by some important non-logarithmic information or entropic functionals seems to be a serious difficulty for deriving generalizations of Landauer’s principle in terms of the non-standard maxent formalisms that are nowadays popular for the study of non-equilibrium, meta-stable states. On the other hand, the Beck-Cohen approach allows for the extension of Landauer’s principle to some of those scenarios. This important issue, however, needs further exploration. In this regard, any new developments towards a valid formulation of Landauer-like principles directly based upon generalized, non-standard entropic measures are very welcome. http://www.up.ac.za/dspace/bitstream/2263/14606/1/Zander_Landauer(2009).pdf Myself I feel fairly confident that if these problems can be worked out, and a precise enough measurement device could be built to measure such tiny fluctuations of temperature, if it is not already built, say some precise laser instrument,,, then a true measure of the 'physicality' of information in life could be carried out. Until then, for us to kick around functional information bits (FITS), which are derived from a incomplete knowledge of probability is to miss the mark for a true measure of the transcendent information present in life. you then ask: (b) what prior knowledge do you take into account when calculating the information,,, The prior knowledge (presupposition) I take into account, is that I hold that the transcendent information in life is physically dominate of the matter and energy, therefore I should never expect that which is lesser in quality of its existence to produce that which is greater in its existence. etc... etc.. etc..bornagain77
November 16, 2010
November
11
Nov
16
16
2010
11:37 AM
11
11
37
AM
PDT
#83 BA77 I will gladly admit I don't know of any examples of a material process generating information above and beyond what was already present in the parent species because (a) I wouldn't know how to measure the information (if you would answer my questions I might have a hope) (b) I am not a biologist or biochemist Now I have answered you question as fully and honestly as I can. There is nothing hypothetical about my question and nothing needs analyzing. All I am asking is (a) what specification do you use when measuring the information in an organism (b) what prior knowledge do you take into account when calculating the information If you don't know the answer that is fine as well - just say so.markf
November 16, 2010
November
11
Nov
16
16
2010
10:39 AM
10
10
39
AM
PDT
and markf, all I am asking is that you provide JUST ONE example of material processes generating information above and beyond what was already present in the parent species. I care not to argue hypotheticals until you present ANYTHING that we may analyze.bornagain77
November 16, 2010
November
11
Nov
16
16
2010
09:50 AM
9
09
50
AM
PDT
#81 BA77 I am not asking you to talk of something you know to be impossible. I am simply asking two questions about how you measure the information in an object - given the clear statement from the ID community that information is measured through a probability. Are you saying it is impossible to measure the information in an object?markf
November 16, 2010
November
11
Nov
16
16
2010
09:16 AM
9
09
16
AM
PDT
markf, information is shown to be its own unique entity that is completely separate, and dominate, of time-space matter-energy, in quantum teleportation experiments as well as further solidified to be a unique and independent entity with the refutation of the hidden variable argument of Einstein. ,,, To measure transcendent information on the quantum scale is fairly routine nowadays. Further work is needed to get a proper 'physical' measure of the transcendent information encoded in life. As to answering your questions? I was clear that you need to provide just one REAL example of information increasing over and above what is already present in a parent species by passing the fitness test or by falsifying ID as laid out by Behe. i.e. markf of what concern is it to me to talk about what hypothetically could be a validation of Darwinism if you in fact have no real examples to offer as proof. i.e. Why should I talk of something that I know to be impossible?bornagain77
November 16, 2010
November
11
Nov
16
16
2010
09:10 AM
9
09
10
AM
PDT
#78 markf it is no ‘long debate’, you have no evidence for an increase of information leading to speciation event period. If you disagree, SHOW THE EVIDENCE!!! I thought we were talking about natural selection leading to the diversity of life - you seem to have changed the subject again! I am sorry - I just can't keep up. "As for what I do know and don’t know about properly measuring information content, do you even acknowledge that information is shown to be its own unique independent entity, separate from matter and energy by quantum teleportation, and with the refutation of the hidden variable argument? If you don’t agree, why not? and please tell me in which parallel universe that you do happen to agree with the findings." I freely admit I have no idea what on earth you are talking about. Now will you answer my questions?markf
November 16, 2010
November
11
Nov
16
16
2010
08:50 AM
8
08
50
AM
PDT
#76 Gpuccio I can see how investigation can reveal unexplained jumps in evolution at the molecular level. This might be interpreted as answering the question "when" did design take place (although it might simply be interpreted as "unexplained jumps". I can see nothing in your examples about "how". Suppose we find an incident where a gene mutates 30 base pairs simultaneously to create a totally new function. What has that told us about how it happened?markf
November 16, 2010
November
11
Nov
16
16
2010
08:45 AM
8
08
45
AM
PDT
markf it is no 'long debate', you have no evidence for an increase of information leading to speciation event period. If you disagree, SHOW THE EVIDENCE!!! As for what I do know and don't know about properly measuring information content, do you even acknowledge that information is shown to be its own unique independent entity, separate from matter and energy by quantum teleportation, and with the refutation of the hidden variable argument? If you don't agree, why not? and please tell me in which parallel universe that you do happen to agree with the findings. :)bornagain77
November 16, 2010
November
11
Nov
16
16
2010
08:39 AM
8
08
39
AM
PDT
#74 BA77 "markf, pass any of the tests I listed and then will talk" Neat way of changing the subject and avoiding answering the question! I kind of guessed you would not attempt an answer. I guess the "tests" are these questions in #72? 1) what evidence, whatsoever, do you have that information has been generated, by purely material processes, above and beyond what was in the parent species in the first place? i.e. Have Darwinists passed the fitness test? 2) Have Darwinists falsified Abel’s null hypothesis for the generation of functional prescriptive information by purely material processes? 3) have Darwinists ever shown that prescriptive information increases with any mutation? 4) Have Darwinists ever shown that a sub-speciation was wrought by an increase in information and not by a loss in information? 5) Have Darwinists even justified using Natural Selection as to explaining the diversity of all the life on earth? I am sure that whatever answer I give to any of these you will not rate it as a pass. So I guess you have successfully evaded answering my questions. As it happens 1 to 4 all refer to the information in an object. My question challenges the very idea that something has an amount of information - it requires further definition - so I can't understand the questions much less pass the test. The answer to 5 is that almost every text book on evolution makes the case that natural selection explains the diversity of life on earth. Of course you will disagree but that is a very long debate - meanwhile I have asked you a couple of really quite simple questions (which I strongly suspect you have no idea how to answer)markf
November 16, 2010
November
11
Nov
16
16
2010
08:24 AM
8
08
24
AM
PDT
Mark: Can you give me an example of how this would work? And why has no one even attempted it? Surely that is the way for a scientific community to the hypothesis seriously? There are many possible lines of investigation. I believe that most data will come from a "higher resolution" understanding of molecular natural history, as we go on sequencing genomes and analyzing proteomes. We are just at the beginning. And the data we already have are interpeted one way only, starting from the false assumptions of darwinian model. The way we can make design hypotheses could be the following: We must look for the first emergence of new functional domains in the proteome, and try to restrict as much as possible the chronological windows for their emergence. we must refine our "tree of life" and our "molecular clocks". It is fundamental that we may reconstruct natural history as precisely and objectively as possible. Each time we witness the emergence of new complex function in a short window of time, for instance as the emergence of a new species, with new proteins, unrelated to the ancestor species, we have to postulate a design input. The analysis of the design input must be based on the basic biochemical level, and can then proceed to more general levels (regulation, complex systems). If a new species gets new proteins and new networks and new regulations, we have to try to connect those features to understand the higher functional purpose of all of them. That can lead us to a better understanding of the design strategy we are observing. Anothert iimportant level of enquiry is that examplified in the recent work about mutations in ribosome proteins. We must have definite experimental data on how mutation work in a random system, how many of them are negative, neutyral or positive as regards fitness in an objective model. To study design, it is essential to be able to define correctly the role of RV and of NS, if any. You ask: "Why has no one even attempted it?" First of all, it's not true. The few biological researchers we have in our field (Behe, Axe, Durston) have given great contributions. If it were not for them, many important issues would still be completely obscure. But many non ID researchers are contributing greatly to this lines of enquiry. They are gathering facts. Some of them are well aware of the problem of functional complexity, and are trying to elucidate it better (in the hope that a darwinist compatible explanation may be found, probably, but it's fine just the same). Whatever darwinist propaganda may say, the problem of the origin of functional information is still completely unresolved, and it is crucial to our understanding of the living world. While scientistic philosophers try to deny that functional information exists, or that DNA is a code, or that proteins are highly isolated and unlikely islands of functionality, or that a random system must obey the laws or porbability, or that consciousness exists, or whatever other undeniable fact or concept they feel like denying from time to time, serious researchers are well aware of the problem of functional information, and try to solve it. They may be prejudices, they often are, but if they are honest enough in their pursuit of truth (not necessarily the general condition) they are helping. From those who study the real nature of mutations, to those who explore rugged landscapes, to those who falsify wrong comfortable theories like the frameshift emergence of nylonase. They are helping. They are working for scientific truth. For me, they are working for ID.gpuccio
November 16, 2010
November
11
Nov
16
16
2010
07:21 AM
7
07
21
AM
PDT
we'llbornagain77
November 16, 2010
November
11
Nov
16
16
2010
07:19 AM
7
07
19
AM
PDT
markf, pass any of the tests I listed and then will talk.bornagain77
November 16, 2010
November
11
Nov
16
16
2010
07:18 AM
7
07
18
AM
PDT
BA77 "The point being markf is that you have no scientific basis in the first place as to dictate to me what you think should be a proper measure or not for ascertaining Genetic Entropy i.e. information" I am using the measure of information that the ID community provides! Look at the glossary or any of Dembski's publications if you don't believe me. They all define the measure of information as probability. Even the papers you refer to in your comments such as Dembski's on the law of conservation of information measure it as a probability. All I am asking you is when you talk of the information in a genome (a) what specification are you using? (b) what is the knowledge on which the probability is based? I am not dictating anything. I am just asking you a question.markf
November 16, 2010
November
11
Nov
16
16
2010
07:00 AM
7
07
00
AM
PDT
markf, the principle of Genetic Entropy lines up with ALL available evidence. The real question you should be asking is what evidence, whatsoever, do you have that information has been generated, by purely material processes, above and beyond what was in the parent species in the first place? i.e. Have Darwinists passed the fitness test? Is Antibiotic Resistance evidence for evolution? - 'The Fitness Test' - video http://www.metacafe.com/watch/3995248 Have Darwinists falsified ID? Michael Behe on Falsifying Intelligent Design - video http://www.youtube.com/watch?v=N8jXXJN4o_A Have Darwinists falsified Abel's null hypothesis for the generation of functional prescriptive information by purely material processes? The Capabilities of Chaos and Complexity: David L. Abel - Null Hypothesis For Information Generation - 2009 To focus the scientific community’s attention on its own tendencies toward overzealous metaphysical imagination bordering on “wish-fulfillment,” we propose the following readily falsifiable null hypothesis, and invite rigorous experimental attempts to falsify it: "Physicodynamics cannot spontaneously traverse The Cybernetic Cut: physicodynamics alone cannot organize itself into formally functional systems requiring algorithmic optimization, computational halting, and circuit integration." A single exception of non trivial, unaided spontaneous optimization of formal function by truly natural process would falsify this null hypothesis. http://www.mdpi.com/1422-0067/10/1/247/pdf Can We Falsify Any Of The Following Null Hypothesis (For Information Generation) 1) Mathematical Logic 2) Algorithmic Optimization 3) Cybernetic Programming 4) Computational Halting 5) Integrated Circuits 6) Organization (e.g. homeostatic optimization far from equilibrium) 7) Material Symbol Systems (e.g. genetics) 8) Any Goal Oriented bona fide system 9) Language 10) Formal function of any kind 11) Utilitarian work http://mdpi.com/1422-0067/10/1/247/ag Have Darwinists ever shown that prescriptive information increases with any mutation? The GS (genetic selection) Principle – David L. Abel – 2009 Excerpt: Stunningly, information has been shown not to increase in the coding regions of DNA with evolution. Mutations do not produce increased information. Mira et al (65) showed that the amount of coding in DNA actually decreases with evolution of bacterial genomes, not increases. This paper parallels Petrov’s papers starting with (66) showing a net DNA loss with Drosophila evolution (67). Konopka (68) found strong evidence against the contention of Subba Rao et al (69, 70) that information increases with mutations. The information content of the coding regions in DNA does not tend to increase with evolution as hypothesized. Konopka also found Shannon complexity not to be a suitable indicator of evolutionary progress over a wide range of evolving genes. Konopka’s work applies Shannon theory to known functional text. Kok et al. (71) also found that information does not increase in DNA with evolution. As with Konopka, this finding is in the context of the change in mere Shannon uncertainty. The latter is a far more forgiving definition of information than that required for prescriptive information (PI) (21, 22, 33, 72). It is all the more significant that mutations do not program increased PI. Prescriptive information either instructs or directly produces formal function. No increase in Shannon or Prescriptive information occurs in duplication. What the above papers show is that not even variation of the duplication produces new information, not even Shannon “information.” http://bioscience.bio-mirror.cn/2009/v14/af/3426/3426.pdf Have Darwinists ever shown that a sub-speciation was wrought by an increase in information and not by a loss in information? EXPELLED - Natural Selection And Genetic Mutations - video http://www.metacafe.com/watch/4036840 "...but Natural Selection reduces genetic information and we know this from all the Genetic Population studies that we have..." Maciej Marian Giertych - Population Geneticist - member of the European Parliament - EXPELLED Have Darwinists even justified using Natural Selection as to explaining the diversity of all the life on earth? This following study is very interesting for the researcher surveyed 130 DNA-based evolutionary trees to see if the results matched what 'natural selection' predicted for speciation and found: Accidental origins: Where species come from - March 2010 Excerpt: If speciation results from natural selection via many small changes, you would expect the branch lengths to fit a bell-shaped curve.,,, Instead, Pagel's team found that in 78 per cent of the trees, the best fit for the branch length distribution was another familiar curve, known as the exponential distribution. Like the bell curve, the exponential has a straightforward explanation - but it is a disquieting one for evolutionary biologists. The exponential is the pattern you get when you are waiting for some single, infrequent event to happen.,,,To Pagel, the implications for speciation are clear: "It isn't the accumulation of events that causes a speciation, it's single, rare events falling out of the sky, so to speak." http://www.newscientist.com/article/mg20527511.400-accidental-origins-where-species-come-from.html?page=2 etc.. etc.. etc.. The point being markf is that you have no scientific basis in the first place as to dictate to me what you think should be a proper measure or not for ascertaining Genetic Entropy i.e. information, since you actually have no firm hypothesis to work from in the first place in which to counter me!!! All you can hope to do to protect your delusions of scientific integrity within neo-Darwinism is to obfuscate with smoke and mirrors of rhetoric just how hopelessly pathetic the case is for neo-Darwinism!!!bornagain77
November 16, 2010
November
11
Nov
16
16
2010
06:25 AM
6
06
25
AM
PDT
#70 BA “All beneficial adaptations away from a parent species for a sub-species, which increase fitness to a particular environment, will always come at a loss of the optimal functional information that was originally created in the parent species genome.” Functional information is measured as a probability - agreed? The probability of any outcome is relative to two things: 1) How that outcome is specified. For example, the probability of a dice coming down as an even number, a six, or a six angled in a certain way. 2) What evidence you have available? Do you know anything about the manufacture of the dice or the way it is to be thrown? Any past history of the dice or dice in general? etc. Therefore the information in an outcome is dependent on (a) how that outcome is specified (b) what evidence you have available So it is a nonsense to talk about the "information created in the parent genome". Information relative to what specification and what evidence?markf
November 16, 2010
November
11
Nov
16
16
2010
04:22 AM
4
04
22
AM
PDT
Markf you ask about the correct hypothesis: The foundational rule for the diversity of all life on earth, of Genetic Entropy, which can draw its foundation in science from the twin pillars of the Second Law of Thermodynamics and from the Law of Conservation of Information (Dembski, Marks, Abel), can be stated something like this: "All beneficial adaptations away from a parent species for a sub-species, which increase fitness to a particular environment, will always come at a loss of the optimal functional information that was originally created in the parent species genome." Markf the fossil record also shows loss of information: In fact, the loss of morphological traits over time, for all organisms found in the fossil record, was/is so consistent that it was made into a 'scientific law': Dollo's law and the death and resurrection of genes: Excerpt: "As the history of animal life was traced in the fossil record during the 19th century, it was observed that once an anatomical feature was lost in the course of evolution it never staged a return. This observation became canonized as Dollo's law, after its propounder, and is taken as a general statement that evolution is irreversible." http://www.pnas.org/content/91/25/12283.full.pdf+html A general rule of thumb for the 'Deterioration/Genetic Entropy' of Dollo's Law as it applies to the fossil record is found here: Dollo's law and the death and resurrection of genes ABSTRACT: Dollo's law, the concept that evolution is not substantively reversible, implies that the degradation of genetic information is sufficiently fast that genes or developmental pathways released from selective pressure will rapidly become nonfunctional. Using empirical data to assess the rate of loss of coding information in genes for proteins with varying degrees of tolerance to mutational change, we show that, in fact, there is a significant probability over evolutionary time scales of 0.5-6 million years for successful reactivation of silenced genes or "lost" developmental programs. Conversely, the reactivation of long (>10 million years)-unexpressed genes and dormant developmental pathways is not possible unless function is maintained by other selective constraints; http://www.pnas.org/content/91/25/12283.full.pdf+html Dollo's Law was further verified to the molecular level here: Dollo’s law, the symmetry of time, and the edge of evolution - Michael Behe Excerpt: We predict that future investigations, like ours, will support a molecular version of Dollo's law:,,, Dr. Behe comments on the finding of the study, "The old, organismal, time-asymmetric Dollo’s law supposedly blocked off just the past to Darwinian processes, for arbitrary reasons. A Dollo’s law in the molecular sense of Bridgham et al (2009), however, is time-symmetric. A time-symmetric law will substantially block both the past and the future. http://www.evolutionnews.org/2009/10/dollos_law_the_symmetry_of_tim.html This following tidbit of Genetic Entropy evidence came to me from Rude on the Uncommon Descent blog: At one of the few petrified forests that sports ginkgo wood, I was told by the naturalist that ginkgos are old in the fossil record—they date from the Permian back when trees were first “invented”. She said that there are many species of fossilized Ginkgoaceae, but Ginkgo biloba, is the only living species left. - Rude - Uncommon Descent The following site points out that there is a fairly constant, and unexplained, 'background extinction rate'. My expectation for extinctions, at least for the majority of extinctions not brought about by catastrophes, is for the fairly constant rate of 'background extinctions' to be attributable directly to Genetic Entropy: The Current Mass Extinction Excerpt: The background level of extinction known from the fossil record is about one species per million species per year, or between 10 and 100 species per year (counting all organisms such as insects, bacteria, and fungi, not just the large vertebrates we are most familiar with). In contrast, estimates based on the rate at which the area of tropical forests is being reduced, and their large numbers of specialized species, are that we may now be losing 27,000 species per year to extinction from those habitats alone. The typical rate of extinction differs for different groups of organisms. Mammals, for instance, have an average species “lifespan” from origination to extinction of about 1 million years, although some species persist for as long as 10 million years. http://www.pbs.org/wgbh/evolution/library/03/2/l_032_04.html Psalm 104: 29-30 You hide Your face, they are dismayed; You take away their spirit, they expire And return to their dust. You send forth Your Spirit, they are created; And You renew the face of the ground. One persistent misrepresentation, that evolutionists continually portray of the fossil record, is that +99.9% of all species that have ever existed on earth are now extinct because of 'necessary evolutionary transitions'. Yet the fact is that 40 to 80% of all current living species found on the earth are represented fairly deeply in the fossil record. In fact, some estimates put the number around 230,000 species living today, whereas, we only have about a quarter of a million different species collected in our museums. Moreover, Darwin predicts we should have millions of transitional fossil forms. These following videos, quotes, and articles clearly point this fact out: The Fossil Record - The Myth Of +99.9% Extinct Species - Dr. Arthur Jones - video http://www.metacafe.com/watch/4028115 "Stasis in the Fossil Record: 40-80% of living forms today are represented in the fossil record, despite being told in many text books that only about 0.1% are in this category. The rocks testify that no macro-evolutionary change has ever occurred. With the Cambrian Explosion complex fish, trilobites and other creatures appear suddenly without any precursors. Evidence of any transitional forms in the fossil record is highly contentious." Paul James-Griffiths via Dr. Arthur Jones http://edinburghcreationgroup.org/studentpaper1.php The following studies show that the number of species that are currently alive is well below the 'millions of species' that are commonly believed to be alive: Marine Species Census - Nov. 2009 Excerpt: The researchers have found about 5,600 new species on top of the 230,000 known. They hope to add several thousand more by October 2010, when the census will be done. http://news.yahoo.com/s/ap/20091122/ap_on_sc/us_marine_census Scientists finish first sea census - October 2010 Excerpt: The raw numbers behind the $650 million Census of Marine Life are impressive enough: Almost 30 million observations by 2,700 scientists from more than 80 nations spent 9,000 days at sea, producing 2,600 academic papers and documenting 120,000 species for a freely available online database. http://cosmiclog.msnbc.msn.com/_news/2010/10/03/5224377-scientists-finish-first-sea-census Census of Marine Life Publishes Historic Roll Call of Species in 25 Key World Areas - August 2010 Excerpt: In October, the Census will release its latest estimate of all marine species known to science, including those still to be added to WoRMS and OBIS. This is likely to exceed 230,000. (Please note how far off the 230,000 estimated number for species to be found is from the actual 120,000 number for species that were actually found in the census) http://www.sciencedaily.com/releases/2010/08/100802173704.htm etc... etc... I dunno markf, the fossil evidence shows, and has always shown, sudden appearance, rapid diversity, with long term stability following, as well as loss of morphological variability over long terms, and yet, despite the fact that evolutionists have never shown a gain of functional information above what was already present in the parent species, they refuse to honestly report on the evidence, and continue to try to intimidate anyone who questions the neo-Darwinian paradigm. Mark you act as if the evidence is just reported on honestly and that 'scientists' will all of the sudden start to be fair if the correct model for classification is in place. If you truly do believe that, you are very naive in this matter for this 'discrepancy of evidence', that evolutionists have falsely subjected the public to, turns out to be very much a atheistic/materialistic religious dogma that is maintained to be true by the reigning priesthood of Darwinism, no matter how crushing the evidence is against neo-Darwinism.bornagain77
November 16, 2010
November
11
Nov
16
16
2010
03:02 AM
3
03
02
AM
PDT
Thanks PaV, I would like to get all this two step stuff in one place, as well remember that Seelke has done work on the two step limit: Response from Ralph Seelke to David Hillis Regarding Testimony on Bacterial Evolution Before Texas State Board of Education, January 21, 2009 Excerpt: He has done excellent work showing the capabilities of evolution when it can take one step at a time. I have used a different approach to show the difficulties that evolution encounters when it must take two steps at a time. So while similar, our work has important differences, and Dr. Bull’s research has not contradicted or refuted my own. http://www.discovery.org/a/9951 Reductive Evolution Can Prevent Populations from Taking Simple Adaptive Paths to High Fitness - Ann K. Gauger, Stephanie Ebnet, Pamela F. Fahey, and Ralph Seelke – 2010 Excerpt: In experimental evolution, the best way to permit various evolutionary alternatives, and assess their relative likelihood, is to avoid conditions that rule them out. Our experiments, like others (e.g. [40]), used populations of cells growing slowly under limiting nutrient conditions, thereby allowing a number of paths to be taken to higher fitness. We engineered the cells to have a two-step adaptive path to high fitness, but they were not limited to that option. Cells could reduce expression of the non-functional trpAE49V,D60N allele in a variety of ways, or they could acquire a weakly functional tryptophan synthase subunit by a single site reversion to trpAD60N, bringing them within one step of full reversion (Figure 6). When all of these possibilities are left open by the experimental design, the populations consistently take paths that reduce expression of trpAE49V,D60N, making the path to new (restored) function virtually inaccessible. This demonstrates that the cost of expressing genes that provide weak new functions is a significant constraint on the emergence of new functions. In particular, populations with multiple adaptive paths open to them may be much less likely to take an adaptive path to high fitness if that path requires over-expression. http://bio-complexity.org/ojs/index.php/main/article/view/BIO-C.2010.2/BIO-C.2010.2bornagain77
November 16, 2010
November
11
Nov
16
16
2010
02:36 AM
2
02
36
AM
PDT
BA: Hoyle, "The Mathematics of Evolution". He takes a 'path-integral' approach to the Darwinian model, and, based on the distribution of genomic variants in the population---this is my way of categorizing his maths---at most one can expect evolution to move either two steps forward or backward from the current distribution. It's been well over a year since I was beefing up on all of this, so I would have to go back and pull out the math and the quotes that go with it. From memory, I would say that he reaches these conclusions in Chapters 3 and 4.PaV
November 15, 2010
November
11
Nov
15
15
2010
11:48 PM
11
11
48
PM
PDT
#62 Gpuccio There is a danger of conducting the same debate in two places but I wan't to pick up one point here. You write: the same techniques used by darwinists (comparisons of the gemomes and proteomes) can be used to try to understand when and how design inputs have happened. It will come. We just need a scientific community which, at least in part, takes seriously the design hypothesis. Can you give me an example of how this would work? And why has no one even attempted it? Surely that is the way for a scientific community to the hypothesis seriously?markf
November 15, 2010
November
11
Nov
15
15
2010
10:59 PM
10
10
59
PM
PDT
Mark: Thank you for your entry, I will go there soon. Regarding sdic, the situation is complex. If you look at figure 3 in the paper, you will see that the main sequence in all 4 sdic genes is the same as cdic. It is true, however, that sdic1, the one which seems to be expressed, is truncated in the final part, because it is different in that part from cdic, in part as a consequence of frameshift mutations (but not only). The problem is that the protein coded by sdic is not really known, nor is its structure and function known. So, any hypothesis that a frameshift mutation has created a new functional sequence here is completely out of order. Apparently, the only effect of the mutation seems to be a contribution to the loss of part of the functional molecule. The nylonase model was completely different. Ohno, while wrong, had the courage to make a strong hypothesis: he started form an existing, functional protein, nylonase, and hypothesized in molecular detail that it had originated from an existing protein coding gene through a specific mutation which had created a new ORF (frameshifted). Such a detailed model has a great advantage: it can easily be falsified. And yet, we needed decades to falsify it. But it was not difficult: I have personally extracted the sequence of the supposed precursor protein from Ohno's paper, and blasted it. The result: no homology with any existing protein. IOWs, the supposed precursor protein has never existed. Nylonase, instead, has very strong homology with penicillinase, from which it derives. The truth, as always, is simple. We must just look for it in the right place.gpuccio
November 15, 2010
November
11
Nov
15
15
2010
12:27 PM
12
12
27
PM
PDT
OK - I created an entry on my blog that I hope will be a neutral and friendly venue for this discussion based on mutual respect. Gpuccion - the paper on the sdic gene I linked to says: It is a chimeric gene formed by duplication of two other genes followed by multiple deletions and other sequence rearrangements Are deletions not examples of frameshifts?markf
November 15, 2010
November
11
Nov
15
15
2010
09:02 AM
9
09
02
AM
PDT
PaV, could you reference the Hoyle calculation for me please.bornagain77
November 15, 2010
November
11
Nov
15
15
2010
07:44 AM
7
07
44
AM
PDT
markf:
. . . a brief skim of the literature suggests that (1) Different populations of mosquitoes gained their resistance through different mutations (2) We don’t know precisely what mutations were necessary in any of these cases (3) No one has investigated to see whether intermediaries do exist.
Certainly (1) is true. Certainly (2) is true. But doesn't (2) simply imply that we, like NS, don't know what the 'target' is when it comes to the development of resistance? Which then would mean, doesn't it, that (3) is impossible to determine? IOW, how can you know what an intermediate looks like if you don't know what the ultimate goal looks like. Isn't this a real limitation of Darwinian theory? And doesn't it suggest that if we talk about intermediates that we roam in a very speculative area? But, for our purposes here, I would point out these three things: (1) Behe and Snoke set up a theoretical model that included pseudogenes and gene duplication, with the result that within reasonable amounts of replications only TWO a.a. substitutions were available to Darwinian search; (2) Sir Fred Hoyle established via his own mathematical model (one that at various points reached the same standard values as popl gentcs) arrived at the conclusion that NS can only view mutations up to TWO evolutionary steps in either direction; and (3) Behe, in EoE, reaches the same conclusion, but based on empirical evidence, that it is extremely difficult for NS to take two steps involving a two a.a. substituion for each step. It seems to me that from three different angles, we get the same result: NS is limited to two a.a. steps in any direction. I would presume then that within the genome we would have a plethora of 'intermediates' that are 'one' a.a. shifted, and probably not so many that are 'two' a.a. shifted. The implication is that if a protein gets shifted by over two a.a.'s, then NS eliminates this life form. Thus, taking a random walk for more than a few a.a.'s seems very highly improbable. This is the dilmena here, is it not? Why? Exactly because there doesn't seem to be that many available intermediates, and positive selection (=directional selection) will take way too much time to arrive at any major change all by itself.PaV
November 15, 2010
November
11
Nov
15
15
2010
07:29 AM
7
07
29
AM
PDT
Mark: the same techniques used by darwinists (comparisons of the gemomes and proteomes) can be used to try to understand when and how design inputs have happened. It will come. We just need a scientific community which, at least in part, takes seriously the design hypothesis.gpuccio
November 15, 2010
November
11
Nov
15
15
2010
05:10 AM
5
05
10
AM
PDT
1 2 3 4

Leave a Reply