Uncommon Descent Serving The Intelligent Design Community

Stephen Meyer’s Book Ranked #1 in Science/Physics/Cosmology at Amazon

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Over at Amazon in the Physics/Cosmology section, Dr. Meyer’s book got the surprise ranking ahead of Stephen Hawking’s book, A Brief History in Time.

There is a section on cosmology and the origin of life in Signature in the Cell.

Here are the Amazon Stats:

Signature in the Cell: DNA and the Evidence for Intelligent Design
Hardcover: 624 pages
Publisher: HarperOne (June 23, 2009)

Amazon.com Sales Rank: #799 in Books (See Bestsellers in Books)

Popular in these categories:

#1 in Books > Science > Astronomy > Cosmology
#1 in Books > Religion & Spirituality > Christianity > Theology > Creationism
#1 in Books > Science > Physics > Cosmology

Congratulations Dr. Meyer!

Comments
Kairosfocus: So, we are justified in reworking the Boltzmann expression to separate clumping/thermal and configurational components: S = k ln (Wclump*Wconfig) = k lnWth*Wc . . . [Eqn A.11, cf. TBO 8.2a] or, S = k ln Wth + k ln Wc = Sth + Sc . . . [Eqn A.11.1]
Sth = thermal entropy Sc = configurational entropy S = total entropy Configurational entorpy is a deep subject, and so is amending the notion of "S" to include Sc (configurational entorpy). But this seems the most insightful because many creationist still invoke the second law without understanding the important distinctions between Sth( thermal entropy) and Sc (configurationl entropy).scordova
August 9, 2009
August
08
Aug
9
09
2009
01:53 PM
1
01
53
PM
PDT
Dave Whisker "Actually, mad doc, the authors speculated on the cause, but only in the discussion section" I am aware of that and what I was trying to draw attention to was the illogicalities in their speculations: "The authors point out that compensatory epistatic mutations are not the same as “beneficial mutations” (even though they obviously benefit the organism). They postulate these mutations benefit the worst performing organisms the most. (It appears If the organism is mutated then the mutations are good and if the organism is not mutated, mutations are bad)." From their results, this is all supposed to have happened within 10 generations in several strains. This is a remarkable achievement for a random process and it puts Dawkin's "weasel" program to shame. Basically I think random mutations are unable to do this in 10 generations. Either their experiment is flawed or there is another explaination. In any event their explaination in the discussion makes no sense for the reasons I have outlined before. I am glad that they are doing further investigations. About time too, as that paper is 6 years old. I would be interested to find out the results. I expect the answer will only be found when full genetic mapping of the mutated and non mutated strains can be done and I think you will find a highly efficient data recovery mechanism has been activated in the organism to recover the "lost" information.mad doc
August 8, 2009
August
08
Aug
8
08
2009
04:24 PM
4
04
24
PM
PDT
"You oppose Sanford’s thesis, and you may be right to do so. But the fact of the matter is when ID takes charge of science as opposed to being continually locked in debate with recalcitrant Darwinists, there will be new standards by which we judge whether or not genetic entropy is occurring." I am not sure what you are trying to say. I have no opposition to any type of research at all and am open to any supported conclusions. I just find Sanford's conclusions not in sync with the world. If all our genomes are crap, I believe like a rusted out car is how Sanford described it, and we still function just fine, then that is a most interesting finding. But I see no indication that the organisms of this world are not functioning well when they have dysfunctional genomes. Some individual members may have problems or the occasional species but in general life is "peachy keen."jerry
August 8, 2009
August
08
Aug
8
08
2009
09:10 AM
9
09
10
AM
PDT
PS: Cf Shapiro's actual remarks: ______________ RNA's building blocks, nucleotides, are complex substances as organic molecules go. They each contain a sugar, a phosphate and one of four nitrogen-containing bases as sub-subunits. Thus, each RNA nucleotide contains 9 or 10 carbon atoms, numerous nitrogen and oxygen atoms and the phosphate group, all connected in a precise three-dimensional pattern. Many alternative ways exist for making those connections, yielding thousands of plausible nucleotides that could readily join in place of the standard ones but that are not represented in RNA. That number is itself dwarfed by the hundreds of thousands to millions of stable organic molecules of similar size that are not nucleotides . . . . The RNA nucleotides are familiar to chemists because of their abundance in life and their resulting commercial availability. In a form of molecular vitalism, some scientists have presumed that nature has an innate tendency to produce life's building blocks preferentially, rather than the hordes of other molecules that can also be derived from the rules of organic chemistry. This idea drew inspiration from . . . Stanley Miller. He applied a spark discharge to a mixture of simple gases that were then thought to represent the atmosphere of the early Earth. ["My" NB: Subsequent research has sharply undercut this idea, a point that is unfortunately not accurately reflected in Sci Am's caption on a picture of the Miller-Urey apparatus, which in part misleadingly reads, over six years after Jonathan Wells' Icons of Evolution was published: The famous Miller-Urey experiment showed how inanimate nature could have produced amino acids in Earth's primordial atmosphere . . .] Two amino acids of the set of 20 used to construct proteins were formed in significant quantities, with others from that set present in small amounts . . . more than 80 different amino acids . . . have been identified as components of the Murchison meteorite, which fell in Australia in 1969 . . . By extrapolation of these results, some writers have presumed that all of life's building could be formed with ease in Miller-type experiments and were present in meteorites and other extraterrestrial bodies. This is not the case. A careful examination of the results of the analysis of several meteorites led the scientists who conducted the work to a different conclusion: inanimate nature has a bias toward the formation of molecules made of fewer rather than greater numbers of carbon atoms, and thus shows no partiality in favor of creating the building blocks of our kind of life . . . I have observed a similar pattern in the results of many spark discharge experiments . . . . no nucleotides of any kind have been reported as products of spark discharge experiments or in studies of meteorites, nor have the smaller units (nucleosides) that contain a sugar and base but lack the phosphate. To rescue the RNA-first concept from this otherwise lethal defect, its advocates have created a discipline called prebiotic synthesis. They have attempted to show that RNA and its components can be prepared in their laboratories in a sequence of carefully controlled reactions, normally carried out in water at temperatures observed on Earth . . . . Unfortunately, neither chemists nor laboratories were present on the early Earth to produce RNA . . . . The analogy that comes to mind is that of a golfer, who having played a golf ball through an 18-hole course, then assumed that the ball could also play itself around the course in his absence. He had demonstrated the possibility of the event; it was only necessary to presume that some combination of natural forces (earthquakes, winds, tornadoes and floods, for example) could produce the same result, given enough time. No physical law need be broken for spontaneous RNA formation to happen, but the chances against it are so immense, that the suggestion implies that the non-living world had an innate desire to generate RNA. The majority of origin-of-life scientists who still support the RNA-first theory either accept this concept (implicitly, if not explicitly) or feel that the immensely unfavorable odds were simply overcome by good luck. ______________ That's not in the ballpark of 25% support. It's in the context of lethal defects of speculative theories. And, of course, Orgel's posthumous rebuttal was just as devastating to the metabolism-first model: >> It must be recognized that assessment of the feasibility of any particular proposed prebiotic cycle must depend on arguments about chemical plausibility, rather than on a decision about logical possibility . . . . Theories of the origin of life based on metabolic cycles cannot be justified by the inadequacy of competing theories: they must stand on their own . . . >> That's why evolutionary materialist models and speculations on OOL are in crisis.kairosfocus
August 8, 2009
August
08
Aug
8
08
2009
07:09 AM
7
07
09
AM
PDT
KF-san, (I find it slightly annoying to see you resurrecting an answered matter as though it were not answered; when you did not respond on the merits when I pointed out what was wrong with the latest case being put up.] Are you referring to this? And then, explain how lucky noise plus blind mechanical necessity somehow spontaneously rearranged dilute racemic organic molecules in a plausible prebiotic soup into a homochiral, functional information storing and processing system. Including inventing along the way: digital information stored in codes [thus also, computer language -- which thus precedes speech, perhaps by 3.8 BY on the conventional timelines], algorithms, data structures, programs, and the executing machinery to implement the hard and software system. I do have a proof of materialistic OOL, but it is too long to fit in the margin! ;) Really if you take each of the adjectives in your description of the problem, you know that different researchers are working on that aspect of the problem. None of them is a showstopper. Dilute solutions can be concentrated, racemic mixtures sorted, organic molecules created, plausible atmoshperes calculated, and prebiotic environments tested. if you think one of these areas is a showstopper, point it out. If the last paper published in a field was done a few years ago, and it was a review of all the current work up until then which summarized why results had been negative, and the labs had closed and funding wasted away, and no PhD theses were being written in that area, I would certainly agree that the outlook was moribund and pessimistic, similar to Dr Shapiro. However, I don't see that to be the case.Nakashima
August 8, 2009
August
08
Aug
8
08
2009
07:04 AM
7
07
04
AM
PDT
Nakashima-san: Evoloops is an intelligently designed and controlled simulation in a computer, not chemistry. I note my remarks on how the analogy breaks down: ______________ >> As genes circulate in the loop counterclockwise [note the non-random specificity], copies of them are made [Just how, kindly sir?] and sent [again, just how, and how will these copies just happen to program functional protein chains etc?] toward the tip of the arm. The arm grows through repetition of straight growth and left turning. [How is such "folding" initiated and controlled? Is not Right turning just as probable inherently, in a presumed prebioptic soup, and how are the "right" monomers going to be available in step by step sequence to carry out the info storage and onward metaboliic function etc?]. When the tip reaches its own root after three left turns, they bond together to form a new offspring loop. [an algorithm writes itself out of lucky noise -- or is that by an intelligent programmer's input] Then the connection between parent and offspring disappears [[ That is, we have termination here, a nontrivial issue in algorithm design]]. ( In such a way, the loop reproduces its offspring which has a structure identical to its parent’s in the right area, in 151 updates. >> ___________________ In short,t he functionality of evoloops is crucially dependent on their specific design and coding. GEM of TKIkairosfocus
August 8, 2009
August
08
Aug
8
08
2009
06:58 AM
6
06
58
AM
PDT
KF-san, Mr Shapiro’s point was about the problem of the chemistry not just the statistics, save insofar as statistical thermodynamics issues are implied. Indeed. That was why my reference to Evoloops was apropos. Evoloops is a different chemistry that does support the creation of evolving life forms directly from random soups. I apologize for not following up your reply on that other thread when it was posted. I agree that Evoloops is intelligently designed. I said so twice on the other thread. That is not why I bring it up. It is an existence proof, nothing more. What it proves is that a Creator can walk away from Its creation after the "Big Bang" and life does form without subsequent assistance - in that chemistry. It might easily be objected that this particular chemistry is finely tuned to allow life to form. I agree again, and note that the fine tuning argument has been used for our chemistry also. An appropriate research question would be - are life supporting chemistries common? If 25% of chemistries support life, is it accurate to use the term fine tuned? (I chose 25% as a rough parallel to the number of universes that support stars.)Nakashima
August 8, 2009
August
08
Aug
8
08
2009
06:45 AM
6
06
45
AM
PDT
Mad doc, Another comment. Suzanne Estes's lab has gone on to explore the genetics of adaptation, especially the role of compensatory mutations in C. elegans. In other words, she is taking the questions on the causes of the fitness recovery discussed in the original paper and developing a research program to explore them.Dave Wisker
August 8, 2009
August
08
Aug
8
08
2009
04:18 AM
4
04
18
AM
PDT
mad doc, Regarding the Lynch paper on C elegans, much of the recovery of the mutated lines had occurred in only 10 generations. In some of the lines it appears almost all of the recovery had occurred in that time. (i.e. they were almost as productive as the non-mutated original strain). The authors attribute recovery to “compensatory epistatic mutations” by dismissing other alternatives e.g. “beneficial mutations” and back mutations (because they were too improbable Actually, mad doc, the authors speculated on the cause, but only in the discussion section (where that kind of thing is common and perfectly appropriate). The actual scientific question they addressed in the experiments (and which is reflected in their design) was fitness recovery of populations that have been mutationally degraded. What we have then is a case of the authors keeping the speculation out of the main body of the paper (methods and results) and restricting any speculation to the discussion section where it belongs. So their choice of the title of the paper is fine as is.Dave Wisker
August 8, 2009
August
08
Aug
8
08
2009
03:39 AM
3
03
39
AM
PDT
6 --> So, as I follow the Thaxton et al argument in my app 1, point 8: ______________ >> degree of confinement in space constrains the degree of disorder/"freedom" that masses may have. And, of course, confinement to particular portions of a linear polymer is no less a case of volumetric confinement (relative to being free to take up any location at random along the chain of monomers) than is confinement of gas molecules to one part of an apparatus. And, degree of such confinement may appropriately be termed, degree of "concentration." Diffusion is a similar case: infusing a drop of dye into a glass of water -- the particles spread out across the volume and we see an increase of entropy there. (The micro-jets case of course is effectively diffusion in reverse, so we see the reduction in entropy on clumping and then also the further reduction in entropy on configuring to form a flyable microjet.) So, we are justified in reworking the Boltzmann expression to separate clumping/thermal and configurational components: S = k ln (Wclump*Wconfig) = k lnWth*Wc . . . [Eqn A.11, cf. TBO 8.2a] or, S = k ln Wth + k ln Wc = Sth + Sc . . . [Eqn A.11.1] We now focus on the configurational component, the clumping/thermal one being in effect the same for at-random or specifically configured DNA or polypeptide macromolecules of the same length and proportions of the relevant monomers, as it is essentially energy of the bonds in the chain, which are the same in number and type for the two cases . . . >> _______________ 7 --> In simpler terms, the fact of additivity of thermal and configurational entropy terms implies their common nature: you can only add apples, oranges and bananas together by finding a common nature as fruit. 8 --> In this case, we see an increasing confinement of spatial location -- hence my term "configuration space" for this cut-down version of phase space -- which more and more sharply reduces the number of ways mass and emnergy at micro level may be arranged consistent with observable macro level constraints. (Thus, the example of the microjets in my point 6 the same appendix.] 9 --> And so we have arrived at islands of configurations that yield a macro-observable function in a sea of initially possible configs. 10 --> That is, we now see that we are at the issue of functionally specific complex information and the organising work that is required to get to it from a presumed initial at-random arrangement in a prebiotic soup, or less a clumped together but randomly sequenced polymer chain. 11 --> And, of course, there is an informed sequence of constraints that must apply force to confine relevant components to a restricted zone then arrange them into a functional sequence to make one of the macromolecules of life. 12 --> Then, to arrange the result5ing macromolecules into a functionally organised whole: including the info storage, reading and implementing systems required for the von Neuman self-replicator that is so central to cell based life. 13 --> Such directed organised forces are physical work, and are further specified as to more or less macro-observable functional outcomes to be obtained. 14 --> And so, we see how ther is indeed an analytical bridge between the energy and mass distribution issues of statistical thermodynamics, and the creation of functionally specific complex information lying at the heart of cell based life. GEM of TKIkairosfocus
August 7, 2009
August
08
Aug
7
07
2009
10:40 PM
10
10
40
PM
PDT
Sal: Re: ID is focused on the configurational entropy, “configurational entropy” was the term used in the founding book of the modern ID movement, Mystery of Life’s Origin 1 --> You are of course correct that the argument of Thaxton et al in ch 8 TMLO turns on splitting the TdS eqn into thermal and configurational parts, constituting entropy at micro level as requiring for the case of an informational polymer, [a]agglomeration of monomers from a scattered condition, then [b] the arrangement of the sequence of monomers in the resulting chain into a useful pattern. 2 --> In turn, this rests on seeing that s = k ln w is based on the number of ways particles and energy may be arranged at micro-levels, consistent with prevailing macro-conditions. 3 --> As Brillouin (who Thaxton et al are following explicitly) put it in analysing Maxwell's Demon who is envisioned as using his [acquired . . .] information on the speed and direction of molecules moving between two connected containers to force their separation, driving the relevant temperatures into disequilibrium and apparently breaking the 2nd law of thermodynamics:
Every physical system is incompletely defined. We only know the values of some macroscopic variables, and we are unable to specify the exact positions and velocities of all the molecules contained in a system. We have only scanty, partial information on the system, and most of the information on the detailed structure is missing. Entropy measures the lack of information; it gives us the total amount of missing information on the ultramicroscopic structure of the system. This point of view is defined as the negentropy principle of information . . . , and it leads directly to a generalization of the second principle of thermodynamics, since entropy and information must, be discussed together and cannot be treated separately . . . any observation or experiment made on a physical system automatically results in an increase of the entropy of the laboratory. It is then possible to compare the loss of negentropy (increase of entropy) with the amount of information obtained. The efficiency of an experiment can be defined as the ratio of information obtained to the associated increase in entropy. This efficiency is always smaller than unity, according to the generalized Carnot principle. [Sci and info theory, 2nd Edn; Cf point 3 and point 8, App 1 my always linked]
4 --> Thus, Brillouin has opened an analytical door from thermal agitation and its randomised distribution to configurations, and highly informationally constrained configurations. 5 --> Gary L Bertrand of U of Missouri-Rollo argues, similarly:
The freedom [i.e. micro-level uncertainty] within a part of the universe may take two major forms: the freedom of the mass and the freedom of the energy. The amount of freedom is related to the number of different ways the mass or the energy in that part of the universe may be arranged while not gaining or losing any mass or energy. We will concentrate on a specific part of the universe, perhaps within a closed container. If the mass within the container is distributed into a lot of tiny little balls (atoms) flying blindly about, running into each other and anything else (like walls) that may be in their way, there is a huge number of different ways the atoms could be arranged at any one time. Each atom could at different times occupy any place within the container that was not already occupied by another atom, but on average the atoms will be uniformly distributed throughout the container. If we can mathematically estimate the number of different ways the atoms may be arranged, we can quantify the freedom of the mass. If somehow we increase the size of the container, each atom can move around in a greater amount of space, and the number of ways the mass may be arranged will increase . . . . The thermodynamic term for quantifying freedom is entropy, and it is given the symbol S. Like freedom, the entropy of a system increases with the temperature and with volume . . . the entropy of a system increases as the concentrations of the components decrease. The part of entropy which is determined by energetic freedom is called thermal entropy, and the part that is determined by concentration is called configurational entropy."
[ . . . ]kairosfocus
August 7, 2009
August
08
Aug
7
07
2009
10:39 PM
10
10
39
PM
PDT
Dave Wisker Regarding the Lynch paper on C elegans, much of the recovery of the mutated lines had occurred in only 10 generations. In some of the lines it appears almost all of the recovery had occurred in that time. (i.e. they were almost as productive as the non-mutated original strain). The authors attribute recovery to "compensatory epistatic mutations" by dismissing other alternatives e.g. "beneficial mutations" and back mutations (because they were too improbable). The authors point out that compensatory epistatic mutations are not the same as "beneficial mutations" (even though they obviously benefit the organism). They postulate these mutations benefit the worst performing organisms the most. (It appears If the organism is mutated then the mutations are good and if the organism is not mutated, mutations are bad). They also state this, which I find revealing because it moves from supposition to fact in the space one sentence "... compensatory mutation appears to be the only viable explanation for the observed results. Given that the observed fitness restoration did in fact result from compensatory mutation accumulation, our findings suggest that a surprisingly high fraction of deleterious mutations can be compensated." The title of their paper should have been: "Heavily Mutated Roundworms Rejuvenate in 10 Generations and We Don't Know How."mad doc
August 7, 2009
August
08
Aug
7
07
2009
07:21 PM
7
07
21
PM
PDT
There are various kinds of entropy: 1. thermal 2. configurational 3. informational ID is focused on the configurational entropy, "configurational entropy" was the term used in the founding book of the modern ID movement, Mystery of Life's Origin, (link provided above). Informational entropy may refer in some cases to the compactness of information (the inability to compress a file for example may indicates high informational entropy). That is because from an informational standpoint, the lack of an algorithmically orderly structure is suggestive of maximum packing of information in every bit. If thermal entropy goes way up (like say bringing objects into a plasma state), then yes, configurational entorpy is maximized. They are tied at extreme points of the temperature scale. But there is a region where thermal and configurational entropy may not be in sync. For example, I can have a scrabble pattern that says, "me thinks like a weasel". The cofigurational entropy won't change much despite the fact the room temperature (and hence thermal entropy) may change... It is conceivable configurational entorpy can go up while thermal entropy goes down. A house of cards has low configurational entropy, it is in an ordered state with respect to what we consider designed patterns. When it collapses it is in a high state of configurational entropy and in it is also in configurational equilibrium. There is an analogous case for configurational entropy in OOL, and that case was made in Mystery, a book highly influential to Dr. Meyer.scordova
August 7, 2009
August
08
Aug
7
07
2009
06:36 PM
6
06
36
PM
PDT
PPS: Sigh: Link seems messed up: try again for no 347 in the Eye thread of July 11, 2009. URL: https://uncommondescent.com/intelligent-design/an-eye-into-the-materialist-assault-on-lifes-origins/#comment-328014kairosfocus
August 7, 2009
August
08
Aug
7
07
2009
04:24 PM
4
04
24
PM
PDT
Nak, http://www.youtube.com/watch?v=_O-QqC9yM28bornagain77
August 7, 2009
August
08
Aug
7
07
2009
04:21 PM
4
04
21
PM
PDT
Onlookers: Re Mr Wisker @ 212: Informational and thermodynamic entropy are analogous, not identical, which was my point. It is sadly clear that DW has not even seriously read much less interacted with even the brief excerpts in 210 above. Let us observe again, how the mathematics is drawn out from the context of the informational challenge of the macro- vs micro- state views of systems, by Jaynes et al. In particular, let us look at: _____________ >> A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability should be seen as, in part, an index of ignorance] . . . . [Deriving . . . ] . . . . S({pi}) = – C [SUM over i] pi*ln pi, [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp – beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . .[pp.3 - 6] S, called the information entropy, . . . correspond[s] to the thermodynamic entropy, with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature [this APPLICATION of a general analysis to a particular instance is what is being taken out of context to assert "analogy" without addressing the context that for instance is reflected in what follows] . . . A thermodynamic system is characterized by a microscopic structure [= microstates] that is not observed in detail [= uncertainty about the specific state] . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states [which represents information] . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. [In short, we see not mere analogy but an analytical bridge] It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context [i.e we see an application of the analysis and associated theory] [p. 7] . . . . >> ________________ That is what Mr Wisker needed to speak to cogently and it is what he has decided to duck with a dismissive claim about analogies. So, i draw Mr Wisker's attention to the following from pp. Vii - viii of Robsertson:
. . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . .
Now, of course there are debates in physics around such claims [and one needs not agree with the sort of reasoning above to arrive at a design theory conclusion], but they are not to be settled by mere dismissive rhetorical assertions about "analogies." GEM of TKI PS: Nakashima-san: Mr Shapiro's point was about the problem of the chemistry not just the statistics, save insofar as statistical thermodynamics issues are implied. And, I long since answered on evoloops [cf. 347 in the Eye into materialist assault thread], pointing out how they are ever so carefully constructed algorithmically to loop and to replicate; such entities in the real world simply would illustrate design at work; even as they express an intelligently designed and implemented algorithm in the simulated world that is presented. (I find it slightly annoying to see you resurrecting an answered matter as though it were not answered; when you did not respond on the merits when I pointed out what was wrong with the latest case being put up.]kairosfocus
August 7, 2009
August
08
Aug
7
07
2009
03:57 PM
3
03
57
PM
PDT
tragic mishap, The DNA is the hardware that carries out the instructions of the software- the software written by the intelligent designer(s)- "the" genetic algorithm(s) if you will. I am saying there is software telling the DNA what to do and when to do it, etc. It is the software- unseeable- DNA can be seen- that runs the show. The sequence specificity is just to carry out the commands. When an RNA is formed software gets downloaded to it. When DNA replicates the software from the parent strands gets passed on- downloaded from parent to daughter. The ribosomes are programmed genetic compilers- BTW DNA does not replicate itself. It gets replicated as part of the cell when the cell replicates. And that is when the software from the parent cell gets downloaded in the daughter.Joseph
August 7, 2009
August
08
Aug
7
07
2009
02:42 PM
2
02
42
PM
PDT
Mr BA^77, You are mistaken. 4^(10^9) is the number of genomes of length one gigabase. We were discussing how many c. elegans (not bacteria) it would take to cover a single base change in every one of the bases in their 100 megabase genome. Reverting mutations does not require searching all possible genomes. It requires looking at nearby genomes.Nakashima
August 7, 2009
August
08
Aug
7
07
2009
02:24 PM
2
02
24
PM
PDT
Nak you stated: , “If DNA miscopies every 10^-9 base, simple mutation will sample every possible choice in the genome eventually.” when I responded: 4^1000000000 possible combinations Nak,,, How long will it take to search all of those possible combinations? Even with a population of say 10^40 bacteria? You then stated: What problem do you think that is relevant to? Nothing we have discussed recently. ---- Nak why do you deny something you had made a direct claim to in the previous post? I believe they call the audacity of what you just did ,,,,DENIALISM,,, Denialism "is the refusal to accept an empirically verifiable reality. http://en.wikipedia.org/wiki/Denialism As they say in AA recovery programs Nak,,,Keep Comin Back"bornagain77
August 7, 2009
August
08
Aug
7
07
2009
02:01 PM
2
02
01
PM
PDT
DATCG, Well as I am not in contact with Dr. Sanford I cannot comment on his current work. I think Sal can help you there. As you may well already know, he has developed a computer program called Mendel's accountant: Using Computer Simulation to Understand Mutation Accumulation Dynamics and Genetic Load: excerpt: We apply a biologically realistic forward-time population genetics program to study human mutation accumulation under a wide-range of circumstances. Using realistic estimates for the relevant biological parameters, we investigate the rate of mutation accumulation, the distribution of the fitness effects of the accumulating mutations, and the overall effect on mean genotypic fitness. Our numerical simulations consistently show that deleterious mutations accumulate linearly across a large portion of the relevant parameter space. http://bioinformatics.cau.edu.cn/lecture/chinaproof.pdf MENDEL’S ACCOUNTANT: J. SANFORD†, J. BAUMGARDNER‡, W. BREWER§, P. GIBSON¶, AND W. REMINE http://mendelsaccount.sourceforge.net http://www.scpe.org/vols/vol08/no2/SCPE_8_2_02.pdf But as Dave has persistently pointed out on this thread, there is a compensation mechanism in operation that Dr. Sanford has apparently not taken full appraisal of yet (IMHO, the mechanism is far from being "the source" of functional information that evolutionists would need to falsify the principle of GE that Dave claims it is). Thus, in my very limited knowledge of the matter, I would have to say, from all the evidence I have been able to examine thus far, that if Dr. Sanford is able to solidify his baseline for GE, from improved, and rigorous, measurement of the loss of functional information/complexity/fitness in current "compensation" studies and extrapolate that baseline in a meaningful way to several sources of observed GE from ancient bacteria, he will have made the first move to meaningfully established GE as the principle for biology and will have provided a basis to elucidate it further mathematically in his MA program, that is to say he will have done so at least for the bacterial populations studied. I also know, from my nosebleed view of the entire spectrum of current evidence, that Dr. Sanford can take a fairly large amount of confidence in one of the prime predictions of GE, in that I firmly believe all ancient bacteria will be found to have more functional complexity than their current lineages (Sal told me Sanford is looking for ancient bacteria in salt mines). If the loss is somewhat consistent across all populations, from ancient to current, then it will give strong indication of a overriding law,,,and even if the loss is not consistent across the bacterial populations,,, it will at least be a very interesting area of investigation for Dr. Sanford and whomever is working with him. Other than that I really have no clue what Dr. Sanford is up to.bornagain77
August 7, 2009
August
08
Aug
7
07
2009
01:43 PM
1
01
43
PM
PDT
Mr DATCG, Why Mr. Nakashima? Why didn’t he rule it out? Why did he leave that possibility open? And if anything is possible according to Darwinians, why not intelligence? I think he left it open out of intellectual honesty. The problem is detecting the signature of intelligence, subject of Dr Meyer's book. The space aliens, like God, have done a good job of covering their tracks.Nakashima
August 7, 2009
August
08
Aug
7
07
2009
01:26 PM
1
01
26
PM
PDT
Mr BA^77, 4^1000000000 possible combinations Nak,,, How long will it take to search all of those possible combinations? Even with a population of say 10^40 bacteria? What problem do you think that is relevant to? Nothing we have discussed recently. As well, I think you are trying to say there is absolutely no complex algorithmic information in the genome whatsoever and that we should look totally to “non information causes” in this recovery of information,, at least that is what it seems like from your naive appealing to totally “brute material forces” to recover the information that was lost. Yes, it was lost by a brute materialist force and can be recoverd by the same force. Is that surprising? As well, Nak just because you have somewhat deluded yourself into thinking Genetic Entropy is falsified, on such piddling evidence of “equilibrium” in Dave’s paper,,,and proudly proclaim GE is falsified over and over,,,does not detract from the truth that GE is not rigorously falsified within the scientific method[...] Inasmuch as GE is formulated as necessarily true for all genomes, any single counter-example falsifies it. Any recovery in function is a falsification. At least it would be if GE were a scientific theory, published in a peer reviewed journal article. As a bunch of hand waving and opinion, it can squirm out of any uncomfortable situation. What is your definition of GE, now? "Genetic Entropy is a one way decline in function that is inevitable due to accumulating deleterious mutations, except when function is recovered, but not too much." You might want to check with Dr Sanford on that. Here is yet another study that I am afraid you will completely ignore: I'm actually very interested in Abel's three classes of sequence complexity. Abel doesn't make any strong claims for the three classes, though you might be tempted to think he does by his writings. He doesn't claim they are universal or that they are distinct. Is there possibly a fourth class? Abel doesn't say yes or no. Do his three classes overlap? He doesn't say yes or no. If they are universal and distinct, as he would like you to believe, is there a definite decision procedure that lets you distinguish which of the three classes a sequence belongs to? No, how could there be without resolving the first two issues? It is also interesting that Abel never references Wolfram and his classification system, even though that is the most relevant similar research. That bothered me, because it seemed to me that Abel was ducking some important questions about CAs with simple rules and simple inputs that generated quite complex sequences.Nakashima
August 7, 2009
August
08
Aug
7
07
2009
01:16 PM
1
01
16
PM
PDT
DatCG, You ask why is life a means of local entropy reduction. I would answer because it converts energy from the sun into carbon structures. But I don't think that's what you really want as an answer. That's because you are asking a philosophical rather than a scientific question. Science answers what and how types of questions. Ask a philosopher for the why, as well as the why the ozone layer is important. My answers are bound to be uninteresting to you.Dave Wisker
August 7, 2009
August
08
Aug
7
07
2009
01:09 PM
1
01
09
PM
PDT
Mr DATCG, Come now, you must be aware that I do discuss origins all the time on UD! I just linked to an interesting paper today... You are conflating OOL and c. Elegans reverting some deleterious mutation. I don't think that is a good idea.Nakashima
August 7, 2009
August
08
Aug
7
07
2009
12:17 PM
12
12
17
PM
PDT
BA77 and maybe Scordova? What are the current thinkings by Dr. Sanford on GenE? Is he still tracking an eventual mutational meltdown over time? If so, how long? Has he changed the timeline? I'm very curious to hear what Scordova discovers in future discussions with Dr. Sanford. I'm also curious what future events and conversations with Dr. Meyer.DATCG
August 7, 2009
August
08
Aug
7
07
2009
12:14 PM
12
12
14
PM
PDT
Nak states: , "If DNA miscopies every 10^-9 base, simple mutation will sample every possible choice in the genome eventually." 4^1000000000 possible combinations Nak,,, How long will it take to search all of those possible combinations? Even with a population of say 10^40 bacteria? As well, I think you are trying to say there is absolutely no complex algorithmic information in the genome whatsoever and that we should look totally to "non information causes" in this recovery of information,, at least that is what it seems like from your naive appealing to totally "brute material forces" to recover the information that was lost. Here is yet another study that I am afraid you will completely ignore: "No man-made program comes close to the technical brilliance of even Mycoplasmal genetic algorithms. Mycoplasmas are the simplest known organism with the smallest known genome, to date. How was its genome and other living organisms' genomes programmed?" - David L. Abel and Jack T. Trevors, “Three Subsets of Sequence Complexity and Their Relevance to Biopolymeric Information,” Theoretical Biology & Medical Modelling, Vol. 2, 11 August 2005, page 8 Mycoplasma Genitalium - The "Simplest" Life On Earth - video http://www.youtube.com/watch?v=eRoMxpZWR7c As well, Nak just because you have somewhat deluded yourself into thinking Genetic Entropy is falsified, on such piddling evidence of "equilibrium" in Dave's paper,,,and proudly proclaim GE is falsified over and over,,,does not detract from the truth that GE is not rigorously falsified within the scientific method,,, and in fact once again shows your unscientific bias of being a uncritical Darwinian cheerleader... Until the matter is fully ironed out with further rigorous experimentation, that is the true state of how the GE falsification matter sits... In fact in all fairness to the current state of the evidence, I feel the evidence looks very Good for GE being further solidified as the main principle for biology.bornagain77
August 7, 2009
August
08
Aug
7
07
2009
11:58 AM
11
11
58
AM
PDT
Nakashima, You never explain how such a system came about in the first place. So far, origins is still a mystery. After 200yrs, billions spent, the many different conjectures, ideas, imaginations have failed to produce any evidence of origins. You talk about sophiticated programs as they are today and state no intelligence is required, yet you cannot recreate these intelligent bio-programs yourself even though you have great experience in life. Yet you say - easy. But why can you not create such an easy bio program Mr. Nakashima? With all due respect to your background which I do recognize. I'm not as bold as you. I admit there is much we do not know, to much to rule out a guided evolution possibility. Even Richard Dawkins refused to rule out possible seedings from advanced civilizations as responsible for the Code of Life. Why Mr. Nakashima? Why didn't he rule it out? Why did he leave that possibility open? And if anything is possible according to Darwinians, why not intelligence?DATCG
August 7, 2009
August
08
Aug
7
07
2009
11:31 AM
11
11
31
AM
PDT
DaveW, First, to correct any possible misunderstanding from my previous post about looking through a Designers eye, "design to counteract mutations..." I should make it clear that I believe mutations are allowed to a certain extent for variation if that was not obvious. But for survival purposes under stress, etc., that the design must be flexible and recoverable, plus maintain a core for distribution and storage of blueprints and unfolding life under different ecozones. Discovering the error correction programs to edit, splice, copy, destroy, disband, etc., is evidence of design. Now, to your response: "Interesting. You could also look at life as a means of locally reducing entropy (albeit temporarily) in the face of a Universe ultimately winding down." You could, but why? Please expand. Why would "life" do anything? From an unguided perspective? There's always an unmentioned ghost in the machine of unguided evolutionary logic which seems illogical to me. Its like saying mother nature did it. But why? Without purpose, goals, or reason for being, why? Why does "life" provide local zones of temporary protection? I asked... "Do you think the Ozone layer is important? You said, "Yes" Because why Dave? What does the Ozone layer provide protection from regarding "life." You stated above that "life" provides temporary security from entropy, but that is simply not true if you take the Ozone layer and atmospheric layers away. Whatever you consider "life" to be cannot survive as we know it today without protection. What do we need protection from that the Ozone layer provides?DATCG
August 7, 2009
August
08
Aug
7
07
2009
11:19 AM
11
11
19
AM
PDT
PPS - An interesting example of this brute force problem solving might be happening in the AIG story that BA^77 referenced earlier. If you sequenced the genome of the wild type and/or antibiotic resistant bacteria, you might find that the bacteria after ten hours (20 generations?) had shifted the genome in response to the famine media. Might be a fun experiment! :)Nakashima
August 7, 2009
August
08
Aug
7
07
2009
10:31 AM
10
10
31
AM
PDT
As a corollary to my last comment, the history of cryptography is one of elegant ciphers falling to brute force attacks. The Enigma breaking 'bombes' of Alan Turing were checking every possible combination as fast as they could. Botnets can be used to crack keys as easily as launch denial of service attacks.Nakashima
August 7, 2009
August
08
Aug
7
07
2009
10:18 AM
10
10
18
AM
PDT
1 2 3 9

Leave a Reply