Uncommon Descent Serving The Intelligent Design Community

Too much biology to fit in the data pockets?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

That’s what some experts say:

Despite the need for new analytical tools, a number of biologists said that the computational infrastructure continues to be underfunded. “Often in biology, a lot of money goes into generating data but a much smaller amount goes to analyzing it,” said Nathan Price, associate director of the Institute for Systems Biology in Seattle. While physicists have free access to university-sponsored supercomputers, most biologists don’t have the right training to use them. Even if they did, the existing computers aren’t optimized for biological problems. “Very frequently, national-scale supercomputers, especially those set up for physics workflows, are not useful for life sciences,” said Rob Knight, a microbiologist at the University of Colorado Boulder and the Howard Hughes Medical Institute involved in both the Earth Microbiome Project and the Human Microbiome Project. “Increased funding for infrastructure would be a huge benefit to the field.”

Comments
A prediction of creationism would be evolutionism doesn't make biology origins plausible. Maybe evolutionary biologists would best avoid super computers.Robert Byers
October 12, 2013
October
10
Oct
12
12
2013
12:14 AM
12
12
14
AM
PDT
as to:
Even if they did, the existing computers aren’t optimized for biological problems.
nor, to model biology realistically, will computers ever be:
"Complexity Brake" Defies Evolution - August 2012 Excerpt: "This is bad news. Consider a neuronal synapse -- the presynaptic terminal has an estimated 1000 distinct proteins. Fully analyzing their possible interactions would take about 2000 years. Or consider the task of fully characterizing the visual cortex of the mouse -- about 2 million neurons. Under the extreme assumption that the neurons in these systems can all interact with each other, analyzing the various combinations will take about 10 million years..., even though it is assumed that the underlying technology speeds up by an order of magnitude each year.",,, Even with shortcuts like averaging, "any possible technological advance is overwhelmed by the relentless growth of interactions among all components of the system," Koch said. "It is not feasible to understand evolved organisms by exhaustively cataloging all interactions in a comprehensive, bottom-up manner." He described the concept of the Complexity Brake:,,, "Allen and Greaves recently introduced the metaphor of a "complexity brake" for the observation that fields as diverse as neuroscience and cancer biology have proven resistant to facile predictions about imminent practical applications. Improved technologies for observing and probing biological systems has only led to discoveries of further levels of complexity that need to be dealt with. This process has not yet run its course. We are far away from understanding cell biology, genomes, or brains, and turning this understanding into practical knowledge.",,, Why can't we use the same principles that describe technological systems? Koch explained that in an airplane or computer, the parts are "purposefully built in such a manner to limit the interactions among the parts to a small number." The limited interactome of human-designed systems avoids the complexity brake. "None of this is true for nervous systems.",,, to read more go here: http://www.evolutionnews.org/2012/08/complexity_brak062961.html To Model the Simplest Microbe in the World, You Need 128 Computers - July 2012 Excerpt: Mycoplasma genitalium has one of the smallest genomes of any free-living organism in the world, clocking in at a mere 525 genes. That's a fraction of the size of even another bacterium like E. coli, which has 4,288 genes.,,, The bioengineers, led by Stanford's Markus Covert, succeeded in modeling the bacterium, and published their work last week in the journal Cell. What's fascinating is how much horsepower they needed to partially simulate this simple organism. It took a cluster of 128 computers running for 9 to 10 hours to actually generate the data on the 25 categories of molecules that are involved in the cell's lifecycle processes.,,, ,,the depth and breadth of cellular complexity has turned out to be nearly unbelievable, and difficult to manage, even given Moore's Law. The M. genitalium model required 28 subsystems to be individually modeled and integrated, and many critics of the work have been complaining on Twitter that's only a fraction of what will eventually be required to consider the simulation realistic.,,, http://www.theatlantic.com/technology/archive/2012/07/to-model-the-simplest-microbe-in-the-world-you-need-128-computers/260198/
Also of note, Quantum entanglement is found in DNA on a massive scale:
Quantum Information/Entanglement In DNA – short video http://www.metacafe.com/watch/5936605/
And this quantum entanglement is implicated in quantum computation:
Is DNA a quantum computer? Stuart Hameroff Excerpt: DNA could function as a quantum computers with superpositions of base pair dipoles acting as qubits. Entanglement among the qubits, necessary in quantum computation is accounted for through quantum coherence in the pi stack where the quantum information is shared,,, http://www.quantumconsciousness.org/dnaquantumcomputer1.htm
Here is a clear example of ‘quantum computation’ in the cell:
Quantum Dots Spotlight DNA-Repair Proteins in Motion - March 2010 Excerpt: "How this system works is an important unanswered question in this field," he said. "It has to be able to identify very small mistakes in a 3-dimensional morass of gene strands. It's akin to spotting potholes on every street all over the country and getting them fixed before the next rush hour." Dr. Bennett Van Houten - of note: A bacterium has about 40 team members on its pothole crew. That allows its entire genome to be scanned for errors in 20 minutes, the typical doubling time.,, These smart machines can apparently also interact with other damage control teams if they cannot fix the problem on the spot. http://www.sciencedaily.com/releases/2010/03/100311123522.htm
Of note: DNA repair machines ‘Fixing every pothole in America before the next rush hour’ is analogous to the traveling salesman problem. The traveling salesman problem is a NP-hard (read: very hard) problem in computer science; The problem involves finding the shortest possible route between cities, visiting each city only once. ‘Traveling salesman problems’ are notorious for keeping supercomputers busy for days.
NP-hard problem - Examples http://en.wikipedia.org/wiki/NP-hard#Examples Speed Test of Quantum Versus Conventional Computing: Quantum Computer Wins - May 8, 2013 Excerpt: quantum computing is, "in some cases, really, really fast." McGeoch says the calculations the D-Wave excels at involve a specific combinatorial optimization problem, comparable in difficulty to the more famous "travelling salesperson" problem that's been a foundation of theoretical computing for decades.,,, "This type of computer is not intended for surfing the internet, but it does solve this narrow but important type of problem really, really fast," McGeoch says. "There are degrees of what it can do. If you want it to solve the exact problem it's built to solve, at the problem sizes I tested, it's thousands of times faster than anything I'm aware of. If you want it to solve more general problems of that size, I would say it competes -- it does as well as some of the best things I've looked at. At this point it's merely above average but shows a promising scaling trajectory." http://www.sciencedaily.com/releases/2013/05/130508122828.htm
Since it is obvious that there is not a material CPU (central processing unit) in the DNA, or cell, busily computing answers to this monster logistic problem, in a purely ‘material’ fashion, by crunching bits, then it is readily apparent that this monster ‘traveling salesman problem’, for DNA repair, is somehow being computed by ‘non-local’ quantum computation within the cell and/or within DNA; Moreover, even though quantum computation is implicated in biology on a massive scale, scientists are having a extremely difficult time achieving even the first tiny steps of quantum computation in machines, even though the payoff, and investment, is huge!;
Quantum Computing Promises New Insights, Not Just Supermachines - Scott Aaronson - December 2011 Excerpt: Unfortunately, while small quantum computations have already been demonstrated in the lab, they typically fall apart after only a few dozen operations. That’s why one of the most-celebrated quantum computations to date has been to factor 15 into 3 times 5 — with high statistical confidence! (With a lab full of equipment). The problem is decoherence: basically, stray interactions that intrude prematurely on the computer’s fragile quantum state, “collapsing” it like a soufflé. In theory, it ought to be possible to reduce decoherence to a level where error-correction techniques could render its remaining effects insignificant. But experimentalists seem nowhere near that critical level yet. (Of note: Now they have factored 143 into 13 times 11) http://www.nytimes.com/2011/12/06/science/scott-aaronson-quantum-computing-promises-new-insights.html?pagewanted=2&_r=1&ref=science
Also of interest is that the integrated coding between the DNA, RNA and Proteins of the cell apparently seems to be ingeniously programmed along the very stringent guidelines laid out by Landauer’s principle, by Charles Bennett from IBM of Quantum Teleportation fame, for ‘reversible computation’ in order to achieve such amazing energy efficiency.
Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon - Charles H. Bennett Excerpt: Of course, in practice, almost all data processing is done on macroscopic apparatus, dissipating macroscopic amounts of energy far in excess of what would be required by Landauer’s principle. Nevertheless, some stages of biomolecular information processing, such as transcription of DNA to RNA, appear to be accomplished by chemical reactions that are reversible not only in principle but in practice.,,,, http://www.hep.princeton.edu/~mcdonald/examples/QM/bennett_shpmp_34_501_03.pdf
The amazing energy efficiency possible with ‘reversible computation’ has been known about since Charles Bennett laid out the principles for such reversible programming in 1973, but as far as I know, due to the extreme level of complexity involved in achieving such ingenious ‘reversible coding’, has yet to be accomplished in any meaningful way for our computer programs even to this day:
Reversible computing Excerpt: Reversible computing is a model of computing where the computational process to some extent is reversible, i.e., time-invertible.,,, Although achieving this goal presents a significant challenge for the design, manufacturing, and characterization of ultra-precise new physical mechanisms for computing, there is at present no fundamental reason to think that this goal cannot eventually be accomplished, allowing us to someday build computers that generate much less than 1 bit's worth of physical entropy (and dissipate much less than kT ln 2 energy to heat) for each useful logical operation that they carry out internally. per wikipedia
bornagain77
October 11, 2013
October
10
Oct
11
11
2013
04:20 PM
4
04
20
PM
PDT

Leave a Reply