Uncommon Descent Serving The Intelligent Design Community

Zettabytes – by Chance or Design?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

A new measure of information has been invented – the Zettabyte = 1,000,000,000,000,000,000,000 bytes, or 10^21 bytes.
Zettabytes overtake petabytes as largest unit of digital measurement Heidi Blake, 4 May 2010, The Telegraph UK “The size of the “digital universe” will swell so rapidly this year that a new unit – the zettabyte – has been invented to measure it.”

Humanity’s total digital output currently stands at 8,000,000 petabytes – which each represent a million gigabytes – but is expected to pass 1.2 zettabytes this year.
One zettabyte is equal to one million petabytes, or 1,000,000,000,000,000,000,000 individual bytes. . . .
“A huge increase in video and digital photography – in the old days people would take one photograph, now they can knock off 20 photos and rather than store just one, people store all 20. Then there is the fact that the number of devices where information can be generated and stored has also increased.”
As a result the digital universe is forecast to expand by a factor of 44 over the next decade, according to the survey.

Werner Gitt observes that the storage capacity of

“1 cubic cm of DNA is 10^21 bits. (DNA – deoxyribonucleaic acid.)”

Comparison of different quantities of information. p 188, In The Beginning was Information. Werner Gitt. 2000 CLV ISBN 3-89397-255-2. See also 2006 edition ISBN 0890514615

In today’s terminology, DNA has a storage capacity of 0.1 zettabyte/ml.

By comparison, Hitachi considers “very large capacity drives” as holding 2TB (2*10^12 bytes).

Werner Gitt further estimated the:

“Daily information flows in the human body” as about 3.4 * 10^24, or 3.9 * 10^19 bits/s.”

Figure 33, The Information Spiral, ibid p 189. i.e. about 430 zettabytes/day.

Now what are the probabilities of zettabytes complex specified information (CSI) being generated by natural stochastic processes (“chance”) as compared to Design?

For extra credit, how would one objectively evaluate the probabilities of Daniel’s prediction being fulfilled? See:
“many will go here and there to increase knowledge.” Daniel 12:4
——————
Corrected “zetabyte” to “zettabyte”. See: Zettabyte

Comments
Just when you think you've seen it all: Wingsuit flying with rocketboots - CRAZY! http://www.youtube.com/watch?v=8HMdioj6kng ----------------- BASE wingsuit mountain swooping http://www.youtube.com/watch?v=p1p3pS2XV4Abornagain77
May 6, 2010
May
05
May
6
06
2010
04:56 PM
4
04
56
PM
PDT
Much better Off topic: The hand of God... or Darwin. HDV starling footage cut to Pachelbel http://www.youtube.com/watch?v=D36ujD-FUl4bornagain77
May 6, 2010
May
05
May
6
06
2010
03:59 PM
3
03
59
PM
PDT
Off topic: Immense murmuration of Spectacular Starlings http://www.youtube.com/watch?v=8vhE8ScWe7wbornagain77
May 6, 2010
May
05
May
6
06
2010
03:46 PM
3
03
46
PM
PDT
William Dembski re
Zettabytes appear explicitly in my 2004 THE DESIGN REVOLUTION (p. 120):
Now why didn't I remember that? Good reminder to dig deeper before going along with reporter enthusiasm.DLH
May 6, 2010
May
05
May
6
06
2010
10:41 AM
10
10
41
AM
PDT
andrewjg: some more "useless" knowledge: MapReduce: simplified data processing on large clusters Excerpt:,,, an average of one hundred thousand MapReduce jobs are executed on Google's clusters every day, processing a total of more than twenty petabytes of data per day. http://portal.acm.org/citation.cfm?doid=1327452.1327492 i.e. Google processes 20 petabytes of data per day. That is, 20,000,000,000,000,000 bytes every day. This works out to around 231,000,000,000 bytes per second!!! which reminds me of these articles: Cells Are Like Robust Computational Systems, - June 2009 Excerpt: Gene regulatory networks in cell nuclei are similar to cloud computing networks, such as Google or Yahoo!, researchers report today in the online journal Molecular Systems Biology. The similarity is that each system keeps working despite the failure of individual components, whether they are master genes or computer processors. ,,,,"We now have reason to think of cells as robust computational devices, employing redundancy in the same way that enables large computing systems, such as Amazon, to keep operating despite the fact that servers routinely fail." http://www.sciencedaily.com/releases/2009/06/090616103205.htm Nanoelectronic Transistor Combined With Biological Machine Could Lead To Better Electronics: - Aug. 2009 Excerpt: While modern communication devices rely on electric fields and currents to carry the flow of information, biological systems are much more complex. They use an arsenal of membrane receptors, channels and pumps to control signal transduction that is unmatched by even the most powerful computers. http://www.sciencedaily.com/releases/2009/08/090811091834.htmbornagain77
May 6, 2010
May
05
May
6
06
2010
08:17 AM
8
08
17
AM
PDT
100 gigabyte per second 1 x 10^38 bytes 1 x 10^11 bytes per second = 1 x 10^27 seconds ------------ Did anybody bring a lunch? I think this is going to take a while.bornagain77
May 5, 2010
May
05
May
5
05
2010
01:01 PM
1
01
01
PM
PDT
DLH, I just stumble across a couple of Gitt's quotes: DNA molecules contain the highest known packing density of information. This exceedingly brilliant storage method reaches the limit of the physically possible, namely down to the level of single molecules. At this level the information density is more than 10^21bits/cm3. W. Gitt, In The Beginning Was Information, pg 195. Man is undoubtedly the most complex information processing system existing on the earth. The total number of bits handled daily in all information processing events occuring in the human body, is 3 x 10^24. The number of bits being processed daily in the human body is more than a million times the total amount of human knowledge stored in all the libraries of the world, which is about 10^18 bits. W. Gitt, In The Beginning Was Information, pg 88. http://mywebpages.comcast.net/mkent595/MolecularBiology.htmlbornagain77
May 5, 2010
May
05
May
5
05
2010
12:56 PM
12
12
56
PM
PDT
Zettabytes appear explicitly in my 2004 THE DESIGN REVOLUTION (p. 120): http://books.google.com/books?id=sKVqpXqE0VwC&printsec=frontcover&dq=DEMBSKI+DESIGN+REVOLUTION+ZETTABYTES&source=bl&ots=0tyndT2U3i&sig=gnIcdDRNskLSEEHsbxA2dVQPhBE&hl=en&ei=RXfhS7POL4uE8wT32ayPAw&sa=X&oi=book_result&ct=result&resnum=1&ved=0CAYQ6AEwAA#v=onepage&q=ZETTABYTES&f=falseWilliam Dembski
May 5, 2010
May
05
May
5
05
2010
06:50 AM
6
06
50
AM
PDT
andrewjg, you've probably seen this number before, yet if you haven't the largest "material" number I've ever seen semi-directly associated with "information" is this one: According to esteemed British mathematical physicist Roger Penrose (1931-present), the odds of one particular individual constant, the “original phase-space volume” of the universe, required such precision that the “Creator’s aim must have been to an accuracy of 1 part in 10^10^123”. This number is gargantuan. If this number were written out in its entirety, 1 with 10^123 zeros to the right, it could not be written on a piece of paper the size of the entire visible universe, even if a number were written down on each sub-atomic particle in the entire universe, since the universe only has 10^80 sub-atomic particles in it. Roger Penrose discusses initial entropy of the universe. - video http://www.youtube.com/watch?v=WhGdVMBk6Zo The Physics of the Small and Large: What is the Bridge Between Them? Roger Penrose Excerpt: "The time-asymmetry is fundamentally connected to with the Second Law of Thermodynamics: indeed, the extraordinarily special nature (to a greater precision than about 1 in 10^10^123, in terms of phase-space volume) can be identified as the "source" of the Second Law (Entropy)." http://www.pul.it/irafs/CD%20IRAFS%2702/texts/Penrose.pdf How special was the big bang? - Roger Penrose Excerpt: This now tells us how precise the Creator's aim must have been: namely to an accuracy of one part in 10^10^123. (from the Emperor’s New Mind, Penrose, pp 339-345 - 1989) http://www.ws5.com/Penrose/ This 1 in 10^10^123 number, for the time-asymmetry of the initial state of entropy for the universe, also lends strong support for "highly specified infinite information" creating the universe since; "Gain in entropy always means loss of information, and nothing more." Gilbert Newton Lewis - 1923 Authored Thermodynamics and the Free Energy of Chemical Substances with M. Randall. http://www.woodrow.org/teachers/ci/1992/Lewis.htmlbornagain77
May 5, 2010
May
05
May
5
05
2010
04:18 AM
4
04
18
AM
PDT
@bornagain Me to. I am just not very good at cataloging it. I often find your links interesting e.g. the Euler one recently.andrewjg
May 5, 2010
May
05
May
5
05
2010
02:48 AM
2
02
48
AM
PDT
andrewjg, Thanks for the link. I must have a special place in my heart for useless information such as boiling the ocean and consuming 1000 moons because I will definitely try to find a place to use this "useless" knowledge somehow.bornagain77
May 5, 2010
May
05
May
5
05
2010
02:33 AM
2
02
33
AM
PDT
@bornagain I heard the "boiling the oceans" line when watching a presentation by Bill Moore one of the primary architects behind ZFS. But I found some interesting but useless "facts" regarding ZFS's theoretical storage capacity. http://blogs.sun.com/dcb/entry/zfs_boils_the_ocean_consumesandrewjg
May 4, 2010
May
05
May
4
04
2010
10:51 PM
10
10
51
PM
PDT
Delving further into the archives, it appears Zetta as (1000)^7 or 10^21 was established by the 19th Conférence Générale des Poids et Mesures (CGPM). The BIPM also established the prefix for 10^24 as yotta Y. 1,000,000,000,000,000,000,000,000. See NIST on SI prefixes.DLH
May 4, 2010
May
05
May
4
04
2010
09:15 PM
9
09
15
PM
PDT
andrewjg, I love the boil the ocean quote with 2^128; 3.4 x 10^38 bits, but I can't seem to find the maximum speed at which it can operate for data transfer. just for fun there are 4.3 x 10^46 molecules of water in the ocean: http://www.answerbag.com/q_view/220062bornagain77
May 4, 2010
May
05
May
4
04
2010
07:30 PM
7
07
30
PM
PDT
Great post! Point of fact, however, the highest unit of data so far is the yottabyte which is 10^24 bytes. Additionally, the highest link connection in use today is 100 Gigabytes per second. However, as big as those numbers are, the biggest problems posed by large amounts of data is the processing power required to actually do anything with it, like error correction, sorting, and looking up the data you need. It's only very recently that the terabyte sort was recently modified to the 100 terabyte sort. The current winner of the terabyte sort being able to crunch 100 terabytes in 173 minutes (which is blazingly fast). I can only imagine the processing power it would take to crunch 430 zetabytes (or .43 yottabytes) per day.kai5263499
May 4, 2010
May
05
May
4
04
2010
07:18 PM
7
07
18
PM
PDT
andrewjg Thanks for the info. Solaris ZFS — A better, safer way to manage your data "Mind-boggling scalability: 128-bit file system, 16 billion billion times the capacity of 32- or 64-bit file systems".
The future-proof file system Solaris ZFS offers a dramatic advance in data management with an innovative approach to data integrity, near zero administration, and a welcome integration of file system and volume management capabilities. The centerpiece of this new architecture is the concept of a virtual storage pool which decouples the file system from physical storage in the same way that virtual memory abstracts the address space from physical memory, allowing for much more efficient use of storage devices. In Solaris ZFS, space is shared dynamically between multiple file systems from a single storage pool, and is parceled out of the pool as file systems request it. Physical storage can be added to or removed from storage pools dynamically, without interrupting services, providing new levels of flexibility, availability, and performance. And in terms of scalability, Solaris ZFS is a 128-bit file system. Its theoretical limits are truly mind-boggling — 2128 bytes of storage, and 264 for everything else such as file systems, snapshots, directory entries, devices, and more. To make it possible to recover corrupted data, ZFS implements RAID-Z an implementation of RAID designed for ZFS. RAID-Z improves upon previous RAID schemes with parity or even double-parity, striping, and atomic operations. These features make ZFS ideally suited for managing industry standard storage servers like the Sun Fire 4500.
DLH
May 4, 2010
May
05
May
4
04
2010
07:01 PM
7
07
01
PM
PDT
andrewjg, just to clarify my last post: Samuel Braunstein was talking about actually changing a human being into pure transcendent information so as to teleport them to another physical spot . The 10^32 bits roughly represents only the amount of information needed to "decode" the teleportation event, and not the actual amount of information involved in the teleportation. The actual amount of information involved would be "infinite specified information" for each quantum wave/particle teleported. Also of note the 10^32 "decoding" bits must travel by speed of light considerations so as to maintain the speed of light No-communication theorem http://en.wikipedia.org/wiki/No-communication_theorem , which simply means it is impossible for two "3-D material" entities in this universe to communicate faster than the speed of light i.e. this decoding speed of light "caveat" is the "price" we pay for living in the ordered structure of this space-time continuum.bornagain77
May 4, 2010
May
05
May
4
04
2010
06:33 PM
6
06
33
PM
PDT
andrewjg, not to argue, in fact I agree with much of what you said, but Samuel Braunstein was talking about actually physically teleporting a human being to another spot entirely, not just representing a human as information, and was also talking of the best fiber optic conceivable to the limits of physics. Professor Samuel L. Braunstein Computer Science, University of York Journal: Quantum Information & Computation Books: Quantum Computing; Scalable Quantum Computers; Quantum Information with Continuous Variables http://www-users.cs.york.ac.uk/schmuel/bornagain77
May 4, 2010
May
05
May
4
04
2010
04:18 PM
4
04
18
PM
PDT
@bornagain77 A couple of observations relating to your post follow. But I am not trying to argue it. Just thought you may be interested. 1. Sun Microsystems already defined what a zetabyte is. When they developed there 128 bit file system they invented the term. A 128 bit file system can reference 3.4x10^38 bytes so it can actually reference a lot more than a zetabyte. They wanted to make their new file system absolutely future proof. The calculated that to flip each bit on such a system would require enough energy to boil all the water on the planet. 2. Data can be transmitted much faster in 2010 than in 1995. Although I imagine it would still take a long time. 3. The imagery is likely highly compressed, I doubt it is stored in an uncompressed format so that actually it is much worse than you imply. I would imagine the data on the internet is made up of text, imagery, video and audio. Now again I imagine that video is likely the bulk of all data on the internet. I believe video on average has a much higher compression ratio than imagery so that means that information content would be understated. But having said that I am very certain that there is a lot of duplication on the internet. So I think that would then mean the information content on the internet is vastly overstated. Maybe by as much as an order of magnitude. 4. But since imagery i.e. the colour spectrum is continuous there are actually an infinite number of colours. It is likely that the imagery referenced is either 16 or 24 bit which will have an impact on storage size. 4. Bitmap based imagery is likely a very poor information store for the human body. Just as with imagery vector based imagery is far more suitable that bitmaps there could actually be better ways of storing information biological life that can be used to generate accurate imagery. In fact you could probably just use vectors rather than bitmaps and store the data on personal computer - although your computer would die a horrible death trying to render it.andrewjg
May 4, 2010
May
05
May
4
04
2010
02:52 PM
2
02
52
PM
PDT
DLH, that reminds me of this article: A fun talk on the teleportation (of a single human) - Braunstein Excerpt: Just how much information are we talking about anyway? Well the visible human project by the American National Institute of Health requires about 10 Gigabytes (that's about 10^11 = 100,000,000,000 "bits," or yes/no answers, this is about ten CD ROMs) to give the full three dimensional details of a human down to one millimeter resolution in each direction. If we forget about recognizing atoms and measuring their velocities and just scale that to a resolution of one-atomic length in each direction that's about 10^32 bits (a one followed by thirty two zeros). This is so much information that even with the best optical fibers conceivable it would take over one hundred million centuries to transmit all that information! It would be easier to walk! If we packed all that information into CD ROMs it would fit into a cube almost 1000 kilometers on a side! http://www.research.ibm.com/quantuminfo/teleportation/braunstein.html Thus it seems the entire digital output of the entire world doesn't even equal the base level measure of location information in one human body.bornagain77
May 4, 2010
May
05
May
4
04
2010
11:21 AM
11
11
21
AM
PDT

Leave a Reply