There is a current wave of attempts in an around UD to cloud, strawmannise, obfuscate, twist into pretzels and dismiss the observed (and measurable) phenomenon, functionally specific, complex organisation and/or associated information, FSCO/I.
Accordingly, let us first note the root of the concept in the work of leading OOL — origin of life — researchers in the 1970’s:
. . . In brief, living organisms are distinguished by their specified complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity. [The Origins of Life (John Wiley, 1973), p. 189.]
Okay, that should be enough to highlight that FSCO/I — actually, a commonplace phenomenon, as familiar as the files on your computer, and as quantifiable as the file sizes you see routinely reported by your operating system — is not a dubious and ill-conceived concept fraudulently foisted on the world by “IDiots” who are “Creationists in cheap tuxedos.”
But a bit more can and should be said, and as a for record supplement, I here clip and augment a comment just put up in a UD discussion thread:
>>. . . common garden variety files all over the Internet, in MP3 players, cams, phones etc all show examples of what FSCO/I is about.
U/D, Oct 28, an illustration, ranging from text and image files to tapes and readers . . . paper and ribosomal, and protein AA strings :
Where, such familiar items demonstrate beyond reasonable — operative word — doubt that it is observable and readily measurable.
At simplest level, by determining that there is functionally specific info there [dependent on particular config], and counting the chain length of y/n q’s required or used to specify the actual config used from the field of possible configs, i.e. in binary . . . two state . . . digits, aka bits — or, more descriptively accurately, functionally specific bits. In technical communicative contexts, somewhat more elaborate metrics can be used due to redundancies, e.g. helpful in lossless data compression.
a: as highly contingent things can be accounted for on blind chance or intelligently directed configuration (aka design — and to address notorious tendencies to twist language, INTELLIGENT design for emphasis),
b: it makes sense to identify a needle in haystack threshold beyond which,
c: for functionally specific — thus highly constrained by the need to have many, well-matched, correctly arranged and coupled parts or facets etc — organisation of at least that complexity,
d: it is maximally implausible that such comes about by design. Where,
e: a useful such threshold for our solar system of ~ 10^57 atoms, is 500 bits, implying a config space of 3.27*10^150 possibilities.
f: So, we may measure complexity in bits, using standard info metrics, I, thus
g: we may also observe functional specificity (most easily by vulnerability to random perturbation), and
h: assign what economists and statisticians call a dummy variable, S that
i: is zero as default (implying not credibly functionally specific), but
j: on observing such specificity can be set to 1. Then
k: by subtracting the threshold level, 500, from the product I*S, we have a design threshold metric:
Chi_500 = I*S – 500, in bits beyond the solar system threshold,
l: in a context where, the 10^57 atoms of our solar system, each observing a tray of 500 coins flipped every 10^-14 s, for the 10^17 s that is a generous estimate of sol system lifespan to date, would yield 10^88 observations as an upper limit of the capacity of our sol system to do so. (This is also a measure of the number of possible chemical level interactions of the atoms of the sol system to date. 10^-14 s is a fast chem rxn time.)
m: Which, is comparable to the 3.27*10^150 possible configs of 500 H/T coins, and is comparable to a sample of one straw to a cubical haystack 1,000 light years across, comparably thick as our barred spiral galaxy at its central bulge. So also
n: were such a haystack superposed on our galactic neighbourhood, and were such a sample taken blindly, by chance and/or blind mechanical necessity, with practical certainty, it would pick up only straw.
o: For, while a star may be ~ 1/2 million miles across, stellar separations are on the order of several light years. (E.g. Earth’s next nearest neighbour is 4 LY away.)
p: Thus, any blind search of a config space of 500 bits on the gamut of the sol system, is maximally unlikely to detect FSCO/I.
U/D Oct 31, this boils down to saying that the solar system can be turned into 10^57 interacting real world Monte Carlo sims, that explore config space (I often use the idea of giving each atom a tray of 500 coins that are flipped and observed every 10^-14 s for 10^17 s, which turns out to be exploring about one straw to a cubical haystack 1,000 light years across of the possibilities for 500 bits. In block diagram form, here is what such a dynamic-statistical-informational model would look like, for our solar system as search engine model, there would be 10^57 of them, coupled as appropriate:
q: Where, too, FSCO/I can come in the form of digitally . . . discrete state . . . coded strings, or in specific configs such as we often see in engineering diagrams, particularly exploded views that show us nodes and arcs networks.
r: Such a network or diagram is now often rendered in AutoCAD or similar software, reducing it to coded strings of bits, i.e. Y/N questions chained.That is,
s: discussion on FSCO/I rich strings is without loss of generality, WLOG.
Thus, we have in hand an observed phenomenon, FSCO/I.
It is quantifiable and is relevant to proposals to create FSCO/I rich entities by blind chance and mechanical necessity. On the gamut of our sol system, 500 bits is a threshold beyond which that is maximally implausible. For the observed cosmos as a whole, 1,000 bits is an even more generous limit. Or in more familiar textual terms, 73 or 143 ASCII characters, equivalent to 10 or 20 typical English words. ASCII characters also being relevant to computer code which is often written as raw text files.
On trillions of known cases, i.e. start with the Internet, FSCO/I is empirically reliable as an index of design, and attempts to provide claimed counter examples on long experience, reduce to design in the background.
So, design thinkers are entitled to hold on best current explanation inductive grounds, that FSCO/I is a highly reliable index of design.
Of course cell based life, the dozens of observed body plans and the underlying physics of the observed cosmos are all chock full of FSCO/I.
And thereby hangs a huge debate, as that points like a spear, straight at the heart of a major worldview narrative, evolutionary materialism that purports to explain everything from hydrogen to humans on blind chance and mechanical necessity. >>
Okay, that should be enough for a for record comment.
This is FYI/FTR, so discussion will be undertaken in threads, not below. END