An Information-Theoretic Formalism for Multiscale Structure in Complex Systems (Open access)

Here’s a writeup in *Science* News:

Many attempts at describing complexity mathematically have been made. Some have been moderately successful. Network math, for example, illuminates some systems by identifying links between the system’s parts (followers on Twitter, for instance, or being in a movie with Kevin Bacon). Cellular automata, shifting grids of white and black pixels, can reproduce many complex patterns found in nature by growing according to simple rules. Even the standard equations of calculus are useful in certain complex contexts. Statistical physics, for instance, applies standard math to averages of behavior in complex systems. Often this approach provides excellent insights, such as into how the changing relationships between parts of a material can suddenly alter its properties, say when a metal switches from magnetic to nonmagnetic or vice versa (an example of a phase transition).

But these approaches don’t always work. They’re useful for some systems but not others. And all methods so far suffer from an inability to quantify both complexity and structure meaningfully. By most measures, a box containing gas molecules bouncing around at random would be highly complex. But such a system totally lacks structure. A crystal, on the other hand, has a rigorous well-defined structure, but is utterly simple in the sense that if you know the locations of a few atoms, you know the whole arrangement of the entire object.

Here’s the abstract:

We develop a general formalism for representing and understanding structure in complex systems. In our view, structure is the totality of relationships among a system’s components, and these relationships can be quantified using information theory. In the interest of flexibility we allow information to be quantified using any function, including Shannon entropy and Kolmogorov complexity, that satisfies certain fundamental axioms. Using these axioms, we formalize the notion of a dependency among components, and show how a system’s structure is revealed in the amount of information assigned to each dependency. We explore quantitative indices that summarize system structure, providing a new formal basis for the complexity profile and introducing a new index, the “marginal utility of information”. Using simple examples, we show how these indices capture intuitive ideas about structure in a quantitative way. Our formalism also sheds light on a longstanding mystery: that the mutual information of three or more variables can be negative. We discuss applications to complex networks, gene regulation, the kinetic theory of fluids and multiscale cybernetic thermodynamics.

Follow UD News at Twitter!

paper, thanks: http://arxiv.org/pdf/1409.4708v1.pdf

Just remember folks: If you can’t compute this information metric’s value for any system I choose–and I mean compute it to the nth decimal place, then you have to declare the whole thing bunk. No excuses now. No claiming that this is only initial research and will be refined over time. No excusing it because it might depend on solving the halting problem (which it appears it can actually depend on). No claiming that “real” systems are too complex to actually enumerate all the possible states/configurations/whatevers. You must be able to calculate it for any system I point to, and do so quickly and accurately. Otherwise, it must be rejected as some worthless creationist fabrication.

I choose the following: a red blood cell of a rhesus monkey, an Intel Pentium (TM) processor, and the last episode of Seinfeld. You have until tomorrow. I will check back in 24 hours to get the values to 6 significant figures. You critics of CSI, you have your work cut out for you, so get to it.