- Share
-
-
arroba
An Information-Theoretic Formalism for Multiscale Structure in Complex Systems (Open access)
Here’s a writeup in Science News:
Many attempts at describing complexity mathematically have been made. Some have been moderately successful. Network math, for example, illuminates some systems by identifying links between the system’s parts (followers on Twitter, for instance, or being in a movie with Kevin Bacon). Cellular automata, shifting grids of white and black pixels, can reproduce many complex patterns found in nature by growing according to simple rules. Even the standard equations of calculus are useful in certain complex contexts. Statistical physics, for instance, applies standard math to averages of behavior in complex systems. Often this approach provides excellent insights, such as into how the changing relationships between parts of a material can suddenly alter its properties, say when a metal switches from magnetic to nonmagnetic or vice versa (an example of a phase transition).
But these approaches don’t always work. They’re useful for some systems but not others. And all methods so far suffer from an inability to quantify both complexity and structure meaningfully. By most measures, a box containing gas molecules bouncing around at random would be highly complex. But such a system totally lacks structure. A crystal, on the other hand, has a rigorous well-defined structure, but is utterly simple in the sense that if you know the locations of a few atoms, you know the whole arrangement of the entire object.
Here’s the abstract:
We develop a general formalism for representing and understanding structure in complex systems. In our view, structure is the totality of relationships among a system’s components, and these relationships can be quantified using information theory. In the interest of flexibility we allow information to be quantified using any function, including Shannon entropy and Kolmogorov complexity, that satisfies certain fundamental axioms. Using these axioms, we formalize the notion of a dependency among components, and show how a system’s structure is revealed in the amount of information assigned to each dependency. We explore quantitative indices that summarize system structure, providing a new formal basis for the complexity profile and introducing a new index, the “marginal utility of information”. Using simple examples, we show how these indices capture intuitive ideas about structure in a quantitative way. Our formalism also sheds light on a longstanding mystery: that the mutual information of three or more variables can be negative. We discuss applications to complex networks, gene regulation, the kinetic theory of fluids and multiscale cybernetic thermodynamics.
Follow UD News at Twitter!