Especially to conservation of information theory:
This brings us to a more general result known as the conservation of information. Design theorists William Dembski and Robert J. Marks defined the law of conservation of information in their 2009 paper “Conservation of Information in Search” and then proved the result in their follow-on 2010 paper “The Search for a Search”. The conservation of information (COI) says the expected active information produced by any combination of random and deterministic processes is guaranteed to be zero or less. Active information is itself the difference between two different probability distributions.
We can see the conservation of information is a generalization of Bell’s no-go theorem in quantum mechanics. It contrasts the difference between two probability distributions, and then take the expectation to get a hard limit. Finally, we measure whether this limit is met by averaging a large number of physical measurements.
Eric Holloway, “Why is Bell’s Theorem Important for Conservation of Information? ” at Mind Matters News
Further reading on information theory:
But is determinism true? Does science show that we fated to want whatever we want? (Michael Egnor)
At the movies: can AI restore blurred images? Working with pixels, we can do remarkable things—as long as we are not asking for magic (Robert J. Marks)
Why information theory is like a good run. Information theory can help us understand a wide range of fields besides computers. (Eric Holloway)
and
COVID-19: When 900 bytes shut down the world. A great physicist warned us, information precedes matter and energy: Bit before it.