Cosmology Information Intelligent Design Physics

At Mind Matters News: Does information have mass? An experimental physicist weighs in

Spread the love

Physicist Melvin Vopson argues that information has mass; Eric Holloway replies that, if so, the mass must come from outside the universe.

Meanwhile, our physics color commentator Rob Sheldon offers,

For numbers less than 4, geometrical series win, but for 4 and greater the exponential series win out.

So as soon as we are talking about real world information, the more information we have, the less the bits weigh, until at very large amounts of information they weigh almost nothing.

The only way Vopson could be vindicated is if we store all that information serially on magnetic tape. But if the bits are not indistinguishable serial QM bits, then in the infinite limit they weigh nothing at all.

News, “Does information have mass? An experimental physicist weighs in” at Mind Matters News

(You need to visit the OP to see the mathematical demo.)

Takehome: Rob Sheldon notes that the more real-world information we have, the less the bits weigh until, at very large amounts of information, they weigh almost nothing.


Here are Eric Holloway’s reflections on whether information has mass:

Does information weigh something after all? What if it does? At the rate we create information today, one physicist computes that in 350 years, the energy will outweigh the atoms of Earth. Vopson’s idea that creating information also creates mass and energy is fascinating — and it promises even bigger mysteries than the ones we address now.

9 Replies to “At Mind Matters News: Does information have mass? An experimental physicist weighs in

  1. 1
    JVL says:

    Too bad Rob Sheldon got exponential series wrong. Kind of undercuts his argument if he doesn’t understand the mathematical terms.

    Leaving that aside, using his example of the factorial what does it mean to have a series with a term 4 or greater?

    Me thinks he just made something up.

  2. 2
    Querius says:

    It’s currently believed that information is conserved, neither created nor destroyed. But I think the problem is that “Shannon Information” is not really information, but a measure of data compressibility.

    To make my point, please tell me whether each of the following has a great, equal or lesser amount of information:

    a. 21
    b. XXI
    c. Twenty one
    d. 10101
    e. IIIIIIIIIIIIIIIIIIIII

    Second question. Modern encrypted communication is ideally indistinguishable from noise. Is the information contained in an encrypted communication equal in information content to the same number of bits of noise?

    Third question. The word NOISE and the letters ESION and SINEO and are different only by arrangement. The difference in information is only your choice in the order of how you read the letters. So, does choice also involve a change in mass?

    -Q

  3. 3
    Seversky says:

    Querius/2

    It’s currently believed that information is conserved, neither created nor destroyed.

    If that is the case then the Universe – or, at least, the information that describes it – must have always existed. It is eternal and there is no need to posit a designer or creator to account for it.

  4. 4
    jerry says:

    It is eternal

    You just made an airtight case for a creator.

    If something is eternal explain why anything that is possible did not exist? Or why it wouldn’t currently exist? Because they don’t exist it negates that there is/was eternity.

    People like to spout things without understanding the implications of what they spout. Eternal means literally everything possible.

    In this case someone just proved the opposite of what they thought they proved

  5. 5
    EDTA says:

    Querius,
    >Second question. Modern encrypted communication is ideally indistinguishable from noise. Is the information contained in an encrypted communication equal in information content to the same number of bits of noise?

    As with all of this discussion, it depends on how information is being measured. Are we talking about the bits that have to be transmitted while encrypted, or the amount of information the recipient learns after decrypting the message? If I’m the channel over which the encrypted message is sent, and I don’t have the decryption key, then yes, the amount of information is the number of bits I have to transport. If I’m the recipient, the message may effectively contain any number of bits from 0 up to the encrypted number of bits.

    Are you trying to point to a contradiction or paradox in the above?

  6. 6
    Bob O'H says:

    Leaving that aside, using his example of the factorial what does it mean to have a series with a term 4 or greater?

    I think his point was that 2^x 4. Quite what this has to do with information is beyond me, but I’m not a colour physicist.

    BTW, in case anyone is interested, the comment about factorials being an exponential series is just weird. In fact, factorials increase more quickly than exponentially.

  7. 7
    Querius says:

    Jerry @4,

    You just made an airtight case for a creator.

    LOL. And with his own fingers . . . But the problem of information, existence, and design persists while scientists struggle to explain the history of entropy and why there’s something rather than nothing.

    EDTA @5,
    My point is that “Shannon information” is a misnomer that confuses data with information and is relevant primarily to data compression and transfer. As in my examples, information can be encoded or encrypted with differing efficiency. The Biblical man, Job, wished that his words could be chiseled into stone and filled with lead (Pb). We certainly can compute the caloric energy required to accomplish this task, but the same information survived on parchment and ink.

    The word NOISE and the letters ESION and SINEO and are different only by arrangement. The difference in information is only your choice in the order of how you read the letters. So, does choice also involve a change in mass?

    One can choose to read the three sets of letters above right to left, left to right, or in a different order. Choice is involved in all three instances, but meaningful information conveyed only by one of the arrangements.

    -Q

  8. 8
    EDTA says:

    Querius,

    Right. I think we were saying the same thing in two different ways.

    In regards to the post, and also Landauer’s principle (https://en.wikipedia.org/wiki/Landauer%27s_principle), it seems clear to me anyway (and I think you are saying this also) that the amount of energy required to erase a bit of data depends entirely on how it is encoded in matter (or energy). If Landauer was trying to specify a minimum, then it’s not clear to me (even after reading the Wiki article on L’s Prin.) how that bit is supposedly encoded. The marrying of the thermodynamic and info-theoretic definitions of “entropy” doesn’t specify that. If you or anyone else here can shed light on that question, it would be appreciated.

  9. 9
    Querius says:

    EDTA,
    The first problem is

    Data is not information, information is not knowledge, knowledge is not understanding, understanding is not wisdom. – Clifford Stoll

    So, we’re talking about data–specifically quantized data, since any measurement contains a huge amount of data, down to Planck distances. I agree that the energy required to encode one binary digit depends on the medium–carving in stone versus charge on a RAM chip for example. Let’s consider computer memory.

    Maintaining RAM memory takes energy to reread and rewrite it, so erasing a bit (allowing it to randomly decay to zero) takes less energy than maintaining it. https://www.techopedia.com/definition/2805/memory-refresh

    Reading a location in RAM also requires energy as does rewriting it. Thus, the question changes to what is the minimum amount of energy required to change anything in our universe. This takes us down to quantum levels.

    Or, it might also involve detection (or memory) and choice. For example, let’s consider a binary representation of Pi. One can locate any binary value or string of values within a certain distance along Pi. In that case, no changes or “writing” is needed, merely choice. The first digits of Pi in Binary-Coded Decimal following 3. is 1415 or 1 100 1 101. If I’m looking for a 1, I’ll hit it on the first bit, if a 0, I’ll hit it on the third bit.

    So, no. I don’t know whether there’s a minimum energy cost for 1 bit of information, nor do I know whether consciousness, whatever that is, also needs to be involved.

    Personally, I don’t think the attempt at materializing data (much less information) will yield any benefit.

    -Q

Leave a Reply