Very important question. I believe information is a percievable or measurable difference in states of a thing at two different time instants or difference in states of two things. A state of a thing exists only because it can be distinguished from another state; that is, the state exists only because it can encode information relative to another state.
How fast information can travel? Just as fast as one can distinguish one state from another. This assumes that there exists an agent capable of distinguishing between the states. How fast can this be done? It doesn't start at one state and end at another state. It is instantaneous recognition of the difference. The information is created just when the difference in states is recognized. There is no start or end. It is an event. Events do not have speed.
I'm not sure this answers the question. The wikipedia page is about the entropy of a probability distribution. But the information speed limit is supposed to apply even if everything is totally deterministic.
If I write a single, 100% certain message and put it in a spaceship it still cannot go faster than light, even though there is no information transfer (entropy of my message (a constant random variable) is 0).
(I'm not saying you are wrong, I am asking to be corrected)
I agree that there is "information" in the colloquial sense there, or even in the Kolmogorov sense. I don't understand how there is information in the entropy sense, because I do not see a random variable anywhere in this story.
I believe we are talking about Shannon's direct analogy between information and entropy here. The low probabliity of the atoms of the spaceship being arranged as they are - a specific design of a spaceship - out of all their possible arrangements, is a state of low entropy and high information content.
Entropy isn't necessarily random. I think a better description of entropy might be unimportant states. For the rocket, we might care about the total mass and we might even care about the temperature of those bolts. Those parameters represent information.
But there's even more information in the rocket, specifically the momentum*position of each of the atoms within the bolt. That quantity for each atom is measurable/knowable and represents information.
I don't know, something about this feels more like an abstraction than a real thing. If you say "the speed of light is the speed limit for information" that doesn't feel like you're saying something about the way the universe works directly.
Pretty much the entirety of modern physics is this type of abstraction (https://en.wikipedia.org/wiki/The_Unreasonable_Effectiveness...). You are just facing the discomfort felt when your intuition clashes with the tools usually required to make reliable predictions and explanations about the universe.
Consider, how is your current discomfort with this statement any different from the discomfort felt through the centuries when ideas like fields instead of forces, entropy, phase spaces, quantum amplitudes / complex numbers, matrix mechanics, spacetime, curvature, etc were being introduced. They are all abstractions that end up being more reliable than our intuition.
Whether this "abstraction" is a "real thing" is a question for the philosophers. For me there is no difference between the two.
Sure it's a bit different (especially if you're concerned with how it makes you feel) than, say, describing tectonic plates or the acceleration of a falling body due to earth's gravity but that doesn't make it not a real thing.
Quantum spin is one that gets me, or the uncertainty principle. It makes me very uneasy, but whether or not I'm comfortable is irrelevant. Those are to the best of our knowledge actual features of the universe.
Abstractions aren't really in the language of the pure sciences. Analogies, metaphors, etc can all serve to help explain but the speed of the propagation of information in this universe is very much defined as the velocity of light in a vacuum in a completely literal sense. There's no abstraction here. Maybe some confusion about what we mean by 'information' but I'm sure there are better resources if you want an afternoon rabbit hole.
“entropy is an anthropomorphic concept, not only in the well-known statistical sense that it measures the extent of human ignorance as to the microstate. Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it.”
That appears to be an increasingly common perspective, summed up in the phrase "it from bit". It seems to owe its popularity to this essay by John Wheeler from 1989 [1]. I'm not really familiar with the topic so I can't say more than that.
I struggle with understanding just what a physical thing is? If a particle is an excitation of a quantum field, then what is a quantum field? Is it anything more than just information?
Isn't it simple? Information is the content of a message from one intelligent being to another such that the recipient can understand the intended meaning of the message.
Random bits a can randomly arrange into a a configuration that looks like information. But that is not information because there is no intended message behind it. Whereas if an intelligent being organizes a set of bits into a configuration say like in a zip-file which looks like random bits, it is information because it can be decoded into meaning.
That's just not true. Whether the message is from an "intelligent" source or not, it can still contain information. When you make an observation of a random process, you decrease your uncertainty about its state, or in other words, you lower its entropy. In the process, information has been communicated to you.
Are you saying that the act of observation decreases entropy of the observed system?
Or are you saying that entropy is fully in the "eye of the beholder"? If I observe something and you observe the same thing but longer, you reduce the entropy further (for yourself) than I do?