In thermodynamics there are two other formulations of entropy: the Clausius one in terms of temperature and heat, and the Boltzmann one. The latter defines entropy as the log of the number of microstates a system could be in a particular macrostate.
The Shannon definition is equivalent to the Boltzmann def only in the case that the micro state consists of infinitely many identical subsystems. If there are only finitely many, for instance, the log of the quantity does not correspond to the same "-p log p".
The Clausius def can be derived from the Boltzmann one, but they are nevertheless also distinct formulations.
According to Wikipedia, if you start with the Gibbs entropy (which is the same as Shannon entropy), and then assume all microstate probabilities are equal (which Boltzmann does), you get the Boltzmann entropy formula. It also says Boltzmann himself used a p ln(p) formulation.
So aren't they the same, perhaps up to a constant factor?
If you count the number of microstates for a given macrostate you get a hyper geometric number N!/(n_1!n_2!...)
The log of this is the Boltzmann entropy. However, if you consider N to be very large or infinite, you can show using the Stirling approximation that this ends up being the Gibbs/Shannon entropy in that case. So, in general, no.
In thermodynamics / statistical mechanics there is another formulation of entropy: Gibbs entropy is different from Boltzmann entropy (and equivalent to Shanon entropy in information theory).
Sure, but that doesn't mean that the concept of entropy in physics is a different concept than its incarnation in information theory. Just like the concept of energy existed before the development of thermodynamics, but thermodynamic energy is still energy.
If you want to be pedantic about it sure. The fact remains that a discussion or blog post can be about very different things depending on the context. You're not going to learn anything about the relationship between energy and entropy, for example, if you're talking about information theory. Hence my original comment.