You said I was using tautologies as straw men, which is incoherent and suggests you’re not arguing in good faith.
Anyhow, of course entropy correlates to music quality; maximum entropy music is white noise! I’ve even had luck finding interesting jazz musicians from the distribution of key signatures they use—- anything more entropic than the Real Book is a great indicator. Similarly, network entropy makes it easier to identify musicians with a flexible arsenal of riffs. You could adapt it to chord progressions to find unusual reharmonizations in live jazz to study and practice. It could be a helpful regularizer for neural network music generation. Entropic methods are among the most powerful in statistics.
It looks like we've hit a reply depth limit, which is maybe for the best, because I don't think we're making any progress here.
> You did use tautologies...
You seem to think calling something a tautology is a way to dismiss it. Almost everything in mathematics is a tautology-- most of what I say is a tautology. Any rigorous argument is tautological; it's the aspiration of literally all formal reasoning.
> Lots of uninteresting and bad music is also entropic.
And here, you seem to think someone is claiming that entropy is equivalent to music quality, not just a useful correlate or eg. indicator of something that might be more likely to show up in good music than bad music. I don't know of anyone making that claim; all the examples I gave require mild correlation.
You did use tautologies, and they are right there above and still irrelevant, and thus straw man arguments in the context of the question what useful information is this specific paper contributing to the corpus of knowledge. The irony of flinging bad faith accusations and ad-hominem when trying to distract from the failure to have a relevant argument isn’t lost on me though.
As you point out, white noise is more entropic than the Real Book. Lots of uninteresting and bad music is also entropic. Why exactly is that a good indicator? I’m glad you finally have some examples, but this doesn’t demonstrate that entropy is a decent discriminator of anything.
Anyhow, of course entropy correlates to music quality; maximum entropy music is white noise! I’ve even had luck finding interesting jazz musicians from the distribution of key signatures they use—- anything more entropic than the Real Book is a great indicator. Similarly, network entropy makes it easier to identify musicians with a flexible arsenal of riffs. You could adapt it to chord progressions to find unusual reharmonizations in live jazz to study and practice. It could be a helpful regularizer for neural network music generation. Entropic methods are among the most powerful in statistics.