I didn’t argue with your information-free tautologies about tools, I’m pointing out that they are straw man when it comes to using entropy to identify the quality level of music.
> network entropy doesn’t need to identify the quality level of music on it’s own to be useful.
Okay. Another context-free tautology. So what are we even talking about then, what is your point? You offered above “you could eg. test the vast swaths of potentially overlooked composers to see if any of them merit closer listening.” Are you taking back that suggestion?
Feel free to offer something - anything - more specific on how the entropy can provide useful information about music. What uses are you envisioning? What other metrics in combination with entropy are you thinking of?
What I don’t see in your argument is a single specific reason the specific paper we’re commenting on has value, and what that value is. You’re suggesting that someone else doing something else might someday uncover usefulness or applications, and maybe it will build on this paper. That could happen, and yet measuring entropy is already a well known idea, and the applications to compression have been well explored already, and we can demonstrate that entropy of music has no correlation with quality, therefore the probability of what you suggest actually happening still seems rather low, and the discussion doesn’t seem to be improving the odds.
You said I was using tautologies as straw men, which is incoherent and suggests you’re not arguing in good faith.
Anyhow, of course entropy correlates to music quality; maximum entropy music is white noise! I’ve even had luck finding interesting jazz musicians from the distribution of key signatures they use—- anything more entropic than the Real Book is a great indicator. Similarly, network entropy makes it easier to identify musicians with a flexible arsenal of riffs. You could adapt it to chord progressions to find unusual reharmonizations in live jazz to study and practice. It could be a helpful regularizer for neural network music generation. Entropic methods are among the most powerful in statistics.
It looks like we've hit a reply depth limit, which is maybe for the best, because I don't think we're making any progress here.
> You did use tautologies...
You seem to think calling something a tautology is a way to dismiss it. Almost everything in mathematics is a tautology-- most of what I say is a tautology. Any rigorous argument is tautological; it's the aspiration of literally all formal reasoning.
> Lots of uninteresting and bad music is also entropic.
And here, you seem to think someone is claiming that entropy is equivalent to music quality, not just a useful correlate or eg. indicator of something that might be more likely to show up in good music than bad music. I don't know of anyone making that claim; all the examples I gave require mild correlation.
You did use tautologies, and they are right there above and still irrelevant, and thus straw man arguments in the context of the question what useful information is this specific paper contributing to the corpus of knowledge. The irony of flinging bad faith accusations and ad-hominem when trying to distract from the failure to have a relevant argument isn’t lost on me though.
As you point out, white noise is more entropic than the Real Book. Lots of uninteresting and bad music is also entropic. Why exactly is that a good indicator? I’m glad you finally have some examples, but this doesn’t demonstrate that entropy is a decent discriminator of anything.
I’m also curious why you’re arguing with my tautologies!