Very nice. I just ran a half dozen varying songs which we perform through it and it mostly matches my chord sheets.
For some bands this will be a dynamic changer. Singers who can not write down chords sometimes don't want to impose on the instrumentalists by bringing in new material. If they can bring in a rough whack that barrier is lifted.
Some notes:
• Not all music is in 4/4. Ok, mixed meter and obscure meter, but 3/4 really ought to be either recognized, or let me tell you. Even when the song is in 4/4, it sometimes loses track of where "1" is.
• It doesn't like to (maybe can't) write more than a 3 note chord, and I didn't see any "sus" chords, but maybe I didn't feed it a good sample. Not understanding 4 note chords will probably add ambiguity to the results, say a Em7 being mistaken for a G if there isn't enough E going on.
• It appears to cache well. When I hit popular songs through their cleanest YouTube URL the results pop up instantly. Well done!
Looking forward to progress, but it goes in the toolbox now.
I've used it before, it's pretty interesting, especially for its visualisations of the chords being played. It works better for songs with a clear chord structure, though. Too bad the project seems kind of dead. There are other projects listed elsewhere in this discussion (e.g. Chordino+NNLS Chroma) which seem viable.
Tangentially, Melodyne's "Direct Note Access" promotional video was very exciting when I saw it years ago, but I have to wonder how well it ever worked (when it finally came out).
I've used Direct Note Access several years ago (soon after it was released), and it's really good. Like creepy good. I put some full-band bluegrass tunes through it and it pretty much caught every note. It doesn't attempt to separate out instruments, so the output was just one MIDI track, but I could easily go in and separate out bass notes, mandolin solos, etc.
It had trouble with most pop and rock music, presumably because of the crushed dynamics and drums. Of course, it's not even designed for analyzing multi-instrument recordings.
(There's usually a few "doesn't work" comments, but imagine how much extra processing is required to get the fundamental when you run guitars thru delay/ reverb, overdrive/distortion, chorus/tremolo/vibrato etc effects)
> Curious about the real-world results, I quickly grabbed a link to an artist I know, Andy Zipf, and put his song through Chordify. I played along with the results for most of the song, but wanted to see if the chords were right, directly from the source. I sent Zipf a message on Twitter asking and he responded, “Looks correct.”
Perhaps I'm foolish to question the artist about the chords of his own song, but it certainly doesn't look quite right to me. Each phrase in the verse starts on an A, and the algorithm misses that a few times. When the whole band starts, a lot more chord changes get missed, which seems odd to me based on the description of the algorithm.
Also, it would be nice if the tool attempted to identify a tonal center ("key") of the song, and use that to spell the chords more appropriately. For a lot of pop music, this would be fairly easy and reliable. In this song, the tonal center is clearly E, and it would be nice if the C#m wasn't spelled as Dbm.
Yeah, those chords certainly don't look quite right. There's some missed changes and some that show up a beat too early or too late. It's also missing a lot of subtleties, like 7th chords and inversions.
For a much more blatant example of Chordify just failing to spot really obvious things, check out their results for "Get Lucky":
You might also like the Yanno chord extractor, with super-academic-looking but quite easy to use UI here: http://yanno.eecs.qmul.ac.uk
If I remember aright, both Chordify and Yanno use NNLS Chroma for the feature extractor but they use different methods to segment and label the chords. (Yanno uses Chordino which you can find at http://isophonics.net/nnls-chroma with the NNLS Chroma plugin.)
Harmonic extraction algorithms are cool. Make a recording of yourself standing in a stairwell hitting a wok with a spoon, push that through the algo, and you get avant-garde jazz out the other end. Sometimes. It's certainly easier than spending 30 years learning to play the keyboard.
I guess with it hooked to youtube we can rapidly try out all sorts of unusual inputs.
Many musicians consider themselves 'composers' because they know harmony. Harmony is 'composition for dummies'. To compose real stuff you have to study counterpoint for lifetime.
And there are even worse algorithmic composing systems like Lyle Murphy's Equal Interval System, Schillinger, etc... I love algorithms but that's not art.
Depends what you mean by "real stuff" I guess. You could just as easily argue that musicians consider themselves composers because they read Fux. Counterpoint is important for certain styles of composition and irrelevant for others.
At its most basic, composition is the creation of a musical score. What form the score takes, what genre the piece is in, the method of composition, the evaluation of the resulting work, are all flexible depending on the style of music and the quirks of the particular composer and audience. Whether something qualifies as "art" is almost entirely subjective -- if it speaks to an audience (including the composer!) it's worth something.
Makes me wonder what the next step would be. Seems like they're essentially decomposing a song into its ingredients. I wonder if they could use their algorithm to convert a song from one genre to another like auto-creating the hiphop or dance version of a given song :)
Actually, they are taking a "big picture" approach, since it's easier to find out the chord produced by an unknown number of instruments than to find out how many instruments are playing and what they are playing, separately. Doing these things you describe would require the second option, I believe.
From the article itself:
“The problem with ‘full polyphonic transcription’ is that the computer doesn’t know how many voices and instruments sound together and what the characteristics are of these instruments,” says De Haas.
“When you transcribe chords, we examine the mixture as a whole and examine what the prominent frequencies are in the spectrum.”
Perhaps related: There's an app called "Shapes" on iTunes where the devs deconstructed popular songs into "shapes", basically a set of keys that will sound good with the song no matter what you play:
http://www.playshapes.com
this youtube video is not available in your location
this deezer song is not available in your country
this youtube video contains content from [x]. it is restricted from playback on certain sites.
tried to use this and the results were completely off...
BUT i was trying to use it on Moonlight In Vermont by the Johnny Smith Quintet. I am expecting that the type of harmonic structure used in the jazz chord melody style is probably beyond the current capabilities of this technology.
Even still though, i mean the results seemed to have no relation at all to teh music being played, chords were shown on silent parts of the song and vice versa.
fwiw, this uses software [1] that has dozens of dials to adjust yet not giving access to those dials. That's probably why. One of the key challenges (i've tried to write chord detectors personally) is that different instruments sounds have different harmonic falloff rates (in different frequency ranges no less).
I think you nailed it. Yes, the algorithms can't figure out all the proper tuning yet.
For instance, the lowest frequency (among the strongest amplitude candidates) is often the bass line, & thus the chord. However, those type of general rules are common, yet often broken to make the music more interesting in the first place. Esp. when notes are "left out" to open things up & let you mind fill it in.
For some bands this will be a dynamic changer. Singers who can not write down chords sometimes don't want to impose on the instrumentalists by bringing in new material. If they can bring in a rough whack that barrier is lifted.
Some notes:
• Not all music is in 4/4. Ok, mixed meter and obscure meter, but 3/4 really ought to be either recognized, or let me tell you. Even when the song is in 4/4, it sometimes loses track of where "1" is.
• It doesn't like to (maybe can't) write more than a 3 note chord, and I didn't see any "sus" chords, but maybe I didn't feed it a good sample. Not understanding 4 note chords will probably add ambiguity to the results, say a Em7 being mistaken for a G if there isn't enough E going on.
• It appears to cache well. When I hit popular songs through their cleanest YouTube URL the results pop up instantly. Well done!
Looking forward to progress, but it goes in the toolbox now.