They seem to concentrate on generating sound, not MIDI, even if they can read MIDI or use it internally. I'd like to generate my melodies algorithmically and then use the MIDI output of these programs as input in Ableton Live, Logic etc.
SuperCollider has a series of classes called Patterns for sequencing and algorithmic composition. The output can be sent to the embedded server for live synthesis or to a midi output (or directly to Ableton with a virtual midi cable) http://doc.sccode.org/Tutorials/A-Practical-Guide/PG_Cookboo...
GRACE,(Graphical Realtime Algorithmic Composition Environment) built on top of Common Music, can save the output to WAV or MIDI or using OSC, and it also interfaces nicely with LilyPond for a musical score printout. A lot of the livecoding programs allow you to direct the output to a file other than just audio out. I own Ableton, and you can use Max inside of it if you have the studio version, and now CSound can be used, but not for Livecoding if I understand it correctly.
Also check out alda: https://github.com/alda-lang/alda