Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am really interested in this. The current paradigm from e.g. Suno is an all or nothing finished product. Producing intermediate assets allows you to do simple things like proper mastering or swapping instruments or editing melodies etc.


Yeah. I want a DAW with autopilot features (assistance), not for the LLM to wholesale take away my creative input (vibe composing).


This is our goal with https://parture.org!


Is your goal to break down audio to individual instrument's sheet music? That'd be nothing short of amazing.


I agree that what ai music needs to become an industry tool is the ability to create, access and remix parts, but I think tools like Suno have more of the right idea vs tools like this. In order to be able to write intermediate parts properly, you need to be able to understand the whole and what things should sound like when put together, or when the notes are actually played by a musician. Then it’s easier to work back from there, split your tracks apart into stems, transcribe your stems into MIDI, etc.

Suno et al are moving in this direction but I honestly think development will be somewhat stunted until we get a good open source model(s), and something like control-nets.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: