Hi everybody, I'm the creator of Astrofox. Surprised to see this here, but I'll be glad to answer any questions.
Astrofox has been my side project for several years now. It's basically my playground for trying out things like Electron, React and WebGL. It's open-source, MIT licensed, and totally free to use.
This one took less than 20 minutes from scratch (never seen this app before) (rendering into 4k is another story - it is hours very slow, which is probably understandable with web rendering involved.)
Quick skim reveals it uses React and WebGL shaders to create these effects. React part is easy. Is there something recommended I could read about shaders in order to be able to contribute objects/effects to Astrofox?
The current standard for making live visuals for events (i.e. 'vee jay') is Resolume. The one thing that Resolume lacks is effective support for importing 3D objects.
Its nice to see blend modes in this app. That offers the possibility of building up cool effects layer by layer.
Looks cool! Any plans to support headless operation? What I had in mind was to design a scene in the UI (for example a daily podcast), then on command line take the project file and the audio file and generate the video file output.
Yes, the UI is all from scratch. My inspiration was from seeing other audio-reactive videos on Youtube, but those were rendered with either Adobe After Effects or a 3D program like Blender. I wanted to build something that was easy for the average person to use.
I had ended up using ffmpeg a few times for this (i think inspired by the link you gave there) because it's often easier to share video than just audio on mainstream social media
A few details of what worked for me and the output shown here:
Slightly reminiscent of old school demo scene stuff. Or perhaps like a static version of WinAMP? I want more scrollers to greet peeps in trippy space tunnels! xD
Looks incredible, I could really use this I've been using stock footage for my music videos with various effects/transitions. Can I overlay the effects over videos or only static images? I already have a ton of ideas.
I guess I could just export the audio visual effects based on the audio track on a green background and then key them out. So not a major issue just extra steps.
If you simply drop in a song, it will render in realtime. You don't have to render it to a video. I am working on a "live" mode that will act much like Winamp, where you would select a playlist of songs and have it cycle effects.
Uneducated suggestion, feel free to ignore if inapplicable:
I don't know if this is common cross platform, but my Windows has an audio in device called "Stereo Mix" which is just all audio output played by all applications mixed together and looped back to an input device. If you could set your live mode to listen on any input device, then people can select Stereo Mix and use whatever existing software they have for playing/playlisting and have astrofox produce the visuals. You'd be a sort of Winamp-less Milkdrop!
The one thing that I would like to see added to this would be the implementation of IO streams for audio and video. I'd love to find a piece of software that could be selected as the sole output for either a system's audio or a specific application, keep a buffer of the audio stream for processing, create an AV output, and then allow that to either be captured or streamed to something like OBS or VLC. Realistically, I can see latency being a bit of an issue of course. Audio fidelity may also take a hit if you exceed the buffer's capacity, but trying to implement a direct passthrough for the audio may desync the it from the visualizer.
Maybe it's just a pipe dream, but has anyone created/worked with a project like that before? I'd be interested in seeing what's out there.
Unfortunately this appears to be an Electron app, so expecting behavior which would probably be quite simple for a real application does in fact seem unlikely.
Too bad, too, since this does indeed look like a cool app.
Astrofox has been my side project for several years now. It's basically my playground for trying out things like Electron, React and WebGL. It's open-source, MIT licensed, and totally free to use.