Hi everybody, I'm the creator of Astrofox. Surprised to see this here, but I'll be glad to answer any questions.
Astrofox has been my side project for several years now. It's basically my playground for trying out things like Electron, React and WebGL. It's open-source, MIT licensed, and totally free to use.
This one took less than 20 minutes from scratch (never seen this app before) (rendering into 4k is another story - it is hours very slow, which is probably understandable with web rendering involved.)
Quick skim reveals it uses React and WebGL shaders to create these effects. React part is easy. Is there something recommended I could read about shaders in order to be able to contribute objects/effects to Astrofox?
The current standard for making live visuals for events (i.e. 'vee jay') is Resolume. The one thing that Resolume lacks is effective support for importing 3D objects.
Its nice to see blend modes in this app. That offers the possibility of building up cool effects layer by layer.
Looks cool! Any plans to support headless operation? What I had in mind was to design a scene in the UI (for example a daily podcast), then on command line take the project file and the audio file and generate the video file output.
Yes, the UI is all from scratch. My inspiration was from seeing other audio-reactive videos on Youtube, but those were rendered with either Adobe After Effects or a 3D program like Blender. I wanted to build something that was easy for the average person to use.
Astrofox has been my side project for several years now. It's basically my playground for trying out things like Electron, React and WebGL. It's open-source, MIT licensed, and totally free to use.