Hacker News new | past | comments | ask | show | jobs | submit login
Astrofox – Turn Audio into Videos (astrofox.io)
213 points by GenericCanadian on July 7, 2022 | hide | past | favorite | 38 comments



Hi everybody, I'm the creator of Astrofox. Surprised to see this here, but I'll be glad to answer any questions.

Astrofox has been my side project for several years now. It's basically my playground for trying out things like Electron, React and WebGL. It's open-source, MIT licensed, and totally free to use.


Thank you so much for that! Finally something to make visuals when sharing on video sites!

https://www.youtube.com/watch?v=TzQt11AOslQ

This one took less than 20 minutes from scratch (never seen this app before) (rendering into 4k is another story - it is hours very slow, which is probably understandable with web rendering involved.)

Quick skim reveals it uses React and WebGL shaders to create these effects. React part is easy. Is there something recommended I could read about shaders in order to be able to contribute objects/effects to Astrofox?


Incredibly cool. I've always wondered what tool was used to make the visuals associated with pretty much every DJ set uploaded to YouTube.


The current standard for making live visuals for events (i.e. 'vee jay') is Resolume. The one thing that Resolume lacks is effective support for importing 3D objects.

Its nice to see blend modes in this app. That offers the possibility of building up cool effects layer by layer.


That isn't what everyone on YouTube is using though. See an example here: https://youtu.be/d_G37G-8gMA?t=3329


Ahhh. this is new to me. Thanks.


Looks cool! Any plans to support headless operation? What I had in mind was to design a scene in the UI (for example a daily podcast), then on command line take the project file and the audio file and generate the video file output.


I haven't looked into it so I am not sure if it's possible. The requirement would be for Electron to be able to run headless with WebGL enabled.


The UI is beautiful, is it all from scratch? or did you use libraries, and where did you get inspiration for it from?

Sorry for the multiple questions, just not often I see something that catches my eye like that.


Yes, the UI is all from scratch. My inspiration was from seeing other audio-reactive videos on Youtube, but those were rendered with either Adobe After Effects or a 3D program like Blender. I wanted to build something that was easy for the average person to use.


Did you simply reinvent Winamp?


Found this today when looking for something to turn short podcast clips into something more visual. Anyone have any other open source tools they like? I've heard of using ffmpeg (https://lukaprincic.si/development-log/ffmpeg-audio-visualiz...).

Youtube video of Astrofox: https://www.youtube.com/watch?v=IbvuniqNPPw


I had ended up using ffmpeg a few times for this (i think inspired by the link you gave there) because it's often easier to share video than just audio on mainstream social media

A few details of what worked for me and the output shown here:

https://discourse.mozilla.org/t/tts-audio-to-video-trick-usi...

But Astrofox looks much cooler and massively more sophisticated.


The key to all audio/music visualizations is the Fast Fourier Transform. Hook a FFT implementation to a graphics library and you are set.


You can use revoldiv.com. If you go to export and choose audiogram, it will convert the audio/video you uploaded to text and create an audiogram.


vvvv not open source but has a free version:

https://vvvv.org/


Slightly reminiscent of old school demo scene stuff. Or perhaps like a static version of WinAMP? I want more scrollers to greet peeps in trippy space tunnels! xD

Digression about the Atari demo scene: Amazingly they still keep cranking them out! https://www.youtube.com/watch?v=3QginSr9V7A


This is fantastic - will be using in the next Dystopian Disco mixtape series

https://www.youtube.com/watch?v=64Mxban54mM&list=PLFyd_83HK5...

Is it possible to set the effects run time for a set period & if so would it be possible to have the effects modulated?


Looks incredible, I could really use this I've been using stock footage for my music videos with various effects/transitions. Can I overlay the effects over videos or only static images? I already have a ton of ideas.

https://youtu.be/k2NAtIo8Wcs

https://youtu.be/BRMywh9yHWY

https://youtu.be/-89K59xrYmY

https://youtu.be/TCw7Bcwpsfw


Video is not currently supported but will be in the future.


I guess I could just export the audio visual effects based on the audio track on a green background and then key them out. So not a major issue just extra steps.


I love this so much! I've always wanted to build one. but struggle with the math to turn music frequency into something beautiful visually


It's open-source so you can see how everything is done. The audio processing is all done with Web Audio APIs.


Is there a package like this that renders real-time rather than to a video for sharing? (like winamp visualizations used to do)


Check out projectM, "an open-source project that reimplements the esteemed Winamp Milkdrop": https://github.com/projectM-visualizer/projectm


If you simply drop in a song, it will render in realtime. You don't have to render it to a video. I am working on a "live" mode that will act much like Winamp, where you would select a playlist of songs and have it cycle effects.


Uneducated suggestion, feel free to ignore if inapplicable:

I don't know if this is common cross platform, but my Windows has an audio in device called "Stereo Mix" which is just all audio output played by all applications mixed together and looped back to an input device. If you could set your live mode to listen on any input device, then people can select Stereo Mix and use whatever existing software they have for playing/playlisting and have astrofox produce the visuals. You'd be a sort of Winamp-less Milkdrop!


I also recommend Le Biniou an open - source, user - friendly, powerful music visualization & VJing tool https://biniou.net/


I made a similar product on iOS [1]. It’s not there yet in terms of polish, but soon it will be.

[1] https://podbuddy.app/


Can you run this headlessly from the command line? It would be powerful to generate videos at scale.


@mcao - this is fantastic! Finally a free and a really robust replacement for Headliner and such. It's so great for podcasts. Thanks!

EDIT I now realize it's been out for 2 years now and never heard of it before!


The one thing that I would like to see added to this would be the implementation of IO streams for audio and video. I'd love to find a piece of software that could be selected as the sole output for either a system's audio or a specific application, keep a buffer of the audio stream for processing, create an AV output, and then allow that to either be captured or streamed to something like OBS or VLC. Realistically, I can see latency being a bit of an issue of course. Audio fidelity may also take a hit if you exceed the buffer's capacity, but trying to implement a direct passthrough for the audio may desync the it from the visualizer.

Maybe it's just a pipe dream, but has anyone created/worked with a project like that before? I'd be interested in seeing what's out there.


Streams, a pipe dream? I see what you did there.

Unfortunately this appears to be an Electron app, so expecting behavior which would probably be quite simple for a real application does in fact seem unlikely.

Too bad, too, since this does indeed look like a cool app.


I just recalled windows media player animations seeing this. (Is this thing still kicking ? I wouldn't know. I ditched Windows quite some time ago)


Looks similar to the FL Studio built in ZGameEditor Visualizer. Of course the advantage here is that you can use it independent from a DAW.


The latest version seems broken. It's not available on Homebrew either.


Could be a useful tool to generate snippets for a larger project.


Any plans to take this to mobile?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: