So far, I've been mostly focused on real-time rendering using JavaScript. For more complex scenes, or to quickly prototype visuals, I was planning to find a way to play videos.
Instead of some streaming setup, I decided to store the videos directly on the Omega, especially since I have lots of room in the form of a 64 GB SD card.
I created a TouchDesigner setup to extract 16x16 monochrome frames from videos -- saves each frame as a CSV file. I then wrote a Node.js script to take the CSV frames and encode them into a proprietary video format (I call "FRFR", stands for "Frekvens Frames"). Code and documentation is on GitHub.
I've been dying to see what one of my favourite demoscene demos, Intrinsic Gravity by Still, would look like on this limited medium. My initial plan was to try to recreate the scenes in JavaScript. But, directly stealing frames from a video of the demo, I was able to quickly test out what it would look like without writing a single line of demo code.
There's are some timing issues I have to look into.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.