-
Another demoscene tribute
05/23/2020 at 04:42 • 0 commentsAnother timing test or just an excuse for blinkenlight coolness.
This is Masagin - NVISION 08 Invitation" by Farbrausch.
I screwed up and recorded this only in 720p, losing a lot of detail.
-
Large videos and streaming!
05/22/2020 at 22:02 • 0 commentsMade some good progress over the last few days.
First, I switched to on-demand loading/decoding of frames so that I can store and playback very long videos on the Omega. Here's the first successful test of playing back a 78 minute video:
Since first exporting video frames as CSV and then running them through my encoder to turn them info ".ff" files was proving to be very cumbersome, I wanted to experiment with streaming.
Streaming frame data from TouchDesigner over UDP and adding a UDP listener scene was a breeze. Here are the fist successful runs. First, locally, on the text-mode emulator:
Next, streaming remotely to the Omega:
And finally, once I got streaming working, it was trivial to hook things up to my webcam in TouchDesigner:
-
Accurate frame timing at last
05/20/2020 at 17:02 • 0 commentsMy overnight experiment of running this binary clock for 8 hours and checking for frame drops / clock drift confirmed that I now have proper frame timing!
-
Even better frame timing
05/20/2020 at 04:53 • 0 commentsSignificantly improved frame timing with a simple tweak to prevent timing errors for individual frames to accumulate over time.
Created a binary clock (to avoid the hassle of rendering actual digits). It's all 6-bit values from top to bottom: hours, minutes, seconds. The rapidly moving dot at the bottom is a frame counter: 12 x 5 = 60 frames per second. The clock doesn't read the current time of the OS constantly. Instead, it calculates what the time should be based on the rendered frames. If there's any timing issues with rendering or if frames are dropped, it should cause the clock to lag behind.
Showing reference time on my laptop with a big clock I wrote as a dweet and there's a 1-second difference despite both the Omega and my laptop being synched with NTP. I'll keep this running until the morning and see how much of a deviance there'll be.
-
Improved render timing
05/16/2020 at 07:09 • 0 commentsImproved render timing by moving it from relying on a JavaScript setInterval() to relying on a render event fired by the render thread. Looks much better. (I'm also getting a bit better at video production!)
-
Crunching frames
05/15/2020 at 22:48 • 0 commentsSo far, I've been mostly focused on real-time rendering using JavaScript. For more complex scenes, or to quickly prototype visuals, I was planning to find a way to play videos.
Instead of some streaming setup, I decided to store the videos directly on the Omega, especially since I have lots of room in the form of a 64 GB SD card.
I created a TouchDesigner setup to extract 16x16 monochrome frames from videos -- saves each frame as a CSV file. I then wrote a Node.js script to take the CSV frames and encode them into a proprietary video format (I call "FRFR", stands for "Frekvens Frames"). Code and documentation is on GitHub.
I've been dying to see what one of my favourite demoscene demos, Intrinsic Gravity by Still, would look like on this limited medium. My initial plan was to try to recreate the scenes in JavaScript. But, directly stealing frames from a video of the demo, I was able to quickly test out what it would look like without writing a single line of demo code.
There's are some timing issues I have to look into.