“Ben the Intern” (his real name is just “Ben”) was double dared by the camera team to recreate the camera rig responsible for the ground-breaking “bullet time” special effects in The Matrix using Raspberry Pi hardware. Then I triple dared him to write a story about how it went. This is that story.
I’ve had a great time as an intern at Raspberry Pi. Over the last few weeks, I’ve worked with the camera team getting hands-on experience with everything that makes our cameras work. From adding new tools for libcamera-apps
, to playing in the lab calibrating cameras, I’ve seen way more of the camera world than I ever dreamed. As a little treat, the engineers gave me a fun project to work on: recreate the bullet time effect from the movie “The Matrix”. I downloaded some documentation and got started.
Synchronised Captures
When the High Quality Camera or the Global Shutter Camera start capturing a frame, they output a tiny pulse on the board’s XVS pad. I carefully soldered some wires to both boards and tweaked the driver software. After playing with pull-up resistors while watching the pulses on an oscilloscope, I finally got both cameras to capture frames at exactly the same time. The following GIF shows what both cameras captured, and the setup I used to capture the video.
External triggering
Having achieved this first goal, the team moved the posts and gave me another challenge. The engineers wanted me to test external triggers to see if I could capture a frame through an external source like Raspberry Pi Pico. Pulsing the XTR pin low on the Global Shutter Camera for a given time caused the camera to capture a frame with that given exposure time. This allows for all kinds of synchronisation fun.
After rebuilding the Linux kernel again — a rather tedious process — and playing with some drivers, I managed to get this to work. We could use the Raspberry Pi Pico to send a pulse, and the camera would respond. The final piece of the puzzle was writing a bit of code to trigger a signal from the Pico with the right frame rate and exposure time. Finally, we were ready to start capturing!
3D-printed rig
Our Maker in Residence, Toby, created a 3D-model of the camera rig. You can see close ups of this rig in the previous photos. It took quite a bit of work to get each camera aligned and focused on exactly the same point, but once that was done it was time to power up the rig.
I had each Raspberry Pi record a ten second video, the frames from which could then be stitched together using FFmpeg. This gave us our final result.
If you’re interested in trying this out yourself, check out the Raspberry Pi Camera documentation. The sections on synchronous captures and triggering provide more detailed guidance.
And finally, here’s the result! I managed to capture a Pi that seems to (almost) float in mid-air!