Bullet Time Effect – Frozen Raspberry Pi

Some people said we were trying to build a Raspberry Pi time machine. In some ways they were right. Others compared it to the LHC at CERN and said we would warp time and space when they saw our pictures on Facebook https://www.facebook.com/raspberryPiFace. They were partly right too.

Raspberry Pi LHC

The Raspberry Pi Large Hadron Collider? It felt like we were building it.

And on the Friday night before the Jam it felt like we were building the LHC! With nearly half a kilometre of network cables, 48 Raspberry Pis fitted with cameras and PiFace Control and Displays we wondered if we’d finally been too ambitious with a project!

It started over 13 years ago watching when I saw the film The Matrix and a BBC documentary called Supernatural: The Unseen Powers of Animals. What linked the two programs was a camera effect where time would stop and the camera would move round the scene. I spent days trying to work out how the effect worked – I accepted that The Matrix used computer generated graphics, but surely the animals in a BBC natural history unit documentary had to be real? My faith in wildlife documentaries was reassured when I discovered a few years later how the effect was created.

The effect was called Bullet time, or Time-slice and consists of taking pictures from a long line of cameras at the same time, but playing them back one after another. Because all the frames are taken at exactly the same time, the action is frozen, but when the frames from cameras, each with a different view, are shown in order, it gives the effect of moving around a scene while time is frozen. On paper it seemed quite simple, to create a bullet time effect, you just needed a lot of cameras that would trigger in perfect sync. The camera rigs for TV and film are as impressive as the budgets, with tens of digital SLR cameras triggered by a slave cable. I could barely afford one DSLR camera, never mind twenty odd. Nor could I afford the tripods to mount them on! I thought my dreams of creating my own bullet time effect were simply unaffordable. That is, until I had an idea back in June this year.

Raspberry Pi Snap Camera

PiFace Snap-Camera turns the Pi into an easy to build simple digital camera

I’d been working on a PiFace interface so I could use my Raspberry Pi without a keyboard and monitor. For a bit of fun I wondered if I could turn it into a simple digital camera, that would take a picture when a button was pressed, and to my pleasant surprise, discovered you could. An idea was beginning to form in my head. If I wrote a bit more code, instead of pressing a button to take a picture, I could trigger it remotely over a network. Furthermore, it cost a lot less than any other digital camera. Could the Raspberry Pi really recreate a bullet time style effect?

I have various crazy ideas, but normally they subside after a few days, but this one wouldn’t. I decided to buy four Raspberry Pi cameras to see if I could trigger them all at once. I laser cut a frame to mount them and it appeared to work, but with four cameras I couldn’t really tell how good the effect was. I decided to buy another four cameras, with the justification that if it didn’t work I’d either sell the cameras or use them in workshops I did with young people. Eight cameras showed promise but nothing conclusive, there was only one way to find out.


Sometimes you wonder if a project was a bit too ambitious

It’s a funny feeling carrying nearly 50 computers in a box barely big enough to hold a laptop. In contrast, after multiplying the original mounting frame by 12, I’d ended up with a 3m wide circle. This was when I first realised I was building something significant. After much plugging, wiring, mounting, and SD card package installing the rig was finally ready. It certainly looked impressive, particularly given it was wider than the office. As such, the first time we tried the full rig was the Manchester Raspberry Jam.

I’d mentioned to Ben Nuttall, general star, (soon to be working for the foundation) and organiser of the Manchester Raspberry Jam that I’d be bringing the Frozen Pi rig, but for some reason he didn’t automatically associate this with a 3m ring of 48 Raspberry Pis, half a kilometre of network cable, a few industrial network switches and enough power adapters to terrify most caretakers. We were generously squeezed into a corner of Madlab at the capacity jam.

A packed Manchester Raspberry Jam

A packed Manchester Raspberry Jam

Soon pictures of the rig were being posted on Facebook and Twitter, yet still we hadn’t actually tested the full rig. The first eager volunteer bravely strode into the circle. We pushed the button and the red camera lights came on in unison. A few moments later the first images had been collected over the network and were being stitched into a video. To everyone’s amazement, including mine, it actually worked! Raspberry Pi had frozen time, recreating a Hollywood effect for a fraction of the cost. You can see the results in the video above http://youtu.be/IqoA4HeBCQ4?t=2m19s

The setup wasn’t perfect – the floor in Madlab is characterful and the rig was a bit low, which meant sadly we chopped Liz Upton’s head off when she had a go. Despite this, we showed it was possible. We’d learned a few things along the way –booting and installing software on 48 Raspberry Pis needs a bit of work, and the PiFace Control and Display proved essential for debugging by showing what was going on, and allowing us to push switches to trigger actions and see status without plugging in a monitor.


Our patented support system was perhaps a bit low

The navigation switch on PiFace Control and Display allows us to set the index of the camera in the rig, so it makes it easy to build the rig in different shapes.

The rig generated a lot of interest and hopefully, in line with the aims of the Raspberry Pi foundation, has inspired youngsters to code and shown the computers can be used creatively.  I could imagine an after-school club building their own rig with a class set of Raspberry Pi and cameras.

For those with just one Raspberry Pi, its still possible to play with time. Instead of freezing it with the bullet time effect it’s easy to speed it up with a time-lapse as shown in the previous blog post.

We’ve still got a few tweaks to make to it, but we’ve got loads of things plan to shoot with Frozen Pi. We’re looking for suggestions too! If you’ve got an idea, or want to see more videos get in touch via our Facebook page.

Tech Specs

For those of you interested in the gory techie details, here are the stats

  • 48 Raspberry Pi Model Bs
  • 48 Raspberry Pi Cameras
  • 48 PiFace Control and Display
  • 48 NOOBS SD cards
  • 48 5V PSU
  • About half a kilometre of network cable
  • 2 x 24 port switches
  • 1 wireless router
  • Custom laser cut frame
  • Enough extension cables plugged into a single socket to scare most caretakers
  • Python script listening to receive command to take picture (included in snap-camera package) https://github.com/piface/snap-camera
  • Python script to collect images over network and assemble frames in order

Timeslice is a trademark of Time-slice films and Bullet time is a trademark of Warner Bros.


31 thoughts on “Bullet Time Effect – Frozen Raspberry Pi

  1. Pingback: Bullet Time Effect – by Andrew Robison | Manchester Raspberry Jam

  2. Pingback: Create your own Bullet time camera rig with Raspberry Pi | MuktwareMuktware

  3. >> “The navigation switch on PiFace Control and Display allows us to set the index of the camera in the rig,…” .

    While it is a nice way of showing how to make use of the Pi-Face features, I have a better idea to set the indices. (I dont know if it is feasible, but if it is, it will reduce the chore). My idea is to semi-automate the Index setting process.
    In this process,
    1) you switch on the raspis,
    2) all raspis get on the network and start listening to a central raspi.
    3) When the central raspi indicates it is ready (using the Pi-Face, of course), You just go around the rig, pressing a button on each of the RasPis ***in the right order***
    4) Every press makes the corresponding Raspi get the right index.

    How this works: Every time the trigger is pressed on the Piface, its RasPi gets the Id from the central server. The central server simply increments the Counter internally, before responding to the next requesting RasPi.

    With this approach, you only spend a few fractions of a second for each Raspi, Instead of several seconds.

  4. Pingback: Bullet time effect with 48 #RaspberryPi and cameras | Raspberry PiPod

  5. Pingback: Building a Bullet Time Rig at the Raspberry Pi Jamboree | MAKE

  6. Better than bullet time you have a 360 degree scanning rig for photogrammetry. Reducing the density and spreading the cameras out into a more spherical setup you can probably get some amazing scans. We do this where I work with 56 Sony a99s. I toyed with the idea of trying it with the few Raspberry Pis I have but lost interest. This is so inspiring I may have to grab a few more for testing. Feed some of these images into AGI Soft and see what it does!

  7. Pingback: Raspberry Pi Bullet Time Rig: Frozen Pi

  8. Pingback: Frozen Pi: Re-creating the bullet time effect from The Matrix using 48 Raspberry Pis #piday #raspberrypi @Raspberry_Pi « adafruit industries blog

  9. Absolutely love the idea, the execution, and you sharing all of it! Brilliant.

    I wonder whether you can elimate all the network cabling by daisy-chaining the raspis, using the GPIO ports to connect the output of one to the input of the next in the chain. A challenge might be to control the real-time delay between them – ideally the whole setup should be able to trigger (almost) simultaneously (at least the delay between two exposures should be constant). The chaining will also implicitly determine the order – which may no longer be required as each pi would simply trigger once it gets the trigger signal from its predecessor, plus a configurable delay.

    Phase I would be initialization where the delay is communicated to all pis in the chain (and maybe a “health check” is communicated back), and phase II would be the actual exposure. You could chain any number of pi’s, without using switches or lots of network cabling. Maybe the I2C bus could be used?

    • Glad you like the idea. It’s a good point about chaining a trigger on the GPIO — we did consider it. In the end we used network cabling so we could transfer the images to a central machine too. Because the Control and Display boards have an Infra red reciever, we wondered about using that too.

  10. It’s definitely an interesting setup. I am also using multiple raspberry pi cameras, but I found that it’s very difficult to get photos of the same color style. Some photos are a bit green, and some are a bit yellow. I have specified the same awb and exposure parameters on all cameras. Have you met the problem? Any hint will be welcome, thank you!

  11. I love the project, congratulations! Can you post the parameters with which you run raspistill command? I would like to see the values ​​of shutter speed, exposure, awb, etc.. you use. Thk!

  12. Pingback: Frozen Pi — An Affordable Bullet Time Recorder

  13. Pingback: Frozen Pi — An Affordable Bullet Time Recorder - Tech key | Techzone | Tech data

  14. Pingback: Raspberry Pi Bullet Time Recorder - Hacked Gadgets – DIY Tech Blog

  15. Pingback: Frozen Pi — An Affordable Bullet Time Recorder | Hack The Planet

  16. We have done tests using studio lights, and these really are needed when freezing action of moving people- — we didn’t measure the number of lumens. When we did our workshops with young people we didn’t use the studio lights, so found there was some motion blur.

  17. Yes, the motion blur and the noise in each photo is the handicap that we have to solve. I guess that there is a direct relation between ilumination and the freeze quality. In the tests with studio lights, have you got the minimum value of sutter’s speed (ss parameter) to get noiseless photos? how many ns?

  18. Pingback: Raspberry Pi Bullet Time Recorder | Hi-tech news

  19. Pingback: Frozen Pi: Efecto Bullet Time con múltiples Raspberry Pi | netgueko

  20. Pingback: ‘Hacker’ is not a Dirty Word. | Hackspace Blog with MrC

    • Although you’d to move the camera very fast to have the same speed as triggering all the cameras at once. And an extremely high shutter rate to not have motion blur. — We struggled with our shutter rates as the room was quite dark.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s