After two days of non-stop work, Hootsuite Engineers Anubhav Mishra and Luke Kysow won the Vancouver DevFest Hackathon in Vancouver, BC. Luke shares his experiences below (originally posted on Medium):

From Saturday morning, November 8th through to Sunday night, Anubhav Mishra and I worked to build LeapSnap at the DevFest Hackathon in Vancouver. The idea for LeapSnap came from a Sci-Fi short I watched where everyone had chips implanted that would store their memories via video. Whenever they wanted to share their memories with friends, they would “cast” them onto any screen within view. While sitting at the dinner table, they could then wirelessly control the TV, cycle through to the memory they wanted to share, and then play it.

Here’s what we ended up building. The videos that play were shot from my phone and uploaded in real time. Each screen is being controlled independently by the gestures and the displays can be placed anywhere, as long as they are connected to the internet.

Our project involved three parts: recording the video, showing the list of videos on multiple displays, and then controlling all of the displays at once with simple hand gestures.

Video Streaming

“Her”-style video capture (thanks Toby!)
“Her”-style video capture (thanks Toby!)

To record the video we used an Android app called Spydroid that wirelessly streamed video from my phone using the Real Time Stream Protocol. We then ran a program called ffmpeg on my laptop that would read the video stream from the phone over wifi and then save it as video.

LeapSnap2

Displaying the Videos

As the videos were coming in, we needed a way to display them across multiple screens. We wrote a simple website that showed the list of videos in a cover flow style view.

Cover Flow style interface
Cover Flow style interface

Playing with Gestures

Leap Motion is a device that enables you to use gestures to control an application. We connected a Leap Motion controller to one laptop that registered the gestures. It then pushed the commands down to all the connected displays, controlling all the displays at once!

Leap Motion Controller
Leap Motion Controller

Swiping to the left or right cycled through the videos, poking played the video, and a simple down-tap would stop the video.

Hackathon Result

There were a ton of great projects that came out of the hackathon. From a distributed game using arduino and multiple players to a virtual reality game using Google Cardboard. The projects were judged on how they solved a difficult technical problem in an innovative way and in the end we were lucky enough to be picked as the winners! We won two Chromecasts and an LG GWatch.

Luke and Mishra with their 3D-printed trophy
Luke and Mishra with their 3D-printed trophy

You can view our presentation along with the rest of the projects here. Thanks to Yaniv Talmor for organizing, all the judges for donating their time, and Google for sponsoring this awesome event!

Technical Details

As mentioned above, the video was streamed from my phone using the SpyDroid Android app using RTSP. I initially tried to connect to the stream and save it as MP4 files using VLC but most of the time it would never write a valid MP4 file. Instead, thanks to a friend’s advice I switched to using ffmpeg and it worked perfectly. I wrote a simple bash script to:

  • connect to the video stream
  • record for 10 seconds (for demo purposes)
  • generate a thumbnail
  • save the video and thumbnail in a dated directory

https://gist.github.com/lkysow/4002781c32ed2ac15d59.js

List of Videos API

When users open up the webpage, the front-end needed to get the list of videos. I hacked out a simple PHP script to provide an API endpoint that would provide a list of the videos and their urls.https://gist.github.com/lkysow/0fe45f25c9817381e131.js

Leap Motion

Mishra is the Leap Motion pro and he integrated Leap Motion with our project. He wrote a node app that listened to the leap motion gestures and pushed the commands down to all the connected clients using websockets. Leap Motion provides some built-in gestures but in order to get the direction of the swipe, he had to do some math. Because Leap Motion runs at 300 fps, we had to debounce the events before we sent them down to the clients.

Clients

We hosted a simple HTML page that clients would load. The Cover Flow display was done using a Jquery plugin. JavaScript on the page connected to the node app via web sockets. Clients would also query the AJAX API to get the list of videos so they could render their display. The clients then just waited for the gesture commands to come down over the websocket and then they would update their displays depending on the gestures.

https://gist.github.com/lkysow/7b3c7850050761c27673.js

Future

I think that sharing experiences and memories is something we all desire to do. Being able to do so effortlessly is the future. This project was an attempt to get somewhere near that idea. Stay tuned! Luke

About the Author: Luke Kysow is a guy who does some stuff. He’s a software engineer on the Hootsuite Platform team. Follow him on Twitter at @lkysow.