Google Glass Circular Video Buffer #throughglass

Having recently had a chance to get hacky with Google Glass, I put together an app that I have been wanting for quite a while. For now it is called Circular Buffer. Essentially it is a 15 to 20 circular frame buffer (Glass camera preview frames). Taping on the “Save” button writes out the frames to an MP4 with H.264 video using JavaCV giving you a video file with the last 15 or 20 seconds #throughglass. (Right now the video files do not have sound and are stored on the “sdcard” of the device. That means that you have to connect via USB and use the “adb” tool to download the files.)

This is a “native” Android app rather than the approved of method of writing Glass apps using the Mirror API. In order to use it, you have to either launch it through the Android developer tools (adb) or install “launchy” (

Test Video

Interested in trying it out or building on it, go for it:

they wear their computers on their bodies

Gargoyles represent the embarrassing side of the Central Intelligence Corporation. Instead of using laptops, they wear their computers on their bodies, broken up into separate modules that hang on the waist, on the back, on the headset. They serve as human surveillance devices, recording everything that happens around them. Nothing looks stupider; these getups are the modern-day equivalent of the slide-rule scabbard or the calculator pouch on the belt, marking the user as belonging to a class that is at once above and far below human society.

Snow Crash by Neal Stephenson

Android Live Streaming courtesy of JavaCV and FFMPEG

For the last little or should I say, long while I have been working on wrangling a solution for live streaming from Android that is both decent quality and extensible. For those of you interested, the litter in my GitHub account documents various previous attempts.

As far as I can tell, most folks that are streaming live video from Android are relying upon the video captured by the MediaRecorder and underlying classes and doing a bit of trickery with the file either while it sent to the server or on the server for distribution. This is fine but it doesn’t give hooks into the actual video frames before they are encoded and sent over the network.

Recently, I came across Samuel Audet’s amazing open source JavaCV project. In that project Samuel is wrapping FFMPEG’s underlying libraries (avcodec, avformat, and so on) using his equally amazing JavaCPP project to expose their functionality to any Java application.

Finally, after a few weeks of experimentation and little (actually a LOT) of help from Samuel himself, I have something working!

Running the live streaming app on a Galaxy Camera
Running the live streaming app on a Galaxy Camera

App and resulting stream on desktop via Wowza Media Server and Flash

There is a quick example of writing a file up on the JavaCV site which provides the foundation:

I have the beginnings of a full blown project (which needs some updating based on the above example) up on GitHub:

Local Report 2012 | Creative Time Reports

The piece that I helped Robert Whitman create is up on Creative Time.  

We had 90 or so callers send in video and make phone calls via a custom iPhone app, Android app and regular phone number over the course of an hour.

Check it out:

Obscura Cam now in the Android market!

I am happy to report that the app I have been working on in collaboration with Witness and the Guardian Project is now available in the Android Market.

The app, Obscura Cam is the outcome of the first phase of our
Secure Smart Cam
project to create smart phone camera software which allows for greater privacy and security in the capturing and sharing of media.

Of course, it is all open source