Video Comments – Revisited

A few years ago, John Schimmel and I worked on an in-time commenting system for video. Specifically we made a WordPress Plugin that interfaced with the built-in WordPress commenting system including user authentication, spam prevention, and so on.

Unfortunately, it no longer works out of the box because we used the QuickTime plugin for video and support for that is waning in the browser space.

Yesterday, I did a quick and dirty update to allow the plugin to use HTML5 video rather than QuickTime. To my delight, it mostly works: Video Commenting Test (try in Safari or Chrome as the video is MP4/h.264).

What still needs to be done is to update the Admin interface to allow multiple video sources and mime type selections for HTML5 video and removing the QuickTime specific portions.

Also, I would love to put an HTML5 canvas on top of this and let people make spacial in-time!

If you are interested, I put it up on GitHub (make sure you use the html5 branch). Pull requests are welcome!

Android Live Streaming courtesy of JavaCV and FFMPEG

For the last little or should I say, long while I have been working on wrangling a solution for live streaming from Android that is both decent quality and extensible. For those of you interested, the litter in my GitHub account documents various previous attempts.

As far as I can tell, most folks that are streaming live video from Android are relying upon the video captured by the MediaRecorder and underlying classes and doing a bit of trickery with the file either while it sent to the server or on the server for distribution. This is fine but it doesn’t give hooks into the actual video frames before they are encoded and sent over the network.

Recently, I came across Samuel Audet’s amazing open source JavaCV project. In that project Samuel is wrapping FFMPEG’s underlying libraries (avcodec, avformat, and so on) using his equally amazing JavaCPP project to expose their functionality to any Java application.

Finally, after a few weeks of experimentation and little (actually a LOT) of help from Samuel himself, I have something working!

Running the live streaming app on a Galaxy Camera
Running the live streaming app on a Galaxy Camera

App and resulting stream on desktop via Wowza Media Server and Flash

There is a quick example of writing a file up on the JavaCV site which provides the foundation: https://code.google.com/p/javacv/source/browse/samples/RecordActivity.java

I have the beginnings of a full blown project (which needs some updating based on the above example) up on GitHub: https://github.com/vanevery/JavaCV_0.3_stream_test

Re: Networked Video in 10 Years : Networked Video == Parseable Video | Not sLop

Interesting, I just got some comment spam on this post from January 2007: Networked Video in 10 Years : Networked Video == Parseable Video | Not sLop.

In the post, I describe the proceedings from a breakout group at that year’s Beyond Broadcast conference.  My conclusion was that online video needs to be more than just video online, that it needs to be parseable (indexed and hyper-linkable and so on).

Unfortunately, for the most part, online video now, is pretty much the same as it was then.  Typically it exists on a web server as a file, is embedded in a web page with a bit of textual information around it and that’s it.  Not a lot of interactivity or time based meta-data as part of it.  Certainly not parseable in the way described in the post.  No easy way to link to specific content or to associate content on the page with any particular point in time in the video.

Fortunately, while that is still mostly the case, it isn’t always the case.  The good folks at Mozilla have been working on an open source JavaScript library called Popcorn.js that allows any time based media (audio/video) to execute code, manipulate a page, display other content and so on.   They have even created a GUI interface so you don’t have to be a JavaScript programmer in order to take advantage.  Nice!

I spent last week, during ITP’s Teach Yourself JavaScript Together, getting familiar with Popcorn and then gave a workshop that showed it (as part of an overall HTML5/JavaScript media workshop).  If you are interested, here is the talk (jump in a little more than 2 minutes):

YouTube Link and the notes are here.

Bringing in a generic H.264 stream to Wirecast (via Wowza and Wirecast’s Generic IP Camera support)

Wirecast is truly a studio in a box. It has a great support for multiple cameras, mixing live and recorded sources, graphic overlays and so on. Recent versions even allow you to bring in live feeds from IP cameras including support for specific Axis cameras.

Since I am a big fan of IP cameras and Axis in particular this is great news. Unfortunately Wirecast doesn’t have direct support for most models and I had to dig quite a bit to get things to work using their “Generic” IP camera support.

First test was to get a straight H.264 encoded into Wowza and then out to Wirecast. To do this, I used the Flash Media Live Encoder and set it publish to “rtmp://localhost:1935/img” (I have Wowza running on my local machine and an application called “img” which is a copy of Wowza’s “live” application). I set the Stream name in FMLE to “media.sav” which is what Wirecast is looking for by default.

In Wirecast’s Source Settings, I added a new IP Camera and set it’s IP address to: “127.0.0.1:1935” and choose “Generic” as the type.

Viola! It works, the video is being captured and encoded by FMLE, sent to Wowza and pulled into Wirecast as a Generic IP camera. In this manner, I can have live cameras via FMLE from anywhere in the world brought live into my final stream.

(Big thanks to Steve McFarlin the developer of the LiveU iPhone broadcasting app for his post on Wirecast’s Forum detailing how he got his software working.)

Local Report 2012 | Creative Time Reports

The piece that I helped Robert Whitman create is up on Creative Time.  

We had 90 or so callers send in video and make phone calls via a custom iPhone app, Android app and regular phone number over the course of an hour.

Check it out: http://creativetimereports.org/2012/10/18/local-report/

Obscura Cam now in the Android market!

I am happy to report that the app I have been working on in collaboration with Witness and the Guardian Project is now available in the Android Market.

The app, Obscura Cam is the outcome of the first phase of our
Secure Smart Cam
project to create smart phone camera software which allows for greater privacy and security in the capturing and sharing of media.

Of course, it is all open source

Robert Whitman’s ‘Passport,’ in Two States at Once – NYTimes.com

Nice article in the Times about Whitman’s latest piece.  I built and ran the video network that connected the two locations for the performance.  Scenes from each location were transmitted and shown live in the other location.

Robert Whitman’s ‘Passport,’ in Two States at Once – NYTimes.com.

The Secure Smart Camera App for Human Rights Video : Video For Change :: A WITNESS blog

Bryan at WITNESS put up a blog post concerning the app that I am working on along with other Guardian folks.

The Secure Smart Camera App for Human Rights Video : Video For Change :: A WITNESS blog.

It’s worth a look if you are interested in the intersection of human rights, mobile technology and citizen media. It’s an open source Android project too!