Android Live Streaming courtesy of JavaCV and FFMPEG

For the last little or should I say, long while I have been working on wrangling a solution for live streaming from Android that is both decent quality and extensible. For those of you interested, the litter in my GitHub account documents various previous attempts.

As far as I can tell, most folks that are streaming live video from Android are relying upon the video captured by the MediaRecorder and underlying classes and doing a bit of trickery with the file either while it sent to the server or on the server for distribution. This is fine but it doesn’t give hooks into the actual video frames before they are encoded and sent over the network.

Recently, I came across Samuel Audet’s amazing open source JavaCV project. In that project Samuel is wrapping FFMPEG’s underlying libraries (avcodec, avformat, and so on) using his equally amazing JavaCPP project to expose their functionality to any Java application.

Finally, after a few weeks of experimentation and little (actually a LOT) of help from Samuel himself, I have something working!

Running the live streaming app on a Galaxy Camera
Running the live streaming app on a Galaxy Camera

App and resulting stream on desktop via Wowza Media Server and Flash

There is a quick example of writing a file up on the JavaCV site which provides the foundation: https://code.google.com/p/javacv/source/browse/samples/RecordActivity.java

I have the beginnings of a full blown project (which needs some updating based on the above example) up on GitHub: https://github.com/vanevery/JavaCV_0.3_stream_test

68 thoughts on “Android Live Streaming courtesy of JavaCV and FFMPEG”

  1. A huge sposibo you for the article. It really helped me to develop my life стрименга on android devices.

  2. Thanks, very nice tutorial… it has saved my lot time..!! Thanks again.

    Can you please provide us some help in playing rtmp video at android side. This will be a great help from your side to other developers.

    Thanks.

  3. Your app works for me, but i am unable to understand how video will be uploaded to server and can be watched on server, can you please explain this. I want live streaming as android phone is recording and in the mean time server is displaying video,
    Mainly how to setup a server, i own a domain is that sufficient or i need to use a streaming server, if yes then which server and what are the steps

  4. Thanks for the nice post!!!!!

    This code works great….I’m able to successfully see the live video recorded from android to browser. But, the main problem is with sync of audio & video. I see that audio runs ahead of video, Do you find the same problem, vanevery?

    Thanks in advance….!

  5. Prezado, good day.
    I am trying to use your project “javaCV 03 Stream Test” with “udp:@/6504” but in some times it work in other times not work.
    It don’t give me 30 fps.
    I intend to go deeper into the code, but I can’t get the source code of the javacv, javacpp and ffmpeg correct for this project.
    All source code that I download, are incompatible with your project.
    You could send me the link or a mail of the source code that you used (javacv, javacpp and ffmpeg)?
    Regards, thank you.

  6. The performance will vary quite a bit from phone to phone as they are both memory constrained and typically have weak processors. I suggest that you go to the source for the source, no pun intended: https://code.google.com/p/javacv/ I didn’t build from source there, just used the latest available versions.

  7. Hi,
    i want to record a video from my app which runs on an Android phone and receive the streaming of the video on a tablet.
    now i have been going through a lot of stuff on the internet, i have successfully written the app where i receive the video on my tab,that was very straight forward process.
    i also have figured out which library to use,your post helped a lot.
    i need to know what changes have to be made at the server side?
    i want to know the procedure in which i want to shoot a video and stream it real time on other device.
    I need few tips and guidance in the right direction

  8. Hi,

    Which RTMP-server do you use? I testing with NGINX-rtmp library and it doesn’t publish it until you call recorder.stop() really weird.

    BR David

  9. Hi,
    i am using your JAVACV source code, i edit your ffmpeg link with my live streaming url but its not publishing video, when i press stop button notting happen..
    but if i pass a local directory path with file name test.flv.. it works fine…

  10. Thanks for the tutorial, I have a question
    Is this code to stream from Android to a browser ? I’m looking for a code to stream MJPEG from an IP camera to Android
    can anyone help

  11. Shawn – I owe you a few beers if you ever pass through Atlanta. I can’t even being to tell you how long I’ve been banging my head on a solution for streaming RTMP without using Adobe Air!

    @Chris check out SpyDroid. Most IP cameras will create an internal server and your clients just connect to that RTMP or RTSP stream. Good luck!

  12. Hi Shawn – Thanks for the good work. I downloaded and setup your project “https://github.com/vanevery/JavaCV_0.3_stream_test” Running it on the phone Samsung Galaxy S3 it is giving the following error.

    01-08 15:54:58.278: E/AndroidRuntime(29934): FATAL EXCEPTION: main
    01-08 15:54:58.278: E/AndroidRuntime(29934): java.lang.VerifyError: com/example/javacv/stream/test2/MainActivity
    01-08 15:54:58.278: E/AndroidRuntime(29934): at java.lang.Class.newInstanceImpl(Native Method)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at java.lang.Class.newInstance(Class.java:1319)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at android.app.Instrumentation.newActivity(Instrumentation.java:1057)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2015)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2125)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at android.app.ActivityThread.access$600(ActivityThread.java:140)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1227)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at android.os.Handler.dispatchMessage(Handler.java:99)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at android.os.Looper.loop(Looper.java:137)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at android.app.ActivityThread.main(ActivityThread.java:4898)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at java.lang.reflect.Method.invokeNative(Native Method)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at java.lang.reflect.Method.invoke(Method.java:511)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1006)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:773)
    01-08 15:54:58.278: E/AndroidRuntime(29934): at dalvik.system.NativeStart.main(Native Method)

    Thanks for the help.

  13. Hi,

    I’ve tried your JavaCV_0.3_stream_test. I can stream the audio to the server. But, I can’t stream the video. There is no error in the LogCat, so I don’t know what happen. Did you face this issue? Thx a lot.

  14. Hi,
    I am trying to get ffserver and ffmpeg working in streaming Webcam on a linux box. Another option is to use javacv to stream live video from Android phone. Can you share a ffserver.conf file and parameters for ffmpeg that works with javacv?

  15. Hello..! Base on this example I was able to stream android camera to rtmp server but I have a sync problem with audio and video. Did anyone face this and fix it? As I read some answers about this all say it’s about cpu overload. I tested on different devices with different quality, bitrates etc. It gets better sometimes but not sync. Thanks!

  16. hy everyone
    i want to know whether editing the “ffmpeg_link” with my own media server link will publish the video to the server or i need to do something else????
    thanks in advance……..

  17. Hello There,
    Nice tut. Please let me know whether this can be used with Red 5 or not as you have mentioned in one of you comments that you are using wowza for this and that is main thing. I am looking for any open source server. Red 5 seems good but doesnt support rtsp. and Wowza is paid. So please tell wether I can records video from my mobile device camera and show it live on other device conneted to the servre or to the web site with the help of red 5 or not? I have wasted much of time but never get any thing usefull uptill now.

  18. Khurram:i want to know whether editing the “ffmpeg_link” with my own media server link will publish the video to the server or i need to do something else????
    Ans:yes you waht to edit your own ip.which server your using?

  19. Hai…. thanks for your tutorial i sucessfully get the live stream after few works and some update to your codes,it helps me a lot,i really appreciate your work.thank you…..

  20. Hi Shawn Van Every & David,

    App is running proper , I also installed Wowza but don’t know to how to check here with Wowza, Please suggest or provide proper steps.

  21. Hi Shawn,

    First of all, I really appreciate that you have contributed your codes which helped me a lot and took me the right position so quickly.

    I wanted to say that it also works with Red5 server because people might be interested in if it does and how.
    I gave a minor change on the URL string as follows;

    private String ffmpeg_link =
    “rtmp://123.123.123.123:1935/oflaDemo/red5StreamDemo”;

    “oflaDemo” is a bundle demo application with Red5 and “red5StreamDemo” is the stream name for Simple Broadcaster/Subscriber Flash application.
    So, you can check if your code works by using Simple Subscriber application before you turn on the debug-switch of the server’s log writer.

    I’ve tested this grate sample with Red5 server 1.0.1.

    Thanks again!

  22. Hi every one i will give you some tips to config with wowza:
    *Add new application in the server with your app name.
    *Use the Adope Rtmp link shown in the test Player of server to modifi the ffmpeg link in code.

  23. Does not work on Nexus 7.
    Neither able to stream to rtmp server nor record on internal storage.

  24. I use your javaCV app.
    It works good. I have a question. streaming image so not good. when camera moving , Image has block noise.
    I would like to know How to update image quality.
    Most important thing is …… Block noise problem(when camera move to other position)
    I am not use English so my english is very poor. please understand.

  25. Thank you for your apply.
    I used your app, and desktop via wowza medeia server.
    and I watched stream video . smart phone and desktop.
    when I see the stream video, I use RTMP Player(android phone)
    and when I see the stream video , via to web browser flash player.

    pc image and smart phone image are same quality.

    when I streamming the video (not moving), streamming image quality is good.
    problem is moving image. Image has many block. (block noise?)

    so I modified some code. for example
    private int imageWidth = 620 -> 1280;
    private int imageHeight = 470 -> 720;
    private int frameRate = 24 ->30;

    yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
    ->yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
    but image is not changed.

    Thank you for listenning my voice.

  26. If I want to use ffplay to display the streaming video in the local Internet, I find that it does not work.
    I tried different formats of the ffmpeg_link, such as:
    rtmp://192.168.1.27:1935/livestream/12070660
    rtmp://live:live@192.168.1.27:8080/live/test.flv
    rtmp://192.168.1.27:1935

    I use ffplay to display:
    ffplay rtmp://192.168.1.27:1935/livestream/12070660

    The output error is:
    RTMP_Connect0, failed to connect socket. 113 (No route to host)/0
    rtmp://192.168.1.27:1935/livestream/12070660: Unknown error occurred

  27. App crashes frequently on stopping.
    To fix this FIRST set ” runAudioThread = false; ”
    Then ,stop and release AudioRecord,
    DON’T stop and release FfmpegFrameRecorder, set “recording = false;”(to stop things done in onPreviewCallBack function) and set “FfmpegFrameRecorder = null”

  28. Hi,
    I was able to recieve my cams stream at my red5 installation,
    works quite well! :)

    Now I’d like the stream not to be sent via rtmp:// but via http://,
    nothing special for ffmpeg, but it seems to be ‘special’ within this Java-ffmpeg installation, am i right?
    (Wasn’t able to get the stream working properly … )
    Any clues, what to do, or where to look for?

    Anyway, thanks for that app :)

  29. @Dheeraj Sachan
    I’m having the same problem here, but I’m not sure about how to solve it;
    what exactly did you change within MainActivity?

    I’ve found “runAudioThread” an set it to false,
    but after that, where shall i stop and release AudioRecord + how can i do that?
    Would you mind posting your changes?

    Thanks!

  30. Hi,

    I wonder now, How to the handle video by server client? who can send web side code for handle streaming video, and which technology is used.

    Thanks,

  31. @Mark
    I am posting modified code.Just few lines needed to be commented.Rest everything works fine.

    package com.example.javacv.stream.test2;

    import android.app.Activity;
    import android.content.Context;
    import android.content.pm.ActivityInfo;
    import android.graphics.Bitmap;
    import android.graphics.Canvas;
    import android.hardware.Camera;
    import android.hardware.Camera.PreviewCallback;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.PowerManager;
    import android.util.Log;
    import android.view.KeyEvent;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.View.OnClickListener;
    import android.widget.Button;
    import android.widget.LinearLayout;
    import com.googlecode.javacv.FFmpegFrameRecorder;
    import com.googlecode.javacv.cpp.opencv_core.IplImage;

    import java.io.IOException;
    import java.nio.ShortBuffer;

    import static com.googlecode.javacv.cpp.opencv_core.IPL_DEPTH_8U;

    public class MainActivity_old extends Activity implements OnClickListener {

    private final static String LOG_TAG = “MainActivity”;

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = “rtmp://yourserver/live/test.flv”;

    //private String ffmpeg_link = “/mnt/sdcard/new_stream.flv”;

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 10;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton;
    private LinearLayout mainLayout;

    @Override
    public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
    setContentView(R.layout.activity_main);

    initLayout();
    initRecorder();
    }

    @Override
    protected void onResume() {
    super.onResume();

    if (mWakeLock == null) {
    PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
    mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
    mWakeLock.acquire();
    }
    }

    @Override
    protected void onPause() {
    super.onPause();

    if (mWakeLock != null) {
    mWakeLock.release();
    mWakeLock = null;
    }
    }

    @Override
    protected void onDestroy() {
    super.onDestroy();

    recording = false;
    }

    private void initLayout() {

    mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

    recordButton = (Button) findViewById(R.id.recorder_control);
    recordButton.setText(“Start”);
    recordButton.setOnClickListener(this);

    cameraView = new CameraView(this);

    LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);
    mainLayout.addView(cameraView, layoutParam);
    Log.v(LOG_TAG, “added cameraView to mainLayout”);
    }

    private void initRecorder() {
    Log.w(LOG_TAG, “initRecorder”);

    if (yuvIplimage == null) {
    // Recreated after frame size is set in surface change method
    yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
    //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);

    Log.v(LOG_TAG, “IplImage.create”);
    }

    recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
    Log.v(LOG_TAG, “FFmpegFrameRecorder: ” + ffmpeg_link + ” imageWidth: ” + imageWidth + ” imageHeight ” + imageHeight);

    recorder.setFormat(“flv”);
    Log.v(LOG_TAG, “recorder.setFormat(\”flv\”)”);

    recorder.setSampleRate(sampleAudioRateInHz);
    Log.v(LOG_TAG, “recorder.setSampleRate(sampleAudioRateInHz)”);

    // re-set in the surface changed method as well
    recorder.setFrameRate(frameRate);
    Log.v(LOG_TAG, “recorder.setFrameRate(frameRate)”);

    // Create audio recording thread
    audioRecordRunnable = new AudioRecordRunnable();
    audioThread = new Thread(audioRecordRunnable);
    }

    // Start the capture
    public void startRecording() {
    Log.i(LOG_TAG, “###################################################################33333333”);
    Log.i(LOG_TAG, “###################################################################33333333”);
    Log.i(LOG_TAG, “###################################################################33333333”);
    Log.i(LOG_TAG, “###################################################################33333333”);
    Log.i(LOG_TAG, “###################################################################33333333”);
    Log.i(LOG_TAG, “###################################################################33333333”);

    try {
    recorder.start();
    startTime = System.currentTimeMillis();
    recording = true;
    audioThread.start();
    } catch (Exception e) {
    Log.i(LOG_TAG, “CAUGHT EXCEPTIONNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN”);
    Log.i(LOG_TAG, “CAUGHT EXCEPTIONNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN”);
    Log.i(LOG_TAG, “CAUGHT EXCEPTIONNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN”);
    Log.i(LOG_TAG, “CAUGHT EXCEPTIONNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN”);
    Log.i(LOG_TAG, “CAUGHT EXCEPTIONNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN”);
    Log.i(LOG_TAG, “CAUGHT EXCEPTIONNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN”);

    e.printStackTrace();
    onDestroy();
    }
    }

    public void stopRecording() {
    Log.i(LOG_TAG, “++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++”);
    Log.i(LOG_TAG, “++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++”);
    Log.i(LOG_TAG, “++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++”);
    Log.i(LOG_TAG, “++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++”);
    Log.i(LOG_TAG, “++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++”);
    Log.i(LOG_TAG, “++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++”);
    Log.i(LOG_TAG, “++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++”);
    Log.i(LOG_TAG, “++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++”);

    // This should stop the audio thread from running
    runAudioThread = false;

    if (recorder != null && recording) {
    recording = false;
    Log.v(LOG_TAG, “Finishing recording, calling stop and release on recorder”);
    /*try {
    recorder.stop();
    recorder.release();
    } catch (Exception e) {
    e.printStackTrace();
    onDestroy();
    }*/
    recorder = null;
    }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
    // Quit when back button is pushed
    if (keyCode == KeyEvent.KEYCODE_BACK) {
    if (recording) {
    stopRecording();
    }
    finish();
    return true;
    }
    return super.onKeyDown(keyCode, event);
    }

    @Override
    public void onClick(View v) {
    if (!recording) {
    startRecording();
    Log.w(LOG_TAG, “Start Button Pushed”);
    recordButton.setText(“Stop”);
    } else {
    stopRecording();
    Log.w(LOG_TAG, “Stop Button Pushed”);
    recordButton.setText(“Start”);
    }
    }

    //———————————————
    // audio thread, gets and encodes audio data
    //———————————————
    class AudioRecordRunnable implements Runnable {

    @Override
    public void run() {
    // Set the thread priority
    android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

    // Audio
    int bufferSize;
    short[] audioData;
    int bufferReadResult;

    bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
    AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
    audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
    AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

    audioData = new short[bufferSize];

    Log.d(LOG_TAG, “audioRecord.startRecording()”);
    audioRecord.startRecording();

    // Audio Capture/Encoding Loop
    while (runAudioThread) {
    // Read from audioRecord
    bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
    if (bufferReadResult > 0) {
    //Log.v(LOG_TAG,”audioRecord bufferReadResult: ” + bufferReadResult);

    // Changes in this variable may not be picked up despite it being “volatile”
    if (recording) {
    try {
    // Write to FFmpegFrameRecorder
    recorder.record(ShortBuffer.wrap(audioData, 0, bufferReadResult));
    } catch (Exception e2) {
    Log.v(LOG_TAG, e2.getMessage());
    e2.printStackTrace();
    onDestroy();
    }
    }
    }
    }
    Log.v(LOG_TAG, “AudioThread Finished”);

    /* Capture/Encoding finished, release recorder */

    if (audioRecord != null) {
    audioRecord.stop();
    audioRecord.release();
    audioRecord = null;
    Log.v(LOG_TAG, “audioRecord released”);
    }
    }
    }

    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

    private boolean previewRunning = false;

    private SurfaceHolder holder;
    private Camera camera;

    private byte[] previewBuffer;

    long videoTimestamp = 0;

    Bitmap bitmap;
    Canvas canvas;

    public CameraView(Context _context) {
    super(_context);

    holder = this.getHolder();
    holder.addCallback(this);
    holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
    camera = Camera.open();

    try {
    camera.setPreviewDisplay(holder);
    camera.setPreviewCallback(this);

    Camera.Parameters currentParams = camera.getParameters();
    Log.v(LOG_TAG, “Preview Framerate: ” + currentParams.getPreviewFrameRate());
    Log.v(LOG_TAG, “Preview imageWidth: ” + currentParams.getPreviewSize().width + ” imageHeight: ” + currentParams.getPreviewSize().height);

    // Use these values
    imageWidth = currentParams.getPreviewSize().width;
    imageHeight = currentParams.getPreviewSize().height;
    frameRate = currentParams.getPreviewFrameRate();

    bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);

    Log.v(LOG_TAG,”Creating previewBuffer size: ” + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
    previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
    camera.addCallbackBuffer(previewBuffer);
    camera.setPreviewCallbackWithBuffer(this);

    camera.startPreview();
    previewRunning = true;
    } catch (IOException e) {
    Log.v(LOG_TAG, e.getMessage());
    e.printStackTrace();
    onDestroy();
    }
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    Log.v(LOG_TAG, “Surface Changed: width ” + width + ” height: ” + height);

    // We would do this if we want to reset the camera parameters

    if (!recording) {
    if (previewRunning){
    camera.stopPreview();
    }

    try {
    //Camera.Parameters cameraParameters = camera.getParameters();
    //p.setPreviewSize(imageWidth, imageHeight);
    //p.setPreviewFrameRate(frameRate);
    //camera.setParameters(cameraParameters);

    camera.setPreviewDisplay(holder);
    camera.startPreview();
    previewRunning = true;
    }
    catch (IOException e) {
    Log.e(LOG_TAG,e.getMessage());
    e.printStackTrace();
    }
    }

    // Get the current parameters
    Camera.Parameters currentParams = camera.getParameters();
    Log.v(LOG_TAG, “Preview Framerate: ” + currentParams.getPreviewFrameRate());
    Log.v(LOG_TAG, “Preview imageWidth: ” + currentParams.getPreviewSize().width + ” imageHeight: ” + currentParams.getPreviewSize().height);

    // Use these values
    imageWidth = currentParams.getPreviewSize().width;
    imageHeight = currentParams.getPreviewSize().height;
    frameRate = currentParams.getPreviewFrameRate();

    // Create the yuvIplimage if needed
    yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
    //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
    try {
    camera.setPreviewCallback(null);

    previewRunning = false;
    camera.release();

    } catch (RuntimeException e) {
    Log.v(LOG_TAG, e.getMessage());
    e.printStackTrace();
    }
    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {

    if (yuvIplimage != null && recording) {
    videoTimestamp = 1000 * (System.currentTimeMillis() – startTime);

    // Put the camera preview frame right into the yuvIplimage object
    yuvIplimage.getByteBuffer().put(data);

    // FAQ about IplImage:
    // – For custom raw processing of data, getByteBuffer() returns an NIO direct
    // buffer wrapped around the memory pointed by imageData, and under Android we can
    // also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
    // – To get a BufferedImage from an IplImage, we may call getBufferedImage().
    // – The createFrom() factory method can construct an IplImage from a BufferedImage.
    // – There are also a few copy*() methods for BufferedImageIplImage data transfers.

    // Let’s try it..
    // This works but only on transparency
    // Need to find the right Bitmap and IplImage matching types

    bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
    //bitmap.setPixel(10,10,Color.MAGENTA);

    canvas = new Canvas(bitmap);
    Paint paint = new Paint();
    paint.setColor(Color.GREEN);
    float leftx = 20;
    float topy = 20;
    float rightx = 50;
    float bottomy = 100;
    RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
    canvas.drawRect(rectangle, paint);

    bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());

    //Log.v(LOG_TAG,”Writing Frame”);

    try {

    // Get the correct time
    recorder.setTimestamp(videoTimestamp);

    // Record the image into FFmpegFrameRecorder
    recorder.record(yuvIplimage);

    } catch (FFmpegFrameRecorder.Exception e) {
    Log.v(LOG_TAG, e.getMessage());
    e.printStackTrace();
    }
    }
    }
    }
    }

  32. Hi,
    Me too enable to stream video from android device to vlc/ffplay.
    What should I use in new FFmpegFrameRecorder(“path”,w,h,1);
    What is path? Is it my android phone ip, or my laptop ip?
    I am using same network.

  33. Hi i want to add switch camera with this can any one give some example.Thank you

  34. hi,
    i am trying to see the live stream in red5 server using this code. I tried changing the link
    private String ffmpeg_link=“rtmp://127.0.0.1:1935/oflaDemo/red5StreamDemo
    i can not see any stream in red5 oflademo or subscriber demos? can somebody help me how to achieve this. do i need to write any client app for red5 or does it automatically push the stream to my red5server which is on my localhost 127.0.0.1.
    I read from the comment of “Yoshio Numai” that he achieved this. but how to do this?

  35. Hi roopa, you can achieve this on any rtmp server,find and give the correct ip and port address and disable your firewall, you don’t need to write any clients

  36. Hi. your document is very valuable for me. Thank you.
    By the way, In your code, I try to change H.263 to H.264 as below

    before :
    recorder.setFormat(“flv”);

    change:
    recorder.setVideoCodec(28); //AV_CODEC_ID_H264 = 28,
    recorder.setFormat(“flv”);
    recorder.setPixelFormat(0);// PIX_FMT_YUV420P = 0, //

    But, Wowza coudn’t recognize H.264 codec. Does your .so library contains h264?

    Thank you in advance.

  37. Hello everybody:
    I am very interested in knowing how to stream live video and audio from my android to my website. I must confess that I have search about it and I found reading your comments that this is a very good app; that almost everybody could work with it.
    Trying to get this app I have downloaded from the above link: “https://code.google.com/p/javacv/source/browse/samples/RecordActivity.java”; a 1mb .zip with some files inside; and I really dont know what to do with any of those file. I think I can chage the ffmpeg_link to my server address but anything else. Please I need some help because I dont know where to set the server, what to do with those file, what to do on my hosting… I just want to stream live video and audio to my website.
    Thanks in advance.

  38. Hi peter are you new to android?this is an android project you need eclipse to work with this,install eclipse then select file>import>Generel>Archive File>Next>Browse the zipfile which you have download from the link>the file will be imported to the eclipse then go to the src folder and open MainActivity.java class you will find the ffmpeg link there.

  39. Hi Giri:
    thanks for the help. Yes I am very new to android, to java, to eclipse…
    I did what you said but didnt work. I´m sorry to ask you this but can you be a little more specific with your explanation? That 1mb zip I downloaded and must import, Eclipse ask me for where to import the file… Thanks again

  40. Help me everyone.
    I don`t know how to use this app.
    where I can see recording video…
    Let me know URL plz…

  41. Have you ever thought about publishing an ebook or guest authoring on other websites?
    I have a blog based upon on the same topics you discuss and
    would really like to have you share some stories/information. I know my subscribers would value your work.
    If you’re even remotely interested, feel free to shoot
    me an email.

    Take a look at my weblog … Google

  42. Hello, thank you for the example code. I’m just wondering if anyone has used this method to stream via RTSP/RTP instead of RTMP

Leave a Reply

Your email address will not be published. Required fields are marked *