Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Image Dislocation problem encountered using javacv when recording on android,please help. #224

Closed
ProgrammerYuan opened this issue Sep 18, 2015 · 4 comments

Comments

@ProgrammerYuan
Copy link

Hello,Samuel:
First,really thank you for developing such a brilliant project.After wrestling with original ffmpeg for months,JavaCV just like a GodSend to me.
Here’s the problem, I 'm developing a simple app which enable every one to record and share their shining life moment with others in least steps. I integrated JavaCV Library(including ffmpeg 2.6) in my project.But when I play the video recorded, i find that in the video, a strip of frame which should appear on the right of the screen appears on the left side,if you don't understand what i just say, please see the image follow.
ffmpeg_example

The image above is a part of a frame extracted from the video, forgive me for cropping it because my friend don't like to be famous,LOL.But I believe you can get my point with it.
Also i rotate the video orientation and record again, but the disloaction still appears on the left.SO I'm thinking this is because I'm not using ffmpeg right.
Please help me...

and the option I set as below:
video_format:mp4
video_size:864 * 480
video_codec:H264
video_bitrate:900k
frame_rate:18

Here's my code:
public class JavaCVRecordeActivity extends Activity implements OnClickListener, SurfaceHolder.Callback, PreviewCallback {

private final static String CLASS_LABEL = "RecordActivity";
private final static String LOG_TAG = CLASS_LABEL;

private PowerManager.WakeLock mWakeLock;

private String ffmpeg_link = "/mnt/sdcard/stream.mp4";

long startTime = 0;
boolean recording = false;

private FFmpegFrameRecorder recorder;

private int sampleAudioRateInHz = 44100;
private int imageWidth = 320;
private int imageHeight = 240;
private int frameRate = 20;

/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;

/* video data getting thread */
private Camera cameraDevice;
private Frame yuvImage = null;
private Button btnRecorderControl, btnEffectChange, btnSnap;

List<String> effects = new ArrayList<>();
int index = 0;
/* The number of seconds in the continuous record loop (or 0 to disable loop). */
final int RECORD_LENGTH = 10;
private final int ARRAY_LENGTH = RECORD_LENGTH * frameRate;
private final int roller_length = 3 * frameRate;
YuvImage image;
ArrayList<Frame> images;
int operate_index = 0;
ArrayList<Long> timestamps;
ShortBuffer[] samples;
int imagesIndex, samplesIndex;
SurfaceView surfaceView;
private boolean toSnap = false;

@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.javacv_main);
    initLayout();
}


@Override
protected void onResume() {
    super.onResume();
}

@Override
protected void onPause() {
    super.onPause();
}

@Override
protected void onDestroy() {
    super.onDestroy();

    recording = false;

    if (cameraDevice != null) {
        cameraDevice.stopPreview();
        cameraDevice.release();
        cameraDevice = null;
    }

    if (mWakeLock != null) {
        mWakeLock.release();
        mWakeLock = null;
    }
}


private void initLayout() {

    /* get size of screen */

    /* add control button: start and stop */
    btnRecorderControl = (Button) findViewById(R.id.recorder_control);
    btnRecorderControl.setText("Start");
    btnRecorderControl.setOnClickListener(this);
    btnEffectChange = (Button) findViewById(R.id.recorder_control2);
    btnEffectChange.setOnClickListener(new OnClickListener() {
        @Override
        public void onClick(View view) {
            String effect = effects.get(index++);
            index %= effects.size();
            Camera.Parameters parameters = mCamera.getParameters();
            parameters.setColorEffect(effect);
            mCamera.setParameters(parameters);
        }
    });
    btnSnap = (Button) findViewById(R.id.recorder_control3);
    btnSnap.setOnClickListener(new OnClickListener() {
        @Override
        public void onClick(View view) {
            toSnap = true;
        }
    });
    surfaceView = (SurfaceView) findViewById(R.id.surface);
    SurfaceHolder holder = surfaceView.getHolder();
    holder.addCallback(this);
    Log.i(LOG_TAG, "cameara preview start: OK");
}

//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {

    Log.w(LOG_TAG, "init recorder");

    if (RECORD_LENGTH > 0) {
        imagesIndex = 0;
        images = new ArrayList<>();
        timestamps = new ArrayList<>();
    } else if (yuvImage == null) {
        yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
        Log.i(LOG_TAG, "create yuvImage");
    }

    Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
    recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 2);
    recorder.setFormat("mp4");
    recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
    recorder.setGopSize(frameRate);
    recorder.setAspectRatio(9 / 16);
    recorder.setVideoBitrate(90 * 10000);
    recorder.setSampleRate(sampleAudioRateInHz);
    // Set in the surface changed method
    recorder.setFrameRate(frameRate);
    recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);

    Log.i(LOG_TAG, "recorder initialize success");

    audioRecordRunnable = new AudioRecordRunnable();
    audioThread = new Thread(audioRecordRunnable);
    runAudioThread = true;
}

public void startRecording() {

    initRecorder();

    try {
        recorder.start();
        startTime = System.currentTimeMillis();
        recording = true;
        audioThread.start();
        recorderThread.start();

    } catch (FFmpegFrameRecorder.Exception e) {
        e.printStackTrace();
    }
}

public void stopRecording() {


    runAudioThread = false;
    try {
        audioThread.join();
    } catch (InterruptedException e) {
        e.printStackTrace();
    }
    audioRecordRunnable = null;
    audioThread = null;

    if (recorder != null && recording) {
        recording = false;
        Log.v(LOG_TAG, "Finishing recording, calling stop and release on recorder");
    }
}

@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {

    if (keyCode == KeyEvent.KEYCODE_BACK) {
        if (recording) {
            stopRecording();
        }

        finish();

        return true;
    }

    return super.onKeyDown(keyCode, event);
}


//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {

    @Override
    public void run() {
        android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

        // Audio
        int bufferSize;
        ShortBuffer audioData;
        int bufferReadResult;

        bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
        audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

        if (RECORD_LENGTH > 0) {
            samplesIndex = 0;
            samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
            for (int i = 0; i < samples.length; i++) {
                samples[i] = ShortBuffer.allocate(bufferSize);
            }
        } else {
            audioData = ShortBuffer.allocate(bufferSize);
        }

        Log.d(LOG_TAG, "audioRecord.startRecording()");
        audioRecord.startRecording();

        /* ffmpeg_audio encoding loop */
        while (runAudioThread) {
            if (RECORD_LENGTH > 0) {
                audioData = samples[samplesIndex++ % samples.length];
                audioData.position(0).limit(0);
            }
            //Log.v(LOG_TAG,"recording? " + recording);
            bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
            audioData.limit(bufferReadResult);
            if (bufferReadResult > 0) {
                Log.v(LOG_TAG, "bufferReadResult: " + bufferReadResult);
                // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                // Why?  Good question...
                if (recording) {
                    if (RECORD_LENGTH <= 0) try {
                        recorder.recordSamples(audioData);
                        //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                    } catch (FFmpegFrameRecorder.Exception e) {
                        Log.v(LOG_TAG, e.getMessage());
                        e.printStackTrace();
                    }
                }
            }
        }
        Log.v(LOG_TAG, "AudioThread Finished, release audioRecord");

        /* encoding finish, release recorder */
        if (audioRecord != null) {
            audioRecord.stop();
            audioRecord.release();
            audioRecord = null;
            Log.v(LOG_TAG, "audioRecord released");
        }
    }
}

//---------------------------------------------
// camera thread, gets and encodes video data
//---------------------------------------------

private SurfaceHolder mHolder;
private Camera mCamera;
private Camera.Size size;
private boolean isCameraBack = true;

@Override
public void surfaceCreated(SurfaceHolder mHolder) {
    this.mHolder = mHolder;

    try {
        if (isCameraBack) {
            mCamera = Camera.open(Camera.CameraInfo.CAMERA_FACING_BACK);
        } else {
            mCamera = Camera.open(Camera.CameraInfo.CAMERA_FACING_FRONT);
        }
        mCamera.setDisplayOrientation(90);
        mCamera.setPreviewCallback(this);
        Camera.Size pictureSize = null;
        Camera.Size previewSize = null;
        Camera.Parameters parameters = mCamera.getParameters();

        List<Camera.Size> supportedPreviewSizes
                = SupportedSizesReflect.getSupportedPreviewSizes(parameters);
        List<Camera.Size> supportedVideoSizes = parameters.getSupportedVideoSizes();
        if (supportedPreviewSizes != null && supportedPreviewSizes.size() > 0) {
            previewSize = getOptimalPreviewSize(supportedPreviewSizes, 1180, 720);

        }
        if (supportedVideoSizes != null && supportedVideoSizes.size() > 0) {
            size = getOptimalPreviewSize(supportedVideoSizes, 1180, 720);
        }
        if (size == null) size = previewSize;
        imageWidth = previewSize.height;
        imageHeight = previewSize.width;
        Logger.out(previewSize.width + "*" + previewSize.height);
        parameters.setPreviewSize(previewSize.width, previewSize.height);
        parameters.setRecordingHint(true);
        parameters.setPictureFormat(ImageFormat.NV21);
        parameters.setPreviewFormat(ImageFormat.NV21);
        Logger.out("PREVIEW_FORMATS:" + parameters.getSupportedPreviewFormats());
        effects = parameters.getSupportedColorEffects();
        mCamera.setParameters(parameters);
        mCamera.setPreviewDisplay(mHolder);
        mCamera.startPreview();
    } catch (Exception e) {
        e.printStackTrace();
        return;
    }
}


private Camera.Size getOptimalPreviewSize(List<Camera.Size> sizes, int w, int h) {
    final double ASPECT_TOLERANCE = 0.1;
    double targetRatio = (double) w / h;
    if (sizes == null) return null;

    Camera.Size optimalSize = null;
    double minDiff = Double.MAX_VALUE;

    int targetHeight = h;

    // Try to find an size match aspect ratio and size
    for (Camera.Size size : sizes) {
        double ratio = (double) size.width / size.height;
        if (Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE) continue;
        if (Math.abs(size.height - targetHeight) < minDiff) {
            optimalSize = size;
            minDiff = Math.abs(size.height - targetHeight);
        }
    }

    // Cannot find the one match the aspect ratio, ignore the requirement
    if (optimalSize == null) {
        minDiff = Double.MAX_VALUE;
        for (Camera.Size size : sizes) {
            if (Math.abs(size.height - targetHeight) < minDiff) {
                optimalSize = size;
                minDiff = Math.abs(size.height - targetHeight);
            }
        }
    }
    return optimalSize;
}

@Override
public void surfaceChanged(SurfaceHolder mHolder, int i, int i1, int i2) {
    this.mHolder = mHolder;
}

@Override
public void surfaceDestroyed(SurfaceHolder mHolder) {
    if (mCamera != null) {
        mCamera.setPreviewCallback(null);
        mCamera.stopPreview();
        mCamera.release();
        mCamera = null;
    }

    mHolder = null;
    if (mHolder != null) {
        mHolder = null;
    }
}

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    if (toSnap) {
        toSnap = false;
        image = new YuvImage(data, ImageFormat.NV21, imageHeight, imageWidth, null);
        try {
            FileOutputStream fileOutputStream = new FileOutputStream(new File("/sdcard/test.jpg"));
            image.compressToJpeg(new Rect(0, 0, imageHeight, imageWidth), 100, fileOutputStream);
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        }
    }
    if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
        startTime = System.currentTimeMillis();
        return;
    }
    if (RECORD_LENGTH > 0) {
        int i = imagesIndex++ % roller_length;
        yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
        images.add(yuvImage);
        timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
    }
        /* get video data */
    if (recording) {
        yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
        ((ByteBuffer) yuvImage.image[0].position(0)).put(rotateYUV420Degree90(data, imageHeight, imageWidth));
        timestamps.add(1000 * (System.currentTimeMillis() - startTime));
        images.add(yuvImage);

        if (RECORD_LENGTH <= 0) try {
            Log.v(LOG_TAG, "Writing Frame");
            long t = 1000 * (System.currentTimeMillis() - startTime);
            if (t > recorder.getTimestamp()) {
                recorder.setTimestamp(t);
            }
            recorder.record(yuvImage);
        } catch (FFmpegFrameRecorder.Exception e) {
            Log.v(LOG_TAG, e.getMessage());
            e.printStackTrace();
        }
    }
}

@Override
public void onClick(View v) {
    if (!recording) {
        startRecording();
        Log.w(LOG_TAG, "Start Button Pushed");
        btnRecorderControl.setText("Stop");
    } else {
        // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
        stopRecording();
        Log.w(LOG_TAG, "Stop Button Pushed");
        btnRecorderControl.setText("Start");
    }
}

Thread recorderThread = new Thread() {
    int index = 0;

    @Override
    public void run() {
        Logger.out("RECORD_THREAD_START");
        while (images.size() != 0 || recording) {
            try {
                if (images.size() > 0) {
                    Logger.out("record_start_" + index);
                    long t = timestamps.get(0);
                    if (t >= 0) {
                        if (t > recorder.getTimestamp()) {
                            recorder.setTimestamp(t);
                        }
                        recorder.record(images.get(0));
                    }
                    timestamps.remove(0);
                    images.remove(0);
                    Logger.out("record_end_" + index++);
                }
            } catch (FrameRecorder.Exception e) {
                e.printStackTrace();
            }
        }
        if (!recording) {
            try {
                recorder.stop();
                recorder.release();
            } catch (FFmpegFrameRecorder.Exception e) {
                e.printStackTrace();
            }
            recorder = null;
            Logger.out("FFMPEGRECORDER_RELEASED");
            startUploadingVideo(ffmpeg_link);
        }

    }
};
@ProgrammerYuan
Copy link
Author

Also, when I want to convert nv21 format frame to YUV420P,it goes wrong, the whole picture becomes grey,green and pink...

@saudet
Copy link
Member

saudet commented Sep 20, 2015

This sounds like a duplicate of #190. Try to use the latest version of JavaCPP 1.0.1-SNAPSHOT and let me know if that fixes the issue or not, thanks!

@ProgrammerYuan
Copy link
Author

@saudet sorry for my imprudence and thank you for a thousand times, I upgraded the javacpp.jar and my app works fine now. But I noticed that you modified the Constructors of the pointer classes to fix this problem, is it absolutely safe to user array_offset funtion on every kinds of framework of android?

@saudet
Copy link
Member

saudet commented Sep 21, 2015 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants