Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ffmpegframerecorder.record() is taking long(sometimes 3-4s per frame) on android device #1693

Closed
streamcam7 opened this issue Sep 14, 2021 · 20 comments

Comments

@streamcam7
Copy link

I have modified the RecordActivity on samples and i am able to do a full live stream (to a RTMP server). I see that record(frame) method, continually gets worse. I have tried timestamping the call time and initially it takes < 30 ms to begin with but gradually it hovers around excess of 400ms and stays at that for a while.

As i let the rtmp stream continue for longer, i end up getting time in excess of >3000 to 4000ms.

The problem i have is the frame drops i have to manage as Camera is spitting out avg of 30fps. FFmpegframerecorder.record() doesn't keep up with the production rate.

I am not sure if i am doing anything wrong ? Is it normal for record() to take that much time. I am not applying any filters to the frame. I am also managing the production and consumption rates with a fixed buffer, which is fine.

But how do i get around the excessive time record() is taking ? What am i missing ? I will try the preset option to ultrafast to see if it helps my case.

I am using Android R (11) image. Any Ideas or pointers regarding this will be helpful

@saudet
Copy link
Member

saudet commented Sep 14, 2021 via email

@streamcam7
Copy link
Author

Thanks for the reply.

I am on JavaCV 1.5.6

Below is what i am using (i commented out platform as its increasing my apk size to close to 800+MBs)

//implementation group: 'org.bytedeco', name: 'javacv-platform', version: '1.5.6'
implementation group: 'org.bytedeco', name: 'javacv', version: '1.5.6'
implementation group: 'org.bytedeco', name: 'javacpp', version: '1.5.6', classifier: 'android-arm64'
implementation group: 'org.bytedeco', name: 'ffmpeg', version: '4.4-1.5.6', classifier: 'android-arm64'

@saudet
Copy link
Member

saudet commented Sep 15, 2021

Ok, thanks, could you run a profiler on that to see which method call is taking all that time?

@streamcam7
Copy link
Author

I am not sure what you mean by running profiler ? Do i have to enable certain logging ? FFmpegLogCallback.set() and some system props ?

Any pointers ? I'd be happy to look into this ..

@saudet
Copy link
Member

saudet commented Sep 15, 2021

That would be the "Method and function traces" here:
https://developer.android.com/studio/profile/cpu-profiler
Thanks!

@streamcam7
Copy link
Author

Thank you. I will look into this and get back with my observations. Much appreciated

@streamcam7
Copy link
Author

This is Summary of record () thread-
image

Also here is the exported trace file which i ran for about 15 secs (Approx)
cpu-art-20210915T134518.zip

A few points which may be significant for context (in case you needed more details): (Overall whole streaming does work, but i am after performance issues (potentially) from record() and VLC lag (which i think may improve if use preset to ultrafast)

  1. I am running 2 threads with executor (Audio and Video)
  2. My Code is significantly different from sample record activity as i am working with camera2 api
  3. I am receiving YUV_420_888 format from Camera, i am converting it to NV21 via native call before calling record(frame,NV21 format)
  4. I have a producer consumer topology for receiving and consuming frames
  5. i am not setting any timestamps on Audio frames/samples and simply calling
  6. i am setting video frame timestamp to when they are received and stored in a blocking queue
  7. If frames' time stamp is greater than that of recorder's time stamp, i set FrameRecorder's timestamp to frame's timestamp

here's the code to give clarity :

AudioThread:
while (mThreadRunning) {
if (mIsStreaming && mFrameRecorder != null) {
bufferReadResult = mAudioRecord.read(mAudioData.array(), 0, mAudioData.capacity());
mAudioData.limit(bufferReadResult);
if (bufferReadResult > 0) {
Log.i(APP_TAG, CLASS_NAME+": bufferReadResult: " + bufferReadResult);
try {
mFrameRecorder.recordSamples(mAudioData);
} catch (FFmpegFrameRecorder.Exception e) {
Log.e(APP_TAG, e.getMessage());
e.printStackTrace();
}
}
}
}

VideoThread:

if (mFrameRecorder != null) {
long timestamp = frame.getTimeStamp();
//Allow proper counting
synchronized (VideoRunnable.class) { // this is just a counter to track how many video frames are processed
FramesProcessed++; //since only 1 video thread is running it wont matter
}
if (timestamp > mFrameRecorder.getTimestamp()) {
mFrameRecorder.setTimestamp(timestamp);
}
try {
long startTime = System.currentTimeMillis();
mFrameRecorder.record(frame.getFrame(),Integer.parseInt(mPixFMT));
long endTime = System.currentTimeMillis();
long processTime = endTime - startTime;
mTotalFrameProcessTime = mTotalFrameProcessTime+ processTime;
Log.i(APP_TAG, CLASS_NAME+": Time taken for Frame recorder record() "
+ processTime + "ms"
+"<-- Thread ID:"+this.hashCode());
}catch (FFmpegFrameRecorder.Exception e) {
Log.e(APP_TAG,CLASS_NAME+": Error Recording Video:"+e.getMessage()
+": <-- Thread ID:"+this.hashCode());
e.printStackTrace();
}
}

Any help/pointer is much appreciated..

@streamcam7
Copy link
Author

I just noticed that Frame class has timestamp in microseconds, but i am assigning it the long value retrieved from System.currentTimeMillis(). I guess that would be problematic, wouldnt it ?

/** Timestamp of the frame creation in microseconds. */
public long timestamp;

@saudet
Copy link
Member

saudet commented Sep 15, 2021 via email

@streamcam7
Copy link
Author

streamcam7 commented Sep 15, 2021 via email

@streamcam7
Copy link
Author

streamcam7 commented Sep 20, 2021 via email

@saudet
Copy link
Member

saudet commented Sep 21, 2021

To encode in real time on Android devices, you'll probably need to use the hardware accelerator, but unfortunately, FFmpeg doesn't support that on Android, yet. You'll need to use directly the MediaCodec API from Android for that, see #945 (comment).

/cc @tmm1

@streamcam7
Copy link
Author

streamcam7 commented Sep 21, 2021 via email

@streamcam7
Copy link
Author

streamcam7 commented Oct 4, 2021 via email

@saudet
Copy link
Member

saudet commented Dec 3, 2021

Like I said above #1693 (comment), it's still possible to use MediaCodec from the application level in Java using the Android API.

@tmm1
Copy link

tmm1 commented Nov 20, 2022

There is some work being done to add MediaCodec based hardware-encoding to FFmpeg: https://patchwork.ffmpeg.org/project/ffmpeg/patch/tencent_81CDB8CFBE553E273C388C966A4D5D203007@qq.com/

@tmm1
Copy link

tmm1 commented Feb 28, 2023

FFmpeg 6.0 has been released and includes support for hardware encoding on Android

saudet added a commit to bytedeco/javacpp-presets that referenced this issue Mar 2, 2023
@saudet
Copy link
Member

saudet commented Mar 3, 2023

FFmpeg 6.0 has been released and includes support for hardware encoding on Android

Awesome! I've updated the presets and JavaCV.

@streamcam7 Please give it a try with the snapshots: http://bytedeco.org/builds/

@sprigogin
Copy link

What is ETA for the stable release of 6.0-1.5.9?

@saudet
Copy link
Member

saudet commented Jun 6, 2023

Version 1.5.9 has been released! Enjoy

@saudet saudet closed this as completed Jun 6, 2023
@saudet saudet removed the help wanted label Jun 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants