New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DJI Mobile SDK real-time RTSP video streaming #817
Comments
Can you share crash logcat? |
@pedroSG94
I feel like the feed being sent is corrupted and not full frames are parsed. This is actually the best one I've seen.. fps is non existent as well.. like looking at a still image. |
Maybe your server need send audio frames to work. Somes server require video and audio frame to work or server will close connection in few seconds. To confirm that you can do this:
|
@pedroSG94 I am sensing the buffer from the drone is bad or the decoding is done badly. One off-topic question, can SDP file be generated with this library? |
I don't know if buffer is fine or not but maybe a good idea is decode buffer and re encode it to generate a valid buffer. |
Hello all, I am also trying to stream the raw drone data to an RTSP server. However, I am having difficulty regarding the keyframes of the video data. As per above, the condition Does anyone know how to set the |
Hello, I think that in your case you have a video buffer in annexb so it start with (0x00, 0x00, 0x00, 0x01 naltype) or (0x00, 0x00, 0x01 naltype). |
Thank for the quick reply! So I have managed to get the naltype by doing Seems like I am missing keyframe? Any idea what might be causing this? |
No exactly. This should be something like this: int naluIndex = UtilsKt.getVideoStartCodeSize(ByteBuffer.wrap(videoBuffer));
//For example: videoBuffer = 0x00, 0x00, 0x00, 0x01, nalByte, ... (in this case getVideoStartCodeSize return index 4)
int naluType = videoBuffer[naluIndex] & 0x1f; I suggest you print 5 first bytes in each buffer to know if you have this pattern and you are using annexb as I think |
naluType 7 = SPS. You can check it here: |
Hi got the same 1 and 7, can you please give some reference examples (code). |
Hi @pedroSG94 , thanks for this great library! graph TD
I("LaunchComponent.Init()") --> |1.Connect|R(RtspClient)-->S[RTSP Server]
I --> |2.Start Camera|A[H264 Video Stream] --> B("SDK.onReceive([]byte,size)")
B -->|convert2ByteBuffer|B -->|Publish h264 bytebuffer|R
|
Hello, Let me explain the reason about nalutype. H264 get video info using sps and pps (nalu type 7 and 8) and frecuently is also included in keyframes (nalu type 5). I was looking a bit in DJI SDK code and I did an alternative way re encoding h264 buffer because you are not getting video info. This is the example code. Maybe bugged (I can't test it). Test it and let me know if you have any problem (read header comments to get more info about it): import android.content.Context;
import android.os.Build;
import androidx.annotation.RequiresApi;
import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.GlInterface;
import com.pedro.rtplibrary.view.OffScreenGlThread;
import dji.sdk.camera.VideoFeeder;
import dji.sdk.codec.DJICodecManager;
/**
* NOTE: This is not tested.
*
* Example code of use for DJI to generate valid h264 to send t RtspClient or RtmpClient classes.
* We are doing the following:
* - Get frames from DJI device in h264.
* - Decode this frames into GlInterface class using DJICodecManager.
* - Copy frames from GlInterface to VideoEncoder surface.
* - VideoEncoder detect that copy automatically and re encode it to h264 that should be valid to stream.
*
* After that, using GetVideoData provided in start method, you should get sps and pps in the onSpsPpsVps callback where you
* can start your stream and send data to stream using getVideoData callback.
*
* Extra info:
* GlInterface is an off screen thread that provide a surfaceTexture of OpenGl that you can render. Also,
* allow you to copy data of that surfaceTexture to other and adding filters, take photo, etc.
* Keep in mind that onSpsPpsVps could be called multiple times so use a conditional is recommended.
*/
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class DJIExample implements VideoFeeder.VideoDataListener {
private DJICodecManager codecManager;
private VideoEncoder videoEncoder;
private GlInterface glInterface;
//encoder and decoder configuration
private final int width = 640;
private final int height = 480;
private final int fps = 30;
private final int bitrate = 1200 * 1000;
private final int iFrameInterval = 2;
private boolean init = false;
public void start(Context context, GetVideoData getVideoData) {
if (init) { //maybe we should init this only one time. I'm not sure
try {
VideoFeeder.getInstance().getPrimaryVideoFeed().addVideoDataListener(this);
init = true;
} catch (Exception ignored) { }
}
videoEncoder = new VideoEncoder(getVideoData);
videoEncoder.prepareVideoEncoder(width, height, fps, bitrate, 0, iFrameInterval, FormatVideoEncoder.SURFACE);
glInterface = new OffScreenGlThread(context);
glInterface.init();
glInterface.setEncoderSize(width, height);
glInterface.start();
videoEncoder.start();
glInterface.addMediaCodecSurface(videoEncoder.getInputSurface());
codecManager = new DJICodecManager(context, glInterface.getSurfaceTexture(), width, height);
}
public void stop() {
glInterface.removeMediaCodecSurface();
videoEncoder.stop();
glInterface.stop();
codecManager.cleanSurface();
videoEncoder = null;
glInterface = null;
codecManager = null;
}
@Override
public void onReceive(byte[] bytes, int i) {
if (codecManager != null) {
codecManager.sendDataToDecoder(bytes, i);
}
}
} |
It works! However, an issue I am facing is that after starting the RTSP stream above, the live video feed on my device freezes (the Surface Texture seems to dissappear after starting the stream) but the RTSP live stream is playable. Any idea what is causing this? I am using two different CodecManagers for the live video feed and the RTSP stream. This is happening every time I instantiate a another CodecManager: Is there a way to display the |
You can replace OffScreenGlThread with OpenGlView to get a preview but you need keep in mind OpenGlView lifecycle:
|
Hi @pedroSG94 how can I push this data use your library by rtmp? |
Hi, @pedroSG94 @ilterpehlivan @keeeenion @supr3me @aviaot , Thanks for this great library, I have implemented the below code for streaming the video feed coming from skydroid FPV camera, which is mounted on Ardupilot drone. But the after playing the stream on server, corrupted data is sent and video feed look like as mentioned in below video. In my case, I'm using RTMP ANT media server. thanks.
20230705_131126.mp4 |
@pedroSG94 how to use this class, there is no connect method neither send video info or send video data |
I will close this issue and response in your issue |
I have been tackling this issue for a week now. Read numerous issues, couple from this very library even.
Most of the answers or code snippets are no longer available anymore.
I have taken this repo as the base for this project: https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample and tried to implement this library into it.
Managed to stream a very short video feed to my local machine and then it breaks for reasons beyond me.
Here is what I have at this very point:
I am hosting https://github.com/pedroSG94/vlc-example-streamplayer as the server on my local machine and trying to access that using the VLC player.
DJI VideoFeeder is spitting out encoded h264 buffer.
My question is what could be the reason for this bad video feed?
Reference:
https://developer.dji.com/api-reference/android-api/BaseClasses/DJIVideoFeeder.html?search=videofeed&i=2&
DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample#33
The text was updated successfully, but these errors were encountered: