Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DJI Mobile SDK real-time RTSP video streaming #817

Closed
keeeenion opened this issue Mar 29, 2021 · 20 comments
Closed

DJI Mobile SDK real-time RTSP video streaming #817

keeeenion opened this issue Mar 29, 2021 · 20 comments

Comments

@keeeenion
Copy link

I have been tackling this issue for a week now. Read numerous issues, couple from this very library even.
Most of the answers or code snippets are no longer available anymore.

I have taken this repo as the base for this project: https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample and tried to implement this library into it.

Managed to stream a very short video feed to my local machine and then it breaks for reasons beyond me.

Here is what I have at this very point:

public class StreamVideo extends MyClassNotNecessaryToPointOut {

    private long presentTimeUs = 0L;
    private RtspClient rtspClient = new RtspClient(new ConnectCheckerRtsp() {
        @Override
        public void onConnectionSuccessRtsp() {

        }

        @Override
        public void onConnectionFailedRtsp(String reason) {

        }

        @Override
        public void onNewBitrateRtsp(long bitrate) {

        }

        @Override
        public void onDisconnectRtsp() {

        }

        @Override
        public void onAuthErrorRtsp() {

        }

        @Override
        public void onAuthSuccessRtsp() {

        }
    });

    private MediaCodec.BufferInfo videoInfo = new MediaCodec.BufferInfo();
    private boolean started;
    protected VideoFeeder.VideoDataListener mReceivedVideoDataListener = (videoBuffer, size) -> {
        videoInfo.size = size;
        videoInfo.offset = 0;
        videoInfo.flags = MediaCodec.BUFFER_FLAG_PARTIAL_FRAME;
        videoInfo.presentationTimeUs = System.nanoTime() / 1000 - presentTimeUs;
        int naluType = videoBuffer[0] & 0x1f;
        //First keyframe received and you start stream.
        // Change conditional as you want but stream must start with a keyframe
        if (naluType == 5 && !rtspClient.isStreaming() && started) {
            videoInfo.flags = MediaCodec.BUFFER_FLAG_KEY_FRAME;
            Pair<ByteBuffer, ByteBuffer> videoData = decodeSpsPpsFromBuffer(videoBuffer, size);
            if (videoData != null) {
                rtspClient.setIsStereo(true);
                rtspClient.setSampleRate(44100);
                presentTimeUs = System.nanoTime() / 1000;
                ByteBuffer newSps = videoData.first;
                ByteBuffer newPps = videoData.second;
                rtspClient.setSPSandPPS(newSps, newPps, null);
                rtspClient.setProtocol(Protocol.TCP);
                rtspClient.connect(this.endpoint);
            }
        }
        ByteBuffer h264Buffer = ByteBuffer.wrap(videoBuffer);
        rtspClient.sendVideo(h264Buffer, videoInfo);

    };

    private VideoFeeder.VideoFeed standardVideoFeeder;
    private String endpoint = "rtsp://192.168.1.100:8554/dji/demo";

    public StreamVideo(String endpoint) {
        this.endpoint = endpoint;
        standardVideoFeeder = VideoFeeder.getInstance().provideTranscodedVideoFeed();
        standardVideoFeeder.addVideoDataListener(mReceivedVideoDataListener);
    }

    @Override
    public void start() {
        started = true;
    }

    @Override
    public void end() {
        started = false;
        rtspClient.disconnect();
    }

    private Pair<ByteBuffer, ByteBuffer> decodeSpsPpsFromBuffer(byte[] csd, int length) {
        byte[] mSPS = null, mPPS = null;
        int i = 0;
        int spsIndex = -1;
        int ppsIndex = -1;
        while (i < length - 4) {
            if (csd[i] == 0 && csd[i + 1] == 0 && csd[i + 2] == 0 && csd[i + 3] == 1) {
                if (spsIndex == -1) {
                    spsIndex = i;
                } else {
                    ppsIndex = i;
                    break;
                }
            }
            i++;
        }
        if (spsIndex != -1 && ppsIndex != -1) {
            mSPS = new byte[ppsIndex];
            System.arraycopy(csd, spsIndex, mSPS, 0, ppsIndex);
            mPPS = new byte[length - ppsIndex];
            System.arraycopy(csd, ppsIndex, mPPS, 0, length - ppsIndex);
        }
        if (mSPS != null && mPPS != null) {
            return new Pair<>(ByteBuffer.wrap(mSPS), ByteBuffer.wrap(mPPS));
        }
        return null;
    }
}

I am hosting https://github.com/pedroSG94/vlc-example-streamplayer as the server on my local machine and trying to access that using the VLC player.

DJI VideoFeeder is spitting out encoded h264 buffer.
My question is what could be the reason for this bad video feed?

Reference:
https://developer.dji.com/api-reference/android-api/BaseClasses/DJIVideoFeeder.html?search=videofeed&i=2&
DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample#33

@pedroSG94
Copy link
Owner

Can you share crash logcat?

@keeeenion
Copy link
Author

keeeenion commented Mar 29, 2021

@pedroSG94
The crash is happening on the receiving end.
I am using your Simple RTSP Server and VLC to test the solution.
After some time VLC stops working.. I never get a clear frame.. rather half of one and half of another merged.

2021/03/30 00:57:51 [I] [1/0/0] [client 192.168.1.186:49922] connected (RTSP/TCP)
2021/03/30 00:57:51 [I] [1/1/0] [client 192.168.1.186:49922] is publishing to path 'dji/demo', 2 tracks with udp
2021/03/30 01:01:24 [I] [2/1/0] [client 127.0.0.1:56903] connected (RTSP/TCP)
2021/03/30 01:01:24 [I] [2/1/1] [client 127.0.0.1:56903] is reading from path 'dji/demo', 2 tracks with udp
2021/03/30 01:01:39 [I] [2/1/1] [client 127.0.0.1:56903] ERR: read tcp 127.0.0.1:8554->127.0.0.1:56903: wsarecv: An existing connection was forcibly closed by the remote host.
2021/03/30 01:01:39 [I] [1/1/0] [client 127.0.0.1:56903] disconnected
2021/03/30 01:03:28 [I] [2/1/0] [client 127.0.0.1:56944] connected (RTSP/TCP)
2021/03/30 01:03:28 [I] [2/1/1] [client 127.0.0.1:56944] is reading from path 'dji/demo', 2 tracks with udp
2021/03/30 01:03:35 [I] [2/1/1] [client 127.0.0.1:56944] ERR: read tcp 127.0.0.1:8554->127.0.0.1:56944: wsarecv: An existing connection was forcibly closed by the remote host.
2021/03/30 01:03:35 [I] [1/1/0] [client 127.0.0.1:56944] disconnected

I feel like the feed being sent is corrupted and not full frames are parsed.

bad rtsp feed

This is actually the best one I've seen.. fps is non existent as well.. like looking at a still image.

@pedroSG94
Copy link
Owner

Maybe your server need send audio frames to work. Somes server require video and audio frame to work or server will close connection in few seconds.

To confirm that you can do this:

@keeeenion
Copy link
Author

@pedroSG94
Can confirm that it is working even without the audio.
Did halt for a second or two, but continued without issues.

I am sensing the buffer from the drone is bad or the decoding is done badly.

One off-topic question, can SDP file be generated with this library?

@pedroSG94
Copy link
Owner

I don't know if buffer is fine or not but maybe a good idea is decode buffer and re encode it to generate a valid buffer.
About SDP this library generate a SDP command in ANNOUNCE:
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtsp/src/main/java/com/pedro/rtsp/rtsp/CommandsManager.kt#L202

@hogantan
Copy link

Hello all, I am also trying to stream the raw drone data to an RTSP server.

However, I am having difficulty regarding the keyframes of the video data. As per above, the condition if (naluType == 5 && !rtspClient.isStreaming()) is never true for me. videoBuffer[0] is always 0 for me.

Does anyone know how to set the naluType or any way of getting the keyframe? Appreciate any advice! My code is the same as above and am using a Mavic 2 Pro.

@pedroSG94
Copy link
Owner

Hello,

I think that in your case you have a video buffer in annexb so it start with (0x00, 0x00, 0x00, 0x01 naltype) or (0x00, 0x00, 0x01 naltype).
You have a method that get offset before naltype byte (3, 4 or 0 if not found, it means you have avc format or invalid h264) here:
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtsp/src/main/java/com/pedro/rtsp/utils/Utils.kt#L29

@hogantan
Copy link

Thank for the quick reply!

So I have managed to get the naltype by doing naluType = UtilsKt.getVideoStartCodeSize(ByteBuffer.wrap(videoBuffer)); and getting 4 consistently.

Seems like I am missing keyframe? Any idea what might be causing this?

@pedroSG94
Copy link
Owner

pedroSG94 commented Apr 19, 2022

No exactly. This should be something like this:

int naluIndex = UtilsKt.getVideoStartCodeSize(ByteBuffer.wrap(videoBuffer));
//For example: videoBuffer = 0x00, 0x00, 0x00, 0x01, nalByte, ... (in this case getVideoStartCodeSize return index 4)
int naluType = videoBuffer[naluIndex] & 0x1f;

I suggest you print 5 first bytes in each buffer to know if you have this pattern and you are using annexb as I think

@hogantan
Copy link

This is video buffer bytes:
image

Now I have two naluType values, 1 and 7 but setting either of them still does not work.
When 1 is set: I get Error to extract video data
When 7 is set I get: waiting for keyframe

image

@pedroSG94
Copy link
Owner

naluType 7 = SPS. You can check it here:
https://yumichan.net/video-processing/video-compression/introduction-to-h264-nal-unit/
Now, you need get PPS (8). Normally you should have sps and pps in keyframes (5) and it is the reason about the code above.
Also, remember that keyframes are produced in an interval so maybe you need wait few seconds until you receive a keyframe

@aviaot
Copy link

aviaot commented May 9, 2022

naluType 7 = SPS. You can check it here: https://yumichan.net/video-processing/video-compression/introduction-to-h264-nal-unit/ Now, you need get PPS (8). Normally you should have sps and pps in keyframes (5) and it is the reason about the code above. Also, remember that keyframes are produced in an interval so maybe you need wait few seconds until you receive a keyframe

Hi got the same 1 and 7, can you please give some reference examples (code).

@ilterpehlivan
Copy link

Hi @pedroSG94 , thanks for this great library!

I also have a use case to stream DJI drone camera feed (already encoded H264) to RTSP server

I tried your recommendations and reached to the point where I reached to @hogantan Nalutype problem and honestly I have no background in H264 encoding/decoding, therefore I am stuck there and considering to switch to low efficient RTMP. Before switching I wanted to undertstand the need of going into this low level h264 encoding details ? we have already h264 encoded data and if we connect to RTSP server successfully then we should be just Playing the data right ? Why all these nalutype check is needed ? To illustrate what I mean, i draw the following flow-chart, I appreciate if you can comment , thanks

graph TD
    I("LaunchComponent.Init()") --> |1.Connect|R(RtspClient)-->S[RTSP Server]
    I --> |2.Start Camera|A[H264 Video Stream] --> B("SDK.onReceive([]byte,size)")
    B -->|convert2ByteBuffer|B -->|Publish h264 bytebuffer|R

@pedroSG94
Copy link
Owner

pedroSG94 commented May 12, 2022

Hello,

Let me explain the reason about nalutype.

H264 get video info using sps and pps (nalu type 7 and 8) and frecuently is also included in keyframes (nalu type 5).
This info (sps and pps) are necessary to start stream in RTMP and RTSP so you need to provide it. This is the reason about the nalutype problem.
Normally you should get keyframes (nalutype 5) in a consistent interval of X seconds (X value is configured in encoder side in my case it is iFrameInterval value of prepareVideo method).

I was looking a bit in DJI SDK code and I did an alternative way re encoding h264 buffer because you are not getting video info. This is the example code. Maybe bugged (I can't test it). Test it and let me know if you have any problem (read header comments to get more info about it):

import android.content.Context;
import android.os.Build;

import androidx.annotation.RequiresApi;

import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.GlInterface;
import com.pedro.rtplibrary.view.OffScreenGlThread;

import dji.sdk.camera.VideoFeeder;
import dji.sdk.codec.DJICodecManager;

/**
 * NOTE: This is not tested.
 *
 * Example code of use for DJI to generate valid h264 to send t RtspClient or RtmpClient classes.
 * We are doing the following:
 * - Get frames from DJI device in h264.
 * - Decode this frames into GlInterface class using DJICodecManager.
 * - Copy frames from GlInterface to VideoEncoder surface.
 * - VideoEncoder detect that copy automatically and re encode it to h264 that should be valid to stream.
 *
 * After that, using GetVideoData provided in start method, you should get sps and pps in the onSpsPpsVps callback where you 
 * can start your stream and send data to stream using getVideoData callback.
 *
 * Extra info:
 * GlInterface is an off screen thread that provide a surfaceTexture of OpenGl that you can render. Also,
 * allow you to copy data of that surfaceTexture to other and adding filters, take photo, etc.
 * Keep in mind that onSpsPpsVps could be called multiple times so use a conditional is recommended.
 */
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class DJIExample implements VideoFeeder.VideoDataListener {

  private DJICodecManager codecManager;
  private VideoEncoder videoEncoder;
  private GlInterface glInterface;
  //encoder and decoder configuration
  private final int width = 640;
  private final int height = 480;
  private final int fps = 30;
  private final int bitrate = 1200 * 1000;
  private final int iFrameInterval = 2;
  private boolean init = false;


  public void start(Context context, GetVideoData getVideoData) {
    if (init) { //maybe we should init this only one time. I'm not sure
      try {
        VideoFeeder.getInstance().getPrimaryVideoFeed().addVideoDataListener(this);
        init = true;
      } catch (Exception ignored) { }
    }
    videoEncoder = new VideoEncoder(getVideoData);
    videoEncoder.prepareVideoEncoder(width, height, fps, bitrate, 0, iFrameInterval, FormatVideoEncoder.SURFACE);
    glInterface = new OffScreenGlThread(context);
    glInterface.init();
    glInterface.setEncoderSize(width, height);
    glInterface.start();
    videoEncoder.start();
    glInterface.addMediaCodecSurface(videoEncoder.getInputSurface());
    codecManager = new DJICodecManager(context, glInterface.getSurfaceTexture(), width, height);
  }

  public void stop() {
    glInterface.removeMediaCodecSurface();
    videoEncoder.stop();
    glInterface.stop();
    codecManager.cleanSurface();
    videoEncoder = null;
    glInterface = null;
    codecManager = null;
  }

  @Override
  public void onReceive(byte[] bytes, int i) {
    if (codecManager != null) {
      codecManager.sendDataToDecoder(bytes, i);
    }
  }
}

@hogantan
Copy link

hogantan commented May 20, 2022

Hello,

Let me explain the reason about nalutype.

H264 get video info using sps and pps (nalu type 7 and 8) and frecuently is also included in keyframes (nalu type 5). This info (sps and pps) are necessary to start stream in RTMP and RTSP so you need to provide it. This is the reason about the nalutype problem. Normally you should get keyframes (nalutype 5) in a consistent interval of X seconds (X value is configured in encoder side in my case it is iFrameInterval value of prepareVideo method).

I was looking a bit in DJI SDK code and I did an alternative way re encoding h264 buffer because you are not getting video info. This is the example code. Maybe bugged (I can't test it). Test it and let me know if you have any problem (read header comments to get more info about it):

import android.content.Context;
import android.os.Build;

import androidx.annotation.RequiresApi;

import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.GlInterface;
import com.pedro.rtplibrary.view.OffScreenGlThread;

import dji.sdk.camera.VideoFeeder;
import dji.sdk.codec.DJICodecManager;

/**
 * NOTE: This is not tested.
 *
 * Example code of use for DJI to generate valid h264 to send t RtspClient or RtmpClient classes.
 * We are doing the following:
 * - Get frames from DJI device in h264.
 * - Decode this frames into GlInterface class using DJICodecManager.
 * - Copy frames from GlInterface to VideoEncoder surface.
 * - VideoEncoder detect that copy automatically and re encode it to h264 that should be valid to stream.
 *
 * After that, using GetVideoData provided in start method, you should get sps and pps in the onSpsPpsVps callback where you 
 * can start your stream and send data to stream using getVideoData callback.
 *
 * Extra info:
 * GlInterface is an off screen thread that provide a surfaceTexture of OpenGl that you can render. Also,
 * allow you to copy data of that surfaceTexture to other and adding filters, take photo, etc.
 * Keep in mind that onSpsPpsVps could be called multiple times so use a conditional is recommended.
 */
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class DJIExample implements VideoFeeder.VideoDataListener {

  private DJICodecManager codecManager;
  private VideoEncoder videoEncoder;
  private GlInterface glInterface;
  //encoder and decoder configuration
  private final int width = 640;
  private final int height = 480;
  private final int fps = 30;
  private final int bitrate = 1200 * 1000;
  private final int iFrameInterval = 2;
  private boolean init = false;


  public void start(Context context, GetVideoData getVideoData) {
    if (init) { //maybe we should init this only one time. I'm not sure
      try {
        VideoFeeder.getInstance().getPrimaryVideoFeed().addVideoDataListener(this);
        init = true;
      } catch (Exception ignored) { }
    }
    videoEncoder = new VideoEncoder(getVideoData);
    videoEncoder.prepareVideoEncoder(width, height, fps, bitrate, 0, iFrameInterval, FormatVideoEncoder.SURFACE);
    glInterface = new OffScreenGlThread(context);
    glInterface.init();
    glInterface.setEncoderSize(width, height);
    glInterface.start();
    videoEncoder.start();
    glInterface.addMediaCodecSurface(videoEncoder.getInputSurface());
    codecManager = new DJICodecManager(context, glInterface.getSurfaceTexture(), width, height);
  }

  public void stop() {
    glInterface.removeMediaCodecSurface();
    videoEncoder.stop();
    glInterface.stop();
    codecManager.cleanSurface();
    videoEncoder = null;
    glInterface = null;
    codecManager = null;
  }

  @Override
  public void onReceive(byte[] bytes, int i) {
    if (codecManager != null) {
      codecManager.sendDataToDecoder(bytes, i);
    }
  }
}

It works! However, an issue I am facing is that after starting the RTSP stream above, the live video feed on my device freezes (the Surface Texture seems to dissappear after starting the stream) but the RTSP live stream is playable. Any idea what is causing this? I am using two different CodecManagers for the live video feed and the RTSP stream.

This is happening every time I instantiate a another CodecManager: new DJICodecManager(context, glInterface.getSurfaceTexture(), width, height);

Is there a way to display the glinterface on my Android device ? That way I wont have to use two DJICodecManagers and use glinterface video output for both rtsp stream and live video feed on my device.

@pedroSG94
Copy link
Owner

pedroSG94 commented May 20, 2022

@supr3me
Copy link

supr3me commented Nov 2, 2022

Hello,

Let me explain the reason about nalutype.

H264 get video info using sps and pps (nalu type 7 and 8) and frecuently is also included in keyframes (nalu type 5). This info (sps and pps) are necessary to start stream in RTMP and RTSP so you need to provide it. This is the reason about the nalutype problem. Normally you should get keyframes (nalutype 5) in a consistent interval of X seconds (X value is configured in encoder side in my case it is iFrameInterval value of prepareVideo method).

I was looking a bit in DJI SDK code and I did an alternative way re encoding h264 buffer because you are not getting video info. This is the example code. Maybe bugged (I can't test it). Test it and let me know if you have any problem (read header comments to get more info about it):

import android.content.Context;
import android.os.Build;

import androidx.annotation.RequiresApi;

import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.GlInterface;
import com.pedro.rtplibrary.view.OffScreenGlThread;

import dji.sdk.camera.VideoFeeder;
import dji.sdk.codec.DJICodecManager;

/**
 * NOTE: This is not tested.
 *
 * Example code of use for DJI to generate valid h264 to send t RtspClient or RtmpClient classes.
 * We are doing the following:
 * - Get frames from DJI device in h264.
 * - Decode this frames into GlInterface class using DJICodecManager.
 * - Copy frames from GlInterface to VideoEncoder surface.
 * - VideoEncoder detect that copy automatically and re encode it to h264 that should be valid to stream.
 *
 * After that, using GetVideoData provided in start method, you should get sps and pps in the onSpsPpsVps callback where you 
 * can start your stream and send data to stream using getVideoData callback.
 *
 * Extra info:
 * GlInterface is an off screen thread that provide a surfaceTexture of OpenGl that you can render. Also,
 * allow you to copy data of that surfaceTexture to other and adding filters, take photo, etc.
 * Keep in mind that onSpsPpsVps could be called multiple times so use a conditional is recommended.
 */
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class DJIExample implements VideoFeeder.VideoDataListener {

  private DJICodecManager codecManager;
  private VideoEncoder videoEncoder;
  private GlInterface glInterface;
  //encoder and decoder configuration
  private final int width = 640;
  private final int height = 480;
  private final int fps = 30;
  private final int bitrate = 1200 * 1000;
  private final int iFrameInterval = 2;
  private boolean init = false;


  public void start(Context context, GetVideoData getVideoData) {
    if (init) { //maybe we should init this only one time. I'm not sure
      try {
        VideoFeeder.getInstance().getPrimaryVideoFeed().addVideoDataListener(this);
        init = true;
      } catch (Exception ignored) { }
    }
    videoEncoder = new VideoEncoder(getVideoData);
    videoEncoder.prepareVideoEncoder(width, height, fps, bitrate, 0, iFrameInterval, FormatVideoEncoder.SURFACE);
    glInterface = new OffScreenGlThread(context);
    glInterface.init();
    glInterface.setEncoderSize(width, height);
    glInterface.start();
    videoEncoder.start();
    glInterface.addMediaCodecSurface(videoEncoder.getInputSurface());
    codecManager = new DJICodecManager(context, glInterface.getSurfaceTexture(), width, height);
  }

  public void stop() {
    glInterface.removeMediaCodecSurface();
    videoEncoder.stop();
    glInterface.stop();
    codecManager.cleanSurface();
    videoEncoder = null;
    glInterface = null;
    codecManager = null;
  }

  @Override
  public void onReceive(byte[] bytes, int i) {
    if (codecManager != null) {
      codecManager.sendDataToDecoder(bytes, i);
    }
  }
}

Hi @pedroSG94
I have a problem:
On SDK, I use getVideoStartCodeSize() process the h264 data, get nalutype, but only has naluType 1/7/9, never get 8 or 5 ,
image

how can I push this data use your library by rtmp?
Thank you very much!

@RishiDate
Copy link

RishiDate commented Jul 12, 2023

Hi, @pedroSG94 @ilterpehlivan @keeeenion @supr3me @aviaot , Thanks for this great library, I have implemented the below code for streaming the video feed coming from skydroid FPV camera, which is mounted on Ardupilot drone. But the after playing the stream on server, corrupted data is sent and video feed look like as mentioned in below video. In my case, I'm using RTMP ANT media server. thanks.

  private void onReceive(byte[] videoBuffer, int size) {
    videoInfo.size = size;
    videoInfo.offset = 0;
    videoInfo.flags = MediaCodec.BUFFER_FLAG_PARTIAL_FRAME;
    videoInfo.presentationTimeUs = System.nanoTime() / 1000 - presentTimeUs;
    int naluIndex = getVideoStartCodeSize(videoBuffer);
    naluType = videoBuffer[naluIndex] & 0x1f;
    if (naluType == 5 && !rtmpClient.isStreaming()) {
        videoInfo.flags = MediaCodec.BUFFER_FLAG_KEY_FRAME;
        Pair<ByteBuffer, ByteBuffer> videoData = decodeSpsPpsFromBuffer(videoBuffer, size);
        if (videoData != null) {
            presentTimeUs = System.nanoTime()/1000;
            ByteBuffer newSps = videoData.first;
            ByteBuffer newPps = videoData.second;
            rtmpClient.setVideoInfo(newSps, newPps, null);
            rtmpClient.connect(serverUrl, true);
        }
    }
    if (rtmpClient.isStreaming()) {
        ByteBuffer h264Buffer = ByteBuffer.wrap(videoBuffer);
        rtmpClient.sendVideo(h264Buffer, videoInfo);
    }
}

 private int getVideoStartCodeSize(byte[] wrap) {

    int startCodeSize = 0;
    if (wrap[0] == 0x00 && wrap[1] == 0x00
            && wrap[2] == 0x00 && wrap[3] == 0x01) {
        //match 00 00 00 01
        startCodeSize = 4;
    } else if (wrap[0] == 0x00 && wrap[1] == 0x00
            && wrap[2] == 0x01) {
        //match 00 00 01
        startCodeSize = 3;
    }
    return startCodeSize;
}

private static Pair<ByteBuffer, ByteBuffer> decodeSpsPpsFromBuffer(byte[] csd, int length) {
    byte[] mSPS = null, mPPS = null;
    int i = 0;
    int spsIndex = -1;
    int ppsIndex = -1;
    while (i < length - 4) {
        if (csd[i] == 0 && csd[i + 1] == 0 && csd[i + 2] == 0 && csd[i + 3] == 1) {
            if (spsIndex == -1) {
                spsIndex = i;
            } else {
                ppsIndex = i;
                break;
            }
        }
        i++;
    }
    if (spsIndex != -1 && ppsIndex != -1) {
        mSPS = new byte[ppsIndex];
        System.arraycopy(csd, spsIndex, mSPS, 0, ppsIndex);
        mPPS = new byte[length - ppsIndex];
        System.arraycopy(csd, ppsIndex, mPPS, 0, length - ppsIndex);
    }
    if (mSPS != null && mPPS != null) {
        return new Pair<>(ByteBuffer.wrap(mSPS), ByteBuffer.wrap(mPPS));
    }
    return null;
}
20230705_131126.mp4

@ghost
Copy link

ghost commented Nov 28, 2023

import android.content.Context;
import android.os.Build;
import androidx.annotation.RequiresApi;
import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.GlInterface;
import com.pedro.rtplibrary.view.OffScreenGlThread;
import dji.sdk.camera.VideoFeeder;
import dji.sdk.codec.DJICodecManager;


@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class DJIExample implements VideoFeeder.VideoDataListener {

  private DJICodecManager codecManager;
  private VideoEncoder videoEncoder;
  private GlInterface glInterface;
  //encoder and decoder configuration
  private final int width = 640;
  private final int height = 480;
  private final int fps = 30;
  private final int bitrate = 1200 * 1000;
  private final int iFrameInterval = 2;
  private boolean init = false;


  public void start(Context context, GetVideoData getVideoData) {
    if (init) { //maybe we should init this only one time. I'm not sure
      try {
        VideoFeeder.getInstance().getPrimaryVideoFeed().addVideoDataListener(this);
        init = true;
      } catch (Exception ignored) { }
    }
    videoEncoder = new VideoEncoder(getVideoData);
    videoEncoder.prepareVideoEncoder(width, height, fps, bitrate, 0, iFrameInterval, FormatVideoEncoder.SURFACE);
    glInterface = new OffScreenGlThread(context);
    glInterface.init();
    glInterface.setEncoderSize(width, height);
    glInterface.start();
    videoEncoder.start();
    glInterface.addMediaCodecSurface(videoEncoder.getInputSurface());
    codecManager = new DJICodecManager(context, glInterface.getSurfaceTexture(), width, height);
  }

  public void stop() {
    glInterface.removeMediaCodecSurface();
    videoEncoder.stop();
    glInterface.stop();
    codecManager.cleanSurface();
    videoEncoder = null;
    glInterface = null;
    codecManager = null;
  }

  @Override
  public void onReceive(byte[] bytes, int i) {
    if (codecManager != null) {
      codecManager.sendDataToDecoder(bytes, i);
    }
  }
}

@pedroSG94 how to use this class, there is no connect method neither send video info or send video data

@pedroSG94
Copy link
Owner

I will close this issue and response in your issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants