Skip to content

Qengineering/RTSP-with-OpenCV

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RTSP - UDP - TCP streams in OpenCV (with neglectable latency)

It is a known issue with RTSP streams and time-consuming algorithms such as deep learning frameworks. You're getting out of sync if individual frames take longer than your stream's frame rate to process. There is an increasing delay between reality and the captured images.

There are two possible solutions to this problem.

  • You could continuously grabbing images in a seperated thread.
  • Or determine the missed images and skipping them before grabbing a new frame. The latter solution is implemented here.

The code speaks for itself.
You can either use GStreamer or FFmpeg to open the stream.
RTSP streams usually are compressed with H264. Therefore your decompression is sensitive for timing issues, like threads halted for some time.


Dependencies.

To run the application, you have to:

  • OpenCV installed 32-bit or 64-bit OS
  • Code::Blocks installed. ($ sudo apt-get install codeblocks)

Tips.

Use only a wired Ethernet connection for your RTSP stream. Wi-Fi can be unstable.
Because the RTSP protocol is sensitive to even just one missing frame, the stream can easily crash.

If you are using the stream in a deep learning app, adjust your resolution and frame rate to the requirements of the deep learning model.
It is not a good idea to send a 1280x960 stream at 30 FPS if your model has a 416x416 input and takes 200 mSec to process a single frame.
It only costs extra memory and processing power.

If you want to stream UDP or TCP, are sure the streams work with the command line prompt beforehand. If not, they certainly won't work in OpenCV. Often errors are caused by the coding in the pipeline, the addresses or missing modules. If you need to install additional GStreamer modules, you'll need to rebuild your OpenCV also! For more information, see our website.


Running the app.

To run the application load the project file RTSPcam.cbp in Code::Blocks.

If you are using a Jetson Nano, you have to change the location where OpenCV has stored its header files to /usr/include/opencv4

At line 16 in main.cpp the stream is opened.

RTSP

cam.Open("rtsp://192.168.178.129:8554/test/");

UDP

cam.Open("udpsrc port=5200 ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! appsink", cv::CAP_GSTREAMER);

Sender: RaspiCam with Raspberry Pi Buster OS
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.178.84 port=5200

Sender: RaspiCam with Raspberry Pi Bullseye OS
gst-launch-1.0 -v libcamerasrc ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.178.84 port=5200

Note, host=192.168.178.84 being the IP address of the receiver.

TCP

cam.Open("tcpclientsrc host=192.168.178.129 port=5000 ! jpegdec ! videoconvert ! appsink", cv::CAP_GSTREAMER);

Sender: RaspiCam with Raspberry Pi Buster OS
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw,width=640,height=480, framerate=30/1 ! videoconvert ! jpegenc ! tcpserversink host=192.168.178.32 port=5000

Sender: RaspiCam with Raspberry Pi Bullseye OS
gst-launch-1.0 -v libcamerasrc ! video/x-raw,width=640,height=480, framerate=30/1 ! videoconvert ! jpegenc ! tcpserversink host=192.168.178.32 port=5000

Note, host=192.168.178.32 being the IP address of the sender.

RaspiCam (Bullseye)

cam.Open("libcamerasrc ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! videoscale ! video/x-raw, width=640, height=480 ! appsink", cv::CAP_GSTREAMER);

RaspiCam (Buster)

cam.Open("v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! videoscale ! video/x-raw, width=640, height=480 ! appsink", cv::CAP_GSTREAMER);

RaspiCam (only Buster or Bullseye legacy stack)

cam.Open(0);

Webcam (only Buster OS or Bullseye legacy stack)

cam.Open(0); //if RaspiCam is not connected
cam.Open(1); //if RaspiCam is connected

MP4

cam.Open("James.mp4");

Folder

cam.Open("/home/pi/Pictures/Plants");

Single file

cam.Open("/home/pi/Pictures/Garden.jpg");

Retrieve the stream

int main()
{
    cv::Mat frame;
    RTSPcam cam;

    cv::namedWindow("Camera", cv::WINDOW_AUTOSIZE);

    cam.Open("rtsp://192.168.178.129:8554/test/"); //you can dump anything OpenCV eats. (cv::CAP_ANY) BTW,OpenCV first tries FFmpeg

    while(true)
    {
        if(!cam.GetLatestFrame(frame)){
            cout << "Capture read error" << endl;
            break;
        }
        //place here your time consuming algorithms
//        cout << cam.CurrentFileName << endl;
        //show frame
        cv::imshow("Camera",frame);
        char esc = cv::waitKey(2);
        if(esc == 27) break;
    }
    cv::destroyAllWindows() ;
    return 0;
}

output image

paypal