-
Notifications
You must be signed in to change notification settings - Fork 1.4k
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Non-monotonous DTS in output stream #144
Comments
hello, the "Non-monotonous DTS in output stream" is a message generated by ffmpeg, and it is related to the ffmpeg -> HLS conversion, not to the server. There are plenty of information online about how to fix it (well, it depends on your stream), like |
Hello :). Thank you for your response. A more specific question then: could the RTSP simple server be a reason why the RTSP is not returning, when checking with ffprobe, the "start_pts": 15000 (example), which is set on the initial RTSP of the camera ? Even though in the ffmpeg processing, we set is using one of three ways, as specified in the official documentation ? Could it be from the server ? set_pts_value_stable = '(RTCTIME - RTCSTART) / (TB * 1000000)' |
This was quick 👍🏻 , thank you for the answer. PS C:\Users\Administrator\Desktop\> .\ffprobe.exe -v quiet -print_format json -show_streams rtsp://10.0.10.7/Src/MediaInput/h264/stream_1
{
"streams": [
{
"index": 0,
"codec_name": "h264",
"codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
"profile": "High",
"codec_type": "video",
"codec_time_base": "1/30",
"codec_tag_string": "[0][0][0][0]",
"codec_tag": "0x0000",
"width": 1920,
"height": 1080,
"coded_width": 1920,
"coded_height": 1088,
"closed_captions": 0,
"has_b_frames": 0,
"sample_aspect_ratio": "1:1",
"display_aspect_ratio": "16:9",
"pix_fmt": "yuv420p",
"level": 40,
"color_range": "tv",
"chroma_location": "left",
"field_order": "progressive",
"refs": 1,
"is_avc": "false",
"nal_length_size": "0",
"r_frame_rate": "30/1",
"avg_frame_rate": "15/1",
"time_base": "1/90000",
"start_pts": 15000,
"start_time": "0.166667",
"bits_per_raw_sample": "8",
"disposition": {
"default": 0,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
}
}
]
} And through rtsp-simple-server i get Input #0, rtsp, from 'rtsp://10.0.10.1:7069/mycamera':
Metadata:
title : Stream
Duration: N/A, bitrate: N/A
Stream #0:0: Video: h264, none, 90k tbr, 90k tbn, 180k tbc
"streams": [
{
"index": 0,
"codec_name": "h264",
"codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
"codec_type": "video",
"codec_time_base": "0/2",
"codec_tag_string": "[0][0][0][0]",
"codec_tag": "0x0000",
"width": 0,
"height": 0,
"coded_width": 0,
"coded_height": 0,
"closed_captions": 0,
"has_b_frames": 0,
"level": -99,
"refs": 1,
"is_avc": "false",
"nal_length_size": "0",
"r_frame_rate": "90000/1",
"avg_frame_rate": "0/0",
"time_base": "1/90000",
"disposition": {
"default": 0,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
}
}
]
} with ffmpeg command ffmpeg.exe
-hide_banner
-loglevel warning
-i rtsp://10.0.10.7/Src/MediaInput/h264/stream_1
-vf [in]scale=320:180,setpts=N/FRAME_RATE/TB
-c:v libx264
-b:v 150k
-g 10
-keyint_min 10
-preset ultrafast
-f rtsp
-rtsp_transport tcp
rtsp://10.0.10.1:7069/mycamera |
Ok, i didn't understood the question at first, that is: does the server return the actual timestamp of the stream when queried about the stream with the DESCRIBE method? If the aim is simply generating HLS, my advice is to regenerate PTS from the frames, since RTSP has its own synchronization mechanism (to get the current timestamp, just do Otherwise, if you absolutely need ffprobe to return the timestamp of the input, we can think about implementing an additional option to activate it when needed. |
Yes , that's what i was thinking too. The second solution would be preferred but i have no idea of how complicated would be to implement since i know next to nothing about Golang, we'll have to see how the first options works out Great work , and thank you again for clarifying this! |
Well, the real problem is finding out where the timestamp must be inserted into the SDP, how / when it has to be updated, what to do when a camera doesn't provide it (it is filtered out because most cameras don't provide it, while they are obliged to provide the RTP timestamp), what's its effect on all the clients. |
@aler9 one more thing that we tried is running a temporary
any ideas where the linux version and the windows version might differ ? we used the same configuration and same version on both OSs |
To add to the weirdness , i started yet another |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Which version are you using?
v0.12.2
Which operating system are you using?
Windows
Question
Hi, does the rtsp server support DTS/PTS ?
I see that message in log files when i input the stream to ffmpeg to output hls or mpeg-dash
The text was updated successfully, but these errors were encountered: