Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MPEG-TS vs. fMP4? #21

Open
gabek opened this issue Jun 22, 2020 · 8 comments
Open

MPEG-TS vs. fMP4? #21

gabek opened this issue Jun 22, 2020 · 8 comments
Labels
backlog Ideas that might be cool and can be looked into later. research Something to look up or a big question that needs an answer video Issues relating to video encoding, codecs, hls output or rtmp input.

Comments

@gabek
Copy link
Member

gabek commented Jun 22, 2020

I see in Apple's WWDC sessions for this week they're starting to push fMP4 within HLS. It's been around for a while, but I think it's finally starting to take hold. I don't know if there's any benefit for owncast to switch, and there might even be drawbacks (older devices and players not yet supporting it). But I'm leaving this as a placeholder issue to research and discuss at some point in the future. I'd like to find specifically if there's any encoding performance wins or losses.

https://hlsbook.net/hls-fragmented-mp4/

@gabek gabek added the research Something to look up or a big question that needs an answer label Jun 22, 2020
@gabek
Copy link
Member Author

gabek commented Jun 23, 2020

Some old documentation in the ffmpeg wiki, with broken links, seems to point that maybe there's a benefit?

https://trac.ffmpeg.org/wiki/StreamingGuide#Codecs

The most popular streaming codec is probably ​libx264, though if you're streaming to a device which requires a "crippled" baseline h264 implementation, you can use the x264 "baseline" profile. Some have have argued that the mp4 video codec is ​better than x264 baseline, because it encodes about as well with less cpu. You may be able to use other codecs, like mpeg2video, or really any other video codec you want, typically, as long as your receiver can decode it, if it suits your needs.

Also note that encoding it to the x264 "baseline" is basically a "compatibility mode" for older iOS devices or the like, see ​here.

The mpeg4 video codec sometimes also comes "within a few percentage" of the compression of x264 "normal settings", but uses much less cpu to do the encoding. See ​http://ffmpeg.zeranoe.com/forum/viewtopic.php?f=7&t=631&hilit=mpeg4+libx264+cores&start=10#p2163 for some graphs (which may be slightly outdated). Basically in that particular test it was 54 fps to 58 fps (libx264 faster), and libx264 file was 5.1MB and mpeg4 was 6MB, but mpeg4 used only half as much cpu for its computation, so take it with a grain of salt.

@gabek
Copy link
Member Author

gabek commented Jun 25, 2020

fMP4 is required for the future Low Latency HLS spec. #28

@gabek
Copy link
Member Author

gabek commented Jul 12, 2020

Closing since I'm not tackling low latency HLS at the moment. Maybe some day.

@gabek gabek closed this as completed Jul 12, 2020
gabek added a commit that referenced this issue Apr 26, 2022
…rt.js-2.9.30

Bump @types/chart.js from 2.9.28 to 2.9.30
xarantolus pushed a commit to xarantolus/owncast that referenced this issue Feb 14, 2023
@mahmed2000
Copy link
Contributor

Has this been considered since?

Encoding should be unrelated to the container format. You can use h264+aac with fmp4 and mpeg4video with mpeg-ts.

@gabek
Copy link
Member Author

gabek commented Oct 26, 2024

For sure. I figured changing the container format itself wouldn't be a huge lift, so I just did a little proof of concept. #3986

There's still things that need to be worked out:

  • The thumbnail generator is not able to read the fMP4 segments independently, due to the init.mp4 initialization required.
  • The end of stream clip is not working yet.
  • Tests fail.
  • Probably a bunch more.

@gabek gabek reopened this Oct 26, 2024
@gabek gabek added backlog Ideas that might be cool and can be looked into later. video Issues relating to video encoding, codecs, hls output or rtmp input. labels Oct 26, 2024
@mahmed2000
Copy link
Contributor

  • the init.mp4 file would need extra work on top of that. It needs to persist as long as the stream does for anyone that joins late. Will also cause problems with replays since those will also need that segment. Simply concatenating the init.mp4 file with the m4s segment should work.
  • The end of stream is trickier. Just converting it to an mp4 file won't work, since that creates 2 moov atoms, which is invalid. That's a problem because any encoding and NAL packet metadata is stored there in regular mp4s. The end-of-stream file would need to also be an m4s segment, but I have not tested how well that works. Might end up having to shove the file into the transcoding and get a different clip per stream variant.
  • The tests fail because they still assume the ffmpeg command is supposed to generate mpeg-ts segments and not fmp4.

@gabek
Copy link
Member Author

gabek commented Oct 28, 2024

Yeah, none of those things are impossible, I just didn't go further. I also don't know if it's worth the effort to go further unless there's some tangible benefit. There's still no way to do LL-HLS with the current pipeline, so the fMP4 requirement isn't going to help there.

@mahmed2000
Copy link
Contributor

The main benefit imo is just the potential for more codecs. AV1 and VP9 are better in terms of bandwidth and storage, but can't be muxed into mpeg-ts as of now. If the codecs are kept the same for compatibility and to not add more configuration to the admin, then yeah. There's little point currently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backlog Ideas that might be cool and can be looked into later. research Something to look up or a big question that needs an answer video Issues relating to video encoding, codecs, hls output or rtmp input.
Projects
None yet
Development

No branches or pull requests

2 participants