-
-
Notifications
You must be signed in to change notification settings - Fork 335
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using ffmpeg.js as media codec and/or media container polyfill #12
Comments
But I don't recommend to use ffmpeg.js in that way. It's not a good idea to force users to wait for entire file to be downloaded, transcoded, etc. You will also need to store it in memory. Take a look at aurora.js |
Is it possible to not wait for entire file to be downloaded and transcode it on-the-fly? That's what I mean in #11. |
FFMPEG is very interesting because it supports almost any |
You may be able to transcode HTTP stream on the fly, but you won't see the intermediate result in main process because worker should send entire
They have plenty of codecs and it should be rather easy to add more. If you need to decode just any audio format available in libavcodec, then you might consider to write small wrapper in C which will use Emscripten API for HTTP requests (like I mentioned in #11) and call libavcodec API to decode downloadable chunks of audio stream. ffmpeg.js is just not inteded for that purpose, it's more like ffmpeg CLI utility in pure Javascript. You won't play audio files with ffmpeg CLI, right? You may also take a look at StreamFile.js and OGVPlayer.js from ogv.js project for example of streamable decoder in JS. It downloads stream data via common JS code, fills queue and passes it to C demuxer/decoder compiled with Emscripten. |
You may want to take a look at https://github.com/duanyao/codecbox.js , which provides a decoder API over ffmpeg. Currently only supports |
One crazy idea just came to my mind: you could try to output transcode result to stdout and read that in main process ( |
Note that as far as I know there's no way to append data to a live blob URL, so streaming data straight into a video or audio element during transcoding may require using MediaSource Extensions. Beware also that encoding video also tends to be very slow, much slower than real-time at non-trivial resolutions, though if you're only applying it to audio and just remuxing video that might work well enough. My ogv.js does a full analogue of the media elements, suitable for replacing an audio or video element at runtime for completely script-mediated playback, but the codecs are not yet fully pluggable and it would not handle h.264 or AAC at this time without some poking. |
What else is the FFplay program intended for? |
Is it possible to use ffmpeg.js as a media codec and/or media container polyfill for the HTML5
<audio>
/<video>
tag in a web browser to transcode formats that not software/hardware supported?Example 1: There is a
test.htm
page with<audio src="test.opus" />
tag,test.opus
file encoded usingOpus
audio codec, but the web browser doesn't supportOpus
. Is it possible to decodetest.opus
to uncompressedtest.wav
(and change the tag to<audio src="test.wav" />
)?Example 2: There is a
test.htm
page with<video src="test.mp4" />
tag,test.mp4
file encoded usingH.264
video codec andAC3
audio codec, but the web browser doesn't supportAC3
. Is it possible to decode onlyAC3
audio to uncompressed audio data (PCM) (that supported by browser), but not decodeH.264
video (that supported by browser natively) (and view thetest.mp4
with working video and audio in the browser)?The text was updated successfully, but these errors were encountered: