-
Notifications
You must be signed in to change notification settings - Fork 424
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rendering raw H.264 frames #59
Comments
there is a example implementation of pretty much what you described here: https://github.com/soliton4/BroadwayStream my advice: then return with questions |
Thanks. Let me give it a try. :) |
The link which you gave looks like a server side implementation. Server side I've managed already in Android. |
@omerjerk did you get it up and running? |
Hi, I get H.264 frames from Android's hardware h.264 encoder and I intend to send those frames to browser and render them. Please have a look here. Right now, this app sends these frames to another android device over websockets and renders them. Thanks. |
dear @omerjerk i understand that your problem is quite specific. in order to do that its best to familiarize yourself with the little demo i set up. |
I'm trying to do the same thing. I looked at your demo code and I have a couple questions:
Now if i send the entire fox.h264 to the browser, how do i play it back using broadway? Can i do the following? var p = new Player({}); Is this expected to work? Also, if the above code works, i would like to make it streamable. |
if you feed the hole file at once it is not streaming anymore is it? also i am not sure if you need to split it into NAL units. you are on the safe side if you seek for 00 00 00 01 in the stream and split it there. looks to me like you figured out how it works. |
Thanks for the quick reply, yes I saw that you are indeed producing raw h264. I cloned your example and got it working nicely. Very good! The code is a little tricky for me to follow as i am not familiar with promisland language. I noticed you are using a websocket and streaming raw h264, i got it working with my very own raw h264 sample file.. so it is doing exactly what i want. My question is, where do you separate the NAL units or does ffmpeg emit data events of NAL units? below is the code in question:
|
in the example the nal units are actually not seperated. you should be able to seperate them quite easily by scanning for 00 00 00 01 i have done it in the nodeMirror project but the code is to crappy to refer to. |
@soliton4 |
it sounds good but also very theoretic. |
So i created an algorithm to parse my h264 file into NAL units.. So basically the algorithm will read a node steam and then call a function whenever it has captured a full nal unit. in my test case im using a 1 megabyte raw h264 video file captured from a raspberry pi camera. For testing this i had the call back function write each nal unit to a file. I then opened the hex editor and made sure each file began with 00 00 00 01. i verified the the sum of the byte length of all the files written had equaled the same byte length of the original 1mb file. I also verified the data contents were as the same by opening the original in the hex editor and comparing data from each chunk. i am 99% sure i am breaking it into NAL units correctly.. If i pass my raw h264 file thru your ffmpeg command then it works, but your ffmpeg command is doing transcoding.. this is using too much cpu and does not make sense. I have a raw h264 file that plays fine in vlc, i would like to make this work with Broadway and I am stuck in the mud now. Is there a way I can send you my 1mb file and you can take a look? I can also send you the inidividual NAL data chunks that i have parsed so you can see exactly what i am asking Braodway to decode. thanks |
@g00dnatur3 i can identify 2 causes of potential errors:
@mbebenita is there a documentation what specific features of h264 are not supported? |
Yup my NAL parser is working fine. I executed your ffmpeg command on my sample h264 file and created another file. I then made my server stream this file to Broadway using my NAL unit parser and it worked fine. So there is something special that you are asking ffmpeg to do when producing the raw h264 that is different from the raw h264 file produced by my raspberry camera. I will play with the raspberry pi camera options to see if i can make it produce a raw h264 format that Broadway likes. |
Oh that's a bummer, are there any plans to support the full feature set? |
WOOOHOO, i got it to work! In the raspberry camera api documentation there is a profile option: profile - The H.264 profile to use for encoding. Defaults to ‘high’, but can be one of ‘baseline’, ‘main’, ‘high’, or ‘constrained’. http://picamera.readthedocs.org/en/latest/api_camera.html#picamera.camera.PiCamera.start_recording The default of 'high' doesn't work, but the 'baseline' profile works just fine! nice! hardwork pays off once again cheers! |
Hi @soliton4, |
This is where I'm maintaining my code - https://github.com/omerjerk/RemoteDroid_web/blob/master/raw.h264.js |
yeah maybe its a problem with h264 profiles (whatever that is) for more info pls refer to the video codec expert you trust |
If I was to use ffmpeg to send raw h264 do you have a recommend command? |
ffmpeg flag -f h264 |
My plan is to create H.264 encoded frames in an android and send them via websockets to a web browser and decode them using broadway. Note : It's not an mp4 file but H.264 frames.
Can anyone guide me on decoding and rendering them after I have received them ?
A code snippet or something would do.
Thanks.
The text was updated successfully, but these errors were encountered: