Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rendering raw H.264 frames #59

Closed
omerjerk opened this issue Apr 2, 2015 · 23 comments
Closed

Rendering raw H.264 frames #59

omerjerk opened this issue Apr 2, 2015 · 23 comments

Comments

@omerjerk
Copy link

omerjerk commented Apr 2, 2015

My plan is to create H.264 encoded frames in an android and send them via websockets to a web browser and decode them using broadway. Note : It's not an mp4 file but H.264 frames.
Can anyone guide me on decoding and rendering them after I have received them ?
A code snippet or something would do.

Thanks.

@soliton4
Copy link
Collaborator

soliton4 commented Apr 2, 2015

there is a example implementation of pretty much what you described here:

https://github.com/soliton4/BroadwayStream

my advice:
install it
play with it
get comfortable

then

return with questions

@omerjerk
Copy link
Author

omerjerk commented Apr 2, 2015

Thanks. Let me give it a try. :)

@omerjerk
Copy link
Author

omerjerk commented Apr 2, 2015

The link which you gave looks like a server side implementation. Server side I've managed already in Android.
I'm asking for how would I write code on the client side.

@soliton4
Copy link
Collaborator

soliton4 commented Apr 4, 2015

@omerjerk did you get it up and running?

@omerjerk
Copy link
Author

omerjerk commented Apr 6, 2015

Hi,
Sorry for reverting to you late. I've not been able to get it running but let me first explain to you my scenario.

I get H.264 frames from Android's hardware h.264 encoder and I intend to send those frames to browser and render them. Please have a look here. Right now, this app sends these frames to another android device over websockets and renders them.
Instead of sending to an Android device, I want to send them to Chrome and do the rendering.

Thanks.

@soliton4
Copy link
Collaborator

soliton4 commented Apr 6, 2015

dear @omerjerk i understand that your problem is quite specific.
due to the specific nature of your problem i am not able to solve it for you.
what i can offer you is to help you render raw h264 with broadway.

in order to do that its best to familiarize yourself with the little demo i set up.
you dont need to focus on the server part in order to solve your problem.

@g00dnatur3
Copy link

I'm trying to do the same thing. I looked at your demo code and I have a couple questions:

  1. It seems you are using ffmpeg with the libx264 codec to package the steam into a special mp4 container.. i get this-- but isnt the ffmpeg going to be very processor intensive. If I already have raw h264 video i dont even want to package it into mp4, i just wanna stream it.. but if i do package it into mp4 i would like to do a vcodec copy as to not encure huge cpu utilization on my server.

  2. I just want to send raw h264 data to the browser (via websocket or http get) -- So as a real example, lets say i do the following:
    convert fox.mp4 to raw h264 --> ffmpeg -i fox.mp4 -vcodec copy -f h264 fox.h264

Now if i send the entire fox.h264 to the browser, how do i play it back using broadway?

Can i do the following?

var p = new Player({});
p.canvas; // the canvas - put it where you want it
p.decode(); --> entire raw h264 data from fox.h264

Is this expected to work?

Also, if the above code works, i would like to make it streamable.
In this case what do i send p.decode() -- is it just NAL units? So i would break the raw h264 stream up into NAL units on the server side and send them 1 or more at a time over websockets?

@soliton4
Copy link
Collaborator

soliton4 commented Apr 6, 2015

@g00dnatur3

  1. no i am not creating a mp4 container with ffmpeg. in fact a raw h264 stream is being created by -f h264
    https://github.com/soliton4/BroadwayStream/blob/master/app/server.pland#L27
    you can ofc use different presets. for example --preset ultrafast will be less cpu intensive
    try feeding the h264 stream as it is and if it decodes you dont need to do anything
  2. https://github.com/soliton4/BroadwayStream/blob/master/app/connected.pland#L28

if you feed the hole file at once it is not streaming anymore is it? also i am not sure if you need to split it into NAL units. you are on the safe side if you seek for 00 00 00 01 in the stream and split it there.

looks to me like you figured out how it works.
what keeps you from actually trying?
especially since there is a example you can just modify. looks like not to much work for me.

@g00dnatur3
Copy link

Thanks for the quick reply, yes I saw that you are indeed producing raw h264.

I cloned your example and got it working nicely. Very good!

The code is a little tricky for me to follow as i am not familiar with promisland language.

I noticed you are using a websocket and streaming raw h264, i got it working with my very own raw h264 sample file.. so it is doing exactly what i want.

My question is, where do you separate the NAL units or does ffmpeg emit data events of NAL units?

below is the code in question:

    this.process = spawn("ffmpeg", params);
    this.stream = this.process.stdout;

    Stream self = this;
    this.stream.on("data", function(data){
      try{
        if (!(data && data.length)){
          return;
        };
        self.streamData(data.toString('base64'));
      }catch(e){};
    });

@soliton4
Copy link
Collaborator

soliton4 commented Apr 6, 2015

in the example the nal units are actually not seperated.
seems like ffmpeg outputs them in a chunk but i am not sure

you should be able to seperate them quite easily by scanning for 00 00 00 01

i have done it in the nodeMirror project but the code is to crappy to refer to.

@soliton4
Copy link
Collaborator

soliton4 commented Apr 6, 2015

@omerjerk
Copy link
Author

omerjerk commented Apr 7, 2015

@soliton4
I'm sure it should not be too difficult to get it running. I went through this code : https://github.com/soliton4/BroadwayStream/blob/master/app/connected.pland#L28
My h.264 frames are basically of type Uint8Array in case of JavaScript and of type byte array in case of Java.
As soon as I receive the frame, if I pass that Uint8Array to player.decode(), it should work. How does it sound to you ?

@soliton4
Copy link
Collaborator

soliton4 commented Apr 7, 2015

it sounds good but also very theoretic.
i say just do it ;)

@g00dnatur3
Copy link

So i created an algorithm to parse my h264 file into NAL units..

So basically the algorithm will read a node steam and then call a function whenever it has captured a full nal unit.

in my test case im using a 1 megabyte raw h264 video file captured from a raspberry pi camera.

For testing this i had the call back function write each nal unit to a file. I then opened the hex editor and made sure each file began with 00 00 00 01.

i verified the the sum of the byte length of all the files written had equaled the same byte length of the original 1mb file.

I also verified the data contents were as the same by opening the original in the hex editor and comparing data from each chunk.

i am 99% sure i am breaking it into NAL units correctly..
But Broadyway will not call onPictureDecoded

If i pass my raw h264 file thru your ffmpeg command then it works, but your ffmpeg command is doing transcoding.. this is using too much cpu and does not make sense.

I have a raw h264 file that plays fine in vlc, i would like to make this work with Broadway and I am stuck in the mud now.

Is there a way I can send you my 1mb file and you can take a look?

I can also send you the inidividual NAL data chunks that i have parsed so you can see exactly what i am asking Braodway to decode.

thanks

@soliton4
Copy link
Collaborator

soliton4 commented Apr 7, 2015

@g00dnatur3 i can identify 2 causes of potential errors:

  1. splitting into nal units.
    you can use this function as a template. it works
    https://github.com/soliton4/nodeMirror/blob/master/src/avc/Wgt.js#L291
  2. have a look at
    https://github.com/mbebenita/Broadway/blob/master/README.markdown
    the video encoding section gives you some insights into what broadway is capable to decode
    broadway can not decode the full feature set of h264.
    for more details we should ask @mbebenita

@mbebenita is there a documentation what specific features of h264 are not supported?

@g00dnatur3
Copy link

Yup my NAL parser is working fine. I executed your ffmpeg command on my sample h264 file and created another file. I then made my server stream this file to Broadway using my NAL unit parser and it worked fine.

So there is something special that you are asking ffmpeg to do when producing the raw h264 that is different from the raw h264 file produced by my raspberry camera.

I will play with the raspberry pi camera options to see if i can make it produce a raw h264 format that Broadway likes.

@g00dnatur3
Copy link

Oh that's a bummer, are there any plans to support the full feature set?

@g00dnatur3
Copy link

WOOOHOO, i got it to work!

In the raspberry camera api documentation there is a profile option:

profile - The H.264 profile to use for encoding. Defaults to ‘high’, but can be one of ‘baseline’, ‘main’, ‘high’, or ‘constrained’.

http://picamera.readthedocs.org/en/latest/api_camera.html#picamera.camera.PiCamera.start_recording

The default of 'high' doesn't work, but the 'baseline' profile works just fine!

nice!

hardwork pays off once again

cheers!

@omerjerk
Copy link
Author

Hi @soliton4,
I'm able to transfer the NAL units from Android device to the browser and this is a sample Uint8Array which I receive - [0, 0, 0, 1, 65, 250, 1, 41, 57, 121, 104, 129, 9, 226, 48, 4, 216, 214, 180, 117, 48, 0, 32, 0, 8, 1, 0, 160, 36, 48, 171, 30, 188, 118, 229, 82, 53, 110, 60, 86, 95, 16, 113, 159, 117, 176, 96, 240, 0, 39, 4, 227, 137, 124, 36, 139, 214, 242, 243, 215, 147, 60, 35, 201, 207, 72, 206, 78, 90, 67, 60, 178, 211, 147, 150, 65, 41, 105, 93, 83, 32, 32, 115, 215, 230, 221, 245, 120, 50, 229, 158, 156, 252, 217, 233, 147, 173, 95, 4, 156…]
I pass this array to player.decode() but nothing gets rendered on the canvas.
I put a console.log statement, where I receive this array, to make sure all the arrays start with 0 0 0 1.
All the arrays do start with these numbers.
Can you tell me what else might be going wrong ?

@omerjerk
Copy link
Author

This is where I'm maintaining my code - https://github.com/omerjerk/RemoteDroid_web/blob/master/raw.h264.js

@soliton4
Copy link
Collaborator

yeah maybe its a problem with h264 profiles (whatever that is)
broadway does not implement the complete h264 feature set

for more info pls refer to the video codec expert you trust

@chaserit123
Copy link

If I was to use ffmpeg to send raw h264 do you have a recommend command?

@g00dnatur3
Copy link

ffmpeg flag -f h264

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants