-
Notifications
You must be signed in to change notification settings - Fork 227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
timestamp units #7
Comments
Finally I solved part of my problem, there was an issue with my timestamp calculations, so I can verify they must be introduced in milliseconds. However I couldn't manage to stream audio and video together, I can send video and audio independently and it works, however when I try to send both the communication crashes. I am sending realtime audio and video, does anyone knows if I should take care of sending video and audio frames in some specific order (i.e. audio waits video, or video waits audio)? Up to now as soon as I grab a frame (audio or video) I send it. |
Hi, Do you think you have a problem something like that? |
Hi, This librtmp always sets hasVideo and hasAudio flags always, so this is not the problem. However I managed to solve the problem. The issue was the timestamps order. I was setting audio and video timestamps correclty, but I was not sending them in the appropriate order. I seams that to mux audio and video in RTMP it is needed that all packets are (regardless they are video or audio), they have to include always an increasing timestamp. What as happening to me is that I was sending a video frame and after that an audio frames with a timestamp with a slightly lower timestamp. Now it works, I have to take care to send all the frames (or rtmp packets) with a timestamp higher or equal than the previous one (even while interlacing video and audio packets), then it works great! Thanks for your help even though |
@davidcassany @mekya Hi, i have same question,i can not publish video and audio together.write result return 0 and the timestamp is increasing, type 9 is video ,8 is audio. how do you solve it? 11-22 01:31:32.306 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 9,size 29,dts 0,result 0 |
@gzsll your timestamps are not correct, each frame added to the stream should have a dts higher than the previous one (regardless if this is audio or video). See in 11-22 01:31:32.415, that frame has 134 as the timestamp, however the next one has 80, that's wrong. To solve it I used to queue audio and video frames in custom priority queue, where they were sorted by timestamp. Then I was getting frames from that queue to the RTMP library. |
@davidcassany first frame's timestamp is 0? other frame is higher than previous? |
@gzsll not sure if first frame needs to be zero, however have a close look to your timestamps. See in instant 11-22 01:31:32.615 and then 11-22 01:31:32.633, the second timestamp belongs to the past, video and audio timestamps are not independent. If you use wireshark to verify your RTMP headers, you will notice the error in headers. |
@davidcassany thanks,i will try it. |
Can you share your source code for sending audio and video? |
Hi @davidarchi, I will try to summarise the relevant parts and maybe some code snippets between today and tomorrow. I hope it will help, I can't give all the code as the parts I am using are quite coupled to an specific application and it isn't that short. |
@davidcassany Hi, can you check my implementation? its source code located at https://github.com/insthync/AndroidSimpleScreenRTMP Now I just trying to send video data the result is 0 but it does not work, don't know what it wrong. |
@insthync for more than 6 six months that I do not program anything related to java or Android (actually I haven't ever been and Android developer, I am a complete noob 😛) I haven't seen anything relevant to your implementation (a part that you may send codec configuration frame multiple times), to me it is weird that you use the same thread to encode the frames than to send the rtmp data. I used the grafika examples to code the VideoEncoder class and thread. That might help you to code a more elaborated encoder class. Some hints that come to my mind:
Those are just my 2 cents, be aware I am not an Android developer, so use my thoughts at your own risk. |
@davidarchi sorry for my delayed answer, I did not have the chance to write the implementation abstract as I told you :P, anyway better later than never. Here goes: I structered the code with the following classes:
public final static Boolean VIDEO = true;
public final static Boolean AUDIO = false;
byte[] encData;
byte[] extraInfo; //SPS/PPS NAL units or AAC headers
Long timestamp; //In usec, direct value form Android encoder (presentationTimeUs)
Boolean video; And this methods: int getTimestampMilliseconds(){
return ((Long)(timestamp/1000)).intValue();
}
@Override
public int compareTo(Object o) {
CodedFrame frame = (CodedFrame) o;
return timestamp.compareTo(frame.timestamp);
}
private PriorityQueue<CodedFrame> pQueue;
private int videoCount;
private int audioCount; And methods: public void push(CodedFrame frame);
// Be aware of race conditions, I used qQueue object to sychornize sensible parts of push method
public CodedFrame pop() throws InterruptedException;
// It only returns a frame if, and only if, there is at least one video AND one audio frame in the queue.
// This is the reason to have separate audio and video counters.
// If the condition is not met it returns NULL
public void clear();
// It resets the queue to an empty state.
private EncoderThread mEncoderThread;
// An inner class that contains the audio encoder process, it places the coded data in a CodedFrame and pushes it to the AVFrameQueue
private CaptureThread mCaptureThread;
// An innner class that grabs raw audio frames and queues them to the audio encoder
private MediaCodec mEncoder;
private MediaFormat audioFormat; Encoder thread has a drainEncoder method (this is the method that pushes encoded frames to the AVFrameQueue): public void drainEncoder() {
final int TIMEOUT_USEC = 0; // no timeout -- check for buffers, bail if none
ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();
while (true) {
int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
break;
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not expected for an encoder
encoderOutputBuffers = mEncoder.getOutputBuffers();
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// Should happen before receiving buffers, and should only happen once.
mEncodedFormat = mEncoder.getOutputFormat();
Log.d(TAG, "encoder output format changed: " + mEncodedFormat);
} else if (encoderStatus < 0) {
Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " +
encoderStatus);
// let's ignore it
} else {
ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
if (encodedData == null) {
throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
" was null");
}
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
codecConfig = new byte[mBufferInfo.size];
encodedData.get(codecConfig);
mBufferInfo.size = 0;
}
if (mBufferInfo.size != 0) {
// adjust the ByteBuffer values to match BufferInfo (not needed?)
encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
byte[] outData = new byte[mBufferInfo.size];
encodedData.get(outData);
CodedFrame frame = new CodedFrame(CodedFrame.AUDIO);
frame.timestamp = mBufferInfo.presentationTimeUs;
frame.extraInfo = codecConfig;
frame.encData = outData;
queue.push(frame);
if (VERBOSE) {
Log.d(TAG, "sent " + mBufferInfo.size + " bytes to muxer, ts=" +
mBufferInfo.presentationTimeUs);
}
}
mEncoder.releaseOutputBuffer(encoderStatus, false);
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.w(TAG, "reached end of stream unexpectedly");
break; // out of while
}
}
}
}
private RTMPMuxer muxer;
private boolean isStreaming;
private AVFramesQueue queue;
private URI url; and the following run method: @Override
public void run()
{
if (muxer.open(url.toString()) <= 0 || muxer.isConnected() <= 0){
isStreaming = false;
cb.onMuxerClose();
return;
}
boolean videoConf = false;
boolean audioConf = false;
CodedFrame frame;
while (isStreaming)
{
try {
frame = queue.pop();
} catch (InterruptedException e) {
isStreaming = false;
continue;
}
if (frame == null){
continue;
}
if (muxer.isConnected() > 0) {
if (frame.video) {
if (!videoConf && frame.extraInfo != null){
videoConf = true;
muxer.writeVideo(frame.extraInfo, 0, frame.extraInfo.length, frame.getTimestampMilliseconds());
} else if (videoConf) {
muxer.writeVideo(frame.encData, 0, frame.encData.length, frame.getTimestampMilliseconds());
}
} else {
if (!audioConf && frame.extraInfo != null){
audioConf = true;
muxer.writeAudio(frame.extraInfo, 0, frame.encData.length, frame.getTimestampMilliseconds());
} else if (audioConf) {
muxer.writeAudio(frame.encData, 0, frame.encData.length, frame.getTimestampMilliseconds());
}
}
} else {
break;
}
}
if (muxer.isConnected() > 0) {
muxer.close();
}
if (isStreaming){
isStreaming = false;
cb.onMuxerClose();
}
} Thats all I got, I hope it helps :) Be warned that I am not an Android developer (neither a java developer), it has been actually my first Android app, so I am pretty sure it can be really improved. My guide was the grafika project. |
@davidcassany Thank you, I will try it :) |
Now I have send video format data before send video frame data, but still can not watch it on VLC, I've notice the video data on server, its framerate always zero. Does not sure that what should I do. Edit: The Video that recording at server is fine, I can watch it but I can't watch it live. |
@insthync from your implementation I would change the timestamp calculations, I would use directly the timestamps provided by the encoder. You have: if (startTime == 0)
startTime = mBufferInfo.presentationTimeUs / 1000;
int timestamp = currentFrame * (1000 / FRAME_RATE); I would try: int timestamp = (Long)(mBufferInfo.presentationTimeUs).intValue() - startTime; Where start time is taken right after encoder start. Add some control to send only once the configuration, make sure mRTMPMuxer.writeVideo run only once and with the timestamp value, not zero, even it is the first one it doesn't necessarily have to be zero. if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
// Pulling codec config data
encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
Log.i(TAG, "sent " + mBufferInfo.size + " bytes to muxer...");
byte[] bytes = new byte[encodedData.remaining()];
encodedData.get(bytes);
int writeResult = mRTMPMuxer.writeVideo(bytes, 0, bytes.length, 0);
Log.d(TAG, "RTMP_URL write format result: " + writeResult);
mBufferInfo.size = 0;
} Which server are you using? I used to use nginx plus the RTMP plugin, it played well with either VLC or FFPLAY. You can use wireshark to see all RTMP headers are correct, this is the way I managed to debug the timestamps, reviewing RTMP packet headers. Wireshark has a decoder for RTMP packets. You should see only deltas of the frame time (RTMP does not use absolute timestamps). |
@davidcassany Thank you, I will try it. about the server I use nginx with its plugin. |
Now, I can see live stream on VLC, but I have to active screen to make frame changes seem like it might be problems with MediaCode, when frame not changes dequeueOutputBuffer() function will return MediaCodec.INFO_TRY_AGAIN_LATER with this it will not send data to server, I will change condition to keep last frame data and send when dequeueOutputBuffer() not return >= 0 The framerate data on server still 0, I think it does not matter. |
I have a question regarding the timestamp units that the API expects. I assume they are milliseconds, as it is stated by the RTMP specs. However I cannot figure out how to transmit audio, the data is sent and the handshake accepted, I can see it using wireshark. However I noticed that the timestamp is always reset to zero in each chunk. Should I use different timestamp units? Have you faced this issue before?
The text was updated successfully, but these errors were encountered: