Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Windows] Crash while using Intel-Media SDK to decode video stream. #32

Closed
Meonardo opened this issue Jul 11, 2022 · 3 comments
Closed

Comments

@Meonardo
Copy link

Thanks for your great work!

I am using libwebrtc.dll into a WPF C# project, works great when rendering my internal camera from my PC,
but when I try to render my remote video I got "Access Violation" exception in file rtc_video_frame_impl.cc.

Notice: everything back to normal(no crash) after I comment out USE_INTEL_MEDIA_SDK in file rtc_peerconnection_factory_impl.cc.

Here is what I found, while using Intel-Media SDK to decode video steam, it produces owt::base::NativeHandleBuffer(which is webrtc::VideoFrameBuffer::Type::kNative) rather than webrtc::VideoFrameBuffer::Type::kI420, so if call buffer_->GetI420() it will return nullptr which led to crash.

@nikohpng
Copy link
Contributor

I have same issue.

@Meonardo
Copy link
Author

Hi @nikohpng Can you test this code block below for your case?
I can get the yuv data buffer successfully but still have rendering issue...

I found the MSDK produces NV12(see intel media-sdk manual) pixel format, the mfxFrameData stores the yuv data (see the mfxFrameData definition), change these code to code below see if you can retrieve yuv data.

mfxFrameData frame_data = pOutputSurface->Data;
mfxMemId dxMemId = frame_data.MemId;
mfxFrameInfo frame_info = pOutputSurface->Info;

m_pmfx_allocator_->LockFrame(dxMemId, &frame_data);

// always nv12
int w = frame_info.Width;
int h = frame_info.Height;
int stride_uv = (w + 1) / 2;
uint8_t* data_y = frame_data.Y;
uint8_t* data_u = frame_data.U;
uint8_t* data_v = frame_data.V;

rtc::scoped_refptr<NV12Buffer> nv12_buffer = webrtc::NV12Buffer::Create(w, h);
if (nv12_buffer.get()) {
  libyuv::I420ToNV12(data_y, w, data_u, stride_uv, data_v, stride_uv,
                     nv12_buffer->MutableDataY(),
                     nv12_buffer->StrideY(),
                     nv12_buffer->MutableDataUV(),
                     nv12_buffer->StrideUV(), w, h);

  rtc::scoped_refptr<VideoFrameBuffer> buffer = std::move(nv12_buffer);

  if (callback_) {
    webrtc::VideoFrame decoded_frame(buffer, inputImage.Timestamp(), 0,
                                     webrtc::kVideoRotation_0);
    decoded_frame.set_ntp_time_ms(inputImage.ntp_time_ms_);
    decoded_frame.set_timestamp(inputImage.Timestamp());
    callback_->Decoded(decoded_frame);
  }
}

m_pmfx_allocator_->UnlockFrame(dxMemId, &frame_data);

@Meonardo
Copy link
Author

After calling m_pmfx_allocator_->LockFrame(dxMemId, &frame_data);, the yuv data can be retrieved, but the code above for converting i420ToNV12 is WRONG, please see the correct way for coverting nv12 -> i420.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants