⭐ Please star this project if you like it and show your appreciation via PayPal.Me
- If you would like to support this project, you can show your appreciation via PayPal.Me
- Current Status: (2019-06-30) - Release 4.1.310 is now available, (see the Releases)
- NuGet Package available here: https://www.nuget.org/packages/FFME.Windows/
- FFmpeg Version: 4.2.0 32-bit or 64-bit
Please note the current NuGet realease might require a different version of the FFmpeg binaries than the ones of the current state of the source code.
Here is a quick guide on how to get started.
- Open Visual Studio (v2019 preview recommended), and create a new WPF Application. Target Framework must be 4.6.1 or above, or .Net Core 3.0.
- Install the NuGet Package from your Package Manager Console:
PM> Install-Package FFME.Windows
- You need FFmpeg shared binaries (64 or 32 bit, depending on your app's target architecture). Build your own or download a compatible build from Zeranoe FFmpeg Builds site.
- Your FFmpeg build should have a
bin
folder with 3 exe files and some dll files. Copy all those files to a folder such asc:\ffmpeg
- Within you application's startup code (
Main
method), setUnosquare.FFME.Library.FFmpegDirectory = @"c:\ffmpeg";
. - Use the FFME
MediaElement
control as any other WPF control. For example: In yourMainForm.xaml
, add the namespace:xmlns:ffme="clr-namespace:Unosquare.FFME;assembly=ffme.win"
and then add the FFME control your window's XAML:<ffme:MediaElement x:Name="Media" Background="Gray" LoadedBehavior="Play" UnloadedBehavior="Manual" />
- To play files or streams, simply set the
Source
property:Media.Source = new Uri(@"c:\your-file-here");
. SinceSource
is a dependency property, it need to be set from the GUI thread.
Note: To build your own FFmpeg binaries, I recommend the Media Autobuild Suite but please don't ask for help on it here.
- Remember: The
Unosquare.FFME.Windows.Sample
provides usage examples for plenty of features. Use it as your main reference. - The generated API documentation is available here
FFME is an advanced and close drop-in replacement for Microsoft's WPF MediaElement Control. While the standard MediaElement uses DirectX (DirectShow) for media playback, FFME uses FFmpeg to read and decode audio and video. This means that for those of you who want to support stuff like HLS playback, or just don't want to go through the hassle of installing codecs on client machines, using FFME might just be the answer.
FFME provides multiple improvements over the standard MediaElement such as:
- Fast media seeking and frame-by-frame seeking
- Properties such as Position, Balance, SpeedRatio, IsMuted, and Volume are all Dependency Properties.
- Additional and extended media events. Extracting (and modifying) video, audio and subtitle frames is very easy.
- Easily apply FFmpeg video and audio filtergraphs.
- Extract media metadata and tech specs of a media stream (title, album, bit rate, codecs, FPS, etc).
- Apply volume, balance and speed ratio to media playback.
- MediaState actually works on this control. The standard WPF MediaElement severely lacks in this area.
- Ability to pick media streams contained in a file or a URL.
- Specify input and codec parameters.
- Opt-in hardware decoding acceleration via devices or via codecs.
- Capture stream packets, audio, video and subtitle frames
- Perform custom stream reading and stream recording
... all in a single MediaElement control
FFME also supports opening capture devices. See example Source URLs below and issue #48
device://dshow/?audio=Microphone (Vengeance 2100):video=MS Webcam 4000
device://gdigrab?title=Command Prompt
device://gdigrab?desktop
If you'd like audio to not change pitch while changing the SpeedRatio property, you'll need the SoundTouch.dll
library v2.1.1 available on the same directory as the FFmpeg binaries. You can get the SoundTouch library here.
First off, let's review a few concepts. A packet
is a group of bytes read from the input. All packets
are of a specific MediaType
(Audio, Video, Subtitle, Data), and contain some timing information and most importantly compressed data. Packets are sent to a Codec
and in turn, the codec produces Frames
. Please note that producing 1 frame
does not always take exactly 1 packet
. A packet
may contain many frames
but also a frame
may require several packets
for the decoder to build it. Frames
will contain timing informattion and the raw, uncompressed data. Now, you may think you can use frames
and show pixels on the screen or send samples to the sound card. We are close, but we still need to do some additional processing. Turns out different Codecs
will produce different uncompressed data formats. For example, some video codecs will output pixel data in ARGB, some others in RGB, and some other in YUV420. Therefore, we will need to Convert
these frames
into something all hardware can use natively. I call these converted frames, MediaBlocks
. These MediaBlocks
will contain uncompressed data in standard Audio and Video formats that all hardware is able to receive.
The process described above is implemented in 3 different layers:
- The
MediaContainer
wraps an input stream. This layer keeps track of aMediaComponentSet
which is nothing more than a collecttion ofMediaComponent
objects. EachMediaComponent
holdspacket
caching,frame
decoding, andblock
conversion logic. It provides the following important functionality:- We call
Open
to open the input stream and detect the different stream components. This also determines the codecs to use. - We call
Read
to read the next available packet and store it in its corresponding component (audio, video, subtitle, data, etc) - We call
Decode
to read the following packet from the queue that each of the components hold, and return a set of frames. - Finally, we call
Convert
to turn a givenframe
into aMediaBlock
.
- We call
- The
MediaEngine
wraps aMediaContainer
and it is responsible for executing commands to control the input stream (Play, Pause, Stop, Seek, etc.) while keeping keeping 3 background workers.- The
PacketReadingWroker
is designed to continuously read packets from theMediaContainer
. It will read packets when it needs them and it will pause if it does not need them. This is determined by how much data is in the cache. It will try to keep approximately 1 second of media packets at all times. - The
FrameDecodingWroker
gets the packets that thePacketReadingWorker
writes and decodes them into frames. It then converts those frames intoblocks
and writes them to aMediaBlockBuffer
. This block buffer can then be read by something else (the following worker described here) so its contents can be rendered. - Finally, the
BlockRenderingWorker
reads blocks form theMediaBlockBuffer
s and sends those blocks to a plat-from specificIMediaRenderer
.
- The
- At the highest level, we have a
MediaElement
. It wraps aMediaEngine
and it contains platform-specific implementation of methods to perform stuff like audio rendering, video rendering, subtitle rendering, and property synchronization between theMediaEngine
and itself.
A high-level diagram is provided as additional reference below.
Your help is welcome!
- I am planning the next version of this control,
Floyd
. See the Issues section.
Please note that I am unable to distribute FFmpeg's binaries because I don't know if I am allowed to do so. Follow the instructions below to compile, run and test FFME.
- Clone this repository and make sure you have .Net Core 3.0 preview 7 SDK or above installed.
- Download the FFmpeg shared binaries for your target architecture: 32-bit or 64-bit.
- Extract the contents of the
zip
file you just downloaded and go to thebin
folder that got extracted. You should see 3exe
files and multipledll
files. Select and copy all of them. - Now paste all files from the prior step onto a well-known folder. Take note of the full path. (I used
c:\ffmpeg\
) - Open the solution and set the
Unosquare.FFME.Windows.Sample
project as the startup project. You can do this by right clicking on the project and selectingSet as startup project
. Please note that you will need Visual Studio 2019 with dotnet Core 3.0 SDK for your target architecture installed. - Under the
Unosquare.FFME.Windows.Sample
project, find the fileApp.xaml.cs
and under the constructor, locate the lineLibrary.FFmpegDirectory = @"c:\ffmpeg";
and replace the path so that it points to the folder where you extracted your FFmpeg binaries (dll files). - Click on
Start
to run the project. - You should see a sample media player. Click on the
Open
icon located at the bottom right and enter a URL or path to a media file. - The file or URL should play immediately, and all the properties should display to the right of the media display by clicking on the
Info
icon. - You can use the resulting compiled assemblies in your project without further dependencies. Look for
ffme.win.dll
.
In no particular order
- To the FFmpeg team for making the Swiss Army Knife of media. I encourage you to donate to them.
- To Kyle Schwarz for creating and making Zeranoe FFmpeg builds available to everyone.
- To the NAudio team for making the best audio library out there for .NET -- one day I will contribute some improvements I have noticed they need.
- To Ruslan Balanukhin for his FFmpeg interop bindings generator tool: FFmpeg.AutoGen.
- To Martin Bohme for his tutorial on creating a video player with FFmpeg.
- To Barry Mieny for his beautiful FFmpeg logo
- Please refer to the LICENSE file for more information.