⭐ Please star this project if you like it and show your appreciation via PayPal.Me
- If you would like to support this project, you can show your appreciation via PayPal.Me
- Current Status: (2018-09-23) - Release 4.0.270 is now available, (see the Releases)
- NuGet Package available here: https://www.nuget.org/packages/FFME.Windows/
- FFmpeg Version: 4.0.2 (32-bit)
Please note the current NuGet realease might require a different version of the FFmpeg binaries than the ones of the current state of the source code.
Here is a quick guide on how to get started.
- Open Visual Studio (v2017 recommended), and create a new WPF Application. Target Framework must be 4.6.1 or above.
- Install the NuGet Package from your Package Manager Console:
PM> Install-Package FFME.Windows
- You need FFmpeg binaries now. Build your own or download a compatible build from Zeranoe FFmpeg Builds site.
- Your FFmpeg build should have a
bin
folder with 3 exe files and some dll files. Copy all those files to a folder such asc:\ffmpeg
- Within you application's startup code (
Main
method), setUnosquare.FFME.MediaElement.FFmpegDirectory = @"c:\ffmpeg";
. - Use the FFME
MediaElement
control as any other WPF control. For example: In yourMainForm.xaml
, add the namespace:xmlns:ffme="clr-namespace:Unosquare.FFME;assembly=ffme.win"
and then add the FFME control your window's XAML:<ffme:MediaElement x:Name="Media" Background="Gray" LoadedBehavior="Play" UnloadedBehavior="Manual" />
- To play files or streams, simply set the
Source
property:Media.Source = new Uri(@"c:\your-file-here");
- Remember: The
Unosquare.FFME.Windows.Sample
provides plenty of usage examples - The generated API documentation is available here
FFME is a close (and I'd like to think better) drop-in replacement for Microsoft's WPF MediaElement Control. While the standard MediaElement uses DirectX (DirectShow) for media playback, FFME uses FFmpeg to read and decode audio and video. This means that for those of you who want to support stuff like HLS playback, or just don't want to go through the hassle of installing codecs on client machines, using FFME might just be the answer.
FFME provides multiple improvements over the standard MediaElement such as:
- Fast media seeking and frame-by-frame seeking
- Properties such as Position, Balance, SpeedRatio, IsMuted, and Volume are all Dependency Properties!
- Additional and extended media events. Extracting (and modifying) video, audio and subtitle frames is very easy.
- Ability to easily apply FFmpeg video and audio filtergraphs.
- Ability to extract media metadata and tech specs of a media stream (title, album, bit rate, codecs, FPS, etc).
- Ability to apply volume, balance and speed ratio to media playback.
- MediaState actually works on this control. The standard WPF MediaElement severely lacks in this area.
- Ability to pick media streams contained in a file or a URL.
- Ability to pass input and codec parameters.
- Ability to introduce hardware decoding acceleration via devices or via codecs.
... all in a single MediaElement control
FFME also supports opening capture devices. See example Source URLs below and issue #48
device://dshow/?audio=Microphone (Vengeance 2100)
device://gdigrab?title=Command Prompt
device://gdigrab?desktop
If you'd like audio to not change pitch while changing the SpeedRatio property, you'll need the SoundTouch.dll
library available on the same directory as your application. You can get the SoundTouch library here.
First off, let's review a few concepts. A packet
is a group of bytes read from the input. All packets
are of a specific MediaType
(Audio, Video, Subtitle, Data), and contain some timing information and most importantly compressed data. Packets are sent to a Codec
and in turn, the codec produces Frames
. Please note that producing 1 frame
does not always take exactly 1 packet
. A packet
may contain many frames
but also a frame
may require several packets
for the decoder to build it. Frames
will contain timing informattion and the raw, uncompressed data. Now, you may think you can use frames
and show pixels on the screen or send samples to the sound card. We are close, but we still need to do some additional processing. Turns out different Codecs
will produce different uncompressed data formats. For example, some video codecs will output pixel data in ARGB, some others in RGB, and some other in YUV420. Therefore, we will need to Convert
these frames
into something all hardware can use natively. I call these converted frames, MediaBlocks
. These MediaBlocks
will contain uncompressed data in standard Audio and Video formats that all hardware is able to receive.
The process described above is implemented in 3 different layers:
- The
MediaContainer
wraps an input stream. This layer keeps track of aMediaComponentSet
which is nothing more than a collecttion ofMediaComponent
objects. EachMediaComponent
holdspacket
caching,frame
decoding, andblock
conversion logic. It provides the following important functionality:- We call
Open
to open the input stream and detect the different stream components. This also determines the codecs to use. - We call
Read
to read the next available packet and store it in its corresponding component (audio, video, subtitle, data, etc) - We call
Decode
to read the following packet from the queue that each of the components hold, and return a set of frames. - Finally, we call
Convert
to turn a givenframe
into aMediaBlock
.
- We call
- The
MediaEngine
wraps aMediaContainer
and it is responsible for executing commands to control the input stream (Play, Pause, Stop, Seek, etc.) while keeping keeping 3 background workers.- The
PacketReadingWroker
is designed to continuously read packets from theMediaContainer
. It will read packets when it needs them and it will pause if it does not need them. This is determined by how much data is in the cache. It will try to keep approximately 1 second of media packets at all times. - The
FrameDecodingWroker
gets the packets that thePacketReadingWorker
writes and decodes them into frames. It then converts those frames intoblocks
and writes them to aMediaBlockBuffer
. This block buffer can then be read by something else (the following worker described here) so its contents can be rendered. - Finally, the
BlockRenderingWorker
reads blocks form theMediaBlockBuffer
s and sends those blocks to a plat-from specificIMediaRenderer
.
- The
- At the highest level, we have a
MediaElement
. It wraps aMediaEngine
and it contains platform-specific implementation of methods to perform stuff like audio rendering, video rendering, subtitle rendering, and property synchronization between theMediaEngine
and itself.
A high-level diagram is provided as additional reference below.
Your help is welcome!
- I am planning the next version of this control,
Floyd
. See the Issues section.
Please note that I am unable to distribute FFmpeg's binaries because I don't know if I am allowed to do so. Follow the instructions below to compile, run and test FFME.
- Clone this repository.
- Download the FFmpeg win32-shared binaries from Zeranoe FFmpeg Builds.
- Extract the contents of the
zip
file you just downloaded and go to thebin
folder that got extracted. You should see 3exe
files and multipledll
files. Select and copy all of them. - Now paste all 11 files from the prior step onto a well-known folder. Take note of the full path. (I used
c:\ffmpeg\
) - Open the solution and set the
Unosquare.FFME.Windows.Sample
project as the startup project. You can do this by right clicking on the project and selectingSet as startup project
- Under the
Unosquare.FFME.Windows.Sample
project, find the fileApp.xaml.cs
and under the constructor, locate the lineMediaElement.FFmpegDirectory = @"c:\ffmpeg";
and replace the path so that it points to the folder where you extracted your FFmpeg binaries (dll files). - Click on
Start
to run the project. - You should see a sample media player. Click on the
Open
icon located at the bottom right and enter a URL or path to a media file. - The file or URL should play immediately, and all the properties should display to the right of the media display by clicking on the
Info
icon. - You can use the resulting compiled assemblies in your project without further dependencies. Look for both
ffme.common.dll
andffme.win.dll
.
If you get the following error when compiling: The current .NET SDK does not support targeting .NET Standard 2.0. Either target .NET Standard 1.6 or lower, or use a version of the .NET SDK that supports .NET Standard 2.0.
Simply download and install .NET Core SDK v2 or later.
Compile FFmpeg for Mac (instructions can be found on FFmpeg.AutoGen) and copy the following libraries from /opt/local/lib
's to /Users/{USER}/ffmpeg
(equivalent to ~/ffmpeg
):
libavcodec.57.dylib
libavdevice.57.dylib
libavfilter.6.dylib
libavformat.57.dylib
libavutil.55.dylib
libswresample.2.dylib
libswscale.4.dylib
Note: when building FFmpeg locally, compiled libraries are named differently than in the list above. E.g. libavcodec.57.dylib
is actually named libavcodec.57.89.100.dylib
. To properly load libraries, copy and rename each library to match the format in the list above.
In the sample MacOS player, the FFmpeg folder is configured to point to ~/ffmpeg
in the following line of code:
Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.UserProfile), "ffmpeg");
Note that this can be customized to point to any other folder.
When distributing the player and the associated libraries with your application, dll files should be added to the project as BundleResource
items. Also, each library should be copied to the output directory on build. Afterwards, change the above configuration to use Environment.CurrentDirectory
to search for FFmpeg libraries.
Make sure you have Xamarin for Visual Studio 2017 installed if you want to open the MacOS projects.
In no particular order
- To the FFmpeg team for making the Swiss Army Knife of media. I encourage you to donate to them.
- To Kyle Schwarz for creating and making Zeranoe FFmpeg builds available to everyone.
- To the NAudio team for making the best audio library out there for .NET -- one day I will contribute some improvements I have noticed they need.
- To Ruslan Balanukhin for his FFmpeg interop bindings generator tool: FFmpeg.AutoGen.
- To Martin Bohme for his tutorial on creating a video player with FFmpeg.
- To Barry Mieny for his beautiful FFmpeg logo
- Please refer to the LICENSE file for more information.