Streaming H.264 via RTP

Matthias edited this page Jun 12, 2013 · 20 revisions

Table of Contents

Playback from network on xvimagesink (i686 PC)

Seeding videotestsrc through x264enc to network

On the video source:

$ gst-launch-0.10 -v videotestsrc ! 'video/x-raw-rgb,width=320' ! ffmpegcolorspace ! x264enc ! rtph264pay ! udpsink host=192.168.2.112 port=9078
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw-rgb, framerate=(fraction)30/1, width=(int)320, height=(int)240, bpp=(int)24, depth=(int)24, endianness=(int)4321, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-rgb, framerate=(fraction)30/1, width=(int)320, height=(int)240, bpp=(int)24, depth=(int)24, endianness=(int)4321, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-rgb, framerate=(fraction)30/1, width=(int)320, height=(int)240, bpp=(int)24, depth=(int)24, endianness=(int)4321, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)320, height=(int)240
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:sink: caps = video/x-raw-rgb, framerate=(fraction)30/1, width=(int)320, height=(int)240, bpp=(int)24, depth=(int)24, endianness=(int)4321, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, width=(int)320, height=(int)240, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, codec_data=(buffer)014d4015ffe10017674d4015eca0a0fd8088000003000bb9aca00078b16cb001000468ebecb2, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)main
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)320, height=(int)240
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z01AFeygoP2AiAAAAwALuaygAHixbLA\\=\\,aOvssg\\=\\=\", payload=(int)96, ssrc=(uint)2261304474, clock-base=(uint)3463796513, seqnum-base=(uint)24047
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, width=(int)320, height=(int)240, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, codec_data=(buffer)014d4015ffe10017674d4015eca0a0fd8088000003000bb9aca00078b16cb001000468ebecb2, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)main
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 3463796513
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 24047
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z01AFeygoP2AiAAAAwALuaygAHixbLA\\=\\,aOvssg\\=\\=\", payload=(int)96, ssrc=(uint)2261304474, clock-base=(uint)3463796513, seqnum-base=(uint)24047
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
udpsink is a UDP client. In this example it connects to server 192.168.2.112, which must be listening on UDP port 9078.

udpsrc starts a UDP server. In the example below it listens on port=9078 on any network device, waits for a connection with the given parameters (RTP, H.264...) and then plays the incoming stream in an X window:

$ gst-launch -v udpsrc port=9078 ! 'application/x-rtp,payload=96,encoding-name=H264' ! rtph264depay ! h264parse ! ffdec_h264 ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding-name=(string)H264, payload=(int)96, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, encoding-name=(string)H264, payload=(int)96, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal, parsed=(boolean)true
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal, parsed=(boolean)true
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:src: caps = video/x-raw-yuv, width=(int)320, height=(int)240, framerate=(fraction)0/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstXvImageSink:xvimagesink0.GstPad:sink: caps = video/x-raw-yuv, width=(int)320, height=(int)240, framerate=(fraction)0/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1
RTP "payload=96" means dynamically allocated RTP payload type. "encoding-name=h264" specifies, that we are looking for an RTP stream with the dynamic payload type "H.264 encoded video".

Seeding v4l2src through x264enc to network

The PC with the camera:

$ gst-launch-0.10 -v v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! ffmpegcolorspace ! x264enc ! rtph264pay ! udpsink host=192.168.2.112 port=9078
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, codec_data=(buffer)014d401effe10017674d401eeca0501ed8088000000300bb9aca00078b16cb01000468ebecb2, stream-format=(string)avc, alignment=(string)au, level=(string)3, profile=(string)main
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z01AHuygUB7YCIAAAAMAu5rKAAeLFss\\=\\,aOvssg\\=\\=\", payload=(int)96, ssrc=(uint)3622465077, clock-base=(uint)1117403979, seqnum-base=(uint)36012
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, codec_data=(buffer)014d401effe10017674d401eeca0501ed8088000000300bb9aca00078b16cb01000468ebecb2, stream-format=(string)avc, alignment=(string)au, level=(string)3, profile=(string)main
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 1117407373
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 36012
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z01AHuygUB7YCIAAAAMAu5rKAAeLFss\\=\\,aOvssg\\=\\=\", payload=(int)96, ssrc=(uint)3622465077, clock-base=(uint)1117403979, seqnum-base=(uint)36012
Basically this is the same as above, except we are actually capturing frames from the camera now, instead of the videotestsrc.

The playback works with the same command as above.

For some reason, I only get like 1 frame every 2 seconds or so, which is awfully slow. Probably encoding and decoding consume a lot of processing power.

Seeding camera encoded H.264 from v4l2src to network

Even when video/x-h264 shows up in gst-inspect-0.10 uvch264_src:

$ gst-inspect-0.10 uvch264_src
...
      video/x-h264
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
          stream-format: { byte-stream, avc }
              alignment: { au }
                profile: { high, main, baseline, constrained-baseline }
...
the following fails even with the Debian testing version of gstreamer-0.10:
$ gst-launch-0.10 -v v4l2src device=/dev/video0 ! 'video/x-h264' ! rtph264pay ! udpsink host=192.168.2.112 port=9078
WARNING: erroneous pipeline: could not link v4l2src0 to rtph264pay0
It works however with vontaene.de's gstreamer-1.0 and a self-compiled recent version of gstreamer-0.10:
$ gst-launch-0.10 -v v4l2src device=/dev/video0 ! 'video/x-h264,width=800,framerate=15/2' ! rtph264pay ! udpsink host=192.168.2.112 port=9078
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-h264, width=(int)800, height=(int)600, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/2, stream-format=(string)byte-stream, alignment=(string)nal
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, width=(int)800, height=(int)600, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/2, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, width=(int)800, height=(int)600, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/2, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, width=(int)800, height=(int)600, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/2, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAKLtAZAmvLgKJAAADAAEAAAMAD2BAAC3GwALce974XhEI1AA\\=\\,aM44gAA\\=\", payload=(int)96, ssrc=(uint)1035602484, clock-base=(uint)2870417857, seqnum-base=(uint)6633
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 2870417857
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 6633
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAKLtAZAmvLgKJAAADAAEAAAMAD2BAAC3GwALce974XhEI1AA\\=\\,aM44gAA\\=\", payload=(int)96, ssrc=(uint)1035602484, clock-base=(uint)2870417857, seqnum-base=(uint)6633
Playback on an i686 PC on the same LAN performed perfectly:
$ gst-launch -v udpsrc port=9078 ! 'application/x-rtp,payload=96,encoding-name=H264' ! rtph264depay ! h264parse ! ffdec_h264 ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, payload=(int)96, encoding-name=(string)H264, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, payload=(int)96, encoding-name=(string)H264, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01424028ffe1002567424028bb406409af2e02890000030001000003000f6040002dc6c002dc7bdef85e1108d401000468ce3880
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01424028ffe1002567424028bb406409af2e02890000030001000003000f6040002dc6c002dc7bdef85e1108d401000468ce3880, width=(int)800, height=(int)600, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01424028ffe1002567424028bb406409af2e02890000030001000003000f6040002dc6c002dc7bdef85e1108d401000468ce3880
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01424028ffe1002567424028bb406409af2e02890000030001000003000f6040002dc6c002dc7bdef85e1108d401000468ce3880, width=(int)800, height=(int)600, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:src: caps = video/x-raw-yuv, width=(int)800, height=(int)600, framerate=(fraction)7/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstXvImageSink:xvimagesink0.GstPad:sink: caps = video/x-raw-yuv, width=(int)800, height=(int)600, framerate=(fraction)7/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1

Seeding camera encoded H.264 from uvch264_src to network

As we have a fancy Logitech C920, of course we want to directly stream camera encoded H.264. This does work with gstreamer-1.0 with the v4l2src, however without bitrate control. As we need bitrate control to adapt to variable internet bandwidth, we will have to use uvch264_src instead of v4l2src. There's no pre-compiled version available, so this needs to be compiled from source.

Be careful: After compilation, it turned out to be pretty experimental stuff. It hung the system, the first time I tried it. The filesystem was destroyed to an unrecoverable extent, see also issue 3. As I learned, this may also be related to overclocking though ...

After I finally got it compiled, I tried Kakaroto's example on my PC:

$ gst-launch-0.10 -v uvch264_src device=/dev/video1 name=src auto-start=true src.vfsrc ! queue ! "video/x-raw-yuv,width=320,height=240,framerate=30/1" ! xvimagesink . src.vidsrc ! queue ! video/x-h264,width=1920,height=1080,framerate=30/1,profile=constrained-baseline ! h264parse ! ffdec_h264 ! xvimagesink
This should open two windows, one for preview and one for high-resolution video. Make sure, at least the source pads vfsrc and vidsrc are connected to a sink when using uvch264_src. If the above command gives a static (not moving) picture, try recompiling gstreamer-0.10 from recent source code. You might need to recompile your plugins aswell (especially the plugins-bad package). Doing so worked for me. You may also want to read more about how to compile gstreamer. After compilation, streaming from uvch264_src to xvimagesink finally worked:
$ gst-launch-0.10 uvch264_src device=/dev/video0 name=src auto-start=true src.vidsrc ! queue ! 'video/x-h264,width=800,framerate=15/2' ! rtph264pay ! udpsink host=192.168.2.112 port=9078 . src.vfsrc ! queue ! fakesink
Setting pipeline to PAUSED ...
0:00:01.805021109  3599   0xdc4660 ERROR               GST_CAPS gstpad.c:2275:gst_pad_get_caps_unlocked:<uvch264mjpgdemux0:sink> pad returned caps ANY which are not a real subset of its template caps image/jpeg, width=(int)[ 0, 2147483647 ], height=(int)[ 0, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]

(gst-launch-0.10:3599): GStreamer-WARNING **: pad uvch264mjpgdemux0:sink returned caps which are not a real subset of its template caps
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:02.143332944  3599   0xdcf580 ERROR               GST_CAPS gstpad.c:2275:gst_pad_get_caps_unlocked:<uvch264mjpgdemux0:sink> pad returned caps ANY which are not a real subset of its template caps image/jpeg, width=(int)[ 0, 2147483647 ], height=(int)[ 0, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]

(gst-launch-0.10:3599): GStreamer-WARNING **: pad uvch264mjpgdemux0:sink returned caps which are not a real subset of its template caps

Despite the errors and warnings it does work perfectly. Receiving the video stream on an i686 PC:

$ gst-launch -v udpsrc port=9078 ! 'application/x-rtp,payload=96,encoding-name=H264' ! rtph264depay ! h264parse ! ffdec_h264 ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, payload=(int)96, encoding-name=(string)H264, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, payload=(int)96, encoding-name=(string)H264, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01640028ffe1002667640028ac7680c8137e5c051200000300020000030078c080005b8d8005b8f7bdf0bc2211a801000468ee38b0
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01640028ffe1002667640028ac7680c8137e5c051200000300020000030078c080005b8d8005b8f7bdf0bc2211a801000468ee38b0, width=(int)800, height=(int)600, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01640028ffe1002667640028ac7680c8137e5c051200000300020000030078c080005b8d8005b8f7bdf0bc2211a801000468ee38b0
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01640028ffe1002667640028ac7680c8137e5c051200000300020000030078c080005b8d8005b8f7bdf0bc2211a801000468ee38b0, width=(int)800, height=(int)600, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:src: caps = video/x-raw-yuv, width=(int)800, height=(int)600, framerate=(fraction)30/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstXvImageSink:xvimagesink0.GstPad:sink: caps = video/x-raw-yuv, width=(int)800, height=(int)600, framerate=(fraction)30/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1
Great, now I can manipulate the camera's encoding bitrate and reduce the required bandwidth.
$ gst-launch-0.10 uvch264_src device=/dev/video0 name=src initial-bitrate=450000 auto-start=true src.vfsrc ! fakesink . src.vidsrc ! queue ! 'video/x-h264,width=1280,framerate=5/1' ! rtph264pay ! udpsink host=192.168.2.112 port=9078
$ ifstat
  KB/s in  KB/s out
   62.05      0.00
   60.88      0.00
   59.09      0.19
   61.65      0.00
   59.69      0.00
   60.90      0.00
   60.21      3.16
Now the Raspberry has to do the playback ...

Playback on the Raspberry Pi

H.264 encoded local file playback

First get a sample H.264 encoded video file, e.g.:

$ wget http://ftp.akl.lt/Video/Big_Buck_Bunny/big_buck_bunny_480p_h264.mov
Play it using omxplayer:
$ omxplayer big_buck_bunny_480p_h264.mov
Video codec omx-h264 width 854 height 480 profile 77 fps 24.000000
Audio codec aac channels 6 samplerate 48000 bitspersample 16
Subtitle count: 0, state: off, index: 1, delay: 0
Play it using playbin (which does not work due to missing audio support so far):
$ gst-launch-1.0 playbin uri=file:///root/big_buck_bunny_480p_h264.mov 
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Cannot connect to server socket err = No such file or directory
Cannot connect to server request channel
jack server is not running or cannot be started
ERROR: from element /GstPlayBin:playbin0/GstPlaySink:playsink: The autoaudiosink element is not working.
Additional debug info:
gstplaysink.c(2696): gen_audio_chain (): /GstPlayBin:playbin0/GstPlaySink:playsink
ERROR: pipeline doesn't want to preroll.
Play it manually (which does work, because only a video sink is specified and the audio get's discarded):
$ gst-launch-1.0 -v filesrc location=big_buck_bunny_480p_h264.mov ! decodebin ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/quicktime
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/quicktime
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstQTDemux:qtdemux0.GstPad:sink: caps = video/quicktime
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0.GstPad:src_0: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3, profile=(string)main, codec_data=(buffer)014d401effe10015274d401ea9181b07bcde00d4040406db0ad7bdf01001000428de09c8, width=(int)853, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3, profile=(string)main, codec_data=(buffer)014d401effe10015274d401ea9181b07bcde00d4040406db0ad7bdf01001000428de09c8, width=(int)853, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3, profile=(string)main, codec_data=(buffer)014d401effe10015274d401ea9181b07bcde00d4040406db0ad7bdf01001000428de09c8, width=(int)853, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0.GstPad:src_1: caps = audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)4, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)11b0, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x0000000000000000
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0.GstPad:sink_1: caps = audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)4, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)11b0, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x0000000000000000
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstAacParse:aacparse0.GstPad:src: caps = audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)4, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)11b0, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x0000000000000000
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstFaad:faad0.GstPad:sink: caps = audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)4, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)11b0, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x0000000000000000
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstAacParse:aacparse0.GstPad:sink: caps = audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)4, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)11b0, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x0000000000000000
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-buffers = 5
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-time = 0
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-bytes = 2097152
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)main, width=(int)854, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstFaad:faad0.GstPad:src: caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x000000000000003f
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)main, width=(int)854, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)main, width=(int)854, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)main, width=(int)854, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)main, width=(int)854, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)main, width=(int)854, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-buffers = 5
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-time = 0
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-bytes = 2097152
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstFaad:faad0.GstPad:src: caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x000000000000003f
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstFaad:faad0.GstPad:src: caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x000000000000003f
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_1: caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x000000000000003f
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_1.GstProxyPad:proxypad8: caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved, rate=(int)48000, channels=(int)6, channel-mask=(bitmask)0x000000000000003f
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad10: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad10: caps = video/x-raw, format=(string)I420, width=(int)854, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)24/1
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)main, width=(int)854, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)main, width=(int)854, height=(int)480, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true
As you can see, gstreamer-1.0's decodebin automatically plugs omxh264dec between the source and the sink, to decode the incoming H.264 to raw video, which is really nice.

Note: uvch264_src seems not to go PLAYING on the Raspberry Pi, if there is no queue element in the pipeline.

Seeding camera encoded H.264 from uvch264_src to the network

A Logitech C920 was attached to a Raspberry Pi and used to stream video to the network:

$ gst-launch-0.10 -v uvch264_src device=/dev/video0 name=src mode=2 auto-start=true src.vfsrc ! queue ! 'video/x-raw-yuv,width=320,height=240' ! fakesink . src.vidsrc ! queue ! 'video/x-h264,width=640,height=480' ! rtph264pay ! udpsink host=192.168.2.109 port=9078
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = "/dev/video0"
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = "/dev/video0"
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter2: caps = video/x-raw-yuv, width=(int)320, height=(int)240
/GstPipeline:pipeline0/GstUvcH264Src:src: ready-for-capture = FALSE
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = "/dev/video0"
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter3: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au }, width=(int)640, height=(int)480; video/x-h264, stream-format=(string)avc, alignment=(string)au, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter4: caps = video/x-raw-yuv, width=(int)320, height=(int)240
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0: num-clock-samples = 0
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0: device-fd = 6
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5: caps = image/jpeg, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src: level-idc = 30
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5.GstPad:src: caps = image/jpeg, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5.GstPad:sink: caps = image/jpeg, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:jpeg: caps = image/jpeg, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:sink: caps = image/jpeg, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1

(gst-launch-0.10:2171): GStreamer-CRITICAL **: gst_structure_fixate_field_nearest_fraction: assertion `gst_structure_has_field (structure, field_name)' failed
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:h264: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vidsrc: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vidsrc.GstProxyPad:proxypad2: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z2QAHqx2gKA9/wFEgAAAAwCAAAAeMCAAFuNgAW4973wvCIRqAA\\=\\=\\,aO44sAA\\=\", payload=(int)96, ssrc=(uint)2989274380, clock-base=(uint)2670966928, seqnum-base=(uint)58058
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 2670975093
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 58058
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z2QAHqx2gKA9/wFEgAAAAwCAAAAeMCAAFuNgAW4973wvCIRqAA\\=\\=\\,aO44sAA\\=\", payload=(int)96, ssrc=(uint)2989274380, clock-base=(uint)2670966928, seqnum-base=(uint)58058
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:yuy2: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstFFMpegCsp:ffmpegcsp4.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstFFMpegCsp:ffmpegcsp4.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vfsrc: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vfsrc.GstProxyPad:proxypad0: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "preroll   ******* "
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "event   ******* (fakesink0:sink) E (type: 102, GstEventNewsegment, update=(boolean)false, rate=(double)1, applied-rate=(double)1, format=(GstFormat)GST_FORMAT_TIME, start=(gint64)0, stop=(gint64)-1, position=(gint64)0;) 0x1b62360"
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "chain   ******* (fakesink0:sink) (153600 bytes, timestamp: 0:00:00.090732102, duration: 0:00:00.033333333, offset: 0, offset_end: 1, flags: 0 ) 0x1b2e2e0"
...

Playing RTP stream on another Raspberry Pi:

$ gst-launch-1.0 -v udpsrc port=9078 ! 'application/x-rtp,payload=96,encoding-name=H264' ! queue ! rtph264depay ! h264parse ! decodebin ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, payload=(int)96, encoding-name=(string)H264, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = application/x-rtp, payload=(int)96, encoding-name=(string)H264, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, payload=(int)96, encoding-name=(string)H264, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = application/x-rtp, payload=(int)96, encoding-name=(string)H264, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse1.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100246764001eac7680a03dff014480000003008000001e30200016e360016e3def7c2f08846a01000468ee38b0
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse1.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)640, height=(int)480, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad5: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad5: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2683): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles: A lot of buffers are being dropped.
The same network stream from the same source was also played back on an i686 PC and played very smoothly there. However on the Raspberry Pi most frames are dropped, I get like 0.5-1 frame per second. Apparently omxh264dec was plugged in successfully, but videosink is EGL/GLE instead of direct framebuffer. Why is that?dfbvideosink is not even compiled into vontaene's package ...

vlc seeding a H.264 movie (via RTMP)

On the i686 PC:

$ wget http://ftp.akl.lt/Video/Big_Buck_Bunny/big_buck_bunny_480p_h264.mov
$ cvlc big_buck_bunny_480p_h264.mov --sout '#rtp{sdp=rtsp://:8080/stream.sdp}' -vv
On the Raspberry:
$ gst-launch-1.0 rtspsrc location=rtsp://192.168.2.112:8080/stream.sdp ! rtph264depay ! h264parse ! omxh264dec ! autovideosink
Also lags and drops some frames. Much faster than Raspberry to Raspberry, though. The lag issue is also discussed here. Apparently it is possible to fix it by compiling a recent version of gstreamer. However, there is a version related problem with this solution.

Another possibility would be testing the dfbvideosink of gstreamer-0.10, pipeing the decoded video from gstreamer-1.0's omxh264dec through fdsink/fdsrc... Doesn't work.

I guess, I have to compile gstreamer-1.0/1.1 myself, too.