Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple Terradeck Emulation #4

Closed
kassah opened this issue Jan 4, 2021 · 7 comments
Closed

Multiple Terradeck Emulation #4

kassah opened this issue Jan 4, 2021 · 7 comments
Labels
enhancement New feature or request

Comments

@kassah
Copy link
Contributor

kassah commented Jan 4, 2021

Goal: Accomodate sacrament meetings at the same location at 9:00am, 10:15am, 12:00pm with separate webcasts.

In order to schedule sacrament meetings close together, my STS has taken to setting up two encoders in the church Webcast platform. I'd like a single Pi to act to as a streamer as requested to both configurations. Starting and stopping independently based on separate Terradeck XML configurations.

@kassah kassah added the enhancement New feature or request label Jan 4, 2021
@ChickenDevs
Copy link
Owner

We have the same issue with multiple wards in the same building but our STS went for a single event that lasts for 5 hours and we just stop the broadcast (either through the cockpit interface or the remote control outlet) between wards. I actually would be very interested in figuring out how to make this work as you suggest but I'm not sure how to handle the logic for multiple autoconfig URLs in a predictable manner.

If we were watching multiple autoconf URLs and the events never overlap it would be pretty simple to just use whichever was active, but what happens if they overlap? (Especially since the webcast system gives us so little granularity in scheduling.)

  • Should one URL be "default" and others used only when it isn't active?
  • Should we switch to a new event whenever it becomes active?
  • Should we switch only when the current event becomes inactive?
  • What other scenarios am I missing?

From an automation perspective, it might actually be easier with three encoders than two.

@kassah
Copy link
Contributor Author

kassah commented Jan 5, 2021

My experiments so far used 1 stream going to from camera to v4l2 loopback device, then doing two streams from the loopback device to check servers.

installing loopback device:

# install V4L2 Loopback device
sudo apt-get install v4l2loopback-utils

# Pull loopback device into kernel
sudo modprobe v4l2loopback

Start streaming from camera to loopback device:

sudo ffmpeg -hide_banner -f v4l2 -thread_queue_size 16 -framerate 25 -video_size 1280x720 -input_format mjpeg -i /dev/video0 -f v4l2 -vcodec copy /dev/video2

Streaming from loopback device to outgoing #1:

sudo ffmpeg -hide_banner -f v4l2 -i /dev/video2 -f alsa -thread_queue_size 1024 -itsoffset 1 -i sysdefault:CARD=Device -map 0:v -map 1:a -vcodec h264 -acodec aac -ar 48000 -ac 2 -preset veryfast -b:v 750000 -framerate 25 -g 50 -filter:a volume=0dB -f flv <rtmp url #1>

Streaming from loopback device to outgoing #2:

sudo ffmpeg -hide_banner -f v4l2 -i /dev/video2 -f alsa -thread_queue_size 1024 -itsoffset 1 -i sysdefault:CARD=Device -map 0:v -map 1:a -vcodec h264 -acodec aac -ar 48000 -ac 2 -preset veryfast -b:v 750000 -framerate 25 -g 50 -filter:a volume=0dB -f flv <rtmp url #2>

But it's not outputting enough frames to keep up with the desired feed (speed is sub 1x) and the CPU is pegged.

@ChickenDevs
Copy link
Owner

Ah. I hadn't even considered multiple streams as a solution to your problem. We just did something similar for a special stake meeting that needed to be broadcast in both Spanish and English. We split the video stream and used two separate audio streams going to two different encoders (rtmp URLs). Our experience confirms yours. The Pi isn't beefy enough to handle multiple streams so we had to use a laptop for that meeting.

@kassah
Copy link
Contributor Author

kassah commented Jan 6, 2021

I had an idea last night, shift the conversion from mjpeg to h264 and bitrate set to the camera side. Then just use -vcodec copy on the outgoing streams.

The downside is it locks the bitrate to the same for both streams. However it might solve the CPU issue, since it's not having to do the same mjpeg to h264 conversion twice on the same video feed.

@kassah
Copy link
Contributor Author

kassah commented Jan 6, 2021

Okay, explored this a little. Got stuck on the fact that it doesn't seem to like picking up the h264 stream midway. Maybe @ChickenDevs will have insight?

Camera side:

sudo ffmpeg -hide_banner -f v4l2 -thread_queue_size 16 -framerate 25 -video_size 1280x720 -input_format mjpeg -i /dev/video0 -f v4l2 -vcodec h264 -preset veryfast -b:v 750000 -framerate 25 /dev/video2

Stream side:

sudo ffmpeg -hide_banner -f v4l2 -i /dev/video2 -f alsa -thread_queue_size 1024 -itsoffset 1 -i sysdefault:CARD=Device -map 0:v -map 1:a -vcodec copy -acodec aac -ar 48000 -ac 2 -g 50 -filter:a volume=0dB -analyzeduration 5 -f flv <rtmpUrl>

Camer side output:

webcast@webcast-pi:~$ sudo ffmpeg -hide_banner -f v4l2 -thread_queue_size 16 -framerate 25 -video_size 1280x720 -input_format mjpeg -i /dev/video0 -f v4l2 -vcodec h264 -preset veryfast -b:v 750000 -framerate 25 /dev/video2
Input #0, video4linux2,v4l2, from '/dev/video0':
  Duration: N/A, start: 9963.410399, bitrate: N/A
    Stream #0:0: Video: mjpeg (Baseline), yuvj422p(pc, bt470bg/unknown/unknown), 1280x720, 25 fps, 25 tbr, 1000k tbn, 1000k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x1158270] using cpu capabilities: ARMv6 NEON
[libx264 @ 0x1158270] profile High 4:2:2, level 3.1, 4:2:2, 8-bit
Output #0, video4linux2,v4l2, to '/dev/video2':
  Metadata:
    encoder         : Lavf58.45.100
    Stream #0:0: Video: h264 (libx264), yuvj422p(pc), 1280x720, q=-1--1, 750 kb/s, 25 fps, 25 tbn, 25 tbc
    Metadata:
      encoder         : Lavc58.91.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/750000 buffer size: 0 vbv_delay: N/A
More than 1000 frames duplicatedN/A time=00:01:48.28 bitrate=N/A dup=1000 drop=0 speed=   1x    
frame=13092 fps= 25 q=31.0 size=N/A time=00:08:42.80 bitrate=N/A dup=4740 drop=0 speed=   1x    

Stream side output (with stream url removed):

    Last message repeated 1 times
[h264 @ 0x1aa7e60] decode_slice_header error
[h264 @ 0x1aa7e60] no frame!
[h264 @ 0x1aa7e60] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0x1aa7e60] decode_slice_header error
[h264 @ 0x1aa7e60] no frame!
[h264 @ 0x1aa7e60] non-existing PPS 0 referenced
[video4linux2,v4l2 @ 0x1aa6e50] decoding for stream 0 failed
[video4linux2,v4l2 @ 0x1aa6e50] Could not find codec parameters for stream 0 (Video: h264, none, 1280x720): unspecified pixel format
Consider increasing the value for the 'analyzeduration' and 'probesize' options
Input #0, video4linux2,v4l2, from '/dev/video2':
  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: h264, none, 1280x720, 30 fps, 30 tbr, 1000k tbn, 2000k tbc
Guessed Channel Layout for Input Stream #1.0 : stereo
Input #1, alsa, from 'sysdefault:CARD=Device':
  Duration: N/A, start: 1609968163.317558, bitrate: 1536 kb/s
    Stream #1:0: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #1:0 -> #0:1 (pcm_s16le (native) -> aac (native))
Press [q] to stop, [?] for help
[alsa @ 0x1abf680] ALSA buffer xrun.
Output #0, flv, to '<streamUrl>':
  Metadata:
    encoder         : Lavf58.45.100
    Stream #0:0: Video: h264 ([7][0][0][0] / 0x0007), none, 1280x720, q=2-31, 30 fps, 30 tbr, 1k tbn, 1000k tbc
    Stream #0:1: Audio: aac (LC) ([10][0][0][0] / 0x000A), 48000 Hz, stereo, fltp, 128 kb/s
    Metadata:
      encoder         : Lavc58.91.100 aac
[video4linux2,v4l2 @ 0x1aa6e50] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8)
[flv @ 0x1b75f90] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly
[flv @ 0x1b75f90] Packet is missing PTS
av_interleaved_write_frame(): Invalid argument
[flv @ 0x1b75f90] Failed to update header with correct duration.
[flv @ 0x1b75f90] Failed to update header with correct filesize.
frame=    1 fps=0.0 q=-1.0 Lsize=       1kB time=00:00:01.00 bitrate=   5.2kbits/s speed=  11x    
video:1800kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[aac @ 0x1b77ec0] Qavg: 213.780
Conversion failed!
webcast@webcast-pi:~$

@ChickenDevs
Copy link
Owner

ChickenDevs commented Jan 12, 2021

Have you tried just using ffmpeg tee directly? I sucessfully tested two streams at ~55% CPU using the instructions from the ffmpeg wiki.

sudo ffmpeg -hide_banner -f v4l2 -thread_queue_size 16 -framerate 25 -video_size 1280x720 -input_format mjpeg -i /dev/video0 -f alsa -thread_queue_size 1024 -itsoffset 1 -i sysdefault:CARD=Device -map 0:v -map 1:a -vcodec h264 -acodec aac -ar 48000 -ac 2 -preset veryfast -b:v 1250000 -framerate 25 -g 50 -filter:a volume=0dB -f tee "[f=flv]<rtmp_url_1>|[f=flv]<rtmp_url_2>"

If that works in your testing then it would just be a matter of editing the cockpit plugin and the python script to handle two autoconfig urls. I would prefer to do that in a seperate branch, but I am happy to make the updates.

@kassah
Copy link
Contributor Author

kassah commented Feb 1, 2021

Sadly, then it won't connect and disconnect the two broadcasts independently. The good news, is it seems like they resolved the issue in the church portal! yay!

@kassah kassah closed this as completed Feb 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants