Skip to content

Commit

Permalink
Merge pull request #71 from aspnair/master
Browse files Browse the repository at this point in the history
ion-sfu-rtp example
  • Loading branch information
adwpc committed Jul 11, 2022
2 parents 12e32a5 + 319347a commit 62ecbcc
Show file tree
Hide file tree
Showing 4 changed files with 1,099 additions and 0 deletions.
116 changes: 116 additions & 0 deletions example/ion-sfu-rtp/README.md
@@ -0,0 +1,116 @@
# ION-SFU RTP

## Features
This application can take RTP from remote and forward received RTP from SFU to remote.
- RTP streams could be generated from anywhere.
- For simplicity, separate gstreamer scripts are used as RTP source and destination.
- All communication with gstreamer are with udpsrc and udpsink on either loopback, unicast or multicast.
- Instead of gstreamer, any other RTP streams could be used, including from IP Camera, generated by VLC etc.
- This application can also manage two streams - first normal stream and the second presentation screen share simulation stream.
- Received peer RTP streams from SFU could be forwarded to a decoding gstreamer pipeline to decode audio and video for both first stream and the remote presentation screen share stream.
- Every incoming RTP stream and track is analyzed for packet losses. This is applicable for both the gstreamer generated local RTP streams and the SFU forwarded remote RTP streams.
- All the streams and tracks related stats are printed periodically (stat_interval in main.go) and during application exit with SIGINT or SIGTERM signals.
- This also joins the room and periodically sends messages to all other clients.
- There is a run script included so that multiple ion-sfu-rtp client processes could be started to simulate load on SFU.

## Requirement
- Currently this example is tested on Debian GNU/Linux and Ubuntu only.

### gstreamer
- Install gstreamer

`sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-tools`

### ion
- Install the ion servers as given at
[https://pionion.github.io](https://pionion.github.io/)


### ion-app-web
- Install [ion-app-web](https://github.com/pion/ion-app-web)
- Start the development server using `npm start`
- Better to have a camera and mic for the browser client connecting to this server.

## Use cases

### Command line help
- To get command line of different applications, open a shell and try the following
```
./gst.sh
go run main.go -help
./run.sh
```
- The **default codecs are VP8 and Opus**. In order to use H.264 codec, do export **VCODEC environment**. This has to be used for each shell before starting the script and the application.
```
export VCODEC=H264
```

### Basic gstreamer encode and decode
- As a basic step, make sure that gstreamer scripts are working by entering the following commands on **two** different shells.

```
./gst.sh -e 2
./gst.sh -d 2 225.0.0.225 21000
```
- Two separate gstreamer video displays should be opened. The first one with [SMPTE pattern](https://en.wikipedia.org/wiki/SMPTE_color_bars) and the second one with a moving ball.
- This also produces an audio [acoustics beat](https://en.wikipedia.org/wiki/Beat_(acoustics)) with 1Hz audible beat frequency. The beat is useful to understand that two audio streams are coming properly.

### Basic call with a browser client
- open a browser client to connect to the ion-app-web npm server. Join the room.
- Open **two** different shells and run the following commands in any order.
```
./gst.sh -e 1
go run main.go -addr "localhost:5551" -session "ion"
```
- Now the browser client should display the SMPTE pattern with a single 1000 Hz tone.

### Call with a browser client with ion-sfu-rtp sending screen share also
- Join the room from the browser client
- Open **two** different shells and run the following commands in any order.
```
./gst.sh -e 2
go run main.go -addr "localhost:5551" -session "ion" -nr_stream 2
```
- Now the browser client should display both the SMPTE pattern and the moving ball with the audible beat.
- The screen share stream is started after a delay (delay_stream in main.go code)

### Call with a browser client with both ion-sfu-rtp and the browser sending screen share
- Open **three** different shells and run the following commands in any order.
```
./gst.sh -e 2
./gst.sh -d 2
go run main.go -addr "localhost:5551" -session "ion" -nr_stream 2 -rtp_fwd 1
```
- Now join the room from the browser client, better to start the browser client after the above commands so that the key video frames are not missed.
- Start a screen share presentation from the browser client
- Now the browser client should display both the SMPTE pattern and the moving ball with the audible beat
- As both RTP forward and the gstreamer decode pipeline are started, two separate video displays should be opened. One for the main camera video from the browser and the second for the screen share.
- Use rtp_fwd only with one client to avoid confusion to the gstreamer decode pipeline.

### Multiple clients to load the SFU
- Open **two** different shells and run the following commands.
- For example, to start 5 clients with 2 streams per client with session name ion
```
./gst.sh -e 2
./run.sh ion 5 2
./run.sh stop
```
- Better to start the gstreamer encode script first.
- The run script would spawn multiple ion-sfu-rtp client processes and all of them take the same encoded streams from the multicast ports.
- Now, a browser client also could be opened to monitor all of them.
- It could be stopped with the same run script with stop parameter.

## Frequently Asked Questions (FAQ)
- Why the gstreamer script is provided separately?
- The ion-sfu-rtp Go application becomes simple, it just listens for RTP packets on UDP ports.
- There is no need to use Cgo at all.
- Instead of gstreamer any other RTP generating application or system could be used.
- The gstreamer script could be easily modified for different scenario.
- With the help of the optional gstreamer decode script, bidirection calls could be easily simulated.
- What is the use of multicast IP in the gstreamer script and the ion-sfu-rtp main application?
- By default both the gstreamer script and the ion-sfu-rtp main application uses 225.0.0.225 multicast group.
- This could be easily overridden by arguments from both the main application and the gstreamer script.
- With multicast group, the gstreamer script could be run on any other computer also.
- Multiple ion-sfu-rtp clients could also be started on different computers.
- With multicast, only one audio and video source and encoder is required, for any number of clients.

277 changes: 277 additions & 0 deletions example/ion-sfu-rtp/gst.sh
@@ -0,0 +1,277 @@
#!/bin/bash

#set -x
#set -e

tx_port_begin=${TX_PORT_BEGIN:-21000}
rx_port_begin=${RX_PORT_BEGIN:-22000}

# this same default multicast address is used in main.go too
default_ip=${DEFAULT_IP:-225.0.0.225}

# video parameters, width, height, fps etc.
w=${WIDTH:-640}
h=${HEIGHT:-360}
fps=${FPS:-15}
# video bitrate and audio bitrate
vb=${VB:-256000}
ab=${AB:-20000}
vb_kbps=$((vb / 1000))

# use the v4l2-ctl command to find out the formats supported
# v4l2-ctl -d /dev/video0 --list-formats-ext
v4l2_device=${V4L2_DEVICE:-/dev/video0}

gst='gst-launch-1.0'
#gst="${gst} -v"

sigint()
{
echo "Received SIGINT"
sleep 1
ps -f | grep ${gst}
ps -f | grep ${gst} | awk '{print $2}' | xargs kill
}

sigterm()
{
echo "Received SIGTERM"
sleep 1
ps -f | grep ${gst}
ps -f | grep ${gst} | awk '{print $2}' | xargs kill
}


# pattern 0 smpte, 18 moving ball
video_tx_vp8()
{
echo "video tx vp8"
port=${1}
pattern=${2}
if [ "${SRC}" == "v4l2" ];
then
videosrc="v4l2src device=${v4l2_device}"
capsrc="video/x-raw,format=YUY2,width=${w},height=${h},framerate=${fps}/1"
else
videosrc="videotestsrc pattern=${pattern}"
capsrc="video/x-raw,format=I420,width=${w},height=${h},framerate=${fps}/1"
fi
${gst} \
${videosrc} ! ${capsrc} ! \
videoscale ! videorate ! videoconvert ! \
timeoverlay halignment=center valignment=center ! \
vp8enc error-resilient=partitions keyframe-max-dist=10 auto-alt-ref=true cpu-used=5 deadline=1 target-bitrate=${vb} ! \
rtpvp8pay pt=120 ! udpsink host=${remote_ip} port=${port} &
}

video_tx_h264() {
echo "video tx h264"
port=${1}
pattern=${2}
if [ "${SRC}" == "v4l2" ];
then
videosrc="v4l2src device=${v4l2_device}"
capsrc="video/x-raw,format=YUY2,width=${w},height=${h},framerate=${fps}/1"
else
videosrc="videotestsrc pattern=${pattern}"
capsrc="video/x-raw,format=I420,width=${w},height=${h},framerate=${fps}/1"
fi
${gst} \
${videosrc} ! ${capsrc} ! \
videoscale ! videorate ! videoconvert ! \
timeoverlay halignment=center valignment=center ! \
x264enc bframes=0 cabac=0 dct8x8=0 speed-preset=ultrafast tune=zerolatency key-int-max=20 bitrate=${vb_kbps} ! video/x-h264,stream-format=byte-stream ! \
rtph264pay pt=126 ! udpsink host=${remote_ip} port=${port} &
}

# wave 0 sine, 8 ticks
# when multiple streams are used, select frequency very near so that
# audio beats are there
# this is a way of distinguishing different streams easily.
audio_tx_opus()
{
echo "audio tx opus"
port=${1}
wave=${2}
freq=${3}
${gst} \
audiotestsrc wave=${wave} freq=${freq} ! audioresample ! audio/x-raw,channels=1,rate=48000 ! \
opusenc bitrate=${ab} ! rtpopuspay pt=109 ! udpsink host=${remote_ip} port=${port} &
}

video_rx_vp8()
{
echo "video rx vp8"
port=${1}
${gst} \
udpsrc address=${listen_ip} port=${port} \
caps='application/x-rtp, media=(string)video, clock-rate=(int)90000' ! \
rtpvp8depay ! vp8dec ! \
videoconvert ! autovideosink &
}

video_rx_h264()
{
echo "video rx h264"
port=${1}
#decoder=decodebin
decoder=openh264dec
${gst} \
udpsrc address=${listen_ip} port=${port} \
caps='application/x-rtp, media=(string)video, clock-rate=(int)90000' ! \
rtph264depay ! h264parse ! ${decoder} ! \
videoconvert ! autovideosink
}

audio_rx_opus()
{
echo "audio rx opus"
port=${1}
${gst} \
udpsrc address=${listen_ip} port=${port} \
caps="application/x-rtp, media=(string)audio" ! \
rtpopusdepay ! opusdec ! autoaudiosink &
}

set_udp_port()
{
let audio_tx_port1=${tx_port_begin}
let video_tx_port1=${tx_port_begin}+2
let audio_rx_port1=${rx_port_begin}
let video_rx_port1=${rx_port_begin}+2

let audio_tx_port2=${tx_port_begin}+4
let video_tx_port2=${tx_port_begin}+6
let audio_rx_port2=${rx_port_begin}+4
let video_rx_port2=${rx_port_begin}+6
}

print_info() {
echo ""
echo "mod = ${mod}"
echo "nr_stream = ${nr_stream}"
echo "remote_ip = ${remote_ip}"
echo "listen_ip = ${listen_ip}"
echo "tx_port_begin = ${tx_port_begin}"
echo "rx_port_begin = ${rx_port_begin}"
echo "audio_tx_port1 = ${audio_tx_port1}"
echo "video_tx_port1 = ${video_tx_port1}"
echo "audio_rx_port1 = ${audio_rx_port1}"
echo "video_rx_port1 = ${video_rx_port1}"
echo "audio_tx_port2 = ${audio_tx_port2}"
echo "video_tx_port2 = ${video_tx_port2}"
echo "audio_rx_port2 = ${audio_rx_port2}"
echo "video_rx_port2 = ${video_rx_port2}"
echo ""
}

usage()
{
echo
echo "$0 <-d|-e> <nr_stream 1|2> [remote_ip|listen_ip] [port]"
echo
echo "-d = decode mode"
echo "-e = encode mode"
echo "remote_ip, listen_ip optional, default multicast ${default_ip}"
echo "port is either remote send port or local listen port"
echo "port must be preceded with ip argument"
echo
echo "To use H.264 codec for both decode and encode, use the environment"
echo "export VCODEC=H264"
echo "To use v4l2 camera instead of videotestsrc use the environment"
echo "v4l2 works only with 1 stream"
echo "export SRC=v4l2"
echo "The default video codec is VP8"
echo
}

if [ "$#" -lt "2" ];
then
usage
exit 2
fi

mod=${1}
nr_stream=${2}
shift 2
if [ "$#" -ne "0" ] && [ "$#" -ne "2" ];
then
echo "Both IP and Port must be specified"
usage
exit 2
fi
ip=${1}
port=${2}
if [ "${ip}" == "" ];
then
remote_ip=${default_ip}
listen_ip=${default_ip}
else
remote_ip=${ip}
listen_ip=${ip}
fi
if [ "${port}" != "" ];
then
tx_port_begin=${port}
rx_port_begin=${port}
fi
set_udp_port
print_info
sleep 1

if [ "${VCODEC}" == "H264" ];
then
video_tx=video_tx_h264
video_rx=video_rx_h264
else
video_tx=video_tx_vp8
video_rx=video_rx_vp8
fi
audio_tx=audio_tx_opus
audio_rx=audio_rx_opus

trap 'sigint' INT
trap 'sigterm' TERM

case "${mod}" in
-e)
${audio_tx} ${audio_tx_port1} 0 1000
${video_tx} ${video_tx_port1} 0
if [ "${nr_stream}" -eq 2 ];
then
${audio_tx} ${audio_tx_port2} 0 1001
${video_tx} ${video_tx_port2} 18
fi
;;
-d)
${audio_rx} ${audio_rx_port1}
${video_rx} ${video_rx_port1}
if [ "${nr_stream}" -eq 2 ];
then
${audio_rx} ${audio_rx_port2}
${video_rx} ${video_rx_port2}
fi
;;
*)
usage
exit 2
;;
esac

sleep 1
echo ""
ps -eao pid,cmd | grep $0 | grep $grep $$
childpid=$(jobs -p)
for pid in ${childpid};
do
ps -eao pid,cmd | grep ${gst} | grep ${pid}
done
echo ""

wait

echo "exiting"
echo "completed"

exit 0

0 comments on commit 62ecbcc

Please sign in to comment.