Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using OpenCV with EZ-WifiBroadcast #71

Open
tictag opened this issue Dec 28, 2017 · 127 comments
Open

Using OpenCV with EZ-WifiBroadcast #71

tictag opened this issue Dec 28, 2017 · 127 comments

Comments

@tictag
Copy link

tictag commented Dec 28, 2017

Hello Rodizio,

Just got my first video stream to work [RPi Zero()W with ZeroCam to RPi3 using 2 x RT5370 based nano USB adapters] and it looks great! :) Really fantastic work (from you and Befinitiv)!!

My query does not need a comprehensive answer, I don't mind doing all the digging to get something working, but I don't want to waste my time investigating if my use case simply isn't an option. I plan to take two seperate video sources connected to a RPi, do some rudimentary image processing then merge them together using OpenCV ... and here's the query ... could I then use EZ-WifiBroadcast to transmit that composite video stream to a receiver?

I've read every page of your wiki and everything revolves around broadcasting a source video from Tx to Rx. Basically, can my source video actually be an output stream from OpenCV?

I don't mind putting the hours in to investigate/troubleshoot etc but not if this use case is simply not do'able.

I would appreciate your thoughts.

Oh, and if you don't get back to me before new year, Happy New year!! :)

@rodizio1
Copy link
Owner

The wifibroadcast tx (as well as my further developed tx_rawsock) basically just send out what they receive on stdin, so if you can pipe your OpenCV output into it somehow it should work.

@tictag
Copy link
Author

tictag commented Jan 2, 2018

Ahhh, so it isn't just 'hard wired' to transmit the camera stream, anything 'piped' (I'm not 100% sure what that means just yet) into tx_rawsock should be transmitted. That sounds brilliant. I've just read tbeauchant's post re an IP cam and s/he is using gstreamer ...

I then edited the .profile file to pipe the output h264 stream from gstreamer into tx_rawsock

Looks like I have my starting point :) Do you have any further docs around tx_rawsock?

Many thanks for all your (and Befinitiv's) hard work on this. Seriously, there is no way I'd ever be able to complete this project without you.

@ivanskj
Copy link

ivanskj commented Jan 2, 2018

I am piping a gstreamer-pipeline from a Sony QX1 MJPEG liveview-url to tx. Souphttpsrc -> h264enc. I can confirm that it works, but i still have to figure out bitrates and all that stuff to get more range. There is also some more latency on the stream. I will try the tx_rawsock

@geofrancis
Copy link

geofrancis commented Jan 2, 2018

i would be very interested in using a sony camera with wifibroadcast, currently I am having to use a B101 hdmi converter, if you could switch between it and the pi camera it would be awsome.

@tictag
Copy link
Author

tictag commented Jan 4, 2018

rodizio1,

Whilst I understand the purpose of EZ-WiFiBroadcast is simplify setup i.e. just download a couple of SD card images and you're good to go, I'm going to need to install a whole lot of software to complete my project e.g. OpenCV, Numpy, Python etc. Is there a 'manual install' doc available? That is, a doc that outlines the steps required to essentially create a working image from, for example, a vanilla Raspbian distribution like Raspbian Stretch Lite?

@careyer
Copy link
Contributor

careyer commented Jan 4, 2018

Maybe the easier way to go is to find a way to boot the EZ-WifiBroadcast image so that it does not autostart all the RX, TX, OSD and Telemetry processes but welcomes you with a login prompt. Connecting the ethernet port to your router would allow for installing all the necessary packages then.

@rodizio1: is there a best practice how to boot the image so that it does not enter transmission / reception mode automatically? I tried variant No. 1 from https://github.com/bortek/EZ-WifiBroadcast/wiki/Logging-into-the-linux-console but it was kind of problematic since there were overlays from the OSD all over the place and my nano editor was in the background ;-)
However No. 2 might work just fine... haven't tried this yet. @tictag: Give it a try! ;-)

@tictag
Copy link
Author

tictag commented Jan 8, 2018

The second method simply provides you with a remote command prompt much in the same way as CTRL-C does locally.

It would certainly be better for me to have the manual install instructions. I've been trying to install and configure stuff on top of the image but just keep running into problems; bash commands that are not present, make commands (e.g. numpy) that appear to run/compile forever (48 hours before quit, should be 3-4 hours). And of course I'm scared to do any kind of apt-get update/upgrade for fear of modified drivers/firmware being overwritten.

Whilst I do certainly believe that some would prefer the EZ (image-based) installation, for others this might cause more problems than it solves.

It would be great to have a manual install method.

@rodizio1
Copy link
Owner

rodizio1 commented Jan 8, 2018

There are no instructions as I just made changes to the image and did not document them apart from the changelog. Instructions on howto make the changes would need to be constantly updated as Raspbian is constantly changing, I don't have the time nor motivation for that. Finding out what has changed is quite easy, simply take the default raspbian lite image and compare it with my image. "meld" is a good tool for that for example.

In the long run, I want to get away from Raspbian and use something smaller and better manageable like buildroot.

In general:

  • Do not run apt-get upgrade, it'll probably just overwrite a lot of stuff and break everything
  • Do not upgrade the kernel, chances are you run into whatever issues
  • If you want to make changes to the kernel, build the one that is included in the image (instructions and patches/config in the /kernel/ directory on the Github repository)
  • Do not change Raspberry firmware or Atheros firmware
  • Installing software packages using apt-get is possible. Just make sure no existing packages are upgraded (it tells you before). If you cannot get around upgrading packages, check before which ones are affected so that you can act accordingly (like making backups of config files that may get overwritten). Also make sure you don't accidently install whatever stuff that messes with network interfaces like network-manager etc. or stuff that installs daemons that may cause whatever issue when they do stuff in the background

Regarding logging in: Hmm, the OSD still being in front of the console when logging in locally is indeed not so nice. I've changed the profile script so that it automatically does a "killall osd" to quit the OSD after logging in.

@tictag
Copy link
Author

tictag commented Jan 9, 2018

Wow, so this has been a voyage of discovery!! I have now managed to get python3.4.2, numpy1.8.2, gstreamer1.0 and opencv3.4.0 all installed and working simultaneously with EZ-WiFiBroadcast v1.6RC3 on a Raspberry Pi 3. This has required me to: resize file systems, find random files, compile code(!), edit configuration files and solve many other problems along the way but ... at least it works!

I'm working on a fully documented bash script to automate this whole thing and I'll upload once I've tested it for others to use should they wish to.

rodizio1 thank you for your candour, I totally get it. As I am eventually going to be writing an instructable for my project, I will want to start off with a vanilla Raspbian image, that is, I am going to try to use that "meld" tool. What was the exact version of original Raspbian source image?

@tictag
Copy link
Author

tictag commented Jan 9, 2018

...p.s. I only put the (!) after 'compile code' because this is the first time I have ever compiled code. Yep, that's how much of a layman I am!

Now that I have everything I need, I'm gonna have to stop wifibroadcast from snagging the hardware (i.e. camera) and instead have OpenCV do this. OpenCV then needs to process the video streams before piping them out through gstreamer to the wifibroadcast tx_rawsock device.

Why does it sound so easy when it's just words? ;)

@lgangitano
Copy link

@tictag I'm interested in your solution, since I'm experimenting with the same setup (video feed multicasted to OpenCV and EZ-Wifibroadcast) for realtime object tracking and streaming. Would you share your results performance-wise on Pi3?

@tictag
Copy link
Author

tictag commented Jan 12, 2018

Of course, happy to. I'm just at the point where I'm adding my thermal stream so will be, probably tomorrow be looking to piping the streams into wifibroadcast, as opposed to it capturing the stream itself. On the receive side, I'll be extracting the two streams from wifibroadcast and piping them into OpenCV for further processing.

...and I have no idea how to do this yet!! Don't let me sound like I know what I'm doing! ;)

@tictag
Copy link
Author

tictag commented Jan 15, 2018

rodizio1 thank you for your candour, I totally get it. As I am eventually going to be writing an instructable for my project, I will want to start off with a vanilla Raspbian image, that is, I am going to try to use that "meld" tool. What was the exact version of original Raspbian source image?

Bump...

@rodizio1
Copy link
Owner

rodizio1 commented Jan 15, 2018

Sorry, I never wrote that down (and in retrospect I found out that there is no version number or similar inside the Raspbian images ...)

What I remember is, that version 1.0 was released around 15th of May 2016 and used Kernel 4.4.9 or 4.4.11, so it must be a Raspbian release around that time with that kernel.

You can find the old Raspbian releases and changelogs here:
http://downloads.raspberrypi.org/raspbian/images/
http://downloads.raspberrypi.org/raspbian/release_notes.txt

@careyer
Copy link
Contributor

careyer commented Jan 16, 2018

@tictag : I am very interested in the automated bash script that you created to automate the installation of additional components to EZ-WifiBroadcast1.6RC3

I'm working on a fully documented bash script to automate this whole thing and I'll upload once I've tested it for others to use should they wish to.

In my use case I need to install the following components:

Being a total Linux noob this might help me get started a bit less troublesome.
Thank you very much in advance!

BTW: Patrick Duffy from DIY Drones (http://diydrones.com/profiles/blogs/thermal-imaging-for-your-drone-on-a-budget) send me ready-2-run image demoing the integration of FlirOne with Raspberry. It works flawlessly. However it does not build on WiFiBroadcast but on nomal WiFi streaming via Gstreamer and fixed IPs over the local WLAN. It also supports transmission of multiple video streams (Thermal Image, HD-Video from Flir & RaspiCam Video) - i.e. I am also following your progress in #76

@tictag
Copy link
Author

tictag commented Jan 16, 2018

Happy to help a fellow noob! If you only need gstreamer then my script probably won't help so much. Tbh, the most complicated thing (for me) has been compiling OpenCV. Mind you, should help with compiling the kernel. Definitely we can work together on this :) (... blind leading the blind ;)

@careyer
Copy link
Contributor

careyer commented Jan 16, 2018

@tictag : That is good news! I will try to install gstreamer first (I suppose the way to go is to do it the regular way with apt-get?) and then get back to you? I think the FlirOne Device driver needs to be compiled as well :-|. Am I correct that you are using this driver for your project as well? Last (and probably most complicated) will be recompiling the kernel in order to get l4l2loopback support added. For this I definitely need some help.

BTW: Here is a screenshot from what I was able to achieve yesterday evening with the Patrick Duffy image. The frame rate was surprisingly good - I believe it was definitely more than the regular 8-9fps. Felt more like 15-20fps which I was positively surprised about:

@tictag
Copy link
Author

tictag commented Jan 16, 2018

Looking hot! Ahem, sorry...

Yes, gstreamer just via apt-get, though I did have a few issues installing:

  • Missing packages during installs
  • Running out of diskspace installing most things
  • 'Fuse' directory not found during gstreamer install

...to resolve

# Missing packages during installs
apt-get update  # do not 'upgrade'

# Running out of diskspace installing most things
nano /etc/fstab  # for device /tmp (ramdisk used as temp scratchdisk), change size=50M, CTRL-X, Y to save then reboot
resize2fs /dev/mmcblk0p2  # this is the 2nd sd card partition and mounted at /dev/root, which is your home directory
df -h  # make sure /dev/root size is the same size as your partition and there's plenty space 'avail'

# 'Fuse' directory not found during gstreamer install
mkdir /dev/fuse
chmod 777 /dev/fuse
apt-get install fuse
apt-get install gstreamer1.0

That should get you gstreamer installed.

@tictag
Copy link
Author

tictag commented Jan 16, 2018

...oh, forgot to answer your question, no I'm not using the FLIROne, I'm just using the FLIR Lepton OEM module on a breakout board, connected to a RPi Zero via SPI (Serial Peripheral Interface), I'm using pylepton to control it, OpenCV for image processing and gstreamer to stream it via wifibroadcast's tx_rawsock.

Does the FLIROne connect via USB?

@careyer
Copy link
Contributor

careyer commented Jan 16, 2018

Yes, the FlirOne conencts via USB and features both a Thermal camera and a HD cam for visable light. Both videostreams can be accessed with the linux driver.

Update:

  • file system resized
  • fuse & gstreamer1.0 successfully installed :-)

@rodizio1
Copy link
Owner

say, what are you guys installing there exactly? Gstreamer is already installed (used for encapsulating the raw h264 video for missionplanner) and there is no package called "gstreamer1.0"

@careyer
Copy link
Contributor

careyer commented Jan 16, 2018

Hi rodizio... We are trying to get some FLIR Thermo cameras working with EZ-Wifibroadcast. In order to make the Flir cameras work we need a very complicated Gstreamer pipeline with modules from Gstreamer good, bad and ugly... that is why we need the fullblown Gstreamer suite.

apt-get update #(not upgrade)
apt-get install gstreamer1.0

installes the hole suite and works flawlessley as far as I can tell. It installs all the nessesary additional modules.

I also succeded to install the FlirOne USB driver (flir8p1-gpl), however now I am stuck installing the v4l2loopback kernel module. :-(

I did:

sudo apt-get install linux-headers-rpi
 sudo wget https://raw.githubusercontent.com/notro/rpi-source/master/rpi-source -O /usr/bin/rpi-source && sudo chmod +x /usr/bin/rpi-source && /usr/bin/rpi-source -q --tag-update
 rpi-source
 sudo apt-get install bc
 rpi-source

git clone https://github.com/umlaeute/v4l2loopback
 cd v4l2loopback
 make
 make install

Everything runs just fine without any errors but the kernelmodule v4l2loopback wont show up:

**root@wifibroadcast(rw):~/v4l2loopback#** sudo modprobe v4l2loopback
modprobe: FATAL: Module v4l2loopback not found.
**root@wifibroadcast(rw):~/v4l2loopback#**

any ideas?

@careyer
Copy link
Contributor

careyer commented Jan 17, 2018

@rodizio1 : I think I know now what the problem is with compiling the v4l2loopback kernel module

The commands above load the kernel sources (or headers) for 4.9.35-v7+ wheras EZ-Wifibroastcast v1.6RC3 runs on 4.9.35-v7 (non+). Maybe that is why the kernel module compiles okay but does not get installed properly?

Can you please advice me how to install the correct kernel sources/headers for 4.9.35-v7 ((and how exactly to apply the patches you made). I took a look at https://github.com/bortek/EZ-WifiBroadcast/tree/master/kernel/readme.md but I do not completely understand how to do it. You know... me just being a linux noob!

I suppose i do not have to re-compile the whole kernel but only the kernel module. But I do not get how I can do it correctly. :-(

Thank you so much! Your help is so much appreciated

@careyer
Copy link
Contributor

careyer commented Jan 17, 2018

Okay success!
I finally solved the problem...
rpi-source always adds a '+' to the release number of downloaded kernels from Github to indicate that it is a local copy. The make process of the v4l2loopback kernel module copes just fine with that but unfortuantly modprobe gets into serious trouble if the kernel release string differs only slightly. It simply won't start and make install will also copy the compiled module to a wrong destination.... GRRRR!! I had to manually alter the release number string all over the filesystem (hidden files, config files.... god knows were I found this damn attached '+'). Now it works!
FlirOne is now functional in the EZ-WifiBroadcast Image 1.6RC3... Now I just have to figure out how to pipe its output to the tx pipeline.

Cheers!

@careyer
Copy link
Contributor

careyer commented Jan 21, 2018

@rodizio1
CC: @ivanskj / @tictag / @tbeauchant
I was finally able to get all the prerequisites for operating the FlirOne thermal camera right. Driver is installed and is linked to the /dev/video3 device on the system. I also have a working gstreamer pipeline. However as of now this gets streamed via a standard WiFi connection (UDP connection via a seperate NIC) and not via WifiBroadcast.

I do not fully understand yet, how I can pipe my Gstreamer Output to WiFi-Broadcast and have it transfered. I found the tx_rawsock called in .profile but I do not understand all the parameters.
Can you please help me to make that happen and guide me a little bit on what needs to be changed and where (on Air and Ground side repectively)? That would be AWESOME! 👍

My Gstreamer pipeline looks like this now:
gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128, framerate=1/10 ! rtpvrawpay ! udpsink host=192.168.0.100 port=9002

I feel the LowBudget DIY Thermal Drone is only 1 setp away. ;-)
Cheers & Thank you!

@tictag
Copy link
Author

tictag commented Jan 21, 2018

I've yet to figure this out myself but I do have a novice understanding about 'piping' in Linux. Piping is where you take the output of one process and feed it to the input of another, ever see that cult film 'the human centipede'? So,

ls -p | directory.txt

... would output the current directory list and 'pipe' it into a file instead of printing it to the console.

I've seen examples on the t'interweb where the output of raspivid (the default application used for pi cameras) is then 'piped' into the tx_rawsock application, thus:

raspivid <parameters> | tx_rawsock <parameters>

I'm not quite sure yet how to do this with OpenCV but as I'll be doing the image processing using Python, I'll likely be using the cv.ImageWriter() method, piped into gstreamer, which in turn pipes into tx_rawsock.

Btw, if you have seen 'the human centipede', I apologise for associatibg that imagery with Linux piping!! ;)

@ivanskj
Copy link

ivanskj commented Jan 21, 2018 via email

@careyer
Copy link
Contributor

careyer commented Jan 22, 2018

Thank you! I am familiar (as a linux noob like me can be) with the concept of piping in Linux. However I am struggling a little bit with the actual combination of things - to be precise:

  • I have no idea what kind of sink I have to specify for gstreamer in order to pipe it into tx_rawsock (none? / stdout? /... something else?)
  • I have no idea about all the tx_rawsock parameters! @rodizio1 maybe you can shed some light on them?
  • I don't understand why the video is piped to tx_rawsock twice (Line 688 + Line 717)
  • I do not understand the mimic of how the videostream is displayed on the groundpi (Line 836ff). Where do I put my gstreamer Pipeline here?

I think that I need to change the following line(s) in /root/.profile
Transmitting the video at the AirPi:

Linie 688: nice -n -9 raspivid -w $WIDTH -h $HEIGHT -fps $FPS -b 3000000 -g $KEYFRAMERATE -t 0 $EXTRAPARAMS -ae 40,0x00,0x8080FF -a "\n\nunder-voltage or over-temperature on TX!" -o - | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

Line717: nice -n -9 raspivid -w $WIDTH -h $HEIGHT -fps $FPS -b $BITRATE -g $KEYFRAMERATE -t 0 $EXTRAPARAMS -a "$ANNOTATION" -ae 22 -o - | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

Receiving and displaying the video at the GroundPi:

Line 836ff: tmessage "Starting RX ... (FEC: $VIDEO_BLOCKS/$VIDEO_FECS/$VIDEO_BLOCKLENGTH)"
ionice -c 1 -n 3 /root/wifibroadcast/rx -p 0 -d 1 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH $NICS | ionice -c 1 -n 4 nice -n -10 tee >(ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo2 > /dev/null 2>&1) >(ionice -c 1 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo4 > /dev/null 2>&1) >(ionice -c 3 nice /root/wifibroadcast_misc/ftee /root/videofifo3 > /dev/null 2>&1) | ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo1 > /dev/null 2>&1
RX_EXITSTATUS=${PIPESTATUS[0]}
check_exitstatus $RX_EXITSTATUS
ps -ef | nice grep "$DISPLAY_PROGRAM" | nice grep -v grep | awk '{print $2}' | xargs kill -9
ps -ef | nice grep "rx -p 0" | nice grep -v grep | awk '{print $2}' | xargs kill -9
ps -ef | nice grep "ftee /root/videofifo" | nice grep -v grep | awk '{print $2}' | xargs kill -9
ps -ef | nice grep "cat /root/videofifo" | nice grep -v grep | awk '{print $2}' | xargs kill -9
done

Any help or recommendations would be greatly appreciated.

PS: Right now my TX pipeline is:

gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128, framerate=1/10 ! rtpvrawpay ! udpsink host=192.168.0.100 port=9002

Any my RX pipeline is:

gst-launch-1.0 udpsrc port=9002 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW, sampling=(string) YCbCr-4:2:0, depth=(string)8, width=(string)160, height=(string)128, colorimetry=(string)BT601-5, payload=(int) 96" ! rtpvrawdepay ! autovideosink

@tbeauchant
Copy link

@tictag @careyer

Piping is the process of grabbing the stdout of one program and feeding it into the stdin of another. In the .profile script this is what is achieved on lines 688 & 71:
raspivid ... | tx_rawsock

Raspivid outputs to it's stdout the h264 data, this data is grabbed and sent to the stdin of tx_rawsock which in turns does it's magic to send it through the wifi.

if you'd like to achieve the same thing with gstreamer, you'll have to replace the raspivid part of the command with your gstreamer pipeline, and make sure gstreamer outputs to stdout. This can be achieved by using the following sink:
gstreamer .... ! filesink location=/dev/stdout
Be aware that the video stream is not the only data that gstreamer will output on stdout. Every line displayed in the console when you run the software is also on stdout, so depending on situations, this may or may not cause issues in the datastream.

In order to get rid of this issue, I have created a fifo using mkfifo, and modified tx_rawsock to use it (look in the source code, there is a line where it opens the /dev/stdin, replace it with the name of the fifo file you created). Then in gstreamer simply output the data to the fifo: filesink location=/path/to/my/fifo

A fifo is simply a special type of file on the filesystem that acts as a buffer.

Also, the reason tx_rawsock is used twice in the .profile is to output different texts messages on the display dpeending on whether the script has detected some issues or not. Usually for testing I disable both these lines and run the command line myself from the terminal.

Good luck,
Theo

@pullenrc
Copy link

pullenrc commented Sep 29, 2018

Hello again,
@careyer I am trying to send an opencv stream to the ground pi, I am trying to do all of my processing on the air side. I have never compiled a kernel before, So it seems a little daunting. If I am understanding the flow properly(please correct me if I am wrong): the kernel needs to be recompiled with the v4l loopback driver, which makes it possible to use the hardware encoder on the air pi to encode to h264 which then gets piped back to the ground pi, and displayed like it was a raspicam? Would you be able to help me out with an outline, so I can set out trying to accomplish this?
I have opencv installed on the air pi, however, I have only tried to send video from /dev/video0 to no avail. I don’t get signal strength(on osd) indicating a connection until I go back to the original code.

I am using 1.6RC6. I tried commenting out the line 770( maybe off, but close) where the raspicam is being piped to tax_rawaock, and I commented the v4l lines below, and couldn’t get a picture, or rssi on OSD.
I feel like I am getting close, any help would be greatly appreciated.

@careyer
Copy link
Contributor

careyer commented Oct 1, 2018

@pullenrc : No you do not need to recompile the kernel. You just need to compile the v4l2loopback kernel module. For that you need to download the correct kernel sources and headers. (Beware of the "+" kernel string problem, see above).

Not sure though: What do you need the v4l2loopback device for? You say you already have a video stream on opencv... what do you need the v4l2loopback devices then? Afaik v4l2loopback is for creating a "virtual video devices". Guess you don't need that since you have your stream already in opencv?
However when it comes to sending a videostream via WBC you need to "frame" it somehow, so that the GroundPi can determine the Start and End of a frame correctly, i.e. you have to encode the datastream in a way so that it is broken up into detectable video frames. Simplest thing to do is to encode the video into a streaming format such as H264 which will take care of all that. This is where the hardware H.264 encoder comes in handy. =D

@JRicardoPC
Copy link

JRicardoPC commented Oct 3, 2018

finally i can launch modprobe, i finded how in a issue open by you(@careyer) in rpi-source's github. for now i maked this try in a pi3, but now i need execute the same in a pi zero, and i have a issue because kernel is different(again -.-), i use the same ssd card, but pi3 use kernel 4.9.35-v7 and pizero use kernel 4.9.35

Update: I downloaded the kernel of this github (4.9.35), I compiled and installed it, but v4l2loopback has the same problem and I can't do a make

@kwatkins
Copy link

kwatkins commented Oct 6, 2018

@careyer I got everything working, well, all except the streaming with the tx_rawsock using gstreamer with the flir one. It looks like on TX side all is well, flirone is reading/sending data, etc and I'm using the last bash .profile you linked above. I'm not seeing any HDMI out on the RX side tho. You have a latest you can share?

@JRicardoPC I just took out the code that does the + append from rpi-source, although, I haven't done this on the RX side yet. I didn't think that side needed to be modified for those changes. I was seeing the same exec format error you had, due to ver mismatching on the .ko, basically, wipe all that out and just take out the code adding the + from rpi-source and do the steps again. There is also a SSL certificate issue you might encounter, they cover it on the repo but i just modified the python script to ignore SSL certs.

@careyer
Copy link
Contributor

careyer commented Oct 6, 2018

@kwatkins : I am happy my project inspired you. Unforuntaly my latest working version is based off of 16rc3. From 16rc4 onwards some change to the tx_rawsock and the sharedmem handling has been introduced which I cannot retrace - this prohibts to call tx_rawsock with a different port parameter than 0 (e.g. "p 10"), more precisely it gives an error message at the tx side. I can only give the advice to go back to rc3 as of now. Sorry & the best of luck

@kwatkins
Copy link

kwatkins commented Oct 6, 2018

@careyer good to know. i'll give this a go with 1.6rc3.

@kwatkins
Copy link

Sir @careyer, if you get a chance can post your working gstreamer pipeline (tx, rx too if it was changed) that you got working with the flir one and ez-wifi? Better yet, the /root/.profile you ended up going with for ez-wifi broadcast 1.6rc3 would be greatly appreciated 👍

For everyone (including me) struggling to get the images setup with v4lc etc, these are the steps that should get you there. This was used for the TX but should work for RX as well.

Run all below as root from the v1.6RC3 image (EZ-Wifibroadcast-1.6RC3.img)

  1. resize, install some tools, update the pi
resize the partition using https://github.com/bortek/EZ-WifiBroadcast/wiki/Tips-and-tricks
dpkg-reconfigure ca-certificates
apt-get update
apt-get -y install vim screen
  1. get vl4c working using Using OpenCV with EZ-WifiBroadcast #71
# fuse and gstreamer for video streams 
mkdir /dev/fuse
chmod 777 /dev/fuse
apt-get -y install fuse
apt-get -y install gstreamer1.0
# kernel sources for vl4c module 
apt-get -y install linux-headers-rpi
wget https://raw.githubusercontent.com/notro/rpi-source/master/rpi-source -O /usr/bin/rpi-source && sudo chmod +x /usr/bin/rpi-source 
#edit /usr/bin/rpi-source following https://github.com/notro/rpi-source/issues/37 and removing https://github.com/notro/rpi-source/blob/master/rpi-source#L350 
# AND  add 'import ssl' to top AND change download(...) to, 
#      ctx = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
#      res = urllib2.urlopen(url,context=ctx).read()
# AND add "--no-check-certificate" to wget calls (what a mess…)
apt-get -y install bc
/usr/bin/rpi-source -q --tag-update
rpi-source
# finally, install v4l2 loopback for /dev/video* 
export GIT_SSL_NO_VERIFY=true
mkdir ~/flir && cd ~/flir && git clone https://github.com/umlaeute/v4l2loopback
cd v4l2loopback && make install
depmod -a 
  1. install flir tools and setup the ez-wifi .profile gstreamer
apt-get -y install libusb-1.0-0-dev
cd ~/flir && git clone https://github.com/fnoop/flirone-v4l2.git
cd flirone-v4l2 && make 
# edit /etc/rc.local and  before the exit 0 add 
# modprobe v4l2loopback devices=5 
# sleep 5
# cd /root/flir/flirone-v4l2 &&  ./flirone ./palettes/Iron2.raw &
#grab a .profile to start with (thanks @careyer !) 
mv /root/.profile /root/.profile-original
wget http://www.ip-networx.de/content/GitHub/editedProfile.txt -O /root/.profile

@careyer
Copy link
Contributor

careyer commented Oct 11, 2018

@kwatkins, you are welcome... I made several other changes to .profile but I am happy to share the tx and rx pipeline command with you:

TX:
nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128,framerate=10/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 10 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

RX:
ionice -c 1 -n 3 /root/wifibroadcast/rx -p 10 -d 1 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH $NICS | ionice -c 1 -n 4 nice -n -10 tee >(ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo2 > /dev/null 2>&1) >(ionice -c 1 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo4 > /dev/null 2>&1) >(ionice -c 3 nice /root/wifibroadcast_misc/ftee /root/videofifo3 > /dev/null 2>&1) | ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo1 > /dev/null 2>&1

You can decode and display via the hellovideo standard video playback

@kwatkins
Copy link

kwatkins commented Oct 16, 2018

@careyer and for others giving this a go, I was able to get it all streaming, thermal vision style. Following the setup steps above on the the TX/AirPi (#71 (comment)) to get the Flir One streaming only changes were to the TX/AirPi "/root/.profile", changing where raspivid was used to,

nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128,framerate=10/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

Also, since I wasn't attaching a camera I set $CAM=1 in the .profile for the TX/AirPi (ez-wifi checks if camera is attached to determine if it's RX or TX)

channel '-p 0' worked fine and no changes needed to view video stream on RX/GroundPi. Again, this is using 'v1.6RC3 image', from what @careyer found this might not work on later images.

My next steps are getting this all into something I can just attach to the bird and it works. I'm using a Li-ion Battery HAT on top of the TX Pi0 for power (https://www.waveshare.com/li-ion-battery-hat.htm) which has a 5V regulated power out. I'm using a mini-usb hub for testing, and includes an ethernet port, totally recommend this when your ssh root@ezwifibrdcast-tx to mess with different streams. Power wise I plan to solder the usb from the wifi directly to the pi splitting out power input from the battery HAT. I'm also trying out some other USB hubs, smaller ones, in the hopes that they work.

Past that, it's mavlink time, finding a way to fuse all this together. Going to try using the mavic pro w/ mavlink wrapper (https://github.com/diux-dev/rosettadrone). I also have an Emlid Navio2 (Autopilot HAT for Raspberry Pi powered by ArduoPilot), if I can get the AirPi working and streaming off that directly, that's some sort of win I haven't wrapped my head around :)

@kwatkins
Copy link

flirpup

Heyyooo - the Zero4U (https://www.adafruit.com/product/3298) seems to be solid, a little hub that pogo jumps onto the usb test pads. Got the TX all contained, power from the HAT on top. Now to find something that will case it, attach, fly.

@careyer
Copy link
Contributor

careyer commented Oct 25, 2018

Alright! Congratulations! I am happy that you got it right and working 👍
I am also using the Zero4U - it is a decent hub!

@JRicardoPC
Copy link

JRicardoPC commented Oct 25, 2018

Hello,
Finally i could fix the problem and now, i can make a stream with the picamera, and the Lepton thermal camera. Now i need got one step more and send the two stream at the same time , for this i try to use videomixer. In the first try in my computer, work so good, but if i try in pizero, i have a trouble connect fdsink fd=1 with videomixer.

At start i try only send videotestsrc

My TX pipeline:
nice -n -9 gst-launch-1.0 videotestsrc ! video/x-raw,width=80,height=60,framerate=10/1 ! videobox border-alpha=0 top=0 bottom=0 ! videomixer name=mix sink_0::xpos=240 sink_0::ypos=180 sink_0::zorder=10 ! videoconvert ! xvimagesink videotestsrc ! video/x-raw,format=AYUV,framerate=5/1,width=320,height=240 ! fdsink fd=1 ! mix. | nice -n -9 /root/wifibroadcast/tx_rawsock -p 10 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

@RespawnDespair
Copy link

In the image from dino.de there are two streams sent side-by-side. On the ground you can switch between which stream you want to use for display, this works because the second stream is very low bandwidth (320x240 several FPS) from the FLIR camera. I see from your command line you use the same. I am working on a one-size-fits-all solution that will allow for multiple streams to be sent side-by-side, also v4l2 support in the image. This should enable you to use the /dev/video0 input. Would that be ok for your application?

@pullenrc
Copy link

pullenrc commented Oct 31, 2018 via email

@careyer
Copy link
Contributor

careyer commented Oct 31, 2018

@RespawnDespair : The FLIRone code in the dino.de image is essentially my code. I have developed it and dino.de made the logic to switch between the different video streams afterwards. As you have noticed correctly both video streams are transfered side-by-side (in parallel). This is possbile because the 2nd stream consumes only very little bandwidth: 160x120 @ 9fps. As you have noticed it also makes use of the v4l2loopback kernel module (which has to be compiled for every plattform seperatly in order to start correctly, i.e. Pi0, Pi2, Pi3....). It would be very convinvient to have v4l2loopback support for all SOCs already build into EZ-WBC. So yes v4l2loopback is also a thing that I introduced.

Essentially the HD-Video (Pi Cam) is written to videofifo1 at the GroundPi and The Flir stream is written to videofifo5. We have decided to do so because we wanted to be able to quickly switch between the HD and thermal view. At the flick of a switch essentially Hellovideo is restarted with a different videofifo (1 for HD and 5 for thermal).

I am not sure what you mean with /dev/video0 input though?

Now that we have the dino.de "AirSwitches" (which can either drive GPIOs or trigger software actions at the AirPi) it also might be possbile to send everything via the same tx_rawsock port "p 0" and already decide at the airside which stream to put into the radio link? I.e. do not transfer both videos on parallel anymore but switch on the airside which video gets fed into the standard video pipeline?
However that migth result in a bigger latency when switching between streams and it might happen that Hellovideo on the GroundPi crashes while switching ?

@zipray
Copy link

zipray commented Nov 12, 2018

@careyer Thank you very much for sharing the image from dino. DE.and I used usb hub to install Insta360 Air on the Air zero PI, and it couldn't start without the raspberry pi camera.Install the raspberry pi camera to boot, but hdmi does not output Insta360 Air video, do I need any special Settings?

@careyer
Copy link
Contributor

careyer commented Nov 12, 2018

@zipray : Sorry the dino.de image contains support for FLIRone only. For insta 360 you need to modify the .profile and alter the tx_rawsock command like in #71 (comment). You have to either connect a PiCamera or search in the .profile for a variable called "FLIRforcecamera" (or something like that.. don't remember its name, should be at the very beginning of .profile) and set this to "=Y" . This will make the Pi believe that there is a Pi camera connected an start as AirPi.

@zipray
Copy link

zipray commented Nov 14, 2018

@careyer Thank you very much. I find that the video stream obtained from insta360 air is a dual-camera image that has not been Mosaic,
qq 20181114093908
Even using the FPV_VR App.@Consti10
cache_66f6c8c8eef2d737
which leads to a very funny phenomenon when using ground VR head tracking in FPV_VR App.
https://youtu.be/MraI-Ff2G3A
Is there any way to get panoramic video directly from insta360 air?

@careyer
Copy link
Contributor

careyer commented Nov 14, 2018

@zipray : Congrats! You did it! 👍 ... Unfortunately I have no idea how to correct/convert that output to a different type of projection. Maybe @johnboiles or someone else can help?

This reference of different types of projections might be helpful: https://en.wikipedia.org/wiki/List_of_map_projections - Seems like we are dealing with a "Nicolosi globular" (or also called "double fisheye") projection here.

Update: Maybe these links can be helpful:

@JRicardoPC
Copy link

Finally I make a videomixer with picamera and lepton module, i tryed some ways and at the end i used v4l2sink to save the mix video in a virtual device, and then send this in another line.
The result:
videomixer

And my code in TX:

nice -n -9 gst-launch-1.0 v4l2src do-timestamp=true device=/dev/video1 ! video/x-raw,width=80,height=60,framerate=10/1 ! videobox border-alpha=0 top=0 bottom=0 ! videomixer name=mix sink_0::xpos=240 sink_0::ypos=180 sink_0::zorder=10 ! videoconvert ! v4l2sink device=/dev/video3 sync=false v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=320,height=240 ! videoconvert ! mix. &

sleep 5

nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=320,height=240,framerate=25/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

@careyer
Copy link
Contributor

careyer commented Nov 23, 2018 via email

@careyer
Copy link
Contributor

careyer commented Jan 4, 2019

@zipray : This might help you https://github.com/96fps/ThetaS-video-remap - we just need to find a better remap projection. Or: https://github.com/raboof/dualfisheye2equirectangular

@jnoxro
Copy link

jnoxro commented May 22, 2019

@kwatkins : I am happy my project inspired you. Unforuntaly my latest working version is based off of 16rc3. From 16rc4 onwards some change to the tx_rawsock and the sharedmem handling has been introduced which I cannot retrace - this prohibts to call tx_rawsock with a different port parameter than 0 (e.g. "p 10"), more precisely it gives an error message at the tx side. I can only give the advice to go back to rc3 as of now. Sorry & the best of luck

Hey, do you happen to have a download link for the RC3 image? can't find it for the life of me

@alisadra
Copy link

alisadra commented Jun 17, 2019

@careyer
you say
"... GRRRR!! I had to manually alter the release number string all over the filesystem (hidden files, config files.... god knows were I found this damn attached '+'). Now it works! "

and

" ...I did a grep command and searched the whole filesystem for any occurrence of the altered kernel.version string."

what your command in bash for find all "+" in kernel version??
I need command grep؟
Someone else can help me
thanks

@msergiu80
Copy link

msergiu80 commented Jul 16, 2019

@careyer Hi, I am looking at the convo above and what a great way to achieve what you had in mind, congrats. I broke 2 Picams in a week by pressing on the sensor and to be honest I am not really a fan of these cams, they are not really suited for airborne and the outside environment. I would like to use a Logitech c920 on 720p with wifibroadcast, love the image quality and the autofocus feature, but got no idea where to start. Well, I know I need to start in the .profile but changes that I could recompile in the convo above get me lost on some points.

I am using Lelik´s 1.6 rc6 image since it was the only one that allowed parameters loading in external mission planner LAN tethered from the GroundPi. Even so it only works until plugging in the RC :) Anyway ...

Main questions are:

  1. How do I force Pi on tx mode without a picam? Tried to change CAM=1 in two instances of the .profile but it doesn´t work?
  2. On how many lines do I have to replace raspivid for gstreamer in the .profile?
  3. What needs to be done on GroundPi in order to switch the video layer to a RX gstreamer pipeline?

Sorry if I am asking any questions that were answered before, but it is a long discussion above and maybe I missed a few.
Thanks in advance!

@careyer
Copy link
Contributor

careyer commented Jul 16, 2019

@msergiu80 : I am sorry that I can not be of much help with this. I have abandoned the EZ-Wifibroadcast project long ago and switched to the much more capable and more full-featured Open.HD project. It features USB-Cam and secondary CAM support as well as Picture-in-Picture and a custom QGC App - all right from the box. It has very active ongoing development as well.

@msergiu80
Copy link

msergiu80 commented Jul 16, 2019 via email

@careyer
Copy link
Contributor

careyer commented Jul 16, 2019

No need for that =). Just join the open Telegram support channel (link in the Wiki). It works but basically needs a bit of configuration and understanding. We still need to update the wiki on that rather new feature ;-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests