-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using OpenCV with EZ-WifiBroadcast #71
Comments
The wifibroadcast tx (as well as my further developed tx_rawsock) basically just send out what they receive on stdin, so if you can pipe your OpenCV output into it somehow it should work. |
Ahhh, so it isn't just 'hard wired' to transmit the camera stream, anything 'piped' (I'm not 100% sure what that means just yet) into tx_rawsock should be transmitted. That sounds brilliant. I've just read tbeauchant's post re an IP cam and s/he is using gstreamer ...
Looks like I have my starting point :) Do you have any further docs around tx_rawsock? Many thanks for all your (and Befinitiv's) hard work on this. Seriously, there is no way I'd ever be able to complete this project without you. |
I am piping a gstreamer-pipeline from a Sony QX1 MJPEG liveview-url to tx. Souphttpsrc -> h264enc. I can confirm that it works, but i still have to figure out bitrates and all that stuff to get more range. There is also some more latency on the stream. I will try the tx_rawsock |
i would be very interested in using a sony camera with wifibroadcast, currently I am having to use a B101 hdmi converter, if you could switch between it and the pi camera it would be awsome. |
rodizio1, Whilst I understand the purpose of EZ-WiFiBroadcast is simplify setup i.e. just download a couple of SD card images and you're good to go, I'm going to need to install a whole lot of software to complete my project e.g. OpenCV, Numpy, Python etc. Is there a 'manual install' doc available? That is, a doc that outlines the steps required to essentially create a working image from, for example, a vanilla Raspbian distribution like Raspbian Stretch Lite? |
Maybe the easier way to go is to find a way to boot the EZ-WifiBroadcast image so that it does not autostart all the RX, TX, OSD and Telemetry processes but welcomes you with a login prompt. Connecting the ethernet port to your router would allow for installing all the necessary packages then. @rodizio1: is there a best practice how to boot the image so that it does not enter transmission / reception mode automatically? I tried variant No. 1 from https://github.com/bortek/EZ-WifiBroadcast/wiki/Logging-into-the-linux-console but it was kind of problematic since there were overlays from the OSD all over the place and my nano editor was in the background ;-) |
The second method simply provides you with a remote command prompt much in the same way as CTRL-C does locally. It would certainly be better for me to have the manual install instructions. I've been trying to install and configure stuff on top of the image but just keep running into problems; bash commands that are not present, make commands (e.g. numpy) that appear to run/compile forever (48 hours before quit, should be 3-4 hours). And of course I'm scared to do any kind of apt-get update/upgrade for fear of modified drivers/firmware being overwritten. Whilst I do certainly believe that some would prefer the EZ (image-based) installation, for others this might cause more problems than it solves. It would be great to have a manual install method. |
There are no instructions as I just made changes to the image and did not document them apart from the changelog. Instructions on howto make the changes would need to be constantly updated as Raspbian is constantly changing, I don't have the time nor motivation for that. Finding out what has changed is quite easy, simply take the default raspbian lite image and compare it with my image. "meld" is a good tool for that for example. In the long run, I want to get away from Raspbian and use something smaller and better manageable like buildroot. In general:
Regarding logging in: Hmm, the OSD still being in front of the console when logging in locally is indeed not so nice. I've changed the profile script so that it automatically does a "killall osd" to quit the OSD after logging in. |
Wow, so this has been a voyage of discovery!! I have now managed to get python3.4.2, numpy1.8.2, gstreamer1.0 and opencv3.4.0 all installed and working simultaneously with EZ-WiFiBroadcast v1.6RC3 on a Raspberry Pi 3. This has required me to: resize file systems, find random files, compile code(!), edit configuration files and solve many other problems along the way but ... at least it works! I'm working on a fully documented bash script to automate this whole thing and I'll upload once I've tested it for others to use should they wish to. rodizio1 thank you for your candour, I totally get it. As I am eventually going to be writing an instructable for my project, I will want to start off with a vanilla Raspbian image, that is, I am going to try to use that "meld" tool. What was the exact version of original Raspbian source image? |
...p.s. I only put the (!) after 'compile code' because this is the first time I have ever compiled code. Yep, that's how much of a layman I am! Now that I have everything I need, I'm gonna have to stop wifibroadcast from snagging the hardware (i.e. camera) and instead have OpenCV do this. OpenCV then needs to process the video streams before piping them out through gstreamer to the wifibroadcast tx_rawsock device. Why does it sound so easy when it's just words? ;) |
@tictag I'm interested in your solution, since I'm experimenting with the same setup (video feed multicasted to OpenCV and EZ-Wifibroadcast) for realtime object tracking and streaming. Would you share your results performance-wise on Pi3? |
Of course, happy to. I'm just at the point where I'm adding my thermal stream so will be, probably tomorrow be looking to piping the streams into wifibroadcast, as opposed to it capturing the stream itself. On the receive side, I'll be extracting the two streams from wifibroadcast and piping them into OpenCV for further processing. ...and I have no idea how to do this yet!! Don't let me sound like I know what I'm doing! ;) |
Bump... |
Sorry, I never wrote that down (and in retrospect I found out that there is no version number or similar inside the Raspbian images ...) What I remember is, that version 1.0 was released around 15th of May 2016 and used Kernel 4.4.9 or 4.4.11, so it must be a Raspbian release around that time with that kernel. You can find the old Raspbian releases and changelogs here: |
@tictag : I am very interested in the automated bash script that you created to automate the installation of additional components to EZ-WifiBroadcast1.6RC3
In my use case I need to install the following components:
Being a total Linux noob this might help me get started a bit less troublesome. BTW: Patrick Duffy from DIY Drones (http://diydrones.com/profiles/blogs/thermal-imaging-for-your-drone-on-a-budget) send me ready-2-run image demoing the integration of FlirOne with Raspberry. It works flawlessly. However it does not build on WiFiBroadcast but on nomal WiFi streaming via Gstreamer and fixed IPs over the local WLAN. It also supports transmission of multiple video streams (Thermal Image, HD-Video from Flir & RaspiCam Video) - i.e. I am also following your progress in #76 |
Happy to help a fellow noob! If you only need gstreamer then my script probably won't help so much. Tbh, the most complicated thing (for me) has been compiling OpenCV. Mind you, should help with compiling the kernel. Definitely we can work together on this :) (... blind leading the blind ;) |
@tictag : That is good news! I will try to install gstreamer first (I suppose the way to go is to do it the regular way with apt-get?) and then get back to you? I think the FlirOne Device driver needs to be compiled as well :-|. Am I correct that you are using this driver for your project as well? Last (and probably most complicated) will be recompiling the kernel in order to get l4l2loopback support added. For this I definitely need some help. BTW: Here is a screenshot from what I was able to achieve yesterday evening with the Patrick Duffy image. The frame rate was surprisingly good - I believe it was definitely more than the regular 8-9fps. Felt more like 15-20fps which I was positively surprised about: |
Looking hot! Ahem, sorry... Yes, gstreamer just via apt-get, though I did have a few issues installing:
...to resolve
That should get you gstreamer installed. |
...oh, forgot to answer your question, no I'm not using the FLIROne, I'm just using the FLIR Lepton OEM module on a breakout board, connected to a RPi Zero via SPI (Serial Peripheral Interface), I'm using pylepton to control it, OpenCV for image processing and gstreamer to stream it via wifibroadcast's tx_rawsock. Does the FLIROne connect via USB? |
Yes, the FlirOne conencts via USB and features both a Thermal camera and a HD cam for visable light. Both videostreams can be accessed with the linux driver. Update:
|
say, what are you guys installing there exactly? Gstreamer is already installed (used for encapsulating the raw h264 video for missionplanner) and there is no package called "gstreamer1.0" |
Hi rodizio... We are trying to get some FLIR Thermo cameras working with EZ-Wifibroadcast. In order to make the Flir cameras work we need a very complicated Gstreamer pipeline with modules from Gstreamer good, bad and ugly... that is why we need the fullblown Gstreamer suite.
installes the hole suite and works flawlessley as far as I can tell. It installs all the nessesary additional modules. I also succeded to install the FlirOne USB driver (flir8p1-gpl), however now I am stuck installing the v4l2loopback kernel module. :-( I did:
Everything runs just fine without any errors but the kernelmodule v4l2loopback wont show up:
any ideas? |
@rodizio1 : I think I know now what the problem is with compiling the v4l2loopback kernel module The commands above load the kernel sources (or headers) for 4.9.35-v7+ wheras EZ-Wifibroastcast v1.6RC3 runs on 4.9.35-v7 (non+). Maybe that is why the kernel module compiles okay but does not get installed properly? Can you please advice me how to install the correct kernel sources/headers for 4.9.35-v7 ((and how exactly to apply the patches you made). I took a look at https://github.com/bortek/EZ-WifiBroadcast/tree/master/kernel/readme.md but I do not completely understand how to do it. You know... me just being a linux noob! I suppose i do not have to re-compile the whole kernel but only the kernel module. But I do not get how I can do it correctly. :-( Thank you so much! Your help is so much appreciated |
Okay success! Cheers! |
@rodizio1 I do not fully understand yet, how I can pipe my Gstreamer Output to WiFi-Broadcast and have it transfered. I found the tx_rawsock called in .profile but I do not understand all the parameters. My Gstreamer pipeline looks like this now: I feel the LowBudget DIY Thermal Drone is only 1 setp away. ;-) |
I've yet to figure this out myself but I do have a novice understanding about 'piping' in Linux. Piping is where you take the output of one process and feed it to the input of another, ever see that cult film 'the human centipede'? So,
... would output the current directory list and 'pipe' it into a file instead of printing it to the console. I've seen examples on the t'interweb where the output of raspivid (the default application used for pi cameras) is then 'piped' into the tx_rawsock application, thus:
I'm not quite sure yet how to do this with OpenCV but as I'll be doing the image processing using Python, I'll likely be using the Btw, if you have seen 'the human centipede', I apologise for associatibg that imagery with Linux piping!! ;) |
Use fdsink as the last element in gstreamer
Sendt fra min iPhone
… 21. jan. 2018 kl. 15:35 skrev tictag ***@***.***>:
I've yet to figure this out myself but I do have a novice understanding about 'piping' in Linux. Piping is where you take the output of one process and feed it to the input of another, ever see that cult film 'the human centipede'? So,
ls -p | directory.txt
... would output the current directory list and 'pipe' it into a file instead of printing it to the console.
I've seen examples on the t'interweb where the output of raspivid (the default application used for pi cameras) is then 'piped' into the tx_rawsock application, thus:
raspivid <parameters> | tx_rawsock <parameters>
I'm not quite sure yet how to do this with OpenCV but as I'll be doing the image processing using Python, I'll likely be using the cv.ImageWriter() method, piped into gstreamer, which in turn pipes into tx_rawsock.
Btw, if you have seen 'the human centipede', I apologise for associatibg that imagery with Linux piping!! ;)
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Thank you! I am familiar (as a linux noob like me can be) with the concept of piping in Linux. However I am struggling a little bit with the actual combination of things - to be precise:
I think that I need to change the following line(s) in /root/.profile
Receiving and displaying the video at the GroundPi:
Any help or recommendations would be greatly appreciated. PS: Right now my TX pipeline is:
Any my RX pipeline is:
|
Piping is the process of grabbing the stdout of one program and feeding it into the stdin of another. In the .profile script this is what is achieved on lines 688 & 71: Raspivid outputs to it's stdout the h264 data, this data is grabbed and sent to the stdin of tx_rawsock which in turns does it's magic to send it through the wifi. if you'd like to achieve the same thing with gstreamer, you'll have to replace the raspivid part of the command with your gstreamer pipeline, and make sure gstreamer outputs to stdout. This can be achieved by using the following sink: In order to get rid of this issue, I have created a fifo using mkfifo, and modified tx_rawsock to use it (look in the source code, there is a line where it opens the /dev/stdin, replace it with the name of the fifo file you created). Then in gstreamer simply output the data to the fifo: filesink location=/path/to/my/fifo A fifo is simply a special type of file on the filesystem that acts as a buffer. Also, the reason tx_rawsock is used twice in the .profile is to output different texts messages on the display dpeending on whether the script has detected some issues or not. Usually for testing I disable both these lines and run the command line myself from the terminal. Good luck, |
Hello again, I am using 1.6RC6. I tried commenting out the line 770( maybe off, but close) where the raspicam is being piped to tax_rawaock, and I commented the v4l lines below, and couldn’t get a picture, or rssi on OSD. |
@pullenrc : No you do not need to recompile the kernel. You just need to compile the v4l2loopback kernel module. For that you need to download the correct kernel sources and headers. (Beware of the "+" kernel string problem, see above). Not sure though: What do you need the v4l2loopback device for? You say you already have a video stream on opencv... what do you need the v4l2loopback devices then? Afaik v4l2loopback is for creating a "virtual video devices". Guess you don't need that since you have your stream already in opencv? |
finally i can launch modprobe, i finded how in a issue open by you(@careyer) in rpi-source's github. for now i maked this try in a pi3, but now i need execute the same in a pi zero, and i have a issue because kernel is different(again -.-), i use the same ssd card, but pi3 use kernel 4.9.35-v7 and pizero use kernel 4.9.35 Update: I downloaded the kernel of this github (4.9.35), I compiled and installed it, but v4l2loopback has the same problem and I can't do a make |
@careyer I got everything working, well, all except the streaming with the tx_rawsock using gstreamer with the flir one. It looks like on TX side all is well, flirone is reading/sending data, etc and I'm using the last bash .profile you linked above. I'm not seeing any HDMI out on the RX side tho. You have a latest you can share? @JRicardoPC I just took out the code that does the + append from rpi-source, although, I haven't done this on the RX side yet. I didn't think that side needed to be modified for those changes. I was seeing the same exec format error you had, due to ver mismatching on the .ko, basically, wipe all that out and just take out the code adding the + from rpi-source and do the steps again. There is also a SSL certificate issue you might encounter, they cover it on the repo but i just modified the python script to ignore SSL certs. |
@kwatkins : I am happy my project inspired you. Unforuntaly my latest working version is based off of 16rc3. From 16rc4 onwards some change to the tx_rawsock and the sharedmem handling has been introduced which I cannot retrace - this prohibts to call tx_rawsock with a different port parameter than 0 (e.g. "p 10"), more precisely it gives an error message at the tx side. I can only give the advice to go back to rc3 as of now. Sorry & the best of luck |
@careyer good to know. i'll give this a go with 1.6rc3. |
Sir @careyer, if you get a chance can post your working gstreamer pipeline (tx, rx too if it was changed) that you got working with the flir one and ez-wifi? Better yet, the /root/.profile you ended up going with for ez-wifi broadcast 1.6rc3 would be greatly appreciated 👍 For everyone (including me) struggling to get the images setup with v4lc etc, these are the steps that should get you there. This was used for the TX but should work for RX as well. Run all below as root from the v1.6RC3 image (EZ-Wifibroadcast-1.6RC3.img)
|
@kwatkins, you are welcome... I made several other changes to .profile but I am happy to share the tx and rx pipeline command with you: TX: RX: You can decode and display via the hellovideo standard video playback |
@careyer and for others giving this a go, I was able to get it all streaming, thermal vision style. Following the setup steps above on the the TX/AirPi (#71 (comment)) to get the Flir One streaming only changes were to the TX/AirPi "/root/.profile", changing where raspivid was used to, nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128,framerate=10/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS Also, since I wasn't attaching a camera I set $CAM=1 in the .profile for the TX/AirPi (ez-wifi checks if camera is attached to determine if it's RX or TX) channel '-p 0' worked fine and no changes needed to view video stream on RX/GroundPi. Again, this is using 'v1.6RC3 image', from what @careyer found this might not work on later images. My next steps are getting this all into something I can just attach to the bird and it works. I'm using a Li-ion Battery HAT on top of the TX Pi0 for power (https://www.waveshare.com/li-ion-battery-hat.htm) which has a 5V regulated power out. I'm using a mini-usb hub for testing, and includes an ethernet port, totally recommend this when your ssh root@ezwifibrdcast-tx to mess with different streams. Power wise I plan to solder the usb from the wifi directly to the pi splitting out power input from the battery HAT. I'm also trying out some other USB hubs, smaller ones, in the hopes that they work. Past that, it's mavlink time, finding a way to fuse all this together. Going to try using the mavic pro w/ mavlink wrapper (https://github.com/diux-dev/rosettadrone). I also have an Emlid Navio2 (Autopilot HAT for Raspberry Pi powered by ArduoPilot), if I can get the AirPi working and streaming off that directly, that's some sort of win I haven't wrapped my head around :) |
Heyyooo - the Zero4U (https://www.adafruit.com/product/3298) seems to be solid, a little hub that pogo jumps onto the usb test pads. Got the TX all contained, power from the HAT on top. Now to find something that will case it, attach, fly. |
Alright! Congratulations! I am happy that you got it right and working 👍 |
Hello, At start i try only send videotestsrc My TX pipeline: |
In the image from dino.de there are two streams sent side-by-side. On the ground you can switch between which stream you want to use for display, this works because the second stream is very low bandwidth (320x240 several FPS) from the FLIR camera. I see from your command line you use the same. I am working on a one-size-fits-all solution that will allow for multiple streams to be sent side-by-side, also v4l2 support in the image. This should enable you to use the /dev/video0 input. Would that be ok for your application? |
@Tigchelaar That would be awesome! Looking forward to it.
Ryan Pullen
On Oct 31, 2018, at 1:14 AM, Jelle Tigchelaar <notifications@github.com<mailto:notifications@github.com>> wrote:
In the image from dino.de<http://dino.de> there are two streams sent side-by-side. On the ground you can switch between which stream you want to use for display, this works because the second stream is very low bandwidth (320x240 several FPS) from the FLIR camera. I see from your command line you use the same. I am working on a one-size-fits-all solution that will allow for multiple streams to be sent side-by-side, also v4l2 support in the image. This should enable you to use the /dev/video0 input. Would that be ok for your application?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<#71 (comment)>, or mute the thread<https://github.com/notifications/unsubscribe-auth/AQVa15UGULPl1NU33sU0QkPHgH7sl0SIks5uqVvzgaJpZM4ROkO0>.
|
@RespawnDespair : The FLIRone code in the dino.de image is essentially my code. I have developed it and dino.de made the logic to switch between the different video streams afterwards. As you have noticed correctly both video streams are transfered side-by-side (in parallel). This is possbile because the 2nd stream consumes only very little bandwidth: 160x120 @ 9fps. As you have noticed it also makes use of the v4l2loopback kernel module (which has to be compiled for every plattform seperatly in order to start correctly, i.e. Pi0, Pi2, Pi3....). It would be very convinvient to have v4l2loopback support for all SOCs already build into EZ-WBC. So yes v4l2loopback is also a thing that I introduced. Essentially the HD-Video (Pi Cam) is written to videofifo1 at the GroundPi and The Flir stream is written to videofifo5. We have decided to do so because we wanted to be able to quickly switch between the HD and thermal view. At the flick of a switch essentially Hellovideo is restarted with a different videofifo (1 for HD and 5 for thermal). I am not sure what you mean with /dev/video0 input though? Now that we have the dino.de "AirSwitches" (which can either drive GPIOs or trigger software actions at the AirPi) it also might be possbile to send everything via the same tx_rawsock port "p 0" and already decide at the airside which stream to put into the radio link? I.e. do not transfer both videos on parallel anymore but switch on the airside which video gets fed into the standard video pipeline? |
@careyer Thank you very much for sharing the image from dino. DE.and I used usb hub to install Insta360 Air on the Air zero PI, and it couldn't start without the raspberry pi camera.Install the raspberry pi camera to boot, but hdmi does not output Insta360 Air video, do I need any special Settings? |
@zipray : Sorry the dino.de image contains support for FLIRone only. For insta 360 you need to modify the .profile and alter the tx_rawsock command like in #71 (comment). You have to either connect a PiCamera or search in the .profile for a variable called "FLIRforcecamera" (or something like that.. don't remember its name, should be at the very beginning of .profile) and set this to "=Y" . This will make the Pi believe that there is a Pi camera connected an start as AirPi. |
@careyer Thank you very much. I find that the video stream obtained from insta360 air is a dual-camera image that has not been Mosaic, |
@zipray : Congrats! You did it! 👍 ... Unfortunately I have no idea how to correct/convert that output to a different type of projection. Maybe @johnboiles or someone else can help? This reference of different types of projections might be helpful: https://en.wikipedia.org/wiki/List_of_map_projections - Seems like we are dealing with a "Nicolosi globular" (or also called "double fisheye") projection here. Update: Maybe these links can be helpful: |
Awesome! Thanks for sharing
Von meinem iPhone gesendet
… Am 23.11.2018 um 10:39 schrieb Ricardo Pérez ***@***.***>:
Finally I make a videomixer with picamera and lepton module, i tryed some ways and at the end i used v4l2sink to save the mix video in a virtual device, and then send this in another line.
The result:
And my code in TX:
nice -n -9 gst-launch-1.0 v4l2src do-timestamp=true device=/dev/video1 ! video/x-raw,width=80,height=60,framerate=10/1 ! videobox border-alpha=0 top=0 bottom=0 ! videomixer name=mix sink_0::xpos=240 sink_0::ypos=180 sink_0::zorder=10 ! videoconvert ! v4l2sink device=/dev/video3 sync=false v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=320,height=240 ! videoconvert ! mix. &
sleep 5
nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=320,height=240,framerate=25/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
|
@zipray : This might help you https://github.com/96fps/ThetaS-video-remap - we just need to find a better remap projection. Or: https://github.com/raboof/dualfisheye2equirectangular |
Hey, do you happen to have a download link for the RC3 image? can't find it for the life of me |
@careyer and " ...I did a grep command and searched the whole filesystem for any occurrence of the altered kernel.version string." what your command in bash for find all "+" in kernel version?? |
@careyer Hi, I am looking at the convo above and what a great way to achieve what you had in mind, congrats. I broke 2 Picams in a week by pressing on the sensor and to be honest I am not really a fan of these cams, they are not really suited for airborne and the outside environment. I would like to use a Logitech c920 on 720p with wifibroadcast, love the image quality and the autofocus feature, but got no idea where to start. Well, I know I need to start in the .profile but changes that I could recompile in the convo above get me lost on some points. I am using Lelik´s 1.6 rc6 image since it was the only one that allowed parameters loading in external mission planner LAN tethered from the GroundPi. Even so it only works until plugging in the RC :) Anyway ... Main questions are:
Sorry if I am asking any questions that were answered before, but it is a long discussion above and maybe I missed a few. |
@msergiu80 : I am sorry that I can not be of much help with this. I have abandoned the EZ-Wifibroadcast project long ago and switched to the much more capable and more full-featured Open.HD project. It features USB-Cam and secondary CAM support as well as Picture-in-Picture and a custom QGC App - all right from the box. It has very active ongoing development as well. |
Yes, tried that too, issue is the same, no loading of parameters in
external app, no documentation about USB cameras :) should I open a ticket
there?
…On Tue, 16 Jul 2019, 10:55 careyer, ***@***.***> wrote:
@msergiu80 <https://github.com/msergiu80> : I am sorry that I can not be
of much help with this. I have abandoned the EZ-Wifibroadcast project long
ago and switched to the much more capable and more full-featured Open.HD
<https://github.com/HD-Fpv/Open.HD> project. It features USB-Cam and
secondary CAM support as well as Picture-in-Picture and a custom QGC App -
all right from the box. It has very active ongoing development as well.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#71?email_source=notifications&email_token=ABYIA4T4TGNA2KDBSKVE4KTP7WLIPA5CNFSM4EJ2IO2KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD2AK3TA#issuecomment-511749580>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABYIA4WA45PSFOTNFPIKDX3P7WLIPANCNFSM4EJ2IO2A>
.
|
No need for that =). Just join the open Telegram support channel (link in the Wiki). It works but basically needs a bit of configuration and understanding. We still need to update the wiki on that rather new feature ;-) |
Hello Rodizio,
Just got my first video stream to work [RPi Zero()W with ZeroCam to RPi3 using 2 x RT5370 based nano USB adapters] and it looks great! :) Really fantastic work (from you and Befinitiv)!!
My query does not need a comprehensive answer, I don't mind doing all the digging to get something working, but I don't want to waste my time investigating if my use case simply isn't an option. I plan to take two seperate video sources connected to a RPi, do some rudimentary image processing then merge them together using OpenCV ... and here's the query ... could I then use EZ-WifiBroadcast to transmit that composite video stream to a receiver?
I've read every page of your wiki and everything revolves around broadcasting a source video from Tx to Rx. Basically, can my source video actually be an output stream from OpenCV?
I don't mind putting the hours in to investigate/troubleshoot etc but not if this use case is simply not do'able.
I would appreciate your thoughts.
Oh, and if you don't get back to me before new year, Happy New year!! :)
The text was updated successfully, but these errors were encountered: