Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discoverable, decentralized, cached live video streaming #31

Open
m-anish opened this issue Feb 26, 2019 · 69 comments
Open

Discoverable, decentralized, cached live video streaming #31

m-anish opened this issue Feb 26, 2019 · 69 comments
Assignees
Labels
architecture Something to do with designing an architectural definition zanskar 2019 Priority issues related to the Zanskar 2019 summer trip
Milestone

Comments

@m-anish
Copy link
Member

m-anish commented Feb 26, 2019

As a subset of #18 , one of the requests from Zanskar to focus on the live video streaming aspect of the mesh network.

The current implementation involves running vlc on a windows laptop and using its web camera and microphone to create a live video stream, and then asking users in different places to open that live stream in their vlc instances. This works well, but suffers from:

  • The fact that this does not happen over a browser, making discovery hard!
  • There is no caching of any sort happening in the mesh nodes, so for every new user, the same amount of additional streaming load is created on the network. For example, if the livestream is 300kbps bitrate (video+audio encoded data), and there are 10 users watching, 3mbps of overall network bandwidth are consumed.

What if there could be some kind of caching implemented on the mesh nodes.

I came across this project, whose readme also lists other potentially similar projects.

We can break this down in two parts:

  • Discoverability
  • Operational efficiency
@m-anish m-anish added the architecture Something to do with designing an architectural definition label Feb 26, 2019
@m-anish m-anish modified the milestones: 0.6, future Feb 26, 2019
@m-anish m-anish changed the title Decentralized or cached live video streaming Discoverable, decentralized, cached live video streaming Mar 2, 2019
@m-anish m-anish added the zanskar 2019 Priority issues related to the Zanskar 2019 summer trip label Mar 2, 2019
@m-anish
Copy link
Member Author

m-anish commented Mar 2, 2019

This might be interesting to try out.

@m-anish m-anish modified the milestones: future, 0.7 Mar 4, 2019
@m-anish
Copy link
Member Author

m-anish commented Mar 7, 2019

See this related issue. Some promising options there.

@m-anish
Copy link
Member Author

m-anish commented Mar 11, 2019

Okay! So I did some more research on this:

  1. I tried YouPHPTube, which claims to be able to broadcast to a website from a mobile phone. I setup the service on a VM smoothly enough, but am working through issues getting the streaming stack up. This seems like a traditional working solution - the safe bet. I will update if I get things operational. This also comes with an encoder, so whatever formats people upload videos in, it will convert them to web formats accordingly. Amazing!

  2. This is a stretch goal - hlsjs-p2p-engine. Decentralized video playing, with support for live streaming. Since this is a stack, we will have to build a basic video website (or use one already available - will need to research). But this will be an AMAZING solution in a decentralized setup.

  3. There is also this solution - streamroot based on similar lines, but not sure how much of it is open/proprietary.

  4. RTCMultiConnection also looks promising, but the demos don't seem to load, and the project has been inactive since the last 2-3 years.

  5. The link above is the work of the same person who has created this aggregated page of all his webrtc work. Lots of potential. (@pdey something for you to tinker with!). Do we have in house basic app development expertise?

  6. Dat project seems interesting too!

  7. Salsify Future - research project at stanford promising dramatic performance improvements!

  8. Tribler Experiments over webtorrent.

  9. p2p over webtorrent with some known limitations.

  10. peerstreamer-ng looks promising to build upon, even if the documentation is sparse.

@m-anish
Copy link
Member Author

m-anish commented Mar 11, 2019

Update. I was able to get basic live streaming to work with YouPHPTube. So we have some kind of a working solution now.

Onwards towards something smoother, and closer to being more decentralized.

Separately, I will work on a playbook to get YouPHPTube into IIAB.

@holta
Copy link

holta commented Mar 11, 2019

Separately, I will work on a playbook to get YouPHPTube into IIAB.

How realistic/practical might it be for educators to "publish" short videos from their Android phones[*] (or short screencasts/tutorials from their laptops) to YouPHPTube on IIAB?

[*] whether 100% homebrewed or from WhatsApp, or YouTube itself...which contains a ton of very thoughtful educational vids and visualizations (amid the swamp ;)

ASIDE: many YouTube science demonstrations are more vivid than PhET science simulations...these 2 artforms should really be combined side-by-side!?

@m-anish
Copy link
Member Author

m-anish commented Mar 11, 2019

Another update. Me and @sboruah were able to test that she could livestream from her phone's camera, and I could watch it in a browser.

There is one limitation: The live stream does not get recorded directly onto the site, and it seems that the feature is enabled through a paid plugin. There is a work around though, as it is easily possible for those videos to be recorded on the server somewhere, where they may be presented (manually), through a simple online form, to the encoder as part of this suite.

@m-anish
Copy link
Member Author

m-anish commented Mar 11, 2019

@holta for what its worth, streaming, transcoding requires significant compute resources, so if IIAB is running on decent hardware (i3 NUC or so), then it is entirely possible.

@m-anish
Copy link
Member Author

m-anish commented Mar 13, 2019

Bumping milestone. Good progress happened on this over the week.

@m-anish m-anish modified the milestones: 0.7, 0.8 Mar 13, 2019
@m-anish
Copy link
Member Author

m-anish commented Mar 13, 2019

One challenge with p2p hls webrtc based solution seems to be that we might have to build apps, or hunt for existing ones, that can do a live stream.

@mikkokotila
Copy link
Collaborator

Let's see...maybe not. Maybe we just have to allow it through a browser by simple means. Let's see.

@m-anish
Copy link
Member Author

m-anish commented Mar 13, 2019

I'm talking of mobile phones. People accustomed to using apps, so it might be the preferred way. Desktop will have to be browser based.

@mikkokotila
Copy link
Collaborator

Well, I think people are more accustomed to go to a browser than to an app, even if you count all apps combined vs. browser. But surely browser will win every time a custom made app for a single purpose.

@m-anish
Copy link
Member Author

m-anish commented Mar 13, 2019

Also, I have little knowledge of app development, but maybe the security model in phones also encourages apps rather than webcamera+mic permissions in a browser. Anyway, I dont know very well about this, so perhaps that's why I think someone more adept needs to answer this.

@mikkokotila
Copy link
Collaborator

mikkokotila commented Mar 13, 2019

It makes sense to have a protocol where we always assume that something can be done in a browser, because it is easy and it does not introduce anything else and allows us to focus on core capabilities instead, prototype fast, etc. etc. and only if that absolutely fails, we consider introducing something new (like an app). Apps are kind of like cognitive garbage...just like we should not use new materials if we can use garbage, we should not use apps if we can use browser.

I'm talking of mobile phones. People accustomed to using apps, so it might be the preferred way. Desktop will have to be browser based.

Precisely, which further makes the case for the browser based approach 100%, as it is same regardless of device and platform. You can be on apple or samsung, or you can be on your laptop or tablet or whatever, it does not matter, you know exactly what to do to access the livestream (or whatever else it may be).

@mikkokotila
Copy link
Collaborator

Also, I have little knowledge of app development, but maybe the security model in phones also encourages apps rather than webcamera+mic permissions in a browser. Anyway, I dont know very well about this, so perhaps that's why I think someone more adept needs to answer this.

No. In app you give the permissions to the app, in browser you give permissions to the browser. Access to camera is 17 characters long syntax and works almost universally. (does not work on edge I think).

Once I have the setup in home, we will know much more about all of these things as certainty.

@mikkokotila
Copy link
Collaborator

Possibly meaningful here >> https://recordrtc.org/

@mikkokotila
Copy link
Collaborator

I can't see any reason why this would not work on just the mesh node.

@mitra42
Copy link

mitra42 commented Mar 27, 2019

@m-anish asked me to comment ... I built the dweb.archive.org site as a decentralized version of the Internet Archive, and am about half way through a year long project dweb-mirror for a offline version of the Archive (suitable for mesh networks etc), it currently can crawl IA resources; serve them fully offline and also act as a proxy (saving as you browse a local server). Its tested on MacOSX; RPi (NOOBs or IIAB); and Rachel3+/WorldPossible though its still got bugs I'm working through.

For video ... we tried WebRTC - it was a massive resource hog, caused browser crashes etc, though we think that was mostly because it was opening lots of connections to DHT's. Like most people in the P2P community we dropped it as far as possible. The one video system we found worked well was WebTorrent, especially because it can easily be setup to have fall-back Http URLs, which works well when you have a mix of popular videos (lots of peers to connect to) and unpopular (no other peers online), so the latter gets seeded by http from an origin server.

I'm part way through a process where videos viewed on the dweb-mirror project will be shared by any peers using webtorrent, in part to get around the well-known issues with bandwidth on Raspberry Pi's.

@m-anish
Copy link
Member Author

m-anish commented Mar 28, 2019

@mitra42 thx for commenting!

Did you also get a chance to try hls? We experimented with hlsjs-p2p-engine and got decent results for live streaming and are thinking to build around that? HLS is not 'true' live, and there is a lag of about 10-20 sec but it is perfectly acceptable for our scenario.

Also, re: webtorrent, is there a project that we can test on the mesh we have locally to do some live video streaming? How well/badly does it work on mobile phones? Would love to test. I came across this in my research, but didn't test it as it seemed they had some known limitations, and limited browser support, but perhaps should rethink that decision.

One reason we are considering the hlsjs-p2p-engine is it seems to be well supported across browsers through various containers (video.js et. al.). We are also thinking to integrate hlsjs-p2p-engine in the video.js container within kiwix zim files which are heavily used within IIAB.

@mitra42
Copy link

mitra42 commented Mar 28, 2019

I haven't tried webtorrent for livestreaming, its not our application, our application (at Internet Archive, and in dweb-mirror) is for static video, and in particular to make sure the experience is always as good as a direct connection to the http server.

@mitra42
Copy link

mitra42 commented Mar 28, 2019

With just video and relatively small number of viewers then for live stream then webrtc (which is what hlsjs is built on) may be what you want, I'm just sceptical of it for viewing static videos given all the problems and how many people have abandoned it. Have you also tried it fully disconnected, webrtc has supposedly some single-point-of-failure issues with single rendezvous servers, but maybe hlsjs knows a way around it.

@m-anish
Copy link
Member Author

m-anish commented Mar 28, 2019

Hmm, I haven't yet tried hlsjs-p2p-engine fully offline. It needs a signaling server and a few js files. We ran the signaling server locally but the js files were still being served online. I think we should make it a priority to test it fully offline (@pdey ), but I am optimistic that it should work offline.

For VOD, we do not envision a lot of p2p, atleast initially, as the number of concurrent users watching a particular video may be small, but still it might get around cases where when a teacher in a class asks pupils to load a video in classroom laptops/tablets and suddenly there is a lot of load on the server.

Thanks for your comments, this is helpful info. To set some basic context, below are two blogposts about Zanskar, where we hope to test and deploy these technologies (more posts to come in the near future).

https://medium.com/eka-foundation/en-meshed-in-la-la-land-part-1-34e0ce29ea1b
https://medium.com/eka-foundation/enmeshed-in-la-la-land-part-2-9799b6f8e9d2

@georgejhunt
Copy link

Seems like ipv4 multicast can be brought into the picture in the case where a whole class needs to have access to the same video, at the same time

@mitra42
Copy link

mitra42 commented Mar 28, 2019

If what you are doing is static video, I'd strongly recommend using WebTorrent instead, because I'm guessing that the case of a class watching recommended videos is probably more common than all watching the same video at the same place at the same time.

@m-anish
Copy link
Member Author

m-anish commented Mar 28, 2019

Actually, the primary case we want to "optimize" for is live streaming, and server loading during that. That concurrent class watching a video is something that we'll have to see from usage data if it really is a pain point, but it was just a guess that at something which might happen in the future.

But the primary focus remains to make live streaming more decentralized and efficient.

Apologies if my earlier comment was confusing.

@mitra42
Copy link

mitra42 commented Mar 28, 2019

Understood - I guess I've not seen livestreaming (i.e. video camera capturing image, showing it elsewhere) being a usecase in disconnected networks, so good to hear that its really an issue for your use cases..

@mikkokotila
Copy link
Collaborator

Hmm, I haven't yet tried hlsjs-p2p-engine fully offline. It needs a signaling server and a few js files. We ran the signaling server locally but the js files were still being served online. I think we should make it a priority to test it fully offline (@pdey ), but I am optimistic that it should work offline.

There is absolutely no issue serving js offline via static files. If we like, we can also do it all on node.

@pdey
Copy link
Collaborator

pdey commented Apr 27, 2019

Updates on hls-js-p2p engine:
I tried to get hls-js-p2p engine to work fully offline.

What works

  • video streaming is working. Though the video.js library used in this project is deprecated.

What does not work

  • Found that the engine actually uses a live CDN that takes care of p2p sharing and caching. The p2p sharing service is disabled on this CDN now. So we don't have p2p sharing happening for offline mode.

Based on these findings, i have tried thinking on a slightly different idea. Following is an outline ( I will soon share the details as i work them out)

  1. Use https://github.com/videojs/http-streaming as a video player. This replaces the deprecated library (https://github.com/videojs/videojs-contrib-hls) used in hls-js-engine.
    I have tried this in mesh network using ffmpeg for hls video production and works fine.

  2. Work on a simple, configuraable reverse-proxy based caching using raspberry-pi and good quality sd - cards and external ssd/hdd to cache and serve HLS chunks. Need to think about the topology. (one POP in every village connected with mesh ?)
    This approach will partially take the load off from origin server of streaming. Though its not a p2p solution. It may take some time to design and build a proper p2p sharing system, preferably using webRTC and webTorrent, which we can do later.

  3. A very simple, light-weight static file server running on a rpi + camera setup for producing hls video. @m-anish has already shared some libraries that i will be testing soon.

@m-anish
Copy link
Member Author

m-anish commented Apr 27, 2019

Prasenjit, I also had some conversations and thought processes around this. Lets talk soon!

@pdey
Copy link
Collaborator

pdey commented Jun 18, 2019

bundle.zip
@m-anish
Writing down the details of obs studio + nginx + hls setup

Install and setup NGINX

Download

sudo apt-get install build-essential libpcre3 libpcre3-dev libssl-dev
wget http://nginx.org/download/nginx-1.7.5.tar.gz
wget https://github.com/arut/nginx-rtmp-module/archive/master.zip

Extract

tar -zxvf nginx-1.7.5.tar.gz
unzip master.zip

Install nginx

cd nginx-1.7.5
./configure --add-module=../nginx-rtmp-module-master
(Note: we are not configuring nginx with ssl, not necessary for us i suppose)
make
sudo make install

Download and copy NGINX init scripts

sudo wget https://raw.githubusercontent.com/JasonGiedymin/nginx-init-ubuntu/master/nginx -O /etc/init.d/nginx

sudo chmod +x /etc/init.d/nginx
sudo update-rc.d nginx defaults

Test upstart script

sudo service nginx status # to poll for current status
sudo service nginx stop # to stop any servers if any
sudo service nginx start # to start the server

NGINX service configuration

sudo nano /usr/local/nginx/conf/nginx.conf

Remove all lines and paste the content of attached file. Change the root and port in http module and hls_path in rtmp module to appropriate values

Test config file

sudo /usr/local/nginx/sbin/nginx -c /usr/local/nginx/conf/nginx.conf -t

Cross-domain config

sudo nano /usr/local/nginx/html/crossdomain.xml

<?xml version="1.0"?>
<!DOCTYPE cross-domain-policy SYSTEM "http://www.adobe.com/xml/dtds/cross-domain-policy.dtd">
<cross-domain-policy>
<allow-access-from domain="*"/>
</cross-domain-policy>

Restart

sudo service nginx restart
sudo service nginx status

Link stream url to OBS studio

stream server: rtmp://localhost/hls
stream url : http://localhost:8081/hls/<STREAM_KEY>.m3u8

__
nginx-conf.txt

@m-anish
Copy link
Member Author

m-anish commented Jun 19, 2019

Work on this lives:

  1. In an IIAB playbook
  2. In a simple html webpage

@m-anish
Copy link
Member Author

m-anish commented Jun 19, 2019

FWIW. something for the future. https://openvidu.io/

@m-anish
Copy link
Member Author

m-anish commented Jul 19, 2019

This might be interesting too - https://github.com/arslancb/clipbucket

Looks like for Zanskar 2019 - we are going with cham

@mitra42
Copy link

mitra42 commented Nov 24, 2019

@m-anish There isn't any docs on that cham page,

  • is it online (live) only, or also for authoring podcasts ?
  • how does it get installed - and is that easy on IIAB or any concerns ?

@m-anish
Copy link
Member Author

m-anish commented Dec 17, 2019

Hi @mitra42 Apologies for such a late reply.

For Zanskar, we went ahead with Cham, but there were a lot of changes that were made to it while in Zanskar. @pdey and @so-ale probably have those in their personal repos/storage.

Also, as it turns out, there seems to be some issue with the intel NUC (we don't know whether it is hardware or software at this point), but the network hasn't been operational for the past 3 weeks or so.

I shall be getting the schoolserver shortly, which is in transport from Zanskar, which will also have the latest cham code commits. If @pdey or @so-ale can produce them, that'd be fine to, but I should be able to respond to you soon (hopefully a week to ten days).

Cham was quite simple and straightforward, and actually there may be lots of room for adding complexity :)

It can be used for live streaming, and the live streams are also archived in various quality settings once the session is done.

Don't understand the 'authoring podcasts' question. The way it works is very simple.

Some user end software like OBS is used to compose a stream. It can be a live stream, audio, or pre-recorded as long as OBS can handle it. It streams it to an rtmp end point where cham, running on nginx produces 3 different quality levels of 'live streams' while also recording them to disk. The live streams you get as a consumer depend on your connection bandwidth to the server.

All this works well but everything is centralized, and for the future, we'd wish to look into making this more decentralized, or atleast move it beyond a single point of failure.

@mitra42
Copy link

mitra42 commented Dec 17, 2019

Thanks @m-anish, that sounds replicable, which would be great, I'm hoping our first deploy is going to be in a limited-internet school in Indonesia, and it looks likely they will want to integrate local content.

I look forward to a bit more detail on that setup, there are a bunch of new things there (OBS, rtmp, cham,), which I can look up, but piecing them all together would be much quicker with some more details. I'm interested then to figure out hooking up the output for selective automated upload to the Internet Archive when the box sees the internet connecting and/or via sneakernet.

@pdey
Copy link
Collaborator

pdey commented Dec 18, 2019

@m-anish I think you will get the code base soon. If you need it any earlier, i have a copy in one of my hard drives but i am not sure if it has the latest snapshot.

@m-anish
Copy link
Member Author

m-anish commented Jan 3, 2020

Hi @mitra42

So, I updated the cham repo to the latest codebase. But there is a catch. The installation of cham is really in two parts:

  1. The html frontend that displays the live video stream.
  2. The backend with rtmp, nginx etc.

I had written a cham playbook for iiab that does everything that is needed for a working setup, but there have been many changes to iiab since so I no longer know if my playbook will work there.

My question to you would be: Do you want this as part of IIAB or as a standalone setup? If its the former, I'll work on updating the playbook, which I guess I need to anyway. If it is the latter, I will pass you specific instructions to get everything working. I'll need to know what hardware you're running on and your OS version in that case.

Looking forward to your reply.

@mitra42
Copy link

mitra42 commented Jan 3, 2020

Thanks - I could see using it in either context (standalone or IIAB) so probably doing the IIAB one is best - especially since I think that is what you are using? With a working set on IIAB I can always look at how to adapt that to a different setup if required.

@m-anish
Copy link
Member Author

m-anish commented Jan 4, 2020

hmm... okay. Let me try an IIAB install on a VM and see how it goes. Give me a couple of days to iron out the proper instructions for you and make any necessary changes in the PR.

@m-anish
Copy link
Member Author

m-anish commented Jan 5, 2020

@mitra42 I updated the PR that adds cham to IIAB.
iiab/iiab#1743

You can try a fresh iiab install. I tried on a VM with Ubuntu 19.10 and it seemed to work.

@mitra42
Copy link

mitra42 commented Jan 5, 2020

Will do, @holta wants me to do a test with the current version anyway, so can do both at same time.

@holta
Copy link

holta commented Jan 30, 2020

@m-anish if you can in coming days, please help @mitra42 with his Cham installation question @ iiab/iiab#2209 ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
architecture Something to do with designing an architectural definition zanskar 2019 Priority issues related to the Zanskar 2019 summer trip
Projects
None yet
Development

No branches or pull requests

6 participants