Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to save PeerTube from constant dying instances. #5783

Open
S7venLights opened this issue May 2, 2023 · 34 comments
Open

How to save PeerTube from constant dying instances. #5783

S7venLights opened this issue May 2, 2023 · 34 comments

Comments

@S7venLights
Copy link

S7venLights commented May 2, 2023

Describe the problem to be solved

Many instances are constantly disappearing as they can't afford to maintain the server costs (Bandwidth).

Describe the solution you would like

Torrent clients have existed for years and have kept most loved content up/available reliably. That's because every user is 'donating' storage and bandwidth.
PeerTube only does this with people running the same video in a browser at the same time or between instances.
I propose a PeerTube app that works essentially like any torrent client/app.
It can cache a user chosen amount of gigs of video and seed it so long as a person runs the app.

In order for it to work well, it would help to have a more centralised instance, so more seeders exist, but I guess the seeding could just work across all instances.
Additionally, if magnet links are used, a content creator could always seed their own content, ensuring it's still available even if it has low seeds or an instance goes down, since the PeerTube app could save the magnet links when someone subscribes to a channel.

As a result:
Popular videos will always be bandwidth backed.
And perhaps to solve storage costs, a content creator could opt to solely self host the video data on their machines/servers/seedbox but still have their channel hosted on the instance server. So in that case, the instance acts as the torrent website directory for the creator, but the data and bandwidth is run by creators and users.
It would also reduce overall bandwidth use, since if users re-watch a video, it will still be in their seeding cache days later, so it could play locally and update the watch count in the instance.
As to creators wanting to delete or update videos, they could do so in the app and that would signal the hosting instance to update the details and the instance can signal all other seeders to sync changes.
Torrent networks and apps are well established and tested.
Honestly this would be a win...

I can't take full credit for this Idea, it was partly inspired by https://autotube.app/
Additionally we could allow people to download, watch and seed videos with regular torrent clients, thus increasing seeders or seeding up-time.

Alternative idea:
Use a similar setup to https://storj.io/ were people volunteer host data on a distributed P2P network with redundancy built in. (I suppose IPFS works similarly?)
This way, even if a content creator went offline forever, their videos will still be seeding/hosted with other nodes.

@S7venLights
Copy link
Author

There seems to be more convo about this here and #494

@alejandrocobo
Copy link

alejandrocobo commented May 11, 2023

As an admin of two Peertube instances, for me, the storage is the most costly resource. At the end, it is easy to distribute the content among other instances to share the bandwidth. I don't see a problem with the cost of the bandwidth. But since every video in a particular instance must be stored locally, that causes a high hardware/service cost.

I don't think there is a good solution for it. Distributed storage is extremely slow and is not 100% guaranteed the content is going to be there forever.

Leaving aside the physical cost of the maintenance, I think it tend to be a lack of planning when somebody set a new instance. Maintaining a Peertube instance has a high cost and that cost is going to be permanent. If you allow registrations, that cost may escalate, at least, lineally. But in some cases it could grow exponentially.

I understand that in some cases, the growth rate of an instance is unsustainable. When the success is hard to predict, the death of success is unavoidable and is nobody fault. In other cases, admins simply don't set a financial plan for their instances. An instance with open registration is going to cost x TB of storage per year. The admin has to have the monetary resources available considering a bad case scenario. Otherwise, the instance is going to close registrations or shut down.

Then, there are some strategies to save storage. For example, you can delete resolutions >1080p in videos with a size > 1GB older than 6 months that has less than 5 views. In some cases, that could save several TB with a minimal service degradation. Maybe Peertube could show a list of videos with certain features (size, age, # of total view, # of views in the last month, max resolution, etc) to easy the job.

My opinion is that the current bandwidth/storage architecture of Peertube is fine. Its scalability can still improve. But overall I don't see a clear better alternative.

Maintaining a Peertube instance is expensive and that is something that whoever wants to set a new instance, has to know.

@ROBERT-MCDOWELL
Copy link
Contributor

ROBERT-MCDOWELL commented May 11, 2023

@S7venLights
it forces us to completely rethink about internet technology and research about how to stock data and where. We are not stuck to reinvent a better tech to get rid of DC, overseas cables and data monopoly from those who have more money (printing) or control politics by corruption. The best would be save and get data instantly on a decentralize network not going into DC nor backbones cables.... "Imagine all the people..." ;)

@S7venLights
Copy link
Author

@alejandrocobo
Thanks for that perspective on storage costs.

I think the idea of a creator seeding their own content like a torrent app and their viewers also seeding could solve the storage issue though, an instance could opt to act as a magnet host in that case.
It could be an optional thing for the instance and/or channel creators.

You said distributed storage is slow but storj.io has managed to make it very fast and even decent quality torrents can be streamed (See the webtorrent app, biglyBT on windows, Stremio or streaming torrents with Brave browser)

From a content creators perspective, dying instances has been a big pain point for me. Something could be done.

@S7venLights
Copy link
Author

@ROBERT-MCDOWELL Sorry I'm afraid I don't understand what you're trying to say…

@ROBERT-MCDOWELL
Copy link
Contributor

@S7venLights
because you're still not ready I think...

@ghost
Copy link

ghost commented May 25, 2023

As an admin of two Peertube instances, for me, the storage is the most costly resource

Big +1 here. As a server admin, the problem is having a reliable and low latency place to store lots of data. As a creator or fan, the problem is reliable hosting of the video you care about.

Both are problems of quality, not just quantity. I don't really care if a bunch of enthusiasts occasionally mirror popular videos. That doesn't help with the problem of quality, trustworthy, low latency long term storage. Random people who flock to popular videos aren't a reliable storage system for the specific things that matter to you and your community.

I'd rather see us find ways to develop trusted peerages between servers, something where friendly administrators can provide that trusted and low latency storage to each other. Trust less and fully distributed networks are a nice addon when you can get them but I don't think we can ever use that for primary storage.

@S7venLights
Copy link
Author

I always assumed people self hosted Peertube on like a NAS or PC at home In which case a couple TB hard disk drive is not that costly, but I suppose people are using VPS's where storage costs more?

In any case, my original idea stands, torrents have worked reliably for decades no reason why they can't help PeerTube too right?

@ghost
Copy link

ghost commented May 27, 2023

no reason

PeerTube already supports torrents, but as you can see in PeerTube they aren't the best fit for videos that are large or dynamic. The other big problem is that torrent infrastructure is largely incompatible with the web and WebTorrent does not solve that, it just creates a parallel (and much smaller) ecosystem.

Fundamentally this is not a technical problem, it's a social problem. All technical things break down without human labor, and a durable way to store video online is a social ritual more than it's a technical object.

@S7venLights
Copy link
Author

S7venLights commented May 28, 2023

Yesterday I streamed an 18GB Video Torrent using www.stremio.com (an electron app/website to stream torrents)

As to social problem, I get that, but people have been seeding torrents for years, it's a social expectation that you should seed.
If there were a Peertube app that explains that option to seed and run in the background people would do it I reckon.

@ghost
Copy link

ghost commented May 28, 2023

Yesterday I streamed an 18GB Video Torrent

Torrent support is not hypothetical in PeerTube, please just try it out instead of bragging about whatever this is. It takes a long time to start playing long streams, and it's not suitable for live broadcasts.

@S7venLights
Copy link
Author

Oh I wasn't bragging I just didn't understand what you meant when you said: "in PeerTube they aren't the best fit for videos that are large or dynamic" I thought, if you can stream a high quality movie via torrents and in a web browser then why isn't it a good fit?
Since you say "It takes a long time to start playing long streams, and it's not suitable for live broadcasts."
I am starting to follow that what you're saying is that torrent streaming doesn't perform well enough.
I guess the wait for a video to start would bother people.
I'd still rather have a waiting time for a video to start though than have instances/my channels disappear without warning and loose hours of my life re-uploading.
Seeding your own channel via torrents seems to me a useful solution for the content creator that will bolster bandwidth and the users contribution, perhaps it could at least act as a fallback.
If/when an instance dies, your seeds and trackers could still exist on your PC and those that have watched your videos and perhaps a local backup of your channel identity/authentication, comments, video descriptions, likes, etc could exist in the proposed PeerTube App. Then the content creator could signup to a new instance to make your channel available online again instantly. (Albeit on a new domain).
I'm just thinking out loud and not sure how it would all work technically, but my goal is simply a solution to a problem that makes PeerTube unsustainable, because it is the best video hosting concept out there if not for dying instances.

@S7venLights
Copy link
Author

To quote my original post: And perhaps to solve storage costs, a content creator could opt to solely self host the video data on their machines/servers/seedbox but still have their channel hosted on the instance server.

In this case it will be the users choice to have videos start slower.

@ghost
Copy link

ghost commented May 29, 2023

I'm just thinking out loud and not sure how it would all work technically, but my goal is simply a solution to a problem that makes PeerTube unsustainable

It's a good motivation, I'm just trying to say this is not a new or on its own useful idea. There are a lot of angles which means it's easy to get distracted, but here are three basic things to consider:

  1. Peertube literally already supports torrents. Please try them out at least, and understand what's good and what isn't. A huge limitation is that classic torrents and web browsers are not using compatible network protocols. At some point a bridge needs to run or a compromise in functionality is made, and when you look closely at that specific point in the design it'll be hard to identify advantages over just running a regular https site.

  2. "I can download a movie" is not the quality bar we are going for. HLS support in PeerTube has been super useful for hosting streams. The use case is not the same as torrenting, and folks expect to follow a link to a timecode in a four hour videogames stream and have it load in seconds. With classic torrents you'd be downloading hundreds of megs of metadata you don't need and connecting to loads of unreliable servers over a protocol the browser doesn't itself support.

  3. Your motivations are pointed in the right direction but this is a fundamentally and deeply hard problem. Data storage requires ongoing human effort, and data availability requires consistency and trust. This is all difficult to coordinate in a truly distributed and low trust environment, and the world's full of hucksters who want you to centralize around their particular shortcut.

sincerely,
-beth

@mskiptr
Copy link

mskiptr commented May 31, 2023

@scanlime Hi, could I ask you a few questions on how WebTorrents work with PeerTube and how they could be optimized?

  1. I thought that PeerTube always used WebTorrent, but here you're suggesting that the servers will happily use HTTP instead. What's up with that?
  2. You've mentioned HTTP Live Streaming. Is that only for livestreams? That is a very distinct use than just watching a video (which is also very distinct from just downloading a video), so no wonder it would require a different approach. Also, wouldn't using multicast decrease the server load here by a lot?
  3. I've just watched a new 16 minute video on TILVids. It was 161MB total, with 102 MB downloaded from peers and 59MB from servers. It started reasonably quickly, and both the servers and peers were contributing during the first ~5 minutes. Then, it paused for a short while and the only traffic would start coming from servers. Only around halfway through the video the peers would start contributing. Then they took over until the end. The total peer count steadily increased util reaching 39 at the end. It seems like there were some inefficiencies in the scheduling and that actually only a small fraction of the peers really were active. Most of the time, there was no traffic – neither up nor down.
  • Regular Torrents tend to be downloaded as quickly as possible. I understand that this would increase the server load, but shouldn't peers still behave this way to minimize any buffering pauses? Also, this would allow for seeding larger portions of the video from the very beginning.
  • When I finished watching, I had uploaded 35MB. That's a seed ratio of only about .21. But it's definitely not insignificant and neither is the peer:server:total download ratio. I've left the tab open and now I'm at 101 166MB of uploaded content. If I had closed that tab, over 65 130MB more would have to be served from elsewhere. This suggest to me that seeding all watched videos in the background would be a significant bandwidth saver for the servers. I doubt it's possible to achieve with just JavaScript, so maybe a simple browser extension for that would be a good idea?
  1. You've said that storage is much more expensive than bandwidth for typical servers. I know there are existing strategies for dealing with that (like removing resolutions), but wouldn't the solution that @S7venLights mentioned (i.e. removing the content altogether, but keeping the metadata + letting the author host|seed it themselves) be better than going broke (and thus worth implementing)?
  2. I understand that relying solely on peers and no servers is bad for latency. Couldn't then servers handle only the very beginning of a video and let the peers provide everything else? This would also allow for doing a lighter version of (4) – the server would host metadata and the first few dozen seconds while the author (and any other peers) could provide the rest on their own. As for seeking, there's (3.1) or just buffering for a while.

@ghost
Copy link

ghost commented May 31, 2023

@scanlime Hi, could I ask you a few questions on how WebTorrents work with PeerTube and how they could be optimized?

1. I thought that PeerTube always used WebTorrent, but here you're suggesting that the servers will happily use HTTP instead. What's up with that?

There are two storage and distribution backends in PeerTube. An older one uses WebTorrent to distribute a single mp4 file per resolution. It's slow to start because the mov index must be fully downloaded before the client can seek at all. A newer one uses HLS (http live streaming) plus a client-side strategy for peer to peer data sharing over webrtc.

The two backends have different features, and admins can choose which of the two to enable or you can use both at the cost of doubling your storage.

The main benefits of each in practice are that the webtorrent backend uses a single regular MP4 file per resolution (with HLS we use a fragmented MP4 format) so webtorrent mode offers the best compatibility with downloads and with unusual client software, whereas the HLS format is supported by fewer (non-browser) programs but it can start streaming very quickly in browsers.

2. You've mentioned HTTP Live Streaming. Is that only for livestreams? That is a very distinct use than just watching a video (which is also very distinct from just downloading a video), so no wonder it would require a different approach. Also, wouldn't using multicast decrease the server load here by a lot?

It's not just for livestreams, HLS is the default way we distribute all video when it's enabled.

3. I've just watched a new 16 minute video on TILVids. It was 161MB total, with 102 MB downloaded from peers and 59MB from servers. It started reasonably quickly, and both the servers and peers were contributing during the first ~5 minutes. Then, it paused for a short while and the only traffic would start coming from servers. Only around halfway through the video the peers would start contributing. Then they took over until the end. The total peer count steadily increased util reaching 39 at the end. It seems like there were some inefficiencies in the scheduling and that actually only a small fraction of the peers really were active. Most of the time, there was no traffic – neither up nor down.

I can't debug your network without more information. For some context, tilvids uses HLS mode and you can see this as it's downloading m3u8 files in your browser's network debug panel.

The "peers" you're seeing could be servers offering redundancy via a totally different (http based) method, or they could be clients using the webrtc-based P2P loader. The UI doesn't give you a way to see the difference easily, but again you can check your browser's network panel and see what it's doing.

* Regular Torrents tend to be downloaded as quickly as possible. I understand that this would increase the server load, but shouldn't peers still behave this way to minimize any buffering pauses? 

There is nothing magical about using WebTorrent. It's effectively the same transport at the end of the day, https for seeds and WebRTC for browser peers.

4. You've said that storage is much more expensive than bandwidth for typical servers. I know there are existing strategies for dealing with that (like removing resolutions), but wouldn't the solution that @S7venLights mentioned (i.e. removing the content altogether, but keeping the metadata + letting the author host|seed it themselves) be better than going broke (and thus worth implementing)?

How would this be different than the author running a web server?

5. I understand that relying solely on peers and no servers is bad for latency. Couldn't then servers handle only the very beginning of a video and let the peers provide everything else? This would also allow for doing a lighter version of (4) – the server would host metadata and the first few dozen seconds while the author (and any other peers) could provide the rest on their own. As for seeking, there's (3.1) or just buffering for a while.

One of the root causes of the performance inconsistency you've already witnessed is that these random peers have no performance guarantees. So how long do you give them to respond? There's an incremental path forward really, in which the whole system gets better at estimating how fast each peer is and connecting people with reliable sources of data for quick start. This likely involves bug fixes in the existing client code as well as new features in the protocol to let clients get bootstrapped with information about peer performance.

You didn't ask about this mechanism but the most successful "peering" mechanism in PeerTube is the redundancy offered between servers that are following each other. This works well, since clients can always get a fast start and these other peertube servers tend to have pretty reliable connectivity. It would be cool if we could focus more on polishing what works. Torrents have a flashy brand but they're really last generation as far as the peering mechanisms peertube supports.

@mskiptr
Copy link

mskiptr commented May 31, 2023

Thank you for a detailed response. It's quite enlightening.

  • Regular Torrents tend to be downloaded as quickly as possible. I understand that this would increase the server load, but shouldn't peers still behave this way to minimize any buffering pauses?

There is nothing magical about using WebTorrent. It's effectively the same transport at the end of the day, https for seeds and WebRTC for browser peers.

What I meant here is, the current buffering strategy seems to load content x seconds forward and then wait until the user has watched some chunk of that. Instead it would be beneficial to buffer the whole video when (otherwise idle) peers are around.

  1. You've said that storage is much more expensive than bandwidth for typical servers. I know there are existing strategies for dealing with that (like removing resolutions), but wouldn't the solution that @S7venLights mentioned (i.e. removing the content altogether, but keeping the metadata + letting the author host|seed it themselves) be better than going broke (and thus worth implementing)?

How would this be different than the author running a web server?

Here, author = content creator. So not necessarily the person hosting the instance.
My point there is that very niche videos could still be served (e.g. from the author's own PC|NAS), while still being listed on the original instance. No need for setting up a web server and migrating over or getting removed due to too high professional hosting costs.

the most successful "peering" mechanism in PeerTube is the redundancy offered between servers that are following each other.

Well, judging by that new and hot video I've mentioned, it only took me not closing the tab to lighten the load on the servers by over 166MB. That's more than the video itself weighed.

Oh, now I've noticed this. It seems like we're all mixing three distinct problems and two distinct – but related – solutions. Problems being: lessening the bandwidth costs, lessening the storage costs and preventing videos from being deleted due to these costs. And the proposed solutions being: making viewers seed long term by default and letting creators seed videos instead of the original instance.

Torrents have a flashy brand but they're really last generation as far as the peering mechanisms peertube supports.

Yeah, I'm not married to torrents. It just looks to me that a huge amount of server bandwidth could be saved by making viewers be peers on watched videos in the longer term.

@ghost
Copy link

ghost commented May 31, 2023

What I meant here is, the current buffering strategy seems to load content x seconds forward and then wait until the user has watched some chunk of that. Instead it would be beneficial to buffer the whole video when (otherwise idle) peers are around.

This is a configuration choice in the JavaScript client code. The drawback of buffering huge amounts of video is client side storage. I don't know if the client currently changes it's buffering strategy based on what peers exist but I suspect not.

If you want to know more about this, the place to look is the novage hls p2p loader project and the specifics of peertube's use of it.

Here, author = content creator. So not necessarily the person hosting the instance. My point there is that very niche videos could still be served (e.g. from the author's own PC|NAS), while still being listed on the original instance. No need for setting up a web server and migrating over or getting removed due to too high professional hosting costs.

My point was that the whole point of federation is that it doesn't extremely matter what server a video is on. So if a creator wants to host their own video they can already just run a server.

Maybe the feature you'd actually end up needing in this case is a way for an existing user to suggest a server follow to the admin and to migrate their channel there. Account migration in general would be a really handy feature especially for cases where a creator wants to graduate to their own hosting.

Yeah, I'm not married to torrents. It just look to me that a huge amount of server bandwidth could be saved by making viewers be peers on watched videos in the longer term.

Really depends on the server's traffic pattern. The long tail is where a lot of the resources end up going on a big and eclectic server. If you're only worried about the very latest and hottest video then.. maybe? It's interesting but it's not obvious to me that this solves a real problem. And it certainly doesn't solve reliable long term primary hosting.

@mskiptr
Copy link

mskiptr commented May 31, 2023

Really depends on the server's traffic pattern. The long tail is where a lot of the resources end up going on a big and eclectic server. If you're only worried about the very latest and hottest video then.. maybe? It's interesting but it's not obvious to me that this solves a real problem. And it certainly doesn't solve reliable long term primary hosting.

I guess a large part of what I'm thinking about could just be achieved by caching videos in localstorage (and seeding them when any PeerTube tab is opened). How viable is that?

@ghost
Copy link

ghost commented May 31, 2023

Really depends on the server's traffic pattern. The long tail is where a lot of the resources end up going on a big and eclectic server. If you're only worried about the very latest and hottest video then.. maybe? It's interesting but it's not obvious to me that this solves a real problem. And it certainly doesn't solve reliable long term primary hosting.

I guess a large part of what I'm thinking about could just be achieved by caching videos in localstorage (and seeding them when any PeerTube tab is opened). How viable is that?

You'd be limited to caching and serving videos from the same domain, but yeah maybe? It's not clear to me that this provides enough bandwidth and reliability to outweigh the additional heaviness that every client would experience.

The reason I'm a lot more interested in the server-to-server peering for basic reliability is that the servers already start out reliable and relatively flush with resources. A browser client is unlikely to stick around long and we already keep it quite busy with the work of playing (and remuxing, and hashing, and sharing) one video.

@S7venLights
Copy link
Author

My point was that the whole point of federation is that it doesn't extremely matter what server a video is on. So if a creator wants to host their own video they can already just run a server.

My proposed solution was more simple than this, a Peertube App that seeds your own videos and a certain amount of watched videos, would be much easier for a end user to setup.
But it seems you're saying that won't solve the dying instance problems as the bandwidth or storage savings are unreliable.
Realistically I think PeerTube will remain difficult to grow and sustain if public instances aren't surviving. But I suppose that if you want the advantages of PeerTube you have to be willing to put in the work of self hosting.

@S7venLights
Copy link
Author

S7venLights commented Jun 3, 2023

Is PeerTube really any more efficient than other video hosting platforms if most times you watch a video you aren't watching it with any peers and the servers do all the work.
Would not my idea scale up the amount of content seeded and make peers more reliable?
Maybe even if the PeerTube app is a progressive web app that caches some videos and runs in the background 'seeding' via webrtc

@ghost
Copy link

ghost commented Jun 3, 2023

Would not my idea scale up the amount of content seeded and make peers more reliable?

Content "seeding" here is literally just serving a file over https.

If I understand your idea, it's just that desktop PCs run an app to serve videos? This already exists, it's a web server. PeerTube is one option, but literally a creative person who wants to host video can use any web server. You also need a routable IP, and you need DNS. Don't have a static IP? There's dyn DNS. Don't have ports 80 and 443? Now you need a gateway of some sort because you can't interoperate directly with clients. As soon as you need a gateway, the situation you're in isn't better than if the gateway was serving the video itself.

@S7venLights
Copy link
Author

S7venLights commented Jun 17, 2023

🤨 I don't know what you mean by all that. My idea was a peertube app that works like any torrent app. Install, download, seed. Three simple steps that require none of what you just mentioned.

@ghost
Copy link

ghost commented Jun 17, 2023

Well it would be trivial to make a downloadable app that claims to "seed" without doing anything else, but I'm saying it's more complicated than this to put content on the internet in a way that's actually useful to web browser clients.

In your easy-button scenario where the user downloads a thing and does zero network configuration, even UDP based P2P will be hit-and-miss. If your torrent app fails to open ports or do NAT hole punching, all your computer can do is hold data, it can't directly accept connections from anyone else. How does a web browser connect, and connect fast enough that users won't give up?

@S7venLights
Copy link
Author

I don't have the coding or network knowledge to answer that question. But I imagine it could be done between everyone using the peertube app in the same way torrent clients do it.
Using the app would become the preferred advertised way of using PeerTube

@ghost
Copy link

ghost commented Jun 17, 2023

I don't have the coding or network knowledge to answer that question. But I imagine it could be done between everyone using the peertube app in the same way torrent clients do it. Using the app would become the preferred advertised way of using PeerTube

I'm not asking questions for my own benefit, these are research leads if you care to take them.

One thing to look into is how torrent clients accept incoming connections. Look at how this works on different types of internet service.

If you're proposing an app-only p2p system those already exist but it seems unlikely peertube would go in that direction. If you want to interact with web browsers, I suggest you look into what types of network connections web browsers can make.

@LoveIsGrief
Copy link
Contributor

LoveIsGrief commented Jun 18, 2023

I think this issue keeps popping up. There seems to be a large amount of good will amongst people to share bandwidth or maybe even storage without having to host a full instance, but the idea has never really been accepted by the maintainers.

Regarding bandwith, I think the only ways forward are:

  • maintainers implement HLS on top of webtorrent (not sure if that was ever attempted)
  • somebody writes a library for peertube's custom webRTC P2P HLS implementation, which is seemingly entirely undocumented and could possibly an unstable moving target

Regarding storage, IPFS has been brought up (#494) but the official line seems to be "use S3". It's not clear to me if the storage layer can be easily extended by a plugin. My last perusal of plugin capabilities seemed to indicate "no".

Unless I'm missing something, maintainers would have to prioritize this issue and actively guide contributors or provide interfaces for them in order to solve it. Without their willingness, one would have to wait until it becomes critical, for it to get on their radar, and be perceived as an issue by them OR for somebody to work around the maintainers and write a custom solution (possibly in a fork, a distributed storage with an S3 frontend + ipfs player plugin, etc.)

@ghost
Copy link

ghost commented Jun 18, 2023

never really been accepted by the maintainers

Can you provide more information? It's not my understanding that anything real is being rejected here, only that a lot of enthusiasm is being spilled by the community in ways that ignore internet architecture.

"seeding webtorrent" is literally just running an https server

both the P2P strategies (webtorrent, novage hls loader) are javascript libraries that use webrtc. The transport characteristics aren't much different. The difference between HLS and WebTorrent on PeerTube is not so much about the transport itself and more about the granularity of data that's sharable, with webtorrent working on files and HLS on segments of a few seconds long.

@ghost
Copy link

ghost commented Jun 18, 2023

I'll add that threads like this are frankly extremely frustrating to participate in, and if you aren't getting the feedback you want from developers it's not because of some conspiracy against good ideas it's because it is tiring to constantly lead someone to water who will not drink.

@LoveIsGrief
Copy link
Contributor

@scanlime

Can you provide more information? It's not my understanding that anything real is being rejected here

You can read the links.

both the P2P strategies (webtorrent, novage hls loader) are javascript libraries that use webrtc

yes, JS libs that require a chromium instance

The difference between HLS and WebTorrent on PeerTube is not so much about the transport itself and more about the granularity of data that's sharable, with webtorrent working on files and HLS on segments of a few seconds long.

The transport is important because of the clients that support it. Webtorrent has now established itself to be used by clients beyond JS thanks libtorrent arvidn/libtorrent#4123 . It can now be used in python, java, golang and node. https://github.com/Novage/p2p-media-loader did good work, but it's entirely new, requires chromium due to its dependency on simple-peer (see bug I raised feross/simple-peer#866), and works differently from webtorrent. So, no, it's not just "literally running an https server".

webtorrent working on files and HLS on segments of a few seconds long

HLS chops up an MP4 into multiple segments and can be done for different bitrates of the file. It's based on M3U8 playlists that retrieve the list of segments and different segments can be loaded depending on the internet speed of the user (adapative bitrate streaming).

Webtorrent is just (as the name implies) a torrent that can be retrieved in the web-browser. If you've used torrents before, you'll know that it can contain multiple files, which can be downloaded in any order and segments of the files can be requested at any point. That means, HLS can be implemented over webtorrent:

  • create a torrent that contains all the segments in folders prefixed by the resolution e.g
list.m3u8
360p/
 list.m3u8
 *.mp4
480p/
 list.m3u8
 *.mp4
720p/
 list.m3u8
 *.mp4
  • retrieve the webtorrent with a videojs plugin (if I'm not mistaken this project does use videojs) and request the root playlist
  • create adaptors in the plugin for retrieving requested chunks in the webtorrent

Users can then use a script like mine to retrieve and reseed the webtorrents all without going through hassle of running a peertube instance.

if you aren't getting the feedback you want from developers it's not because of some conspiracy against good ideas

Nowhere do I say there's a conspiracy. Here's what I wrote

Unless I'm missing something, maintainers would have to prioritize this issue and actively guide contributors or provide interfaces for them in order to solve it. Without their willingness, one would have to wait until it becomes critical, for it to get on their radar, and be perceived as an issue by them OR for somebody to work around the maintainers and write a custom solution (possibly in a fork, a distributed storage with an S3 frontend + ipfs player plugin, etc.)

In short, it's not their priority right now (probably have more important stuff to do) and they'll cross that bridge once they get to it.

As you can see in this comment I proposed taking a stab at IPFS storage, but the response was cautious. I finally didn't go through with a POC as it would've been a large investment for a "maybe", which is also why I dropped the issue of HLS webtorrents.

@ghost
Copy link

ghost commented Jun 18, 2023

You can read the links.

I'm not low on information here I'm trying to communicate across some kind of gap. There's something you're missing here but I'm not sure how to communicate it.

Fundamentally when you open a web page with a video on it, the web browser needs fast access to that video.

Technologies that a web browser can't use directly or that won't resolve content quickly may be useful as an additional storage strategy but they can't be the primary storage for a video.

If you're still looking at IPFS that's also rather unhelpful through the lens of these two constraints. PeerTube needs its primary storage to be accessible by web browsers, period. IPFS gateways are... web servers. WebTorrent seeds are... web servers.

@S7venLights
Copy link
Author

S7venLights commented Jul 10, 2023

Dammit! 3rd time I've lost a channel to PeerTube instances dying without warning and this one had hours of work!😾

https://www.orion-hub.fr/
"Due to a suspension of our dedicated server by our provider for Spam Emails, reported by UCEPROTECT and caused by our Mastodon instance, we are no longer able to offer you our PeerTube instance Orion-Hub.fr...

We are sorry for the inconvenience this causes, thank you for your understanding."

They claimed they had good funds and would keep it active always too!

PeerTube needs account migration
#549
Or some solution

@snan
Copy link

snan commented May 9, 2024

I guess #6388 overlaps with this a bit but the idea there would be to be able to seed from a headless, command-line–only server.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants