New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Quickstart fails, valid config/config.yaml is required
(and other mattdsteele questions)
#52
Comments
Another note: while building the project it failed as my machine didn't have
Probably worth mentioning that dependency in the quickstart |
After building from source and running, I'm getting the same error as when running Dockerized:
Which makes me think the issue is in my config file; I'm just not sure what's causing it. The full file I'm testing in `config/config.yaml`:root@ubuntu-s-1vcpu-1gb-sfo2-01:~/owncast# cat config/config.yaml
|
valid config/config.yaml is required
valid config/config.yaml is required
You're the first to use the Dockerized setup for anything 🥇 I'll have to update the documentation and code, we recently changed the config to just be in the root of the project instead of |
General question as I'm provisioning hardware; about how much disk is required to serve locally? I'd prefer not to have to use cloud storage just to keep things simple, and I've got Cloudflare in front so bandwidth isn't as much of an issue. Say I set two stream qualities, with bitrates at Also, do the files in |
valid config/config.yaml is required
valid config/config.yaml is required
(and other mattdsteele questions)
From Cloudflare's perspective, I believe the only files I need to exclude from the CDN and disable caching are |
So the idea is to keep files around only as long as they're useful to be there. They get cleaned up during two different times:
Because of this the number of files actually on disk at any given time is quite low. The downside is if you're looking to have a complete back buffer of video and not just focusing on live. You can adjust the number of segments that are "in play" by changing As for the cache, that should be correct. The correct "do not cache" cache-control headers are also set on m3u8 files so hopefully the CDN is respecting those, as well. I'd love to hear your experience embedding the video into another page, since it's not something we've done. I'd like to take your implementation tips and throw it in a document to use as a guide for other people. The catch is having a player that supports HLS. Our web UI uses VideoJS but hls.js is an option too. |
Got it! I presume since I'm running in Docker it'll also get wiped when I rebuild the container. I'll probably pull that off into a volume just so it persists. I'll be focused on live for this use case as well; after my event we may host an edited replay on the site, but that'll probably require some additional tweaks anyhow. So far embedding has worked great! I'm using hls.js with their quickstart instructions, without any issues. Main gotcha was using native HLS on mobile Safari, which I should have caught. I'll write something up as I learn more. |
Just making a note; while testing streaming today I noticed Cloudflare was recording essentially 0% cache hits; nearly all requests for data files were being sent to the owncast instance. Cloudflare's got a good post describing the problem with caching live video, as well as a solution they provide, but only for their paid Stream Delivery product. Not sure what to do with this other than maybe note the limitations with CDNing live video in the README? I'll probably end up using S3 or the like, after making sure the egress charges are advantageous... |
Thanks for the update! That article is a good reference. Like any other caching scenario, there has to be enough demand to make it effective, so I'm not surprised. Not to mention the variability of if a previously cached file in a region A edge server is available yet in a region B edge server. You could try something like Cloudfront to have a little more control over things, but I'm not sure it would be much better without sufficiant demand. Depending on what you think the demand for your stream will be it might be completely fine to let your instance be the origin for the video segments, but object storage does allow for some peace of mind. |
I would be curious to see if KeyCDN works well. I've thought about trying this myself, but have yet to do it. https://www.keycdn.com/blog/hls-streaming |
@graywolf336 I setup KeyCDN and it seems to work OK, though I haven't found a good way to load test to see cache hits. My naive approach has been to connect up a few browsers and watch the bandwidth charts on my VPS. Any better approaches? I've seen one post using JMeter but want to see if there's any better tooling... https://www.ubik-ingenierie.com/blog/easy-and-realistic-load-testing-of-http-live-streaming-hls-with-apache-jmeter/ |
One other note: I fetched the latest changes and attempted to rebuild the Docker container, and at least on my $5 DigitalOcean VPS it's now unable to build the container, I think it's running out of memory:
I'm using it outside a container and it builds fine; so no worries. But probably worth noting if anyone else tries to build the container on a 1GB instance. Update - This actually fails anytime you try to build the binary, not just in Docker!
I can still run it with |
Very odd, I'm not seeing any issues building it locally, and I added a Github action now that builds the image and binary to try to catch any issues like this, and that seems to be working ok. So resources makes sense, but it doesn't make sense that you can't build it, but you can run it. Thanks for letting me know! I'll look into it and see what can be done. I'm assuming it has to do with the requirement of cgo now, but that shouldn't be the end of the world. |
Did some testing on my instance tonight; we watched a movie over it with a few folks (stats show it maxed at 5 viewers), all in the same city. Unfortunately it doesn't look like KeyCDN did a great job limiting the requests to the origin server: I pretty quickly saw a maxing out of bandwidth, and the feed got choppy for everyone: I reconfigured down to a single low-quality stream for the rest of the movie and it got through, but I'm wondering if I just happened to hit the limit for what local serving can support on my little VPS. So I'm thinking I'll be trying the S3 approach next. DigitalOcean has an S3-compatible storage offering, so I'll give that a whirl and let you know how it goes. Kind of a bummer that KeyCDN didn't do what it says on the tin, but I think switching storage providers will be easier than trying to debug the CDN 🤷 |
I think the S3 approach is the way to go, it's less of a wildcard. You know for sure the files will be served from there. I haven't tried the DigitalOcean object storage offering, so I'd love to hear how it works for you. I wrote some documentation on other providers if it's at all helpful: https://github.com/gabek/owncast/blob/master/doc/S3.md The recommendation I can make is to find out how to expire files so you don't have a bunch of old segments sitting around taking up space. |
@gabek This isn't directly an Owncast question, but I noticed in this post you've used video passthrough for one of the streams. What broadcaster and settings did you use for that? I've never been able to get a browser to connect with that stream. I'm using vanilla OBS and have tried both hardware (QSV) and software (x264) encoding. Owncast settings: - full:
videoPassthrough: true
audioPassthrough: true |
I don't suggest you use However, when I was using it, I was using both vanilla OBS and ffmpeg as broadcasting clients on a regular basis, with some iOS apps sprinkled in as well, and they all worked. But it's possible that the changes that went in to support Restream (new RTMP pipeline) somehow no longer allows for passthrough to work. It's a bummer that there's so many variables that passthrough can't easily work, since it makes a huge difference with CPU load, but as it stands it seems like you can't really trust a broadcast without forcing it through the transcoder first. |
Thanks for the info! Mostly I was hoping for a "compute-free" way to get an additional stream; I have no issues using the two transcoded ones I've configured. |
Did a final round of testing for my event, and things are looking good! We streamed for about an hour at three different bitrates, and it was mostly solid. There was one instance of buffering I saw as a stream viewer, but it resumed 6 seconds later. In the owncast stdout I saw this message:
I'm serving the video files via S3-compatible storage, so my guess is the upload to the bucket failed? I didn't catch any other logs. Anyhow, we're on schedule for a production use next weekend. Tune in to https://steele-codr.wedding/ at 3:30pm Central time on 8/8 for the livestream! |
Thanks for the update! Yeah that's what that message means, either that segment never finished writing to disk fast enough for it to be uploaded, or there was just some blip in the actual uploading of it. It does try to re-upload if it fails, so you would see that same error multiple times if it kept failing. However, it's possible by the time it did upload your client already tried to pull it so it ultimately resulted in that buffering. If you end up seeing it more often I suggest moving to a faster encoder preset in your config (at the detriment of visual quality, however). But hopefully you won't have to do that. I'm looking forward to taking part in your event virtually! This is all so super cool! |
A few follow-ups; the event went great! Analytics showed 47 folks used it at the max; the majority were on phones/tablets, so I'm glad we tested there :)
I'm not sure what caused this; I believe the CORS headers were all setup with
|
Thanks for the follow-up! I actually dropped by your Owncast instance the day of your wedding to check it out and saw the CORS errors and thought "oh he must have specifically locked down his CORS policy to only access the video from his personal website". I guess that's not the case. I'm confused how "*" was working for your personal site, but not others. There must be some explanation. Same (type of) player, same source of video. Your blog posts are awesome, it's so cool that it's been helpful enough to you for you to want to tell others and share some real use cases. I really appreciate it! Your wedding was the first real event anybody has used Owncast for, and it's a big one. It's so cool to share it. I'd be super into working together on a talk. Let me know what direction you were thinking of going in as far as content. Feel free to email me: gabek@real-ity.com. |
fix: set thumbnail image to fixed size and fix label color
Ran through the Dockerized setup in the quickstart, and I was able to successfully build the container, but starting it I'm encountering this error:
I copied the sample config into
config/config.yaml
before building the container:The only line I changed was updating
streamingKey
to a different value. I also tried copyingconfig-example.yaml
directly, and that also failed.I'll try the non-Docker approach next, but would definitely like to have this running in a container as an option!
For context, this is a stock DigitalOcean VPS running Ubuntu 20.04
The text was updated successfully, but these errors were encountered: