-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache json response user side #19
Comments
cache the stuff on the user side? the user requests dotaprj not joindota. Dotaprj should slow down on the requests not the other way around. either that, or we take down joindota.. i mean we're giving them free fucking traffic |
This would be to decrease the load on dotaprj as the response will be the same in a 15 minute window. So, let's say I open d2mt to get a stream. I open it again in 5 minutes to get another stream and again at 10 minutes. I've now made 15 requests to dotaprj, 10 of which are duplicate. This gives a total of ~225kb total data transferred By caching the responses the first time, there would be only the 5 requests (~75kb) which when compounded over a lot of users, would see potentially drastic drop in total load on our end (especially during big tournaments with multiple streams and things). |
ok gotchya, makes sense but the json arent live requests to API's they're presaved into our server, so it's isnt doing that much load to us ;). When you visit a page, you dou about 50 requests, for scripts, css, images, fonts, etc.. here you are just doing 4 tiny json requests this isnt worth changing the architecture unless we bring in background processing to a whole new level like notifications and such |
When you visit a page, most modern browsers try and bring up cached results if you've just been browsing the page to save time/memory in loading. But this isn't for the sake of the users, but to save us on requests as there are three APIs currently called (DD2, GG, and Streams) which equal ~0.046mb per call which doesn't change. Assuming users might open the extension multiple times in a 15 minute period, they're using additional data on our servers we wouldn't necessarily have to worry about. The architecture change shouldn't be too major either as it'd basically just be something like this:
The biggest thing required is adding some way of knowing when a cache would expire (could probably add in code to get current time, add 15 to it, and then save this in the JSON that gets passed to the user). |
The json files will only change so often (and I will set it up such that they're essentially started on the hour. Therefore, we can cache the json responses locally till the next time we can actually get an update. This would drastically lower the usage on the server by users.
The text was updated successfully, but these errors were encountered: