-
Notifications
You must be signed in to change notification settings - Fork 246
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some reasonable rate limiting #26
Comments
What I recommend here is to put Expires headers server-side. The browser already has a cache function, it’d be a waste to replicate it client-side. Also note that when you hover randomly, unless you have a really fast connection, you usually mouseout and thus cancel the request before the page has been preloaded. In case sending headers server-side is an issue, I’d recommend using a delay of 50 or 100 ms (the correct delay depends on how close to each other are your links on your page’s layout). |
That's a good point. I missed the "(from cache)" notice in the inspector.
Well, the site in question is already really fast. Between 30 and 70ms to load a page (proactively static-cached through Nginx). So a lot of those hovers are going to complete before mouseout. I might have to just increase the delay instantclick delay (currently 50ms). |
Okay thanks. I didn’t know sub-50 ms HTTP requests weren’t uncommon these days and/or in the U.S. |
Oh, they’re definitely the exception! |
I would certainly want an option to prevent all duplicate url loads (hovering over the same link 5 times will request that link 5 times). I understand that sending cache headers will partially* solve this problem, but it's not always an option for dynamic content. (* by partially, I mean that — as far as I know — we can't rely on browsers always caching the contect correctly.) Is this something that anyone's working on? |
“we can't rely on browsers always caching the contect correctly” If you want to serve dynamic content and are preoccupied about the same link loading multiple times [1] the solution is to set a cache that expires very quickly, in the 2–10 seconds range (depending on how often your content risks changing). And yes, adding a cache would need a refactoring and would increase code complexity. [1] I think those users who takes pleasure in hovering the same link over and over again are few and I’m not sure it’s even worth worrying about, but caching is nice to have anyway. |
If you just absentmindedly move your cursor around a page for a few seconds, you can end up triggering 4 or 5 requests to the same URL. I get that you don't want "fetch once, cache forever" so that this works well with dynamic content. But firing off multiple requests to the same URL within the span of a second or two is just unnecessary.
I propose that there be a cache that holds on to pages for 5 seconds. If you need your site to be more dynamic than that, well, don't use this script.
The text was updated successfully, but these errors were encountered: