Some reasonable rate limiting #26

Closed
markjaquith opened this Issue Feb 13, 2014 · 6 comments

Comments

Projects
None yet
3 participants
@markjaquith

If you just absentmindedly move your cursor around a page for a few seconds, you can end up triggering 4 or 5 requests to the same URL. I get that you don't want "fetch once, cache forever" so that this works well with dynamic content. But firing off multiple requests to the same URL within the span of a second or two is just unnecessary.

I propose that there be a cache that holds on to pages for 5 seconds. If you need your site to be more dynamic than that, well, don't use this script.

@dieulot

This comment has been minimized.

Show comment
Hide comment
@dieulot

dieulot Feb 13, 2014

Owner

What I recommend here is to put Expires headers server-side. The browser already has a cache function, it’d be a waste to replicate it client-side.

Also note that when you hover randomly, unless you have a really fast connection, you usually mouseout and thus cancel the request before the page has been preloaded.

In case sending headers server-side is an issue, I’d recommend using a delay of 50 or 100 ms (the correct delay depends on how close to each other are your links on your page’s layout).

Owner

dieulot commented Feb 13, 2014

What I recommend here is to put Expires headers server-side. The browser already has a cache function, it’d be a waste to replicate it client-side.

Also note that when you hover randomly, unless you have a really fast connection, you usually mouseout and thus cancel the request before the page has been preloaded.

In case sending headers server-side is an issue, I’d recommend using a delay of 50 or 100 ms (the correct delay depends on how close to each other are your links on your page’s layout).

@dieulot dieulot closed this Feb 13, 2014

@markjaquith

This comment has been minimized.

Show comment
Hide comment
@markjaquith

markjaquith Feb 13, 2014

That's a good point. I missed the "(from cache)" notice in the inspector.

Also note that when you hover randomly, unless you have a really fast connection, you usually mouseout and thus cancel the request before the page has been preloaded.

Well, the site in question is already really fast. Between 30 and 70ms to load a page (proactively static-cached through Nginx). So a lot of those hovers are going to complete before mouseout. I might have to just increase the delay instantclick delay (currently 50ms).

That's a good point. I missed the "(from cache)" notice in the inspector.

Also note that when you hover randomly, unless you have a really fast connection, you usually mouseout and thus cancel the request before the page has been preloaded.

Well, the site in question is already really fast. Between 30 and 70ms to load a page (proactively static-cached through Nginx). So a lot of those hovers are going to complete before mouseout. I might have to just increase the delay instantclick delay (currently 50ms).

@dieulot

This comment has been minimized.

Show comment
Hide comment
@dieulot

dieulot Feb 13, 2014

Owner

Okay thanks. I didn’t know sub-50 ms HTTP requests weren’t uncommon these days and/or in the U.S.

Owner

dieulot commented Feb 13, 2014

Okay thanks. I didn’t know sub-50 ms HTTP requests weren’t uncommon these days and/or in the U.S.

@markjaquith

This comment has been minimized.

Show comment
Hide comment
@markjaquith

markjaquith Feb 14, 2014

Oh, they’re definitely the exception!

Oh, they’re definitely the exception!

@joellimberg

This comment has been minimized.

Show comment
Hide comment
@joellimberg

joellimberg Mar 10, 2014

@dieulot

I would certainly want an option to prevent all duplicate url loads (hovering over the same link 5 times will request that link 5 times). I understand that sending cache headers will partially* solve this problem, but it's not always an option for dynamic content.

(* by partially, I mean that — as far as I know — we can't rely on browsers always caching the contect correctly.)

Is this something that anyone's working on?
In the current instantclick.js implementation, XHR requests seem to be very tightly mixed with the overall implementation, so it seems to be non-trivial to implement without a) larger refactoring b) hack-ish solutions.

@dieulot

I would certainly want an option to prevent all duplicate url loads (hovering over the same link 5 times will request that link 5 times). I understand that sending cache headers will partially* solve this problem, but it's not always an option for dynamic content.

(* by partially, I mean that — as far as I know — we can't rely on browsers always caching the contect correctly.)

Is this something that anyone's working on?
In the current instantclick.js implementation, XHR requests seem to be very tightly mixed with the overall implementation, so it seems to be non-trivial to implement without a) larger refactoring b) hack-ish solutions.

@dieulot

This comment has been minimized.

Show comment
Hide comment
@dieulot

dieulot Mar 13, 2014

Owner

“we can't rely on browsers always caching the contect correctly”
What do you mean by that? If a request is completed it will be cached correctly, if it isn’t completed it will not be cached at all.

If you want to serve dynamic content and are preoccupied about the same link loading multiple times [1] the solution is to set a cache that expires very quickly, in the 2–10 seconds range (depending on how often your content risks changing).

And yes, adding a cache would need a refactoring and would increase code complexity.

[1] I think those users who takes pleasure in hovering the same link over and over again are few and I’m not sure it’s even worth worrying about, but caching is nice to have anyway.

Owner

dieulot commented Mar 13, 2014

“we can't rely on browsers always caching the contect correctly”
What do you mean by that? If a request is completed it will be cached correctly, if it isn’t completed it will not be cached at all.

If you want to serve dynamic content and are preoccupied about the same link loading multiple times [1] the solution is to set a cache that expires very quickly, in the 2–10 seconds range (depending on how often your content risks changing).

And yes, adding a cache would need a refactoring and would increase code complexity.

[1] I think those users who takes pleasure in hovering the same link over and over again are few and I’m not sure it’s even worth worrying about, but caching is nice to have anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment