Caching "Over API limit" errors #148

Closed
amacneil opened this Issue Dec 11, 2011 · 18 comments

Projects

None yet

6 participants

@amacneil

When I geocoded a bunch of addresses too quickly, I ran into the API error

Google Geocoding API error: over query limit

Not a massive issue. The problem is that it looks like this is getting cached, so when I retry the same query again later, I get the API limit error, before it even tries contacting Google. This means that the address is now "blocked" so to speak, unless I manually remove it from the cache.

Would it be possible to automatically remove URLs from the cache when an exception is thrown?

@alexreisner
Owner

I've pushed some code that should fix this. Could you try it out? Just change your Gemfile to:

gem 'geocoder', :git => 'git://github.com/alexreisner/geocoder.git', :branch => 'dont_cache_errors'

and run:

bundle install

Let me know how it goes. Thanks.

@amacneil

Thanks Alex. I ran it again but can't replicate the 'over query limit' issue now (I think it was only happening when I did a bulk import on production since it has a much faster internet connection). It looks like your fix will solve the problem though.

@alexreisner
Owner

OK, I've merged into master and removed that branch.

@amacneil

Awesome, thanks for the quick fix :)

@arronmabrey

I don't think this is totally fixed. I'm using the google lookup and this is getting stored into the cache

"{\n \"results\" : [],\n \"status\" : \"OVER_QUERY_LIMIT\"\n}\n"

@alexreisner
Owner

Sorry, this should really be fixed now. Please try running from the current HEAD (will be included in next gem release).

@arronmabrey

Great thanks!

Arron Mabrey
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)

On Wednesday, April 11, 2012 at 9:30 AM, Alex Reisner wrote:

Sorry, this should really be fixed now. Please try running from the current HEAD (will be included in next gem release).


Reply to this email directly or view it on GitHub:
#148 (comment)

@chefkoch86

Problem still exists to me. Don't know, if it has the same reason, perhaps you can help me about that. I've tested it in an rspec test. I repeat the request 100 times in the test. Sometimes it works, sometimes i get the error message.

rspec
Run options: include {:focus=>true}
............Google Geocoding API error: over query limit.
F.............Google Geocoding API error: over query limit.
F.Google Geocoding API error: over query limit.
F.Google Geocoding API error: over query limit.
F.............Google Geocoding API error: over query limit.
FGoogle Geocoding API error: over query limit.
F......................................................

@alexreisner
Owner

@chefkoch86: Sounds like you're actually hitting the query limit because your tests are making a lot of requests. You probably should try stubbing out network calls using something like vcr.

@bcackerman

In production (on heroku) I'm also getting that error still Google Geocoding API error: over query limit.

In my model I have:

geocoded_by :full_address
    after_validation :geocode, :if => :check_address_changed?

  def check_address_changed? #TEST possiblities
    secondary_city_id_changed? || address_changed?
  end

    # create the full address from fields
    def full_address
    address+", "+secondary_city.name
  end
@alexreisner
Owner

@bcackerman please see #222.

@amacneil

This is still happening in 1.1.8. API limit errors are being cached, meaning subsequent calls return the cached API limit error instead of trying again. The reason is that google's API limit error returns http status 200, and you are caching all requests that return 200..399.

Here's an example API limit error from google, which shouldn't be cached:

$ curl -v "http://maps.googleapis.com/maps/api/geocode/json?address=Winnetka&sensor=false"
* About to connect() to maps.googleapis.com port 80 (#0)
*   Trying 74.125.31.95...
* connected
* Connected to maps.googleapis.com (74.125.31.95) port 80 (#0)
> GET /maps/api/geocode/json?address=Winnetka&sensor=false HTTP/1.1
> User-Agent: curl/7.24.0 (x86_64-apple-darwin12.0) libcurl/7.24.0 OpenSSL/0.9.8r zlib/1.2.5
> Host: maps.googleapis.com
> Accept: */*
>
< HTTP/1.1 200 OK
< Content-Type: application/json; charset=UTF-8
< Vary: Accept-Language
< Access-Control-Allow-Origin: *
< Date: Wed, 15 May 2013 00:24:49 GMT
< Server: mafe
< Cache-Control: private
< X-XSS-Protection: 1; mode=block
< X-Frame-Options: SAMEORIGIN
< Transfer-Encoding: chunked
<
{
   "results" : [],
   "status" : "OVER_QUERY_LIMIT"
}
* Connection #0 to host maps.googleapis.com left intact
* Closing connection #0
@alexreisner alexreisner reopened this May 15, 2013
@pencilcheck

So is there a temporary solution to get around this before it is fixed?

@bcackerman

@pencilcheck Are you on heroku?

@pencilcheck

I'm not.

@bcackerman

@pencilcheck Ok, well my personal solution was to get the lat/long from the client side instead of server side. This was because multiple apps were being run on the server who had used Google's limit which gave us an error.

@pencilcheck

It sounds like a good way to distribute the cost, ok I will try that.

@alexreisner
Owner

Fixed by #510.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment