Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

429 error #8

Closed
davidpan opened this issue Jan 26, 2024 · 13 comments · Fixed by #29
Closed

429 error #8

davidpan opened this issue Jan 26, 2024 · 13 comments · Fixed by #29

Comments

@davidpan
Copy link

site has about 6 million web pages, and a 429 error occurs during batch processing.

@AlizerUncaged
Copy link

same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests

@tguillemaud
Copy link

tguillemaud commented Jan 26, 2024

The API is limited to 200 requests per day (not per second) as per Google Documentation

Since this quota is at the service account level, an option would be for the script to support multiple service accounts credentials.

@Pab450
Copy link

Pab450 commented Jan 26, 2024

same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests

50 million ??? how?

@edoardolunardi
Copy link

site has about 6 million web pages, and a 429 error occurs during batch processing.

6 million? That's the number of articles in English on Wikipedia 😆

same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests

50 million??? That's the number of all articles in all languages on Wikipedia lol

@rmens
Copy link

rmens commented Jan 26, 2024

This was also a problem for a site with 15k pages. It couldn't get all the batches. This needs to be more robust with long term local caching of urls and 200 requests per day to the indexing api.

@AlizerUncaged
Copy link

AlizerUncaged commented Jan 30, 2024

@Pab450 @edoardolunardi

I own a math calculator website with solutions to almost all possible textbook equations, for the benefit of the doubt, I make around 1 million pages a month and I need an automated indexer, I own several websites with 10M+ pages as well

image
image

so a good client side rate limiter would be a huge help for sites like mine

@ingeniumdesign
Copy link

ingeniumdesign commented Feb 7, 2024

yes, we have the same problem with 25k sites. "429 error"

Is there a way to increase the quota for "money"?
From a maximum of 200 to more?

@ingeniumdesign
Copy link

is that normal, the Web Search Indexing API has no "Traffic" only the Google Search Console API ? Thanks

direct-indexing-01

@goenning
Copy link
Owner

I noticed that too on my project, I guess it’s normal or a bug on Google’s side

@Noext
Copy link

Noext commented Feb 16, 2024

you can ask google more quota, but i a dont think 50m will be accepted lol

@AntoineKM
Copy link
Contributor

AntoineKM commented Feb 23, 2024

❌ Failed to request indexing.
Response was: 429
{
  "error": {
    "code": 429,
    "message": "Quota exceeded for quota metric 'Publish requests' and limit 'Publish requests per day' of service 'indexing.googleapis.com' for consumer 'project_number:xxx'.",
    "status": "RESOURCE_EXHAUSTED",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.ErrorInfo",
        "reason": "RATE_LIMIT_EXCEEDED",
        "domain": "googleapis.com",
        "metadata": {
          "quota_location": "global",
          "consumer": "projects/xxx",
          "quota_limit_value": "200",
          "service": "indexing.googleapis.com",
          "quota_limit": "DefaultPublishRequestsPerDayPerProject",
          "quota_metric": "indexing.googleapis.com/v3_publish_requests"
        }
      },
      {
        "@type": "type.googleapis.com/google.rpc.Help",
        "links": [
          {
            "description": "Request a higher quota limit.",
            "url": "https://cloud.google.com/docs/quota#requesting_higher_quota"
          }
        ]
      }
    ]
  }
}

A great feature to implement would be to:

  • 1. Stop the process while this error occur (Add rate limit error handling #29)
  • 2. Ask the user if we want to continue or stop the process here
  • 3. Tell the user he can request a higher quota limit or retry in xx hours

@goenning
Copy link
Owner

Agree @AntoineKM I think option 1 is the best one, but also save the progress so the next run would not have to go through the same URLs.

Would you like to send a PR? I don't have such large sites to test it :)

AntoineKM added a commit to ArkeeAgency/google-indexing-script that referenced this issue Feb 28, 2024
@Taimerlan
Copy link

I also agree with @AntoineKM The first option is the best.
I also made the script run on a schedule. It would be nice to run it and forget it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.