You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Today I realized we have 51,000 items in our sitemap and Google is indexing them correctly, but we have 500,000 pages in the index. This is due to Google crawling all the communities and collections and clicking all manners of search, sort, and browse options. It's likely we are getting nailed with a rankings hit because of duplicate content (not to mention the performance overhead of Google's 70,000 queries per day on those dynamic pages).
It seems there is a fix proposed in DS-2962. I have to evaluate this first...
The text was updated successfully, but these errors were encountered:
In the meantime, I've discovered the URL Parameters section of Webmaster Tools, where you can inform Google how to handle query strings in URLs. It looks like there are tens of millions of URLs for dynamic pages here. This is definitely a huge problem.
I've set a handful of the parameters to not show in the index, but the better solution would be to merge the robots.txt patch.
Also, since our sitemap is being consumed properly, we should actually disallow Google's crawling of the site (as suggested by the existing comments in robots.txt.
Today I realized we have 51,000 items in our sitemap and Google is indexing them correctly, but we have 500,000 pages in the index. This is due to Google crawling all the communities and collections and clicking all manners of search, sort, and browse options. It's likely we are getting nailed with a rankings hit because of duplicate content (not to mention the performance overhead of Google's 70,000 queries per day on those dynamic pages).
It seems there is a fix proposed in DS-2962. I have to evaluate this first...
The text was updated successfully, but these errors were encountered: