-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Status: Googlebot blocked by robots.txt #3067
Comments
The current robots is:
Maybe it would make sense to add |
Google told me that the access is not possible therefore I changed it to
and now it is working |
Fixed by bc22c8d |
It seems that the issue is not fixed for the builtin jquery: The reason why this happens is unclear to me. It might be that
This also allows query parameters that are often used for cache busting: |
It turns out google is having random problems loading some assets. I am not yet sure if this is related to grav or not - I assume not. However the builtin jquery must be added with an additional rule, as
|
That's pretty odd, as the allow's are coming after the disallows. But if it works in your testing, them i'm good with it. Thanks. |
I've read, that the length of the rule matters for the googlebot. That makes totally sense why I also found out, why the other resources we unable to load: Google stops loading resources, if there are too many. It seems like a long standing crawling bug, I could work around this by enabling the assets manager. See https://support.google.com/webmasters/thread/4425254?hl=en PR: #3129 |
I had aready commited this to 1.7 branch: 14df5a6 |
The following resources can not be loaded by googlebot because they are blocked by default by robots.txt
/system/assets/jquery/jquery-2.x.min.js
/user/plugins/form/assets/form-styles.css
/user/plugins/login/css/login.css
Thank you for fixing this error in the default configuration
The text was updated successfully, but these errors were encountered: