You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I created a robots.txt file for the guides (#682), but to be very thorough we need to address a few additional questions:
are there any user agents we want to have specific instructions?
are there any pages that we don't want crawled?
We could use DAP analytics to determine if there are any (friendly, instruction-receiving) bots that we would like to behave differently than they are currently.
Point of contact on this issue
jduss4
Reproduction steps (if necessary)
No response
Skills Needed
Any Human
Design
Content
Engineering
Acquisition
Product
Other
Does this need to happen in the next 2 weeks?
Yes
No
How much time do you anticipate this work taking?
4 hours
Acceptance Criteria
Additional directives have been added to the robots.txt file if there are changes to user-agents
Any URLs we do not wish to be crawled have been added to the robots.txt file for all bots
The text was updated successfully, but these errors were encountered:
A description of the work
I created a robots.txt file for the guides (#682), but to be very thorough we need to address a few additional questions:
We could use DAP analytics to determine if there are any (friendly, instruction-receiving) bots that we would like to behave differently than they are currently.
Point of contact on this issue
jduss4
Reproduction steps (if necessary)
No response
Skills Needed
Does this need to happen in the next 2 weeks?
How much time do you anticipate this work taking?
4 hours
Acceptance Criteria
The text was updated successfully, but these errors were encountered: