You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As I understand it, candidate packages just for authors to make sure things look alright before committing the upload, so perhaps we don't want those links being collected by search engines?
The text was updated successfully, but these errors were encountered:
Ick this is a bit tricky. Due to the nature of the paths we can't change robots.txt to disallow them (they share a common prefix with the good paths) and due to the nature of the pages (generated by haddock) its hard to tweak their metadata directly.
I think we'll need something we've intended to have anyway -- a way to have hackage server do a transform/rewrite on generated docs to add some additional html (in this case an additional META directive to disallow crawls). Definitely open to better ideas.
Another thought -- why not just relocate the candidate pages so they don't hang off a common prefix? I don't think there's anything that depends on them being in their current locations...
I did a search Google for
Data.Vector hackage
and the top hit was:https://hackage.haskell.org/package/vector-0.11.0.0/candidate/docs/Data-Vector.html
As I understand it, candidate packages just for authors to make sure things look alright before committing the upload, so perhaps we don't want those links being collected by search engines?
The text was updated successfully, but these errors were encountered: