You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(Imported from Trac #760, reported by warren on 2010-11-04)
When searching for haskell documentation on google, quite frequently the top search results are versions of the library documentation from the archives. For instance, when searching for Data.Vector, the top hit is version 0.5, whereas the latest is 0.7. Also, if the user selects the 0.7 version, it isn't obvious from the documentation that this version corresponds to the latest since it also appears in the archive (the url is http://hackage.haskell.org/packages/archive/vector/0.7/doc/html/Data-Vector.html).
So I'm wondering if it might not make sense to exclude all but the latest haddock documentation from search results by augmenting the robots.txt file.
We had a brief discussion today in #haskell, and somebody had the good idea of using a <meta rel=canonical> tag.
It seems this would avoid outright excluding old versions, but still making it clear to the search engine which is preferred (i.e. the latest docs). Seems to be a good balance.
If this solution is chosen, this would probably be a haddock issue (not a hackage issue anymore).
(Imported from Trac #760, reported by warren on 2010-11-04)
When searching for haskell documentation on google, quite frequently the top search results are versions of the library documentation from the archives. For instance, when searching for Data.Vector, the top hit is version 0.5, whereas the latest is 0.7. Also, if the user selects the 0.7 version, it isn't obvious from the documentation that this version corresponds to the latest since it also appears in the archive (the url is http://hackage.haskell.org/packages/archive/vector/0.7/doc/html/Data-Vector.html).
So I'm wondering if it might not make sense to exclude all but the latest haddock documentation from search results by augmenting the robots.txt file.
It might also make sense to provide alternate urls to the latest versions of documents. E.g. Starting from the http://hackage.haskell.org/package/vector page (which is the latest version), and clicking on Data.Vector, one is taken to http://hackage.haskell.org/packages/archive/vector/0.7/doc/html/Data-Vector.html... but it might be better if this were http://hackage.haskell.org/packages/current/vector/doc/html/Data-Vector.html. This would allow for all files under /archive to be excluded.
The text was updated successfully, but these errors were encountered: