This repository has been archived by the owner on Sep 14, 2021. It is now read-only.
Don't deactivae sitemaps for private sites #213
Labels
Type: Enhancement
Enhancement to an existing feature
Description
When the WordPress option to discourage search engines to index the site is ticked, wp-sitemaps deactivates sitemaps completely.
Sitemaps should still be active for testing and inspection. Sites not in production typically are "private" to avid early indexing a test/staging site. This does not remove he need to see what's being put into sitemaps, especially in case of many CPT's. Some themes/plugins create a lot of CPT's as "public", like templates and more internal things. Even if this is wrong, this need to be filtered out from sitemaps before procuction.
Sitemaps is really just another way of presenting links to content, as HTML archives, RSS and Atom feeds and the API's. Google even treats feeds as partial sitemaps. If you submit a feed or a sitemap (index) to Google you will receive feedback that the links lead to noindexed pages, not forcing indexing.
When site is "private" , there should be no link to it in robots.txt, otherwise work as normal. Sitemaps MAY also be consumed by other parties that search engines, for whatever purpose.
Those who don't need/want sitemaps, or want to modify the content, should filter it. To check the sitemap of plain WordPress, temporarily private, should be as simple as pointing the browser to it.
This also removes the need to inform the user that sitemaps are deactivated (in dashboard and reading options), in case "private".
See https://core.trac.wordpress.org/ticket/50400#comment:16
The text was updated successfully, but these errors were encountered: