Skip to content
Visits pages based on a sitemap to keep your cache warm
Branch: master
Clone or download
khromov Merge pull request #9 from vitor-ao/changes
config.php renamed to config.php.example
Latest commit 180e818 Aug 11, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
class Trim url before fetch Jan 2, 2019
.gitignore Initial version Dec 21, 2014 config.php renamed to config.php.example Mar 4, 2019
config.php.example config.php renamed to config.php.example Mar 4, 2019
warm.php CLI + More parameters + Other improvements Apr 7, 2017

Sitemap Cache Warmer

This PHP script crawls URL:s based on a sitemap. It is used to keep your cache warm by visiting all the pages in your sitemap at regular intervals. It supports sub-sitemap (Sitemap index).


Rename config.php.example to config.php and change the key parameter to a secret value. Upload the file onto your web host, preferably into its own folder (for example, /warm-cache)

Once you have uploaded this file onto your web host, you can visit the following URL to traverse a sitemap and visit all its URL:s:
Available parameters

key - Secret key, as entered in config.php (Required) url - URL to the root sitemap, usually /sitemap.xml (Required) sleep - Amount of time to sleep between each request in seconds. Used for throttling on slow hosts. (Optional, default is to not throttle.) from - Number of the url to start with. (Optional, default is 0). to - Number of the url to stop. Useful to test some URLs on a heavy sitemap (Optional, default is till the end of the sitemap)

Scheduling the crawl

You will need to use CRON to schedule the crawls as often as you wish. Here's an example using cURL and crontab to crawl once every hour:

0 * * * * curl "" >/dev/null 2>&1

If your host provides a CRON URL visiting function, all you need to do is enter the URL, as described in the "Usage" section.


The script will provide a JSON output with stats about the crawl, example:

    "status": "OK",
    "message": "Processed sitemap:",
    "count" : 4,
    "duration": 9.5575199127197,
    "log": [
        "Processed sub-sitemap:",
        "Processed sub-sitemap:",
    "visited_urls": [

Reporting unaccessible pages

You can set up mail alert when some URLs cannot be accessed. Just modify your config.php like this:

return array(
    'key' => '9f316c95a356aab49cf5e4fcf3418295' // Secret key to allow traversing sitemaps
    'reportProblematicUrls' => true,
    'reportProblematicUrlsTo' => ""

URL is reported whencannot be opened with file_get_contents(). Proper handling of status codes will be added soon.

Using the CLI

Also you can launch the script from the CLI to bypass the common errors of timeout (504) from an Nginx server.

php /whatever/you/have/uploaded/it/warm.php url= sleep=0 key=SECRET_KEY
php /whatever/you/have/uploaded/it/warm.php url= sleep=0 key=SECRET_KEY from=10 to=100
php /whatever/you/have/uploaded/it/warm.php url= sleep=0 key=SECRET_KEY to=25

Crawl strategies

If you employ time-based static page cache, you can schedule your crawls to coincide with half the cache expiration time.

For example, if your expiration time is one hour (3600 seconds), you can schedule the crawls to take place every thirty minutes (1800 seconds).

If you have a lot of pages and few visitors, this may cause increased load on the server. For low-traffic deployments, use a long cache expiration time (24 hours or more) and invalidate cache when page content changes.


  • SimpleXML
  • allow_url_fopen in php.ini (Enabled on most hosts)


The plugin has been tested with the WordPress plugins Yoast WordPress SEO and Google XML Sitemaps. It should work with any sitemap which conforms to the sitemap standard.

You can’t perform that action at this time.