Skip to content
Prerender server for web app with crawler for create cache in cron
Branch: master
Clone or download
Latest commit ae4b06f Dec 11, 2018
Type Name Latest commit message Commit time
Failed to load latest commit information.
__tests__ modify ssr test Nov 28, 2018
src add async for close browser crawl file Nov 28, 2018
test add status to mock pupetter Nov 24, 2018
.babelrc add dev and build script. add babel build Oct 26, 2018
.eslintrc add home default, render route, and 404 and 500 error route Oct 26, 2018
.flowconfig change port Nov 20, 2018
.gitignore add close browser if server close Nov 27, 2018
.htaccess_exemple first commit Oct 23, 2018
.nvmrc correction travis Oct 27, 2018
.prettierrc add eslint flow plugin Oct 25, 2018
.travis.yml add travis Oct 27, 2018
LICENSE add travis Oct 27, 2018 readme and create yarn.lock Nov 27, 2018
crawler.js add commander for crawl cmd Nov 25, 2018
index.js add dev and build script. add babel build Oct 26, 2018
jest.config.js add coverage Oct 29, 2018
package-lock.json update package npm Dec 11, 2018
package.json update package npm Dec 11, 2018
prerender-index.js change port Nov 20, 2018
yarn.lock yarn upgrade Nov 27, 2018

Build Status

Prerender server for web app

If crawler visit your web app, the htaccess return prerender response instead web app response. The prerender return the html complete. It use a google chrome headless for generate html.

You have htaccess example for your web app server. Others examples for nginx are available on the github repo :

Article blog [french]

install chrome on machine is required

For your local machine, no problem. Install juste chrome browser.

init project

npm i && npm run build

use precache crawler system

Set task cron for create file caches. The crawler create page for all page from website. Later, the prerender server serve file cache. Increase speed for serve page.


Usage: crawler [options]

  -V, --version    output the version number
  -u, --url [url]  url to crawl
  -h, --help       output usage information

set url for crawl website and save url in file cache

./crawler.js -u

exemple for run cron every hour

0 * * * *   /your/path/server/crawler.js -u

For server, you have exemple here or

1 - example with express server and puppeteer. Actually index.js and ssr.js

Pupeteer : Puppeteer is a Node library which provides a high-level API to control Chrome or Chromium over the DevTools Protocol. Puppeteer runs headless by default, but can be configured to run full (non-headless) Chrome or Chromium.

The express server catch request and run function for get page from chrome headless browser. I have add system for cache response.

Base code use for start project, thanks Eric Bidelman :

2 - example with prerender package

use prerender-index.js for launch prerender system

Server config


You can use for internal request server from localhost:8000.

For external request, you must configure server. Example for apache, use this virtual host config for redirect request to node server.

Add directory with index.html for DocumentRoot. If node server is down, apache return this index.

Enabling Necessary Apache Modules

sudo a2enmod proxy
sudo a2enmod proxy_http

Create virtual host

<VirtualHost *:443>
    ServerAdmin webmaster@localhost
    DocumentRoot /var/www/...

ProxyPass / http://localhost:8000/
ProxyPassReverse / http://localhost:8000/
ProxyPreserveHost On
ProxyRequests Off


Active virtual host

sudo a2ensite file.conf


For run node on server, i recommand PM2 : Install with :

npm install -g pm2

run server in prerender directory with this cmd :

pm2 start

Web app

You have htaccess file example for your web app. This file redirect crawler to prerender server. Change url for your services.


Use curl request for test

replace localhost:3000 by your url service

For pupeteer

params :

  • url : page to render
  • renderType : html (default), png, jpeg, pdf
curl http://localhost:8000/render?url= -w %{time_connect}:%{time_starttransfer}:%{time_total}

For prerender

params :

  • url : page to render
  • renderType : html (default), png, jpeg, pdf, har
curl http://localhost:8000/render?url= -w %{time_connect}:%{time_starttransfer}:%{time_total}

On linux, kill all chrome instance

pkill chrome
You can’t perform that action at this time.