The Open Data Survey application.
JavaScript CSS HTML Other
Failed to load latest commit information.
census [#937] Amend explanation text for rejecting submission Feb 27, 2017
docs Update Feb 13, 2017
fixtures Remove and update some npm dependencies Feb 9, 2017
index [#854] Add 40x error page Feb 8, 2017
scripts Remove and update some npm dependencies Feb 9, 2017
tests Merge branch 'master' into 853/generator Jan 31, 2017
.babelrc [#753] QuestionForm as an external script Aug 17, 2016
.bowerrc [#753] Add static assets and bower components Sep 7, 2016
.dockerignore build Docker Feb 16, 2017
.eslintrc [#832] All files pass current linting rules. Jan 10, 2017
.gitignore [#854] Push site to AWS S3 bucket Feb 8, 2017
.gitmodules Major JavaScript cleanup Oct 23, 2013
.travis.yml [#747] Default test includes coverage. Jul 11, 2016 Fix change log Sep 25, 2014
Dockerfile Fix presence of build assets in dockerfile Feb 21, 2017
LICENSE Add AGPL license. May 29, 2015
Makefile Fix presence of build assets in dockerfile Feb 21, 2017 [#844] Load in preference to .title Dec 9, 2016
Procfile All app code under census directory Jun 19, 2015 Update README with new lint command Jan 10, 2017
bower.json [#753] Add js deps Sep 8, 2016
debug-server.js Lint and ES6ify server files. Jul 19, 2016 Generate message files Feb 21, 2014
gulpfile.js [#844] Manually add pot headers Dec 8, 2016
mkdocs.yml Change references to repo name from census to survey Jan 6, 2017
package.json build Docker Feb 16, 2017
requirements.txt [#758] Docs files and mkdocs build requirements Jul 21, 2016
server.js [#767] Update throng dependency Sep 19, 2016
settings.json.example [#844] Update README and settings template Dec 13, 2016
webpack.config.base.js [#753] Reconfigure webpack config for prod and dev Sep 22, 2016
webpack.config.development.js [#753] Reconfigure webpack config for prod and dev Sep 22, 2016
webpack.config.production.js [#753] Reconfigure webpack config for prod and dev Sep 22, 2016

Open Data Census

Travis Build Status Coverage Status

Open Data Census is a web application that supports a submission and review workflow to collect information on the state of open data.

Some presentation of data is supported, along with partitioning results by year.

The code base supports multiple censuses in a multi-tenant configuration, where each tenant runs a census from a subdomain.

Tenant administrators can customize parts of the app, from look and feel to key texts on various views.

Demo Site

If you want to check out what an Open Data Census site looks like we have a demo site running at:



Getting started

Open Data Census is a Node.js app, running Express v4 and Postgres 9.4 for the database.

Get a local server setup with the following steps:

NOTE: If you need to prefix your commands in your local environment with sudo, then do that.

  1. Install Postgres 9.4 on your machine.
  2. Setup to appropriate credentials on Google and Facebook so they are OAuth providers for your app.
    • For Google: follow these steps and then enable the Google+ API.
      • The callBack url for Google+ API is:
    • For Facebook: follow these steps
      • The callBack url for Facebook is:
  3. Ensure you are running the supported version of Node.js, which is declared in the 'engines' section of package.json.
  4. Create a database with createdb opendatacensus.
  5. Add this line to your hosts file:
  6. Create a local directory called opendatasurvey and move into it with cd opendatasurvey.
  7. Clone the code with git clone ..
  8. Install the dependencies with npm install.
  9. Create a copy of settings.json.example file and name it settings.json changing any values as required.

Now we should be ready to run the server:

  1. Run the app with npm start (the server will be run on the 5000 port)
  2. Log in at with your admin account (the same that was setup on the settings.json file)
  3. Load registry and config data at
  4. Load the data for a specific site, e.g.:
  5. Visit the site:

Other things you can do:

  • Run the test suite with npm test
  • Check your code style with npm run lint using the eslint config /.eslintrc.

Configuration Sheets

Most of the site configuration is taken from config sheets in Google Sheets. You can use this registry sheet and its linked sheets as examples and clone them as necessary.

NOTE: Ensure your registry and all other config sheets have been published as CSV in Google Sheets (click File, Publish to the Web).


We run deployments on Heroku. The app should run anywhere that you can run Node.js and Postgres. The important thing to remember for deployments is that the settings.json file you are using for local development is not available, and therefore you need to configure several settings via environment variables. The key settings you should ensure are set:


i18n For Templates and core code

When templates or strings in core code change, the translations have to be changed. Extract strings to the messages.pot file by running this command:

./node_modules/.bin/gulp pot

You will need the GNU gettext commands. See here for more information.

To update the existing .po files, run:

./node_modules/.bin/gulp update-po

To add a new language, create the directory locale/[language-code]/LC_MESSAGES and create the translation files (*.po). Alternatively, you can copy the locale/en directory to locale/[language-code] and change existing files.

To update the translations cache, run:

./node_modules/.bin/gulp compile-po

i18n For Config

Any column can be internationalised by adding another column with @locale after it. For example, the description column can be translated to German by adding a column of description@de. Only languages which have template translations created for them are valid. The locales setting in the config document can be used to restrict the number of locales available. The first locale in the list is the default locale.

Running Tests

createdb opendatacensus_test
npm test

Heroku Deployment

TBD: This section needs to be updated. The basics of deployment now are just to use the normal heroku commands, as now, one codebase powers multiple census sites.