Skip to content
An automated scanner and web dashboard for tracking TLS deployment across news organizations
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
.circleci Allow ops-tests to fail in CI Jun 12, 2019
.github Sets Harris as codeowner for all changes Mar 19, 2018
api Update requirements for Django 2.1.x Mar 28, 2019
client/src Update string interpolation on teaser template Jun 10, 2019
home deploy info: Make trailing slash optional in URL Jun 17, 2019
infratests Revert DRY-ify testinfra test logic :( Jun 19, 2018
node_modules Remove repetition of django deps, resolve circleCI May 17, 2018
pledges Convert to new URL "routing" syntax Mar 28, 2019
scripts Synchronize this view with changes made on Jun 17, 2019
securethenews requirements: Upgrade django to 2.1.10 Jul 8, 2019
sites Update module paths for Wagtail 2 Mar 28, 2019
test-results Add script/make cmds to run testinfra/app tests May 16, 2018
.flake8 Ignore star import warnings on settings files Jun 10, 2019
.gitignore Synchronize this view with changes made on Jun 17, 2019
.gitmodules Changed submodule from git proto --> https Feb 3, 2017
LICENSE Add AGPLv3 open source license Jan 30, 2017
Makefile Renames prod compose file "ci-" -> "prod-" Jun 12, 2019
Pipfile.lock Open up Pipfile requirements to 3.x Jun 10, 2019 Renames prod compose file "ci-" -> "prod-" Jun 12, 2019
docker-compose.yaml Unbreaks node container build process Jun 12, 2019 Resolve remaining flake8 issues May 16, 2018
news_sites.csv Add Axios ( to news sites Oct 6, 2017
package-lock.json Handle lingering npm audit advisories Jun 5, 2019
package.json Handle lingering npm audit advisories Jun 5, 2019
postcss.config.js Webpack: Get webpack working in templates Jun 5, 2019
prod-docker-compose.yaml Renames prod compose file "ci-" -> "prod-" Jun 12, 2019

Secure the News


Getting Started with the Development Environment

The installation instructions below assume you have the following software on your machine:

From the checkout directory, run the following to jump into a virtualenv:

# The very first time run
$ pipenv install
# Each subsequent time run this to enter a virtualenv shell
$ pipenv shell

Then run the following, which will be need to be run once at clone of this repo:

make dev-init

To start up the development environment you can use the normal docker-compose flow:

docker-compose up

If this command completes successfully, your development site will be available at: http://localhost:8000

To import the example data, run:

make dev-createdevdata

This will also create an admin user for the web interface at http://localhost:8000/admin/ (username: test, password: test).

If you want to start the TLS scan for all the news sites in your development environment, run:

make dev-scan

For a full list of all helper commands in the Makefile, run make help. And, of course, you can obtain a shell directly into any of the containers using docker-compose syntax. Just keep in mind the default shell is ash under alpine. Here is an example of entering the django container:

$ docker-compose exec django ash


If you want to use the PDB program for debugging, it is possible. First, add this line to an area of the code you wish to debug:

import ipdb; ipdb.set_trace()

Second, attach to the running Django container. This must be done in a shell, and it is within this attached shell that you will be able to interact with the debugger. The command to attach is docker attach <ID_OF_DJANGO_CONTAINER>, and on UNIX-type systems, you can look up the ID and attach to the container with this single command:

docker attach $(docker-compose ps -q django)

Once you have done this, you can load the page that will run the code with your import ipdb and the debugger will activate in the shell you attached. To detach from the shell without stopping the container press Control+P followed by Control+Q.

Getting Started with the Production Environment

The environment is fairly similar to development with the exception that your code will not auto-reload and be reflected in the container. So this is not a great environment to development under but it reflects a production-like environment run under gunicorn and behind a reverse-proxy nginx server.

The flow is this:

# Build the prod container (everytime you make a code-change need to re-do this)
make build-prod-container

# Run the prod environment
docker-compose -f prod-docker-compose.yaml up

# Run production apptests
make app-tests-prod

# Run ops tests
make ops-tests

# Teardown prod
docker-compose -f prod-docker-compose.yaml down

Updating Python dependencies

New requirements should be added to * files, for use with pip-compile. There are two Python requirements files:

  • production application dependencies
  • molecule/ local testing and CI requirements (e.g. molecule, safety)

Add the desired dependency to the appropriate .in file, then run:

.. code:: bash

make update-pip-dependencies

All requirements files will be regenerated based on compatible versions. Multiple .in files can be merged into a single .txt file, for use with pip. The Makefile target handles the merging of multiple files.

Development Fixtures

The createdevdata management commands loads Site and Scan data from the fixtures in sites/fixtures/dev.json. If you change the schema of sites.Site or sites.Scan, you will need to update these fixtures, or future invocations of createdevdata will fail.

The general process for updating the development fixtures is:

  1. Migrate your database to the last migration where the fixtures were updated.

  2. Load the fixtures.

  3. Run the migrations that you've added.

  4. Export the migrated fixtures:

    $ python3 dumpdata sites.{Site,Scan} > sites/fixtures/dev.json

The test suite includes a smoke test for createdevdata, so you can easily verify that the command is working without disrupting your own development environment.


If everything is working correctly, you should be able to find an API endpoint at localhost:8000/api (it will redirect to the current API version).

The API is read-only and can be used to obtain site metadata and the latest scan for a given site (e.g., /api/v1/sites will return a directory, and /api/v1/sites/ will return details about the BBC). Various filters and sort options are supported; click the "filters" dialog in the UI to explore them.

To get all scans for a given site, you can use a path like /api/v1/sites/ This URL can also be found in the all_scans field for a given site result.

If you run a public site, note that read access to the API is available to any origin via CORS.

The API is implemented using the Django REST framework; documentation for it can be found here:

You can’t perform that action at this time.