Skip to content

Get a federally-compliant REST API for your CSV data on in about 60 seconds. ATO not included.


Notifications You must be signed in to change notification settings


Repository files navigation



This demo shows you how to get a federally-compliant REST API for your CSV data on in about 60 seconds.

ATO not included.

Trying it out

Deploying the demo

If you haven't already, set up your account and log in to

Open a terminal, then clone this repository.

~$ git clone

(Alternatively, download a ZIP file and unzip it to create the cloudgov-demo-postgrest folder.)

Change into that directory.

~$ cd cloudgov-demo-postgrest

Copy vars.yml.template to vars.yml, and run the deploy script

~$ cp vars.yml.template vars.yml
~$ ./

Once that operation completes, run

~$ cf apps

You'll see the URL your application was assigned. The output will look like:

Getting apps in org sandbox-agencyname / space as

name        requested state   instances   memory   disk   urls
postgrest   started           1/1         512M     1G

You can now run (with your own URL):

~$ curl -s "https://{your-app-url}/Inspection_Results_School_Food_Service?GradeRecent=eq.C" | jq .

You should see a nicely formatted JSON response using the awesome jq utility. To learn more about querying data in PostgREST, see the docs.

Cleaning up the demo



Next steps

Try using your own data

The sample data is in the data directory. You can drop your own .csv files there before running ./ Each file you put in the directory will be turned into a REST API endpoint. (For example, the filename myfile.csv will be available at the REST API endpoint https://{your-app-url}/myfile).

Try customizing the data import process

You can edit the script in the data directory and go to town. (The default behavior just uses csvkit, which you might find useful.)

DBAs demand answers


This was all made possible through the magic of PostgREST... We're just providing the glue here. With this tech in hand, you're not just a DBA, you're a backend web developer now. Have a look at the PostgREST docs to see what else you can do!

What about access control?

The shared-psql plan in use on is a community resources, and does not give you CREATEUSER or CREATEROLE permissions. You'll need to use the medium-psql plan or higher for that.

You can use the service-connect plugin to connect to the DB and create roles by hand, but a better way is to customize to do what you want in a repeatable way. (See the script for an example of how to make DB queries without the psql client available.)

How can I scale this up?

cf scale -i INSTANCES -m MEMORY postgrest

...where INSTANCES is the number of web services you want serving requests, and MEMORY is the memory allocated for each instance.

How much does this cost to run?

It's free to run this demo in your sandbox space. See pricing if you want to instead host it in a more durable prototyping or production space. (Note that using both PostgREST and is going to save you a ridiculous amount of money on custom app code and compliance, so make sure you're comparing the cost of hosting to the real costs of going another route.)


See CONTRIBUTING for additional information.

Public domain

This project is in the worldwide public domain. As stated in CONTRIBUTING:

This project is in the public domain within the United States, and copyright and related rights in the work worldwide are waived through the CC0 1.0 Universal public domain dedication.

All contributions to this project will be released under the CC0 dedication. By submitting a pull request, you are agreeing to comply with this waiver of copyright interest.


Get a federally-compliant REST API for your CSV data on in about 60 seconds. ATO not included.







No releases published


No packages published