Skip to content
The scraper, parser, and database creation scripts for Financial Management Service daily U.S. Treasury statements.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

Federal Treasury API

   €     |    |     €€€€≤±‹€€€€≤±‹
   €    .| -- | -+
   €   ' |    |  |  €€€€≤±±≤€€€€€€€€€€€€≤±±≤€€€€€≤±
   €    `|    |
   €     |`.  |     €€€€≤±±≤€€€€€€€€€€€€≤±±≤€€€€€≤±
   €     |  `.|
   €     |    |`.   €€€€≤±±≤€€€€€€€€€€€€≤±±≤€€€€€≤±
   € +   |    |   '
   € | _ | __ | _'   €€€≤±±≤€€€€€€€€€€€€≤±±≤€€€€€≤±
   €     |    |
   €                      fl±≤€€€€€€€€€€€€≤±±≤€€€€€≤±
   €                                      fl±≤€€€€€≤±

About this branch

The master branch of federal-treasury-api contains code specific to running our project on ScraperWiki. the just-the-api branch contains only the code needed to download the data locally and launch a queryable api. In other words, if you're just looking to get and host the database, go here.

About the API

federal-treasury-api is the first-ever electronically-searchable database of the Federal government's daily cash spending and borrowing. It updates daily and the data can be exported in various formats and loaded into various systems.

About the data

There are eight tables.

  • I. Operating Cash Balance (t1)
  • II. Deposits and Withdrawals (t2)
  • IIIa. Public Debt Transactions (t3a)
  • IIIb. Adjustment of Public Dept Transactions to Cash Basis (t3b)
  • IIIc. Debt Subject to Limit (t3c)
  • IV. Federal Tax Deposits (t4)
  • V. Short-term Cash Investments (t5)
  • VI. Incom Tax Refunds Issued (t6)

Check out thre comprehensive data dictionary and for more information.

Obtaining the data

Optionally set up a virtualenv. (You need this on ScraperWiki.) Run this from the root of the current repository.

virtualenv env

Install dependencies.

pip install -r requirements.pip

Enable the git post-merge hook.

cd .git/hooks
ln -s ../../utils/post-merge .


This one command downloads the (new) fixies and converts them to an SQLite3 database.



Run everything

cd parser

Testing the data

Various tests are contained in tests

Tests are run everyday with ./ and the results are emailed to


Run everything each day around 4:30 - right after the data has been released.

30 16 * * * cd path/to/federal-treasury-api && ./

Optional: set up logging

30 16 * * * cd path/to/federal-treasury-api && ./ >> run.log 2>> err.log

Deploying to ScraperWiki

You can run this on any number of servers, but we happen to be using ScraperWiki. You can check out their documentation here


To use ScraperWiki, log in here, make a project, click the "SSH in" link, add your SSH key and SSH in. Then you can SSH to the box like so.


Or add this to your ~/.ssh/config

Host fms
User cc7znvq

and just run

ssh fms

What this ScraperWiki account is

Some notes about how ScraperWiki works:

  • We have a user account in a chroot jail.
  • We don't have root, so we install Python packages in a virtualenv.
  • Files in /home/http get served on the web.
  • The database /home/scraperwiki.sqlite gets served from the SQLite web API.
    • NOTE: the home/scraperwiki.sqlite is simply a symbolic link to data/treasury_data.db generated by this command: ln -s data/treasury_data.db scraperwiki.sqlite

The directions below still apply for any other service, of course.

You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.