CLI tool for exporting tables or queries from any SQL database to a SQLite file
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
db_to_sqlite Support --table with no --sql Feb 8, 2019
.gitignore
.travis.yml Run --help in Travis to sanity check Jan 17, 2019
LICENSE
README.md Support --table with no --sql Feb 8, 2019
setup.py Release 0.2 Feb 8, 2019

README.md

db-to-sqlite

PyPI Travis CI License

CLI tool for exporting tables or queries from any SQL database to a SQLite file.

Usage: db-to-sqlite [OPTIONS] PATH

  Load data from any database into SQLite.

  https://github.com/simonw/db-to-sqlite

Options:
  --version          Show the version and exit.
  --connection TEXT  SQLAlchemy connection string  [required]
  --all              Detect and copy all tables
  --table TEXT       Name of table to save the results (and copy)
  --sql TEXT         Optional SQL query to run
  --pk TEXT          Optional column to use as a primary key
  --help             Show this message and exit.

For example, to save the content of the blog_entry table from a PostgreSQL database to a local file called blog.db you could do this:

db-to-sqlite blog.db \
    --connection="postgresql://localhost/myblog" \
    --table=blog_entry

You can also save the data from all of your tables, effectively creating a SQLite copy of your entire database. Any foreign key relationships will be detected and added to the SQLite database. For example:

db-to-sqlite blog.db \
    --connection="postgresql://localhost/myblog" \
    --all

If you want to save the results of a custom SQL query, do this:

db-to-sqlite output.db \
    --connection="postgresql://localhost/myblog" \
    --table=query_results \
    --sql="select id, title, created from blog_entry" \
    --pk=id