Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

Initial submission.

Base versions of dbtoyaml and yamltodb handle schemas, sequences,
tables, columns, check constraints, primary keys, foreign keys, unique
constraints and indexes.
  • Loading branch information...
commit 76254e8840e67ab4c5602ae77d871c834c6550f0 0 parents
@jmafc jmafc authored
Showing with 4,334 additions and 0 deletions.
  1. +6 −0 .gitignore
  2. +11 −0 AUTHORS
  3. +29 −0 LICENSE
  4. +4 −0 MANIFEST.in
  5. +7 −0 NEWS
  6. +15 −0 README
  7. +89 −0 docs/Makefile
  8. +45 −0 docs/column.rst
  9. +194 −0 docs/conf.py
  10. +87 −0 docs/constraint.rst
  11. +63 −0 docs/database.rst
  12. +27 −0 docs/dbconn.rst
  13. +61 −0 docs/dbobject.rst
  14. +132 −0 docs/dbtoyaml.rst
  15. +70 −0 docs/index.rst
  16. +43 −0 docs/indexes.rst
  17. +77 −0 docs/overview.rst
  18. +60 −0 docs/schema.rst
  19. +123 −0 docs/table.rst
  20. +85 −0 docs/yamltodb.rst
  21. 0  pyrseas/__init__.py
  22. +114 −0 pyrseas/database.py
  23. +82 −0 pyrseas/dbconn.py
  24. +124 −0 pyrseas/dbobject/__init__.py
  25. +158 −0 pyrseas/dbobject/column.py
  26. +391 −0 pyrseas/dbobject/constraint.py
  27. +163 −0 pyrseas/dbobject/index.py
  28. +172 −0 pyrseas/dbobject/schema.py
  29. +485 −0 pyrseas/dbobject/table.py
  30. +59 −0 pyrseas/dbtoyaml.py
  31. +49 −0 pyrseas/yamltodb.py
  32. 0  setup.cfg
  33. +46 −0 setup.py
  34. 0  tests/__init__.py
  35. +23 −0 tests/dbobject/__init__.py
  36. +501 −0 tests/dbobject/test_constraint.py
  37. +97 −0 tests/dbobject/test_index.py
  38. +95 −0 tests/dbobject/test_schema.py
  39. +106 −0 tests/dbobject/test_sequence.py
  40. +286 −0 tests/dbobject/test_table.py
  41. +155 −0 tests/dbobject/utils.py
6 .gitignore
@@ -0,0 +1,6 @@
+*.pyc
+*~
+MANIFEST
+dist
+docs/_build
+Pyrseas.egg-info
11 AUTHORS
@@ -0,0 +1,11 @@
+Pyrseas was started in 2010.
+
+The PRIMARY AUTHORS are (and/or have been):
+
+ * Joe Abbate
+
+A big THANK YOU goes to:
+
+ * Ken Downs for creating the Andromeda project.
+
+ * Robert Brewer for Post-Facto.
29 LICENSE
@@ -0,0 +1,29 @@
+Copyright (c) 2010 by Joe Abbate, see AUTHORS for more details.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+ * Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+
+ * Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided
+ with the distribution.
+
+ * Neither the name of the Pyrseas project nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
4 MANIFEST.in
@@ -0,0 +1,4 @@
+include AUTHORS ChangeLog LICENSE NEWS
+recursive-include tests *.py
+recursive-include docs *
+prune docs/_build
7 NEWS
@@ -0,0 +1,7 @@
+0.1.0 (5-Apr-2011)
+
+ * Initial release
+
+ - dbtoyaml and yamltodb support PostgreSQL schemas, tables,
+ sequences, check constraints, primary keys, foreign keys, unique
+ constraints and indexes.
15 README
@@ -0,0 +1,15 @@
+=======
+Pyrseas
+=======
+
+Pyrseas provides a framework and utilities to upgrade and maintain a
+relational database. Its purpose is to enhance and follow through on
+the concepts of the `Andromeda Project
+<http://www.andromeda-project.org/>`_.
+
+Pyrseas currently includes the dbtoyaml utility to create a `YAML
+<http://yaml.org/>`_ description of a PostgreSQL database's tables,
+and the yamltodb utility to generate SQL statements to modify a
+database to match an input YAML specification.
+
+Pyrseas is distributed under the BSD license.
89 docs/Makefile
@@ -0,0 +1,89 @@
+# Makefile for Sphinx documentation
+#
+
+# You can set these variables from the command line.
+SPHINXOPTS =
+SPHINXBUILD = sphinx-build
+PAPER =
+BUILDDIR = _build
+
+# Internal variables.
+PAPEROPT_a4 = -D latex_paper_size=a4
+PAPEROPT_letter = -D latex_paper_size=letter
+ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
+
+.PHONY: help clean html dirhtml pickle json htmlhelp qthelp latex changes linkcheck doctest
+
+help:
+ @echo "Please use \`make <target>' where <target> is one of"
+ @echo " html to make standalone HTML files"
+ @echo " dirhtml to make HTML files named index.html in directories"
+ @echo " pickle to make pickle files"
+ @echo " json to make JSON files"
+ @echo " htmlhelp to make HTML files and a HTML help project"
+ @echo " qthelp to make HTML files and a qthelp project"
+ @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
+ @echo " changes to make an overview of all changed/added/deprecated items"
+ @echo " linkcheck to check all external links for integrity"
+ @echo " doctest to run all doctests embedded in the documentation (if enabled)"
+
+clean:
+ -rm -rf $(BUILDDIR)/*
+
+html:
+ $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
+ @echo
+ @echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
+
+dirhtml:
+ $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
+ @echo
+ @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
+
+pickle:
+ $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
+ @echo
+ @echo "Build finished; now you can process the pickle files."
+
+json:
+ $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
+ @echo
+ @echo "Build finished; now you can process the JSON files."
+
+htmlhelp:
+ $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
+ @echo
+ @echo "Build finished; now you can run HTML Help Workshop with the" \
+ ".hhp project file in $(BUILDDIR)/htmlhelp."
+
+qthelp:
+ $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
+ @echo
+ @echo "Build finished; now you can run "qcollectiongenerator" with the" \
+ ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
+ @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Pyrseas.qhcp"
+ @echo "To view the help file:"
+ @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Pyrseas.qhc"
+
+latex:
+ $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
+ @echo
+ @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
+ @echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \
+ "run these through (pdf)latex."
+
+changes:
+ $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
+ @echo
+ @echo "The overview file is in $(BUILDDIR)/changes."
+
+linkcheck:
+ $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
+ @echo
+ @echo "Link check complete; look for any errors in the above output " \
+ "or in $(BUILDDIR)/linkcheck/output.txt."
+
+doctest:
+ $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
+ @echo "Testing of doctests in the sources finished, look at the " \
+ "results in $(BUILDDIR)/doctest/output.txt."
45 docs/column.rst
@@ -0,0 +1,45 @@
+Columns
+=======
+
+.. module:: pyrseas.column
+
+The :mod:`column` module defines two classes, :class:`Column` derived
+from :class:`DbSchemaObject` and :class:`ColumnDict`, derived from
+:class:`DbObjectDict`.
+
+Column
+------
+
+:class:`Column` is derived from
+:class:`~pyrseas.dbobject.DbSchemaObject` and represents a column in a
+table. Its :attr:`keylist` attributes are the schema name and the
+table name.
+
+A :class:`Column` has the following attributes: :attr:`name`,
+:attr:`type`, :attr:`not_null` and :attr:`default`. The :attr:`number`
+attribute is also present but is not made visible externally.
+
+.. autoclass:: Column
+
+.. automethod:: Column.to_map
+
+.. automethod:: Column.add
+
+.. automethod:: Column.drop
+
+.. automethod:: Column.set_sequence_default
+
+.. automethod:: Column.diff_map
+
+Column Dictionary
+-----------------
+
+Class :class:`ColumnDict` is a dictionary derived from
+:class:`~pyrseas.dbobject.DbObjectDict` and represents the collection
+of columns in a database, across multiple tables. It is indexed by the
+schema name and table name, and each value is a list of
+:class:`Column` objects.
+
+.. autoclass:: ColumnDict
+
+.. automethod:: ColumnDict.from_map
194 docs/conf.py
@@ -0,0 +1,194 @@
+# -*- coding: utf-8 -*-
+#
+# Pyrseas documentation build configuration file, created by
+# sphinx-quickstart on Fri Dec 17 22:06:15 2010.
+#
+# This file is execfile()d with the current directory set to its containing dir.
+#
+# Note that not all possible configuration values are present in this
+# autogenerated file.
+#
+# All configuration values have a default; values that are commented out
+# serve to show the default.
+
+import sys, os
+
+# If extensions (or modules to document with autodoc) are in another directory,
+# add these directories to sys.path here. If the directory is relative to the
+# documentation root, use os.path.abspath to make it absolute, like shown here.
+sys.path.append(os.path.abspath('.'))
+
+# -- General configuration -----------------------------------------------------
+
+# Add any Sphinx extension module names here, as strings. They can be extensions
+# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
+extensions = ['sphinx.ext.autodoc']
+
+# Add any paths that contain templates here, relative to this directory.
+templates_path = ['_templates']
+
+# The suffix of source filenames.
+source_suffix = '.rst'
+
+# The encoding of source files.
+#source_encoding = 'utf-8'
+
+# The master toctree document.
+master_doc = 'index'
+
+# General information about the project.
+project = u'Pyrseas'
+copyright = u'2011, Joe Abbate'
+
+# The version info for the project you're documenting, acts as replacement for
+# |version| and |release|, also used in various other places throughout the
+# built documents.
+#
+# The short X.Y version.
+version = '0.1.0'
+# The full version, including alpha/beta/rc tags.
+release = '0.1.0'
+
+# The language for content autogenerated by Sphinx. Refer to documentation
+# for a list of supported languages.
+#language = None
+
+# There are two options for replacing |today|: either, you set today to some
+# non-false value, then it is used:
+#today = ''
+# Else, today_fmt is used as the format for a strftime call.
+#today_fmt = '%B %d, %Y'
+
+# List of documents that shouldn't be included in the build.
+#unused_docs = []
+
+# List of directories, relative to source directory, that shouldn't be searched
+# for source files.
+exclude_trees = ['_build']
+
+# The reST default role (used for this markup: `text`) to use for all documents.
+#default_role = None
+
+# If true, '()' will be appended to :func: etc. cross-reference text.
+#add_function_parentheses = True
+
+# If true, the current module name will be prepended to all description
+# unit titles (such as .. function::).
+#add_module_names = True
+
+# If true, sectionauthor and moduleauthor directives will be shown in the
+# output. They are ignored by default.
+#show_authors = False
+
+# The name of the Pygments (syntax highlighting) style to use.
+pygments_style = 'sphinx'
+
+# A list of ignored prefixes for module index sorting.
+#modindex_common_prefix = []
+
+
+# -- Options for HTML output ---------------------------------------------------
+
+# The theme to use for HTML and HTML Help pages. Major themes that come with
+# Sphinx are currently 'default' and 'sphinxdoc'.
+html_theme = 'default'
+
+# Theme options are theme-specific and customize the look and feel of a theme
+# further. For a list of options available for each theme, see the
+# documentation.
+#html_theme_options = {}
+
+# Add any paths that contain custom themes here, relative to this directory.
+#html_theme_path = []
+
+# The name for this set of Sphinx documents. If None, it defaults to
+# "<project> v<release> documentation".
+#html_title = None
+
+# A shorter title for the navigation bar. Default is the same as html_title.
+#html_short_title = None
+
+# The name of an image file (relative to this directory) to place at the top
+# of the sidebar.
+#html_logo = None
+
+# The name of an image file (within the static path) to use as favicon of the
+# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
+# pixels large.
+#html_favicon = None
+
+# Add any paths that contain custom static files (such as style sheets) here,
+# relative to this directory. They are copied after the builtin static files,
+# so a file named "default.css" will overwrite the builtin "default.css".
+html_static_path = ['_static']
+
+# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
+# using the given strftime format.
+#html_last_updated_fmt = '%b %d, %Y'
+
+# If true, SmartyPants will be used to convert quotes and dashes to
+# typographically correct entities.
+#html_use_smartypants = True
+
+# Custom sidebar templates, maps document names to template names.
+#html_sidebars = {}
+
+# Additional templates that should be rendered to pages, maps page names to
+# template names.
+#html_additional_pages = {}
+
+# If false, no module index is generated.
+#html_use_modindex = True
+
+# If false, no index is generated.
+#html_use_index = True
+
+# If true, the index is split into individual pages for each letter.
+#html_split_index = False
+
+# If true, links to the reST sources are added to the pages.
+#html_show_sourcelink = True
+
+# If true, an OpenSearch description file will be output, and all pages will
+# contain a <link> tag referring to it. The value of this option must be the
+# base URL from which the finished HTML is served.
+#html_use_opensearch = ''
+
+# If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml").
+#html_file_suffix = ''
+
+# Output file base name for HTML help builder.
+htmlhelp_basename = 'Pyrseasdoc'
+
+
+# -- Options for LaTeX output --------------------------------------------------
+
+# The paper size ('letter' or 'a4').
+#latex_paper_size = 'letter'
+
+# The font size ('10pt', '11pt' or '12pt').
+#latex_font_size = '10pt'
+
+# Grouping the document tree into LaTeX files. List of tuples
+# (source start file, target name, title, author, documentclass [howto/manual]).
+latex_documents = [
+ ('index', 'Pyrseas.tex', u'Pyrseas Documentation',
+ u'Joe Abbate', 'manual'),
+]
+
+# The name of an image file (relative to this directory) to place at the top of
+# the title page.
+#latex_logo = None
+
+# For "manual" documents, if this is true, then toplevel headings are parts,
+# not chapters.
+#latex_use_parts = False
+
+# Additional stuff for the LaTeX preamble.
+#latex_preamble = ''
+
+# Documents to append as an appendix to all manuals.
+#latex_appendices = []
+
+# If false, no module index is generated.
+#latex_use_modindex = True
87 docs/constraint.rst
@@ -0,0 +1,87 @@
+Constraints
+===========
+
+.. module:: pyrseas.constraint
+
+The :mod:`constraint` module defines six classes: :class:`Constraint`
+derived from :class:`DbSchemaObject`, classes
+:class:`CheckConstraint`, :class:`PrimaryKey`, :class:`ForeignKey` and
+:class:`UniqueConstraint` derived from :class:`Constraint`, and
+:class:`ConstraintDict` derived from :class:`DbObjectDict`.
+
+Constraint
+----------
+
+Class :class:`Constraint` is derived from
+:class:`~pyrseas.dbobject.DbSchemaObject` and represents a constraint
+on a database table. Its :attr:`keylist` attributes are the schema
+name, the table name and the constraint name.
+
+.. autoclass:: Constraint
+
+.. automethod:: Constraint.key_columns
+
+.. automethod:: Constraint.add
+
+.. automethod:: Constraint.drop
+
+Check Constraint
+----------------
+
+:class:`CheckConstraint` is derived from :class:`Constraint` and represents
+a CHECK constraint.
+
+.. autoclass:: CheckConstraint
+
+.. automethod:: CheckConstraint.to_map
+
+.. automethod:: CheckConstraint.add
+
+.. automethod:: CheckConstraint.diff_map
+
+Primary Key
+-----------
+
+:class:`PrimaryKey` is derived from :class:`Constraint` and represents
+a primary key constraint.
+
+.. autoclass:: PrimaryKey
+
+.. automethod:: PrimaryKey.to_map
+
+Foreign Key
+-----------
+
+:class:`ForeignKey` is derived from :class:`Constraint` and represents
+a foreign key constraint.
+
+.. autoclass:: ForeignKey
+
+.. automethod:: ForeignKey.ref_columns
+
+.. automethod:: ForeignKey.to_map
+
+.. automethod:: ForeignKey.add
+
+Unique Constraint
+-----------------
+
+:class:`UniqueConstraint` is derived from :class:`Constraint` and
+represents a UNIQUE, non-primary key constraint.
+
+.. autoclass:: UniqueConstraint
+
+.. automethod:: UniqueConstraint.to_map
+
+Constraint Dictionary
+---------------------
+
+Class :class:`ConstraintDict` is a dictionary derived from
+:class:`~pyrseas.dbobject.DbObjectDict` and represents the collection
+of constraints in a database.
+
+.. autoclass:: ConstraintDict
+
+.. automethod:: ConstraintDict.from_map
+
+.. automethod:: ConstraintDict.diff_map
63 docs/database.rst
@@ -0,0 +1,63 @@
+Databases
+=========
+
+.. module:: pyrseas.database
+
+The :mod:`database` module defines :class:`Database`.
+
+Database
+--------
+
+A :class:`Database` is initialized with a
+:class:`~pyrseas.dbconn.DbConnection` object. It consists of one or
+two :class:`Dicts`. A :class:`Dicts` object holds various dictionary
+objects derived from :class:`~pyrseas.dbobject.DbObjectDict`, e.g.,
+:class:`~pyrseas.schema.SchemaDict`,
+:class:`~pyrseas.table.ClassDict`, and
+:class:`~pyrseas.column.ColumnDict`. The key for each dictionary is a
+Python tuple (or a single value in the case of
+:class:`SchemaDict`). For example, the
+:class:`~pyrseas.table.ClassDict` dictionary is indexed by (`schema
+name`, `table name`). In addition, object instances in each dictionary
+are linked to related objects in other dictionaries, e.g., columns are
+linked to the tables where they belong.
+
+The :attr:`db` :class:`Dicts` object --always present-- defines the
+database schemas, including their tables and other objects, by
+querying the system catalogs. The :attr:`ndb` :class:`Dicts` object
+defines the schemas based on the :obj:`input_map` supplied to the
+:meth:`diff_map` method.
+
+The :meth:`to_map` method returns and the :meth:`diff_map` method
+takes as input, a dictionary as shown below. It uses 'schema
+`schema_name`' as the key for each schema. The value corresponding to
+each 'schema `schema_name`' is another dictionary using 'sequences',
+'tables', etc., as keys and more dictionaries as values. For example::
+
+ {'schema public':
+ {'sequence seq1': { ... },
+ 'sequence seq2': { ... },
+ 'table t1': { ... },
+ 'table t2': { ... },
+ 'table t3': { ... }
+ },
+ 'schema s1': { ... },
+ 'schema s2': { ... }
+ }
+
+Refer to :class:`~pyrseas.table.Sequence` and
+:class:`~pyrseas.table.Table` for details on the lower level
+dictionaries.
+
+.. autoclass:: Database
+
+Methods :meth:`from_catalog` and :meth:`from_map` are for internal
+use. Methods :meth:`to_map` and :meth:`diff_map` are the external API.
+
+.. automethod:: Database.from_catalog
+
+.. automethod:: Database.from_map
+
+.. automethod:: Database.to_map
+
+.. automethod:: Database.diff_map
27 docs/dbconn.rst
@@ -0,0 +1,27 @@
+Database Connections
+====================
+
+.. module:: pyrseas.dbconn
+
+The :mod:`dbconn` module defines :class:`DbConnection`.
+
+Database Connection
+-------------------
+
+A :class:`DbConnection` is a helper class representing a connection to
+a `PostgreSQL <http://www.postgresql.org>`_ database via the `Psycopg
+<http://initd.org/psycopg/>`_ adapter. A :class:`DbConnection` is not
+necessarily connected. It will typically connect to the database when
+the :class:`~pyrseas.dbobject.DbObjectDict`
+:meth:`~pyrseas.dbobject.DbObjectDict.fetch` method is first
+invoked. It is normally disconnected just before the
+:class:`~pyrseas.database.Database`
+:meth:`~pyrseas.database.Database.from_catalog` returns.
+
+.. autoclass:: DbConnection
+
+.. automethod:: DbConnection.connect
+
+.. automethod:: DbConnection.fetchone
+
+.. automethod:: DbConnection.fetchall
61 docs/dbobject.rst
@@ -0,0 +1,61 @@
+Database Objects
+================
+
+.. module:: pyrseas.dbobject
+
+The :mod:`dbobject` module defines two low-level classes and an
+intermediate class. Most Pyrseas classes are derived from either
+:class:`DbObject` or :class:`DbObjectDict`.
+
+Database Object
+---------------
+
+A :class:`DbObject` represents a database object such as a
+schema, table, or column, defined in a system catalog. It is
+initialized from a dictionary of attributes. Derived classes should
+define a :attr:`keylist` that is a list of attribute names that
+uniquely identify each object instance within the database.
+
+.. autoclass:: DbObject
+
+.. automethod:: DbObject.key
+
+Database Object Dictionary
+--------------------------
+
+A :class:`DbObjectDict` represents a collection of :class:`DbObject`'s
+and is derived from the Python built-in type :class:`dict`. If a
+:class:`~pyrseas.dbconn.DbConnection` object is used for
+initialization, an internal method is called to initialize the
+dictionary from the database catalogs. The :class:`DbObjectDict`
+:meth:`fetch` method fetches all objects using the :attr:`query`
+defined by derived classes. Derived classes should also define a
+:attr:`cls` attribute for the associated :class:`DbObject` class,
+e.g., :class:`~pyrseas.schema.SchemaDict` sets :attr:`cls` to
+:class:`~pyrseas.schema.Schema`.
+
+.. autoclass:: DbObjectDict
+
+.. automethod:: DbObjectDict.fetch
+
+Schema Object
+-------------
+
+A :class:`DbSchemaObject` is derived from :class:`DbObject`. It is
+used as a base class for objects owned by a schema and to define
+certain common methods. This is different from the
+:class:`~pyrseas.schema.Schema` that represents the schema itself.
+
+.. autoclass:: DbSchemaObject
+
+.. automethod:: DbSchemaObject.extern_key
+
+.. automethod:: DbSchemaObject.qualname
+
+.. automethod:: DbSchemaObject.unqualify
+
+.. automethod:: DbSchemaObject.drop
+
+.. automethod:: DbSchemaObject.rename
+
+.. automethod:: DbSchemaObject.set_search_path
132 docs/dbtoyaml.rst
@@ -0,0 +1,132 @@
+dbtoyaml - Database to YAML
+===========================
+
+Name
+----
+
+dbtoyaml -- extract the schema of a PostgreSQL database in YAML format
+
+Synopsys
+--------
+
+::
+
+ dbtoyaml [option...] dbname
+
+Description
+-----------
+
+:program:`dbtoyaml` is a utility for extracting the schema of a
+PostgreSQL database to a `YAML <http://yaml.org>`_ formatted
+specification. Note that `JSON <http://json.org/>`_ is an official
+subset of YAML version 1.2, so the :program:`dbtoyaml` output should
+also be compatible with JSON tools.
+
+The output format is as follows::
+
+ schema public:
+ table t1:
+ check_constraints:
+ check_expr: (c2 > 123)
+ columns:
+ - c2
+ columns:
+ - c1:
+ not_null: true
+ type: integer
+ - c2:
+ type: smallint
+ - c3:
+ default: 'false'
+ type: boolean
+ - c4:
+ type: text
+ primary_key:
+ t1_pkey:
+ access_method: btree
+ columns:
+ - c1
+ foreign_keys:
+ t1_c2_fkey:
+ columns:
+ - c2
+ references:
+ columns:
+ - c21
+ schema: s1
+ table: t2
+ schema s1:
+ table t2:
+ columns:
+ - c21:
+ not_null: true
+ type: integer
+ - c22:
+ type: character varying(16)
+ primary_key:
+ t2_pkey:
+ access_method: btree
+ columns:
+ - c21
+
+The above should be mostly self-explanatory. The example database has
+two tables, named ``t1`` and ``t2``, the first in the ``public``
+schema and the second in a schema named ``s1``. The ``columns:``
+specifications directly under each table list each column in that
+table, in the same order as shown by PostgreSQL. The specifications
+``primary_key:``, ``foreign_keys:`` and ``check_constraints:`` define
+PRIMARY KEY, FOREIGN KEY and CHECK constraints for a given
+table. Additional specifications (not shown) define unique constraints
+and indexes.
+
+:program:`dbtoyaml` currently supports extracting information about
+schemas, sequences, tables, columns, primary keys, foreign keys,
+unique constraints, check constraints and indexes.
+
+Options
+-------
+
+:program:`dbtoyaml` accepts the following command-line arguments:
+
+dbname
+
+ Specifies the name of the database whose schema is to extracted.
+
+-H `host`, --host= `host`
+
+ Specifies the host name of the machine on which the PostgreSQL
+ server is running. The default host name is 'localhost'.
+
+-n `schema`, --schema= `schema`
+
+ Extracts only a schema matching `schema`. By default, all schemas
+ are extracted.
+
+-p `port`, --port= `port`
+
+ Specifies the TCP port on which the PostgreSQL server is listening
+ for connections. The default port number is 5432.
+
+-t `table`, \--table= `table`
+
+ Extract only tables matching `table`.
+
+-U `username`, --user= `username`
+
+ User name to connect as. The default user name is provided by the
+ environment variable :envvar:`USER`.
+
+Examples
+--------
+
+To extract a database called ``moviesdb`` into a file::
+
+ dbtoyaml moviesdb > moviesdb.yaml
+
+To extract only the schema named ``store``::
+
+ dbtoyaml --schema=store moviesdb > moviesdb.yaml
+
+To extract the tables named ``film`` and ``category``::
+
+ dbtoyaml -t film -t category moviesdb > moviesdb.yaml
70 docs/index.rst
@@ -0,0 +1,70 @@
+Pyrseas
+=======
+
+Pyrseas provides a framework and utilities to upgrade and maintain a
+relational database. Its purpose is to enhance and follow through on
+the concepts of the `Andromeda Project
+<http://www.andromeda-project.org/>`_. The name comes from `Python
+<http://www.python.org/>`_, the programming language, and `Perseas
+<http://en.wikipedia.org/wiki/Perseus>`_ [#]_, the Greek mythological hero
+who rescued Andromeda from a sea monster [#]_.
+
+Pyrseas currently includes the dbtoyaml utility to create a `YAML
+<http://yaml.org/>`_ description of a PostgreSQL database's tables,
+and the yamltodb utility to generate SQL statements to modify a
+database to match an input YAML specification.
+
+
+Contents:
+
+.. toctree::
+ :maxdepth: 2
+
+ overview
+.. toctree::
+ :maxdepth: 1
+
+ dbtoyaml
+ yamltodb
+
+API Reference
+-------------
+
+Currently, the only external APIs are the classes
+:class:`~pyrseas.dbconn.DbConnection` and
+:class:`~pyrseas.database.Database` and the methods
+:meth:`~pyrseas.database.Database.to_map` and
+:meth:`~pyrseas.database.Database.diff_map` of the latter. Other
+classes and methods are documented mainly for developer use.
+
+.. toctree::
+ :maxdepth: 2
+
+ dbobject
+ dbconn
+ database
+ schema
+ table
+ column
+ constraint
+ indexes
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
+
+
+.. rubric:: Footnotes
+
+.. [#] The common English name for Perseas is Perseus and the Ancient
+ Greek name is Perseos. However, in modern Greek Περσέας_ is the
+ more common spelling for the mythical hero. The project would be
+ Πυρσέας or ΠΥΡΣΕΑΣ in Greek.
+
+.. _Περσέας: http://en.wiktionary.org/wiki/%CE%A0%CE%B5%CF%81%CF%83%CE%AD%CE%B1%CF%82
+
+.. [#] He is better known for having killed Medusa.
+
43 docs/indexes.rst
@@ -0,0 +1,43 @@
+Indexes
+=======
+
+.. module:: pyrseas.index
+
+The :mod:`index` module defines two classes, :class:`Index` and
+:class:`IndexDict`, derived from :class:`DbSchemaObject` and
+:class:`DbObjectDict`, respectively.
+
+Index
+-----
+
+Class :class:`Index` is derived from
+:class:`~pyrseas.dbobject.DbSchemaObject` and represents an index on a
+database table, other than a primary key or unique constraint
+index. Its :attr:`keylist` attributes are the schema name, the table
+name and the index name.
+
+An :class:`Index` has the following attributes: :attr:`access_method`,
+:attr:`unique`, and :attr:`keycols`.
+
+.. autoclass:: Index
+
+. automethod:: Index.key_columns
+
+. automethod:: Index.to_map
+
+.. automethod:: Index.create
+
+.. automethod:: Index.diff_map
+
+Index Dictionary
+----------------
+
+Class :class:`IndexDict` is derived from
+:class:`~pyrseas.dbobject.DbObjectDict` and represents the collection
+of indexes in a database.
+
+.. autoclass:: IndexDict
+
+.. automethod:: IndexDict.from_map
+
+.. automethod:: IndexDict.diff_map
77 docs/overview.rst
@@ -0,0 +1,77 @@
+.. -*- coding: utf-8 -*-
+
+Overview
+========
+
+Pyrseas provides a framework and utilities to create, upgrade and
+maintain a `PostgreSQL <http://www.postgresql.org/>`_ database. Its
+purpose is to enhance and follow through on the concepts of the
+`Andromeda Project <http://www.andromeda-project.org/>`_.
+
+Whereas Andromeda expects the database designer or developer to
+provide a single `YAML <http://yaml.org/>`_ specification file of the
+database to be created, Pyrseas allows the development database to be
+created using the familar SQL CREATE statements. The developer can
+then run the `dbtoyaml` utility to generate the YAML specification from
+the database. The spec can then be stored in any desired VCS
+repository. Similarly, she can add columns or modify tables or other
+objects using SQL ALTER statements and regenerate the YAML spec with
+dbtoyaml.
+
+When ready to create or upgrade a test or production database, the
+`yamltodb` utility can be used with the YAML spec as input, to generate
+a script of SQL CREATE or ALTER statements to modify the database so
+that it matches the input spec.
+
+Andromeda also uses the YAML specification to generate a PHP-based
+application to maintain the database tables. Pyrseas `dbappgen`
+utility will allow a secondary YAML spec to generate a Python-based
+administrative application for database maintenance, which can be
+activated using `dbapprun`.
+
+Use Cases
+---------
+
+The following two sections discuss the main scenarios where Pyrseas
+tools may be helpful. The first deals with the problem of controlling
+database structural changes while the second examines the topic of
+repetitive database maintenance operations.
+
+Version Control
+---------------
+
+The case for implementing a tool to facilitate version control over
+SQL databases has been made in a couple of blog posts: `Version
+Control, Part 1: Pre-SQL
+<http://pyrseas.wordpress.com/2011/02/01/version-control-part-i-pre-sql/>`_
+and `Version Control, Part 2: SQL Databases
+<http://pyrseas.wordpress.com/2011/02/07/version-control-part-2-sql-databases/>`_. In
+summary, SQL data definition commands are generally incompatible with
+traditional version control approaches which usually require
+comparisons (diffs) between revisions of source files.
+
+The Pyrseas version control tools are not designed to be the ultimate
+SQL database version control solution. Instead, they are aimed at
+assisting two or more developers or DbAs in sharing changes to the
+underlying database as they implement a database application. The
+sharing can occur through a centralized or distributed VCS. The
+Pyrseas tools may even be used by a single DbA in conjunction with a
+distributed VCS to quickly explore alternative designs. The tools can
+also help to share changes with a conventional QA team, but may
+require additional controls for final releases and production
+installations.
+
+Data Maintenance
+----------------
+
+Pyrseas data administration tools (to be developed) aim to supplement
+the agile database development process mentioned above. While there
+are tools such as `pgAdmin III <http://www.pgadmin.org/>`_ that can be
+used for routine data entry tasks, their scope of action is usually a
+single table. For example, if you're entering data for a customer
+invoice, you need to know (or find by querying) the customer ID. On
+the other hand, `Django's admin site application
+<http://docs.djangoproject.com/en/1.2/intro/tutorial02/>`_ can present
+more than one table on a web page, but it requires defining the
+database "model" in Python and has limitations on how the database can
+be structured.
60 docs/schema.rst
@@ -0,0 +1,60 @@
+Schemas
+=======
+
+.. module:: pyrseas.schema
+
+The :mod:`schema` module defines two classes, :class:`Schema` and
+:class:`SchemaDict`, derived from :class:`DbObject` and
+:class:`DbObjectDict`, respectively.
+
+Schema
+------
+
+:class:`Schema` is derived from :class:`~pyrseas.dbobject.DbObject`
+and represents a database schema, i.e., a collection of tables and
+other objects. The 'public' schema is special as in most contexts an
+unqualified object is assumed to be part of it, e.g., table "t" is
+usually shorthand for table "public.t."
+
+For now, the schema :attr:`name` is the only attribute and is of
+course the identifying attribute in the :class:`Schema`
+:attr:`keylist`.
+
+.. autoclass:: Schema
+
+.. automethod:: Schema.extern_key
+
+.. automethod:: Schema.to_map
+
+.. automethod:: Schema.create
+
+.. automethod:: Schema.drop
+
+.. automethod:: Schema.rename
+
+Schema Dictionary
+-----------------
+
+:class:`SchemaDict` is derived from
+:class:`~pyrseas.dbobject.DbObjectDict`. It is a dictionary that
+represents the collection of schemas in a database. Certain internal
+schemas (information_schema, pg_catalog, etc.) owned by the 'postgres'
+user are excluded.
+
+.. autoclass:: SchemaDict
+
+Method :meth:`from_map` is called from :class:`Database`
+:meth:`~pyrseas.database.Database.from_map` to start a recursive
+interpretation of the input map. The :obj:`inmap` argument is the same
+as input to the :meth:`~pyrseas.database.Database.diff_map` method of
+:class:`Database`. The :obj:`newdb` argument is the holder of
+:class:`~pyrseas.dbobject.DbObjectDict`-derived dictionaries which is
+filled in as the recursive interpretation proceeds.
+
+.. automethod:: SchemaDict.from_map
+
+.. automethod:: SchemaDict.link_refs
+
+.. automethod:: SchemaDict.to_map
+
+.. automethod:: SchemaDict.diff_map
123 docs/table.rst
@@ -0,0 +1,123 @@
+Tables, Views and Sequences
+===========================
+
+.. module:: pyrseas.table
+
+The :mod:`table` module defines four classes, :class:`DbClass` derived
+from :class:`DbSchemaObject`, classes :class:`Sequence` and
+:class:`Table` derived from :class:`DbClass`, and :class:`ClassDict`,
+derived from :class:`DbObjectDict`.
+
+Database Class
+--------------
+
+Class :class:`DbClass` is derived from
+:class:`~pyrseas.dbobject.DbSchemaObject` and represents a table, view
+or sequence as defined in the PostgreSQL `pg_class` catalog. Note:
+Views are not implemented yet.
+
+Sequence
+--------
+
+Class :class:`Sequence` is derived from :class:`DbClass` and
+represents a sequence generator. Its :attr:`keylist` attributes are
+the schema name and the sequence name.
+
+A :class:`Sequence` has the following attributes: :attr:`start_value`,
+:attr:`increment_by`, :attr:`max_value`, :attr:`min_value` and
+:attr:`cache_value`.
+
+The map returned by :meth:`to_map` and expected as argument by
+:meth:`diff_map` has the following structure::
+
+ {'sequence seq1':
+ {'start_value': 1,
+ 'increment_by': 1,
+ 'max_value': None,
+ 'min_value': None,
+ 'cache_value': 1
+ }
+ }
+
+Only the inner dictionary is passed to :meth:`diff_map`. The values
+are defaults so in practice an empty dictionary is also acceptable.
+
+.. autoclass:: Sequence
+
+.. automethod:: Sequence.get_attrs
+
+.. automethod:: Sequence.to_map
+
+.. automethod:: Sequence.create
+
+.. automethod:: Sequence.add_owner
+
+.. automethod:: Sequence.diff_map
+
+Table
+-----
+
+Class :class:`Table` is derived from :class:`DbClass` and represents a
+database table. Its :attr:`keylist` attributes are the schema name and
+the table name.
+
+The map returned by :meth:`to_map` and expected as argument by
+:meth:`diff_map` has a structure similar to the following::
+
+ {'table t1':
+ {'columns':
+ [
+ {'c1': {'type': 'integer', 'not_null': True}},
+ {'c2': {'type': 'text'}},
+ {'c3': {'type': 'smallint'},
+ {'c4': {'type': 'date', 'default': 'now()'}}
+ ]
+ },
+ 'primary_key':
+ {'t1_prim_key':
+ {'columns': ['c1', 'c2'], 'access_method': 'btree'}
+ },
+ 'foreign_keys':
+ {'t1_fgn_key1':
+ {'columns': ['c2', 'c3'],
+ 'references':
+ {'table': 't2', 'columns': ['pc2', 'pc1']}
+ },
+ 't1_fgn_key2':
+ {'columns': ['c2'],
+ 'references': {'table': 't3', 'columns': ['qc1']}
+ }
+ }
+ 'unique_constraints': {...},
+ 'indexes': {...}
+ }
+ }
+
+The values for :obj:`unique_constraints` and :obj:`indexes` follow a
+pattern similar to :obj:`primary_key`, but there can be more than one
+such specification.
+
+.. autoclass:: Table
+
+.. automethod:: Table.column_names
+
+.. automethod:: Table.to_map
+
+.. automethod:: Table.create
+
+.. automethod:: Table.diff_map
+
+Class Dictionary
+----------------
+
+Class :class:`ClassDict` is derived from
+:class:`~pyrseas.dbobject.DbObjectDict` and represents the collection
+of tables, views and sequences in a database.
+
+.. autoclass:: ClassDict
+
+.. automethod:: ClassDict.from_map
+
+.. automethod:: ClassDict.link_refs
+
+.. automethod:: ClassDict.diff_map
85 docs/yamltodb.rst
@@ -0,0 +1,85 @@
+yamltodb - YAML to Database
+===========================
+
+Name
+----
+
+yamltodb -- generate SQL statements to update a PostgreSQL database to
+match the schema specified in a YAML file
+
+Synopsys
+--------
+
+::
+
+ yamltodb [option...] dbname yamlspec
+
+Description
+-----------
+
+:program:`yamltodb` is a utility for generating SQL statements to
+update a PostgreSQL database so that it will match the schema
+specified in an input `YAML <http://yaml.org>`_ formatted
+specification file.
+
+For example, given the input file shown under :doc:`dbtoyaml`,
+:program:`yamltodb` outputs the following SQL statements::
+
+ CREATE SCHEMA s1;
+ CREATE TABLE t1 (
+ c1 integer NOT NULL,
+ c2 smallint,
+ c3 boolean DEFAULT false,
+ c4 text);
+ CREATE TABLE s1.t2 (
+ c21 integer NOT NULL,
+ c22 character varying(16));
+ ALTER TABLE s1.t2 ADD CONSTRAINT t2_pkey PRIMARY KEY (c21);
+ ALTER TABLE t1 ADD CONSTRAINT t1_pkey PRIMARY KEY (c1);
+ ALTER TABLE t1 ADD CONSTRAINT t1_c2_fkey FOREIGN KEY (c2) REFERENCES s1.t2 (c21);
+
+Options
+-------
+
+:program:`yamltodb` accepts the following command-line arguments:
+
+dbname
+
+ Specifies the name of the database whose schema is to analyzed.
+
+yamlspec
+
+ Specifies the location of the YAML specification.
+
+-H `host`, --host= `host`
+
+ Specifies the host name of the machine on which the PostgreSQL
+ server is running. The default host name is 'localhost'.
+
+-p `port`, --port= `port`
+
+ Specifies the TCP port on which the PostgreSQL server is listening
+ for connections. The default port number is 5432.
+
+-U `username`, --user= `username`
+
+ User name to connect as. The default user name is provided by the
+ environment variable :envvar:`USER`.
+
+-1, --single-transaction
+
+ Wrap the generated statements in BEGIN/COMMIT. This ensures that
+ either all the statements complete successfully, or no changes are
+ applied.
+
+Examples
+--------
+
+Given a YAML file named `moviesdb.yaml`, to generate SQL statements to
+update a database called `mymovies`::
+
+ yamltodb mymovies moviesdb.yaml
+
+To generate the statements as above and immediately update `mymovies`::
+
+ yamltodb mymovies moviesdb.yaml | psql mymovies
0  pyrseas/__init__.py
No changes.
114 pyrseas/database.py
@@ -0,0 +1,114 @@
+# -*- coding: utf-8 -*-
+"""
+ pyrseas.database
+ ~~~~~~~~~~~~~~~~
+
+ A `Database` is initialized with a DbConnection object. It
+ consists of one or two `Dicts` objects, each holding various
+ dictionary objects. The `db` Dicts object defines the database
+ schemas, including their tables and other objects, by querying the
+ system catalogs. The `ndb` Dicts object defines the schemas based
+ on the `input_map` supplied to the `from_map` method.
+"""
+from pyrseas.dbobject.schema import SchemaDict
+from pyrseas.dbobject.table import ClassDict
+from pyrseas.dbobject.column import ColumnDict
+from pyrseas.dbobject.constraint import ConstraintDict
+from pyrseas.dbobject.index import IndexDict
+
+
+def flatten(lst):
+ "Flatten a list possibly containing lists to a single list"
+ for elem in lst:
+ if isinstance(elem, list) and not isinstance(elem, basestring):
+ for subelem in flatten(elem):
+ yield subelem
+ else:
+ yield elem
+
+
+class Database(object):
+ """A database definition, from its catalogs and/or a YAML spec."""
+
+ class Dicts(object):
+ """A holder for dictionaries (maps) describing a database"""
+
+ def __init__(self, dbconn=None):
+ """Initialize the various DbObjectDict-derived dictionaries
+
+ :param dbconn: a DbConnection object
+ """
+ self.schemas = SchemaDict(dbconn)
+ self.tables = ClassDict(dbconn)
+ self.columns = ColumnDict(dbconn)
+ self.constraints = ConstraintDict(dbconn)
+ self.indexes = IndexDict(dbconn)
+
+ def __init__(self, dbconn):
+ """Initialize the database
+
+ :param dbconn: a DbConnection object
+ """
+ self.dbconn = dbconn
+ self.db = None
+
+ def from_catalog(self):
+ """Populate the database objects by querying the catalogs
+
+ The `db` holder is populated by various DbObjectDict-derived
+ classes by querying the catalogs. The objects in the
+ dictionary are then linked to related objects, e.g., columns
+ are linked to the tables they belong.
+ """
+ self.db = self.Dicts(self.dbconn)
+ if self.dbconn.conn:
+ self.dbconn.conn.close()
+ self.db.tables.link_refs(self.db.columns, self.db.constraints,
+ self.db.indexes)
+ self.db.schemas.link_refs(self.db.tables)
+
+ def from_map(self, input_map):
+ """Populate the new database objects from the input map
+
+ :param input_map: a YAML map defining the new database
+
+ The `ndb` holder is populated by various DbObjectDict-derived
+ classes by traversing the YAML input map. The objects in the
+ dictionary are then linked to related objects, e.g., columns
+ are linked to the tables they belong.
+ """
+ self.ndb = self.Dicts()
+ self.ndb.schemas.from_map(input_map, self.ndb)
+ self.ndb.tables.link_refs(self.ndb.columns, self.ndb.constraints,
+ self.ndb.indexes)
+ self.ndb.schemas.link_refs(self.ndb.tables)
+
+ def to_map(self):
+ """Convert the db maps to a single hierarchy suitable for YAML
+
+ :return: a YAML-suitable dictionary (without Python objects)
+ """
+ if not self.db:
+ self.from_catalog()
+ return self.db.schemas.to_map()
+
+ def diff_map(self, input_map):
+ """Generate SQL to transform an existing database
+
+ :param input_map: a YAML map defining the new database
+ :return: list of SQL statements
+
+ Compares the existing database definition, as fetched from the
+ catalogs, to the input YAML map and generates SQL statements
+ to transform the database into the one represented by the
+ input.
+ """
+ if not self.db:
+ self.from_catalog()
+ self.from_map(input_map)
+ stmts = self.db.schemas.diff_map(self.ndb.schemas)
+ stmts.append(self.db.tables.diff_map(self.ndb.tables))
+ stmts.append(self.db.constraints.diff_map(self.ndb.constraints))
+ stmts.append(self.db.indexes.diff_map(self.ndb.indexes))
+ stmts.append(self.db.columns.diff_map(self.ndb.columns))
+ return [s for s in flatten(stmts)]
82 pyrseas/dbconn.py
@@ -0,0 +1,82 @@
+# -*- coding: utf-8 -*-
+"""
+ pyrseas.dbconn
+ ~~~~~~~~~~~~~~
+
+ A `DbConnection` is a helper class representing a connection to a
+ PostgreSQL database.
+"""
+
+import os
+
+from psycopg2 import connect
+from psycopg2.extras import DictConnection
+
+
+class DbConnection(object):
+ """A database connection, possibly disconnected"""
+
+ def __init__(self, dbname, user=None, host='localhost', port=5432):
+ """Initialize the connection information
+
+ :param dbname: database name
+ :param user: user name
+ :param host: host name
+ :param port: host port number
+ """
+ self.dbname = dbname
+ self.user = user
+ self.host = host
+ self.port = port
+ self.conn = None
+
+ def connect(self):
+ """Connect to the database
+
+ If user is None, the USER environment variable is used
+ instead. The password is either not required or supplied by
+ other means, e.g., a $HOME/.pgpass file.
+ """
+ self.conn = connect("host=%s port=%d dbname=%s user=%s" % (
+ self.host, self.port, self.dbname,
+ self.user or os.getenv("USER")),
+ connection_factory=DictConnection)
+ self._execute("set search_path to public, pg_catalog")
+
+ def _execute(self, query):
+ """Create a cursor, execute a query and return the cursor"""
+ curs = self.conn.cursor()
+ try:
+ curs.execute(query)
+ except Exception, exc:
+ exc.args += (query, )
+ raise
+ return curs
+
+ def fetchone(self, query):
+ """Execute a single row SELECT query and return data
+
+ :param query: a SELECT query to be executed
+ :return: a psycopg2 DictRow
+
+ The cursor is closed and a rollback is issued.
+ """
+ curs = self._execute(query)
+ data = curs.fetchone()
+ curs.close()
+ self.conn.rollback()
+ return data
+
+ def fetchall(self, query):
+ """Execute a SELECT query and return data
+
+ :param query: a SELECT query to be executed
+ :return: a list of psycopg2 DictRow's
+
+ The cursor is closed and a rollback is issued.
+ """
+ curs = self._execute(query)
+ data = curs.fetchall()
+ curs.close()
+ self.conn.rollback()
+ return data
124 pyrseas/dbobject/__init__.py
@@ -0,0 +1,124 @@
+# -*- coding: utf-8 -*-
+"""
+ pyrseas.dbobject
+ ~~~~~~~~~~~~~~~~
+
+ This defines two low level classes and an intermediate class.
+ Most Pyrseas classes are derived from either DbObject or
+ DbObjectDict.
+"""
+
+
+class DbObject(object):
+ "A single object in a database catalog, e.g., a schema, a table, a column"
+
+ def __init__(self, **attrs):
+ """Initialize the catalog object from a dictionary of attributes
+
+ :param attrs: the dictionary of attributes
+
+ Attributes without a value are discarded.
+ """
+ for key, val in attrs.items():
+ if val:
+ setattr(self, key, val)
+
+ def key(self):
+ """Return a tuple that identifies the database object
+
+ :return: a single value or a tuple
+ """
+ lst = [getattr(self, k) for k in self.keylist]
+ return len(lst) == 1 and lst[0] or tuple(lst)
+
+
+class DbSchemaObject(DbObject):
+ "A database object that is owned by a certain schema"
+
+ def extern_key(self):
+ """Return the key to be used in external maps for this object
+
+ :return: string
+ """
+ return '%s %s' % (self.objtype.lower(), self.name)
+
+ def qualname(self):
+ """Return the schema-qualified name of the object
+
+ :return: string
+
+ No qualification is used if the schema is 'public'.
+ """
+ return self.schema == 'public' and self.name \
+ or "%s.%s" % (self.schema, self.name)
+
+ def unqualify(self):
+ """Adjust the schema and table name if the latter is qualified"""
+ if hasattr(self, 'table') and '.' in self.table:
+ tbl = self.table
+ dot = tbl.index('.')
+ if self.schema == tbl[:dot]:
+ self.table = tbl[dot + 1:]
+
+ def drop(self):
+ """Return a SQL DROP statement for the object
+
+ :return: SQL statement
+ """
+ if not hasattr(self, 'dropped') or not self.dropped:
+ self.dropped = True
+ return "DROP %s %s" % (self.objtype, self.qualname())
+ return []
+
+ def rename(self, newname):
+ """Return a SQL ALTER statement to RENAME the object
+
+ :return: SQL statement
+ """
+ return "ALTER %s %s RENAME TO %s" % (self.objtype, self.qualname(),
+ newname)
+
+ def set_search_path(self):
+ """Return a SQL SET search_path if not in the 'public' schema"""
+ stmt = ''
+ if self.schema != 'public':
+ stmt = "SET search_path TO %s, pg_catalog" % self.schema
+ return stmt
+
+
+class DbObjectDict(dict):
+ """A dictionary of database objects, all of the same type"""
+
+ cls = DbObject
+ query = ''
+
+ def __init__(self, dbconn=None):
+ """Initialize the dictionary
+
+ :param dbconn: a DbConnection object
+
+ If dbconn is not None, the _from_catalog method is called to
+ initialize the dictionary from the catalogs.
+ """
+ dict.__init__(self)
+ self.dbconn = dbconn
+ if dbconn:
+ self._from_catalog()
+
+ def _from_catalog(self):
+ """Initialize the dictionary by querying the catalogs
+
+ This is may be overriden by derived classes as needed.
+ """
+ for obj in self.fetch():
+ self[obj.key()] = obj
+
+ def fetch(self):
+ """Fetch all objects from the catalogs using the class query
+
+ :return: list of self.cls objects
+ """
+ if not self.dbconn.conn:
+ self.dbconn.connect()
+ data = self.dbconn.fetchall(self.query)
+ return [self.cls(**dict(row)) for row in data]
158 pyrseas/dbobject/column.py
@@ -0,0 +1,158 @@
+# -*- coding: utf-8 -*-
+"""
+ pyrseas.column
+ ~~~~~~~~~~~~~~
+
+ This module defines two classes: Column derived from
+ DbSchemaObject and ColumnDict derived from DbObjectDict.
+"""
+from pyrseas.dbobject import DbObjectDict, DbSchemaObject
+
+
+class Column(DbSchemaObject):
+ "A table column definition"
+
+ keylist = ['schema', 'table']
+
+ def to_map(self):
+ """Convert a column to a YAML-suitable format
+
+ :return: dictionary
+ """
+ dct = self.__dict__.copy()
+ for k in self.keylist:
+ del dct[k]
+ del dct['number'], dct['name']
+ return {self.name: dct}
+
+ def add(self):
+ """Return a string to specify the column in a CREATE or ALTER TABLE
+
+ :return: partial SQL statement
+ """
+ stmt = "%s %s" % (self.name, self.type)
+ if hasattr(self, 'not_null'):
+ stmt += ' NOT NULL'
+ if hasattr(self, 'default'):
+ if not self.default.startswith('nextval'):
+ stmt += ' DEFAULT ' + self.default
+ return stmt
+
+ def drop(self):
+ """Return string to drop the column via ALTER TABLE
+
+ :return: SQL statement
+ """
+ return "ALTER TABLE %s DROP COLUMN %s" % (self.table, self.name)
+
+ def set_sequence_default(self):
+ """Return SQL statements to set a nextval() DEFAULT
+
+ :return: list of SQL statements
+ """
+ stmts = []
+ pth = self.set_search_path()
+ if pth:
+ stmts.append(pth)
+ stmts.append("ALTER TABLE %s ALTER COLUMN %s SET DEFAULT %s" % (
+ self.table, self.name, self.default))
+ return stmts
+
+ def diff_map(self, incol):
+ """Generate SQL to transform an existing column
+
+ :param insequence: a YAML map defining the new column
+ :return: list of partial SQL statements
+
+ Compares the column to an input column and generates partial
+ SQL statements to transform it into the one represented by the
+ input.
+ """
+ stmts = []
+ base = "ALTER COLUMN %s " % self.name
+ # check NOT NULL
+ if not hasattr(self, 'not_null') and hasattr(incol, 'not_null'):
+ stmts.append(base + "SET NOT NULL")
+ if hasattr(self, 'not_null') and not hasattr(incol, 'not_null'):
+ stmts.append(base + "DROP NOT NULL")
+ # check data types
+ if not hasattr(self, 'type'):
+ raise ValueError("Column '%s' missing datatype" % self.name)
+ if not hasattr(incol, 'type'):
+ raise ValueError("Input column '%s' missing datatype" % incol.name)
+ if self.type != incol.type:
+ # validate type conversion?
+ stmts.append(base + "TYPE %s" % incol.type)
+ # check DEFAULTs
+ if not hasattr(self, 'default') and hasattr(incol, 'default'):
+ stmts.append(base + "SET DEFAULT %s" % incol.default)
+ if hasattr(self, 'default') and not hasattr(incol, 'default'):
+ stmts.append(base + "DROP DEFAULT")
+ return ", ".join(stmts)
+
+
+class ColumnDict(DbObjectDict):
+ "The collection of columns in tables in a database"
+
+ cls = Column
+ query = \
+ """SELECT nspname AS schema, relname AS table, attname AS name,
+ attnum AS number, format_type(atttypid, atttypmod) AS type,
+ attnotnull AS not_null, adsrc AS default
+ FROM pg_attribute JOIN pg_class ON (attrelid = pg_class.oid)
+ JOIN pg_namespace ON (relnamespace = pg_namespace.oid)
+ JOIN pg_roles ON (nspowner = pg_roles.oid)
+ LEFT JOIN pg_attrdef ON (attrelid = pg_attrdef.adrelid
+ AND attnum = pg_attrdef.adnum)
+ WHERE relkind = 'r'
+ AND (nspname = 'public' OR rolname <> 'postgres')
+ AND attnum > 0
+ AND NOT attisdropped
+ ORDER BY nspname, relname, attnum"""
+
+ def _from_catalog(self):
+ """Initialize the dictionary of columns by querying the catalogs"""
+ for col in self.fetch():
+ sch, tbl = col.key()
+ if (sch, tbl) not in self:
+ self[(sch, tbl)] = []
+ self[(sch, tbl)].append(col)
+
+ def from_map(self, table, incols):
+ """Initialize the dictionary of columns by converting the input list
+
+ :param table: table owning the columns
+ :param incols: YAML list defining the columns
+ """
+ if not incols:
+ raise ValueError("Table '%s' has no columns" % table.name)
+ cols = self[(table.schema, table.name)] = []
+
+ for col in incols:
+ for key in col.keys():
+ cols.append(Column(schema=table.schema, table=table.name,
+ name=key, **col[key]))
+
+ def diff_map(self, incols):
+ """Generate SQL to transform existing columns
+
+ :param incols: a YAML map defining the new columns
+ :return: list of SQL statements
+
+ Compares the existing column definitions, as fetched from the
+ catalogs, to the input map and generates SQL statements to
+ transform the columns accordingly.
+
+ This takes care of dropping columns that are not present in
+ the input map. It's separate so that it can be done last,
+ after other table, constraint and index changes.
+ """
+ stmts = []
+ if not incols or not self:
+ return stmts
+ incolnames = [n.name for n in incols.values()[0]]
+ for col in self.values()[0]:
+ if col.name not in incolnames:
+ stmts.append(col.drop())
+
+ return stmts
391 pyrseas/dbobject/constraint.py
@@ -0,0 +1,391 @@
+# -*- coding: utf-8 -*-
+"""
+ pyrseas.constraint
+ ~~~~~~~~~~~~~~~~~~
+
+ This module defines six classes: Constraint derived from
+ DbSchemaObject, CheckConstraint, PrimaryKey, ForeignKey and
+ UniqueConstraint derived from Constraint, and ConstraintDict
+ derived from DbObjectDict.
+"""
+from pyrseas.dbobject import DbObjectDict, DbSchemaObject
+
+
+class Constraint(DbSchemaObject):
+ """A constraint definition, such as a primary key, foreign key or
+ unique constraint"""
+
+ keylist = ['schema', 'table', 'name']
+
+ def key_columns(self):
+ """Return comma-separated list of key column names
+
+ :return: string
+ """
+ return ", ".join(self.keycols)
+
+ def _qualtable(self):
+ """Return a schema-qualified name for a newly constructed object"""
+ return DbSchemaObject(schema=self.schema, name=self.table).qualname()
+
+ def add(self):
+ """Return string to add the constraint via ALTER TABLE
+
+ :return: SQL statement
+
+ Works as is for primary keys and unique constraints but has
+ to be overridden for check constraints and foreign keys.
+ """
+ return "ALTER TABLE %s ADD CONSTRAINT %s %s (%s)" % (
+ DbSchemaObject(schema=self.schema, name=self.table).qualname(),
+ self.name,
+ self.objtype, self.key_columns())
+
+ def drop(self):
+ """Return string to drop the constraint via ALTER TABLE
+
+ :return: SQL statement
+ """
+ if not hasattr(self, 'dropped') or not self.dropped:
+ self.dropped = True
+ return "ALTER TABLE %s DROP CONSTRAINT %s" % (
+ self._qualtable(), self.qualname())
+ return []
+
+
+class CheckConstraint(Constraint):
+ "A check constraint definition"
+
+ objtype = "CHECK"
+
+ def to_map(self, dbcols):
+ """Convert a check constraint definition to a YAML-suitable format
+
+ :param dbcols: dictionary of dbobject columns
+ :return: dictionary
+ """
+ dct = self.__dict__.copy()
+ for k in self.keylist:
+ del dct[k]
+ dct['columns'] = [dbcols[k - 1] for k in self.keycols]
+ del dct['keycols']
+ return {self.name: dct}
+
+ def add(self):
+ """Return string to add the CHECK constraint via ALTER TABLE
+
+ :return: SQL statement
+ """
+ return "ALTER TABLE %s ADD CONSTRAINT %s %s (%s)" % (
+ self._qualtable(), self.name, self.objtype, self.expression)
+
+ def diff_map(self, inchk):
+ """Generate SQL to transform an existing CHECK constraint
+
+ :param inchk: a YAML map defining the new CHECK constraint
+ :return: list of SQL statements
+
+ Compares the CHECK constraint to an input constraint and generates
+ SQL statements to transform it into the one represented by the
+ input.
+ """
+ stmts = []
+ # TODO: to be implemented
+ return stmts
+
+
+class PrimaryKey(Constraint):
+ "A primary key constraint definition"
+
+ objtype = "PRIMARY KEY"
+
+ def to_map(self, dbcols):
+ """Convert a primary key definition to a YAML-suitable format
+
+ :param dbcols: dictionary of dbobject columns
+ :return: dictionary
+ """
+ dct = self.__dict__.copy()
+ for k in self.keylist:
+ del dct[k]
+ dct['columns'] = [dbcols[k - 1] for k in self.keycols]
+ del dct['keycols']
+ return {self.name: dct}
+
+ def diff_map(self, inpk):
+ """Generate SQL to transform an existing primary key
+
+ :param inpk: a YAML map defining the new primary key
+ :return: list of SQL statements
+
+ Compares the primary key to an input primary key and generates
+ SQL statements to transform it into the one represented by the
+ input.
+ """
+ stmts = []
+ # TODO: to be implemented (via ALTER DROP and ALTER ADD)
+ return stmts
+
+
+class ForeignKey(Constraint):
+ "A foreign key constraint definition"
+
+ objtype = "FOREIGN KEY"
+
+ def ref_columns(self):
+ """Return comma-separated list of reference column names
+
+ :return: string
+ """
+ return ", ".join(self.ref_cols)
+
+ def to_map(self, dbcols, refcols):
+ """Convert a foreign key definition to a YAML-suitable format
+
+ :param dbcols: dictionary of dbobject columns
+ :return: dictionary
+ """
+ dct = self.__dict__.copy()
+ for k in self.keylist:
+ del dct[k]
+ dct['columns'] = [dbcols[k - 1] for k in self.keycols]
+ del dct['keycols']
+ refsch = hasattr(self, 'ref_schema') and self.ref_schema or self.schema
+ ref_cols = [refcols[k - 1] for k in self.ref_cols]
+ dct['references'] = {'table': dct['ref_table'], 'columns': ref_cols}
+ if 'ref_schema' in dct:
+ dct['references'].update(schema=dct['ref_schema'])
+ del dct['ref_schema']
+ del dct['ref_table'], dct['ref_cols']
+ return {self.name: dct}
+
+ def add(self):
+ """Return string to add the foreign key via ALTER TABLE
+
+ :return: SQL statement
+ """
+ return "ALTER TABLE %s ADD CONSTRAINT %s FOREIGN KEY (%s) " \
+ "REFERENCES %s (%s)" % (
+ self._qualtable(), self.name, self.key_columns(),
+ self.references.qualname(), self.ref_columns())
+
+ def diff_map(self, infk):
+ """Generate SQL to transform an existing foreign key
+
+ :param infk: a YAML map defining the new foreign key
+ :return: list of SQL statements
+
+ Compares the foreign key to an input foreign key and generates
+ SQL statements to transform it into the one represented by the
+ input.
+ """
+ stmts = []
+ # TODO: to be implemented (via ALTER DROP and ALTER ADD)
+ return stmts
+
+
+class UniqueConstraint(Constraint):
+ "A unique constraint definition"
+
+ objtype = "UNIQUE"
+
+ def to_map(self, dbcols):
+ """Convert a unique constraint definition to a YAML-suitable format
+
+ :param dbcols: dictionary of dbobject columns
+ :return: dictionary
+ """
+ dct = self.__dict__.copy()
+ for k in self.keylist:
+ del dct[k]
+
+ dct['columns'] = []
+ dct['columns'] = [dbcols[k - 1] for k in self.keycols]
+ del dct['keycols']
+ return {self.name: dct}
+
+ def diff_map(self, inuc):
+ """Generate SQL to transform an existing unique constraint
+
+ :param inuc: a YAML map defining the new unique constraint
+ :return: list of SQL statements
+
+ Compares the unique constraint to an input unique constraint
+ and generates SQL statements to transform it into the one
+ represented by the input.
+ """
+ stmts = []
+ # TODO: to be implemented (via ALTER DROP and ALTER ADD)
+ return stmts
+
+
+class ConstraintDict(DbObjectDict):
+ "The collection of table or column constraints in a database"
+
+ cls = Constraint
+ query = \
+ """SELECT nspname AS schema, conrelid::regclass AS table,
+ conname AS name, contype AS type, conkey AS keycols,
+ confrelid::regclass AS ref_table, confkey AS ref_cols,
+ consrc AS expression, amname AS access_method
+ FROM pg_constraint
+ JOIN pg_namespace ON (connamespace = pg_namespace.oid)
+ JOIN pg_roles ON (nspowner = pg_roles.oid)
+ LEFT JOIN pg_class on (conname = relname)
+ LEFT JOIN pg_am on (relam = pg_am.oid)
+ WHERE (nspname = 'public' OR rolname <> 'postgres')
+ ORDER BY schema, 2, name"""
+
+ def _from_catalog(self):
+ """Initialize the dictionary of constraints by querying the catalogs"""
+ for constr in self.fetch():
+ constr.unqualify()
+ sch, tbl, cns = constr.key()
+ constr_type = constr.type
+ del constr.type
+ if constr_type == 'c':
+ del constr.ref_table
+ self[(sch, tbl, cns)] = CheckConstraint(**constr.__dict__)
+ elif constr_type == 'p':
+ del constr.ref_table
+ self[(sch, tbl, cns)] = PrimaryKey(**constr.__dict__)
+ elif constr_type == 'f':
+ # normalize reference schema/table
+ reftbl = constr.ref_table
+ if '.' in reftbl:
+ dot = reftbl.index('.')
+ constr.ref_table = reftbl[dot + 1:]
+ constr.ref_schema = reftbl[:dot]
+ else:
+ constr.ref_schema = constr.schema
+ self[(sch, tbl, cns)] = ForeignKey(**constr.__dict__)
+ elif constr_type == 'u':
+ del constr.ref_table
+ self[(sch, tbl, cns)] = UniqueConstraint(**constr.__dict__)
+
+ def from_map(self, table, inconstrs):
+ """Initialize the dictionary of constraints by converting the input map
+
+ :param table: table affected by the constraints
+ :param inconstrs: YAML map defining the constraints
+ """
+ if 'check_constraints' in inconstrs:
+ chks = inconstrs['check_constraints']
+ for cns in chks.keys():
+ check = CheckConstraint(table=table.name, schema=table.schema,
+ name=cns)
+ val = chks[cns]
+ try:
+ check.expression = val['expression']
+ except KeyError, exc:
+ exc.args = ("Constraint '%s' is missing expression"
+ % cns, )
+ raise
+ if check.expression[0] == '(' and check.expression[-1] == ')':
+ check.expression = check.expression[1:-1]
+ if 'columns' in val:
+ check.keycols = val['columns']
+ self[(table.schema, table.name, cns)] = check
+ if 'primary_key' in inconstrs:
+ cns = inconstrs['primary_key'].keys()[0]
+ pkey = PrimaryKey(table=table.name, schema=table.schema,
+ name=cns)
+ val = inconstrs['primary_key'][cns]
+ try:
+ pkey.keycols = val['columns']
+ except KeyError, exc:
+ exc.args = ("Constraint '%s' is missing columns" % cns, )
+ raise
+ if 'access_method' in val:
+ pkey.access_method = val['access_method']
+ self[(table.schema, table.name, cns)] = pkey
+ if 'foreign_keys' in inconstrs:
+ fkeys = inconstrs['foreign_keys']
+ for cns in fkeys.keys():
+ fkey = ForeignKey(table=table.name, schema=table.schema,
+ name=cns)
+ val = fkeys[cns]
+ try:
+ fkey.keycols = val['columns']
+ except KeyError, exc:
+ exc.args = ("Constraint '%s' is missing columns" % cns, )
+ raise
+ try:
+ refs = val['references']
+ except KeyError, exc:
+ exc.args = ("Constraint '%s' missing references" % cns, )
+ raise
+ try:
+ fkey.ref_table = refs['table']
+ except KeyError, exc:
+ exc.args = ("Constraint '%s' missing table reference"
+ % cns, )
+ raise
+ try:
+ fkey.ref_cols = refs['columns']
+ except KeyError, exc:
+ exc.args = ("Constraint '%s' missing reference columns"
+ % cns, )
+ raise
+ sch = table.schema
+ if 'schema' in refs:
+ sch = refs['schema']
+ fkey.ref_schema = sch
+ self[(table.schema, table.name, cns)] = fkey
+ if 'unique_constraints' in inconstrs:
+ uconstrs = inconstrs['unique_constraints']
+ for cns in uconstrs.keys():
+ unq = UniqueConstraint(table=table.name, schema=table.schema,
+ name=cns)
+ val = uconstrs[cns]
+ try:
+ unq.keycols = val['columns']
+ except KeyError, exc:
+ exc.args = ("Constraint '%s' is missing columns" % cns, )
+ raise
+ if 'access_method' in val:
+ unq.access_method = val['access_method']
+ self[(table.schema, table.name, cns)] = unq
+
+ def diff_map(self, inconstrs):
+ """Generate SQL to transform existing constraints
+
+ :param inconstrs: a YAML map defining the new constraints
+ :return: list of SQL statements
+
+ Compares the existing constraint definitions, as fetched from
+ the catalogs, to the input map and generates SQL statements to
+ transform the constraints accordingly.
+ """
+ stmts = []
+ # foreign keys are processed in a second pass
+ # constraints cannot be renamed
+ for turn in (1, 2):
+ # check database constraints
+ for (sch, tbl, cns) in self.keys():
+ constr = self[(sch, tbl, cns)]
+ if isinstance(constr, ForeignKey):
+ if turn == 1:
+ continue
+ elif turn == 2:
+ continue
+ # if missing, drop it
+ if (sch, tbl, cns) not in inconstrs:
+ stmts.append(constr.drop())
+ # check input constraints
+ for (sch, tbl, cns) in inconstrs.keys():
+ inconstr = inconstrs[(sch, tbl, cns)]
+ if isinstance(inconstr, ForeignKey):
+ if turn == 1:
+ continue
+ elif turn == 2:
+ continue
+ # does it exist in the database?
+ if (sch, tbl, cns) not in self:
+ # add the new constraint
+ stmts.append(inconstr.add())
+ else:
+ # check constraint objects
+ stmts.append(self[(sch, tbl, cns)].diff_map(inconstr))
+
+ return stmts
163 pyrseas/dbobject/index.py
@@ -0,0 +1,163 @@
+# -*- coding: utf-8 -*-
+"""
+ pyrseas.index
+ ~~~~~~~~~~~~~
+
+ This defines two classes, Index and IndexDict, derived
+ from DbSchemaObject and DbObjectDict, respectively.
+"""
+from pyrseas.dbobject import DbObjectDict, DbSchemaObject
+
+
+class Index(DbSchemaObject):
+ """A physical index definition, other than a primary key or unique
+ constraint index.
+ """
+
+ keylist = ['schema', 'table', 'name']
+ objtype = "INDEX"
+
+ def key_columns(self):
+ """Return comma-separated list of key column names
+
+ :return: string
+ """
+ return ", ".join(self.keycols)
+
+ def to_map(self, dbcols):
+ """Convert an index definition to a YAML-suitable format
+
+ :param dbcols: dictionary of dbobject columns
+ :return: dictionary
+ """
+ dct = self.__dict__.copy()
+ for k in self.keylist:
+ del dct[k]
+ dct['columns'] = [dbcols[int(k) - 1] for k in self.keycols.split()]
+ del dct['keycols']
+ return {self.name: dct}
+
+ def create(self):
+ """Return a SQL statement to CREATE the index
+
+ :return: SQL statement
+ """
+ unq = hasattr(self, 'unique') and self.unique
+ acc = hasattr(self, 'access_method') \
+ and 'USING %s ' % self.access_method or ''
+ return "CREATE %sINDEX %s ON %s %s(%s)" % (
+ unq and 'UNIQUE ' or '', self.name, self.table, acc,
+ self.key_columns())
+
+ def diff_map(self, inindex):
+ """Generate SQL to transform an existing index
+
+ :param inindex: a YAML map defining the new index
+ :return: list of SQL statements
+
+ Compares the index to an input index and generates SQL
+ statements to transform it into the one represented by the
+ input.
+ """
+ stmts = []
+ if not hasattr(self, 'unique'):
+ self.unique = False
+ if self.access_method != inindex.access_method \
+ or self.unique != inindex.unique:
+ stmts.append("DROP INDEX %s" % self.qualname())
+ self.access_method = inindex.access_method
+ self.unique = inindex.unique
+ stmts.append(self.create())
+ # TODO: need to deal with changes in keycols
+ return stmts
+
+
+class IndexDict(DbObjectDict):
+ "The collection of indexes on tables in a database"
+
+ cls = Index
+ query = \
+ """SELECT nspname AS schema, indrelid::regclass AS table,
+ pg_class.relname AS name, amname AS access_method,
+ indisunique AS unique, indkey AS keycols
+ FROM pg_index JOIN pg_class ON (indexrelid = pg_class.oid)
+ JOIN pg_namespace ON (relnamespace = pg_namespace.oid)
+ JOIN pg_roles ON (nspowner = pg_roles.oid)
+ JOIN pg_am ON (relam = pg_am.oid)
+ WHERE NOT indisprimary
+ AND (nspname = 'public' OR rolname <> 'postgres')
+ AND indrelid NOT IN (
+ SELECT conrelid FROM pg_constraint
+ WHERE contype = 'u')
+ ORDER BY schema, 2, name"""
+
+ def _from_catalog(self):
+ """Initialize the dictionary of indexes by querying the catalogs"""
+ for index in self.fetch():
+ index.unqualify()
+ sch, tbl, idx = index.key()
+ self[(sch, tbl, idx)] = index
+
+ def from_map(self, table, inindexes):
+ """Initialize the dictionary of indexes by converting the input map
+
+ :param table: table owning the indexes
+ :param inindexes: YAML map defining the indexes
+ """
+ for i in inindexes.keys():
+ idx = Index(schema=table.schema, table=table.name, name=i)
+ val = inindexes[i]
+ try:
+ idx.keycols = val['columns']
+ except KeyError, exc:
+ exc.args = ("Index '%s' is missing columns" % i, )
+ raise
+ for attr in ['access_method', 'unique']:
+ if attr in val:
+ setattr(idx, attr, val[attr])
+ if not hasattr(idx, 'unique'):
+ idx.unique = False
+ self[(table.schema, table.name, i)] = idx
+
+ def diff_map(self, inindexes):
+ """Generate SQL to transform existing indexes
+
+ :param inindexes: a YAML map defining the new indexes
+ :return: list of SQL statements
+
+ Compares the existing index definitions, as fetched from the
+ catalogs, to the input map and generates SQL statements to
+ transform the indexes accordingly.
+ """
+ stmts = []
+ # check input indexes
+ for (sch, tbl, idx) in inindexes.keys():
+ inidx = inindexes[(sch, tbl, idx)]
+ # does it exist in the database?
+ if (sch, tbl, idx) not in self:
+ # check for possible RENAME
+ if hasattr(inidx, 'oldname'):
+ oldname = inidx.oldname
+ try:
+ stmts.append(self[(sch, tbl, oldname)].rename(
+ inidx.name))
+ del self[(sch, tbl, oldname)]
+ except KeyError, exc:
+ exc.args = ("Previous name '%s' for index '%s' "
+ "not found" % (oldname, inidx.name), )
+ raise
+ else:
+ # create new index
+ stmts.append(inidx.create())
+
+ # check database indexes
+ for (sch, tbl, idx) in self.keys():
+ index = self[(sch, tbl, idx)]
+ # if missing, drop it
+ if (sch, tbl, idx) not in inindexes:
+ stmts.append(index.drop())
+ else:
+ # compare index objects
+ stmts.append(index.diff_map(inindexes[(sch, tbl, idx)]))
+
+ return stmts
172 pyrseas/dbobject/schema.py
@@ -0,0 +1,172 @@
+# -*- coding: utf-8 -*-
+"""
+ pyrseas.schema
+ ~~~~~~~~~~~~~~
+
+ This defines two classes, Schema and SchemaDict, derived from
+ DbObject and DbObjectDict, respectively.
+"""
+from pyrseas.dbobject import DbObjectDict, DbObject
+from table import Table, Sequence
+
+KEY_PREFIX = 'schema '
+
+
+class Schema(DbObject):
+ """A database schema definition, i.e., a named collection of tables,
+ views, triggers and other schema objects."""
+
+ keylist = ['name']
+
+ def extern_key(self):
+ """Return the key to be used in external maps for this schema
+
+ :return: string
+ """
+ return KEY_PREFIX + self.name
+
+ def to_map(self, dbschemas):
+ """Convert tables, etc., dictionaries to a YAML-suitable format
+
+ :param dbschemas: dictionary of schemas
+ :return: dictionary
+ """
+ key = self.extern_key()
+ schema = {key: {}}
+ if hasattr(self, 'sequences'):
+ seqs = {}
+ for seq in self.sequences.keys():
+ seqs.update(self.sequences[seq].to_map())
+ schema[key].update(seqs)
+ if hasattr(self, 'tables'):
+ tbls = {}
+ for tbl in self.tables.keys():
+ tbls.update(self.tables[tbl].to_map(dbschemas))
+ schema[key].update(tbls)
+ return schema
+
+ def create(self):
+ """Return SQL statement to CREATE the schema
+
+ :return: SQL statement
+ """
+ return "CREATE SCHEMA %s" % self.name
+
+ def drop(self):
+ """Return SQL statement to DROP the schema
+
+ :return: SQL statement
+ """
+ return "DROP SCHEMA %s CASCADE" % self.name
<