Skip to content

Commit

Permalink
Some improvements to the docs by Justin Pryzby
Browse files Browse the repository at this point in the history
  • Loading branch information
Cito committed Jan 4, 2019
1 parent f21b867 commit f334fbe
Show file tree
Hide file tree
Showing 12 changed files with 163 additions and 159 deletions.
5 changes: 2 additions & 3 deletions docs/contents/general.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,8 @@ provides some higher-level and PostgreSQL specific convenience methods.
from Python that has been developed by the Python DB-SIG in 1999.
The authoritative programming information for the DB-API is :pep:`0249`.

Both Python modules utilize the same lower level C extension module that
serves as a wrapper for the C API to PostgreSQL that is available in form
of the so-called "libpq" library.
Both Python modules utilize the same low-level C extension, which
serves as a wrapper for the "libpq" library, the C API to PostgreSQL.

This means you must have the libpq library installed as a shared library
on your client computer, in a version that is supported by PyGreSQL.
Expand Down
27 changes: 14 additions & 13 deletions docs/contents/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,32 +4,33 @@ Installation
General
-------

You must first have installed Python and PostgreSQL on your system.
If you want to access remote database only, you don't need to install
the full PostgreSQL server, but only the C interface (libpq). If you
are on Windows, make sure that the directory with libpq.dll is in your
``PATH`` environment variable.
You must first install Python and PostgreSQL on your system.
If you want to access remote databases only, you don't need to install
the full PostgreSQL server, but only the libpq C-interface library.
If you are on Windows, make sure that the directory that contains
libpq.dll is part of your ``PATH`` environment variable.

The current version of PyGreSQL has been tested with Python versions
2.6, 2.7 and 3.3 to 3.7, and PostgreSQL versions 9.0 to 9.6 and 10 or 11.

PyGreSQL will be installed as three modules, a dynamic module called
_pg.pyd, and two pure Python wrapper modules called pg.py and pgdb.py.
PyGreSQL will be installed as three modules, a shared library called
_pg.so (on Linux) or a DLL called _pg.pyd (on Windows), and two pure
Python wrapper modules called pg.py and pgdb.py.
All three files will be installed directly into the Python site-packages
directory. To uninstall PyGreSQL, simply remove these three files again.
directory. To uninstall PyGreSQL, simply remove these three files.


Installing with Pip
-------------------

This is the most easy way to install PyGreSQL if you have "pip" installed
on your computer. Just run the following command in your terminal::
This is the most easy way to install PyGreSQL if you have "pip" installed.
Just run the following command in your terminal::

pip install PyGreSQL

This will automatically try to find and download a distribution on the
`Python Package Index <https://pypi.python.org/>`_ that matches your operating
system and Python version and install it on your computer.
system and Python version and install it.


Installing from a Binary Distribution
Expand All @@ -40,7 +41,7 @@ distribution for your computer, you can also try to manually download
and install a distribution.

When you download the source distribution, you will need to compile the
C extensions, for which you need a C compiler installed on your computer.
C extension, for which you need a C compiler installed.
If you don't want to install a C compiler or avoid possible problems
with the compilation, you can search for a pre-compiled binary distribution
of PyGreSQL on the Python Package Index or the PyGreSQL homepage.
Expand Down Expand Up @@ -86,7 +87,7 @@ Now you should be ready to use PyGreSQL.
Compiling Manually
~~~~~~~~~~~~~~~~~~

The source file for compiling the dynamic module is called pgmodule.c.
The source file for compiling the C extension module is pgmodule.c.
You have two options. You can compile PyGreSQL as a stand-alone module
or you can build it into the Python interpreter.

Expand Down
68 changes: 34 additions & 34 deletions docs/contents/pg/adaptation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -61,28 +61,28 @@ Adaptation of parameters
When you use the higher level methods of the classic :mod:`pg` module like
:meth:`DB.insert()` or :meth:`DB.update()`, you don't need to care about
adaptation of parameters, since all of this is happening automatically behind
the scenes. You only need to consider this issue when creating SQL commands
manually and sending them to the database using the :meth:`DB.query` method.

Imagine you have created a user login form that stores the login name as
*login* and the password as *passwd* and you now want to get the user
data for that user. You may be tempted to execute a query like this::

the scenes. You only need to consider this issue when creating SQL commands
manually and sending them to the database using the :meth:`DB.query` method.

Imagine you have created a user login form that stores the login name as
*login* and the password as *passwd* and you now want to get the user
data for that user. You may be tempted to execute a query like this::

>>> db = pg.DB(...)
>>> sql = "SELECT * FROM user_table WHERE login = '%s' AND passwd = '%s'"
>>> db.query(sql % (login, passwd)).getresult()[0]

This seems to work at a first glance, but you will notice an error as soon as
you try to use a login name containing a single quote. Even worse, this error
can be exploited through a so called "SQL injection", where an attacker inserts
malicious SQL statements into the query that you never intended to be executed.
For instance, with a login name something like ``' OR ''='`` the user could
easily log in and see the user data of another user in the database.

One solution for this problem would be to clean your input from "dangerous"
characters like the single quote, but this is tedious and it is likely that
you overlook something or break the application e.g. for users with names
like "D'Arcy". A better solution is to use the escaping functions provided

This seems to work at a first glance, but you will notice an error as soon as
you try to use a login name containing a single quote. Even worse, this error
can be exploited through so-called "SQL injection", where an attacker inserts
malicious SQL statements into the query that you never intended to be executed.
For instance, with a login name something like ``' OR ''='`` the attacker could
easily log in and see the user data of another user in the database.

One solution for this problem would be to cleanse your input of "dangerous"
characters like the single quote, but this is tedious and it is likely that
you overlook something or break the application e.g. for users with names
like "D'Arcy". A better solution is to use the escaping functions provided
by PostgreSQL which are available as methods on the :class:`DB` object::

>>> login = "D'Arcy"
Expand Down Expand Up @@ -161,13 +161,13 @@ separately is to simply cast the parameter values::
>>> db.query_formatted(sql, (login,), inline=False).getresult()[0]

In real world examples you will rarely have to cast your parameters like that,
since in an INSERT statement or a WHERE clause comparing the parameter to a
table column the data type will be clear from the context.

When binding the parameters to a query, PyGreSQL does not only adapt the basic
types like ``int``, ``float``, ``bool`` and ``str``, but also tries to make
sense of Python lists and tuples.

since in an INSERT statement or a WHERE clause comparing the parameter to a
table column the data type will be clear from the context.

When binding the parameters to a query, PyGreSQL not only adapts the basic
types like ``int``, ``float``, ``bool`` and ``str``, but also tries to make
sense of Python lists and tuples.

Lists are adapted as PostgreSQL arrays::

>>> params = dict(array=[[1, 2],[3, 4]])
Expand Down Expand Up @@ -219,14 +219,14 @@ The :meth:`DB.insert` method provides a simpler way to achieve the same::

>>> row = dict(item=inventory_item('fuzzy dice', 42, 1.99), count=1000)
>>> db.insert('on_hand', row)
{'count': 1000, 'item': inventory_item(name='fuzzy dice',
supplier_id=42, price=Decimal('1.99'))}

However, we may not want to use named tuples, but custom Python classes
to hold our values, like this one::

>>> class InventoryItem:
...
{'count': 1000, 'item': inventory_item(name='fuzzy dice',
supplier_id=42, price=Decimal('1.99'))}

Perhaps we want to use custom Python classes instead of named tuples to hold
our values::

>>> class InventoryItem:
...
... def __init__(self, name, supplier_id, price):
... self.name = name
... self.supplier_id = supplier_id
Expand Down
33 changes: 17 additions & 16 deletions docs/contents/pg/connection.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,20 +43,21 @@ query -- execute a SQL command string
:raises pg.InternalError: error during query processing

This method simply sends a SQL query to the database. If the query is an
insert statement that inserted exactly one row into a table that has OIDs, the
return value is the OID of the newly inserted row. If the query is an update
or delete statement, or an insert statement that did not insert exactly one
row in a table with OIDs, then the number of rows affected is returned as a
string. If it is a statement that returns rows as a result (usually a select
statement, but maybe also an ``"insert/update ... returning"`` statement),
this method returns a :class:`Query` that can be accessed via the
insert statement that inserted exactly one row into a table that has OIDs,
the return value is the OID of the newly inserted row as an integer.
If the query is an update or delete statement, or an insert statement that
did not insert exactly one row, or on a table without OIDs, then the number
of rows affected is returned as a string. If it is a statement that returns
rows as a result (usually a select statement, but maybe also an
``"insert/update ... returning"`` statement), this method returns
a :class:`Query` that can be accessed via the
:meth:`Query.getresult`, :meth:`Query.dictresult` or
:meth:`Query.namedresult` methods or simply printed.
Otherwise, it returns ``None``.

The SQL command may optionally contain positional parameters of the form
``$1``, ``$2``, etc instead of literal data, in which case the values
have to be supplied separately as a tuple. The values are substituted by
must be supplied separately as a tuple. The values are substituted by
the database in such a way that they don't need to be escaped, making this
an effective way to pass arbitrary or unknown data without worrying about
SQL injection or syntax errors.
Expand Down Expand Up @@ -91,9 +92,9 @@ query_prepared -- execute a prepared statement

This method works exactly like :meth:`Connection.query` except that instead
of passing the command itself, you pass the name of a prepared statement.
An empty name corresponds to the unnamed statement. You must have created
the corresponding named or unnamed statement with :meth:`Connection.prepare`
before, or an :exc:`pg.OperationalError` will be raised.
An empty name corresponds to the unnamed statement. You must have previously
created the corresponding named or unnamed statement with
:meth:`Connection.prepare`, or an :exc:`pg.OperationalError` will be raised.

.. versionadded:: 5.1

Expand All @@ -111,8 +112,8 @@ prepare -- create a prepared statement
:raises TypeError: invalid connection
:raises pg.ProgrammingError: error in query or duplicate query

This method creates a prepared statement for the given command with the
given name for later execution with the :meth:`Connection.query_prepared`
This method creates a prepared statement with the specified name for the
given command for later execution with the :meth:`Connection.query_prepared`
method. The name can be empty to create an unnamed statement, in which case
any pre-existing unnamed statement is automatically replaced; otherwise a
:exc:`pg.ProgrammingError` is raised if the statement name is already defined
Expand Down Expand Up @@ -317,7 +318,7 @@ values may contain string, integer, long or double (real) values.
.. warning::

This method doesn't type check the fields according to the table definition;
it just look whether or not it knows how to handle such types.
it just looks whether or not it knows how to handle such types.

get/set_notice_receiver -- custom notice receiver
-------------------------------------------------
Expand Down Expand Up @@ -451,8 +452,8 @@ getlo -- build a large object from given oid [LO]
:raises TypeError: invalid connection, bad parameter type, or too many parameters
:raises ValueError: bad OID value (0 is invalid_oid)

This method allows to reuse a formerly created large object through the
:class:`LargeObject` interface, providing the user have its OID.
This method allows reusing a previously created large object through the
:class:`LargeObject` interface, provided the user has its OID.

loimport -- import a file to a large object [LO]
------------------------------------------------
Expand Down
6 changes: 3 additions & 3 deletions docs/contents/pg/db_types.rst
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,6 @@ be necessary to look up certain database settings.
objects of the running connections.

Also note that the typecasting for all of the basic types happens already
in the C extension module. The typecast functions that can be set with
the above methods are only called for the types that are not already
supported by the C extension module.
in the C low-level extension module. The typecast functions that can be
set with the above methods are only called for the types that are not
already supported by the C extension.
45 changes: 23 additions & 22 deletions docs/contents/pg/db_wrapper.rst
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ the values are the names of the attributes' types) with the column names
in the proper order if you iterate over it.

By default, only a limited number of simple types will be returned.
You can get the regular types after enabling this by calling the
You can get the regular types instead, if you enable this by calling the
:meth:`DB.use_regtypes` method.

has_table_privilege -- check table privilege
Expand Down Expand Up @@ -182,8 +182,8 @@ By passing the special name ``'all'`` as the parameter, you can get a dict
of all existing configuration parameters.

Note that you can request most of the important parameters also using
:meth:`Connection.parameter()` which does not involve a database query
like it is the case for :meth:`DB.get_parameter` and :meth:`DB.set_parameter`.
:meth:`Connection.parameter()` which does not involve a database query,
unlike :meth:`DB.get_parameter` and :meth:`DB.set_parameter`.

.. versionadded:: 4.2

Expand Down Expand Up @@ -314,7 +314,7 @@ If *row* is a dictionary, then the value for the key is taken from it.
Otherwise, the row must be a single value or a tuple of values
corresponding to the passed *keyname* or primary key. The fetched row
from the table will be returned as a new dictionary or used to replace
the existing values when row was passed as aa dictionary.
the existing values if the row was passed as a dictionary.

The OID is also put into the dictionary if the table has one, but
in order to allow the caller to work with multiple tables, it is
Expand Down Expand Up @@ -346,7 +346,7 @@ The dictionary is then reloaded with the values actually inserted in order
to pick up values modified by rules, triggers, etc.

Note that since PyGreSQL 5.0 it is possible to insert a value for an
array type column by passing it as Python list.
array type column by passing it as a Python list.

update -- update a row in a database table
------------------------------------------
Expand All @@ -372,9 +372,10 @@ The dictionary is then modified to reflect any changes caused by the
update due to triggers, rules, default values, etc.

Like insert, the dictionary is optional and updates will be performed
on the fields in the keywords. There must be an OID or primary key
either in the dictionary where the OID must be munged, or in the keywords
where it can be simply the string ``'oid'``.
on the fields in the keywords. There must be an OID or primary key either
specified using the ``'oid'`` keyword or in the dictionary, in which case the
OID must be munged.


upsert -- insert a row with conflict resolution
-----------------------------------------------
Expand All @@ -391,8 +392,8 @@ upsert -- insert a row with conflict resolution
:raises pg.ProgrammingError: table has no primary key or missing privilege

This method inserts a row into a table, but instead of raising a
ProgrammingError exception in case a row with the same primary key already
exists, an update will be executed instead. This will be performed as a
ProgrammingError exception in case of violating a constraint or unique index,
an update will be executed instead. This will be performed as a
single atomic operation on the database, so race conditions can be avoided.

Like the insert method, the first parameter is the name of the table and the
Expand Down Expand Up @@ -524,12 +525,12 @@ query_prepared -- execute a prepared statement
:raises pg.OperationalError: prepared statement does not exist

This methods works like the :meth:`DB.query` method, except that instead of
passing the SQL command, you pass the name of a prepared statement. If you
pass an empty name, the unnamed statement will be executed.
passing the SQL command, you pass the name of an prepared statement that you
have created previously using the :meth:`DB.prepare` method.

You must have created the corresponding named or unnamed statement with
the :meth:`DB.prepare` method before, otherwise an :exc:`pg.OperationalError`
will be raised.
Passing an empty string or *None* as the name will execute the unnamed
statement (see warning about the limited lifetime of the unnamed statement
in :meth:`DB.prepare`).

.. versionadded:: 5.1

Expand All @@ -547,8 +548,8 @@ prepare -- create a prepared statement
:raises TypeError: invalid connection
:raises pg.ProgrammingError: error in query or duplicate query

This method creates a prepared statement for the given command with the
given name for later execution with the :meth:`DB.query_prepared` method.
This method creates a prepared statement with the given name for the given
command for later execution with the :meth:`DB.query_prepared` method.
The name can be empty to create an unnamed statement, in which case any
pre-existing unnamed statement is automatically replaced; otherwise a
:exc:`pg.ProgrammingError` is raised if the statement name is already
Expand Down Expand Up @@ -806,9 +807,9 @@ from being interpreted specially by the SQL parser.

This method escapes a string for use as an SQL identifier, such as a table,
column, or function name. This is useful when a user-supplied identifier
might contain special characters that would otherwise not be interpreted
as part of the identifier by the SQL parser, or when the identifier might
contain upper case characters whose case should be preserved.
might contain special characters that would otherwise be misinterpreted
by the SQL parser, or when the identifier might contain upper case characters
whose case should be preserved.

.. versionadded:: 4.1

Expand Down Expand Up @@ -890,7 +891,7 @@ JSON data is automatically decoded by PyGreSQL. If you don't want the data
to be decoded, then you can cast ``json`` or ``jsonb`` columns to ``text``
in PostgreSQL or you can set the decoding function to *None* or a different
function using :func:`pg.set_jsondecode`. By default this is the same as
the :func:`json.dumps` function from the standard library.
the :func:`json.loads` function from the standard library.

.. versionadded:: 5.0

Expand All @@ -909,7 +910,7 @@ type names (the default) or more specific "regular" type names. Which kind
of type names is used can be changed by calling :meth:`DB.get_regtypes`.
If you pass a boolean, it sets whether regular type names shall be used.
The method can also be used to check through its return value whether
currently regular type names are used.
regular type names are currently used.

.. versionadded:: 4.1

Expand Down

0 comments on commit f334fbe

Please sign in to comment.