Many plugin_hooks
are passed objects that provide access to internal Datasette functionality. The interface to these objects should not be considered stable with the exception of methods that are documented here.
The request object is passed to various plugin hooks. It represents an incoming HTTP request. It has the following properties:
.scope
- dictionaryThe ASGI scope that was used to construct this request, described in the ASGI HTTP connection scope specification.
.method
- stringThe HTTP method for this request, usually
GET
orPOST
..url
- stringThe full URL for this request, e.g.
https://latest.datasette.io/fixtures
..scheme
- stringThe request scheme - usually
https
orhttp
..headers
- dictionary (str -> str)A dictionary of incoming HTTP request headers.
.cookies
- dictionary (str -> str)A dictionary of incoming cookies
.host
- stringThe host header from the incoming request, e.g.
latest.datasette.io
orlocalhost
..path
- stringThe path of the request, e.g.
/fixtures
..query_string
- stringThe querystring component of the request, without the
?
- e.g.name__contains=sam&age__gt=10
..args
- MultiParamsAn object representing the parsed querystring parameters, see below.
.url_vars
- dictionary (str -> str)Variables extracted from the URL path, if that path was defined using a regular expression. See
plugin_register_routes
..actor
- dictionary (str -> Any) or NoneThe currently authenticated actor (see
actors <authentication_actor>
), orNone
if the request is unauthenticated.
The object also has two awaitable methods:
await request.post_vars()
- dictionaryReturns a dictionary of form variables that were submitted in the request body via
POST
. Don't forget to read aboutinternals_csrf
!await request.post_body()
- bytesReturns the un-parsed body of a request submitted by
POST
- useful for things like incoming JSON data.
request.args
is a MultiParams
object - a dictionary-like object which provides access to querystring parameters that may have multiple values.
Consider the querystring ?foo=1&foo=2&bar=3
- with two values for foo
and one value for bar
.
request.args[key]
- stringReturns the first value for that key, or raises a
KeyError
if the key is missing. For the above examplerequest.args["foo"]
would return"1"
.request.args.get(key)
- string or NoneReturns the first value for that key, or
None
if the key is missing. Pass a second argument to specify a different default, e.g.q = request.args.get("q", "")
.request.args.getlist(key)
- list of stringsReturns the list of strings for that key.
request.args.getlist("foo")
would return["1", "2"]
in the above example.request.args.getlist("bar")
would return["3"]
. If the key is missing an empty list will be returned.request.args.keys()
- list of stringsReturns the list of available keys - for the example this would be
["foo", "bar"]
.key in request.args
- True or FalseYou can use
if key in request.args
to check if a key is present.for key in request.args
- iteratorThis lets you loop through every available key.
len(request.args)
- integerReturns the number of keys.
The Response
class can be returned from view functions that have been registered using the plugin_register_routes
hook.
The Response()
constructor takes the following arguments:
body
- stringThe body of the response.
status
- integer (optional)The HTTP status - defaults to 200.
headers
- dictionary (optional)A dictionary of extra HTTP headers, e.g.
{"x-hello": "world"}
.content_type
- string (optional)The content-type for the response. Defaults to
text/plain
.
For example:
from datasette.utils.asgi import Response
response = Response(
"<xml>This is XML</xml>",
content_type="application/xml; charset=utf-8"
)
The easiest way to create responses is using the Response.text(...)
, Response.html(...)
, Response.json(...)
or Response.redirect(...)
helper methods:
from datasette.utils.asgi import Response
html_response = Response.html("This is HTML")
json_response = Response.json({"this_is": "json"})
text_response = Response.text("This will become utf-8 encoded text")
# Redirects are served as 302, unless you pass status=301:
redirect_response = Response.redirect("https://latest.datasette.io/")
Each of these responses will use the correct corresponding content-type - text/html; charset=utf-8
, application/json; charset=utf-8
or text/plain; charset=utf-8
respectively.
Each of the helper methods take optional status=
and headers=
arguments, documented above.
To set cookies on the response, use the response.set_cookie(...)
method. The method signature looks like this:
def set_cookie(
self,
key,
value="",
max_age=None,
expires=None,
path="/",
domain=None,
secure=False,
httponly=False,
samesite="lax",
):
You can use this with datasette.sign() <datasette_sign>
to set signed cookies. Here's how you would set the ds_actor cookie <authentication_ds_actor>
for use with Datasette authentication <authentication>
:
response = Response.redirect("/")
response.set_cookie("ds_actor", datasette.sign({"a": {"id": "cleopaws"}}, "actor"))
return response
This object is an instance of the Datasette
class, passed to many plugin hooks as an argument called datasette
.
plugin_name
- stringThe name of the plugin to look up configuration for. Usually this is something similar to
datasette-cluster-map
.database
- None or stringThe database the user is interacting with.
table
- None or stringThe table the user is interacting with.
This method lets you read plugin configuration values that were set in metadata.json
. See writing_plugins_configuration
for full details of how this method should be used.
template
- stringThe template file to be rendered, e.g.
my_plugin.html
. Datasette will search for this file first in the--template-dir=
location, if it was specified - then in the plugin's bundled templates and finally in Datasette's set of default templates.context
- None or a Python dictionaryThe context variables to pass to the template.
request
- request object or NoneIf you pass a Datasette request object here it will be made available to the template.
Renders a Jinja template using Datasette's preconfigured instance of Jinja and returns the resulting string. The template will have access to Datasette's default template functions and any functions that have been made available by other plugins.
actor
- dictionaryThe authenticated actor. This is usually
request.actor
.action
- stringThe name of the action that is being permission checked.
resource
- string or tuple, optionalThe resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
default
- optional, True or FalseShould this permission check be default allow or default deny.
Check if the given actor has permission <authentication_permissions>
to perform the given action on the given resource.
Some permission checks are carried out against rules defined in metadata.json <authentication_permissions_metadata>
, while other custom permissions may be decided by plugins that implement the plugin_hook_permission_allowed
plugin hook.
If neither metadata.json
nor any of the plugins provide an answer to the permission query the default
argument will be returned.
See permissions
for a full list of permission actions included in Datasette core.
name
- string, optionalThe name of the database - optional.
Returns the specified database object. Raises a KeyError
if the database does not exist. Call this method without an argument to return the first connected database.
name
- stringThe unique name to use for this database. Also used in the URL.
db
- datasette.database.Database instanceThe database to be attached.
The datasette.add_database(name, db)
method lets you add a new database to the current Datasette instance. This database will then be served at URL path that matches the name
parameter, e.g. /mynewdb/
.
The db
parameter should be an instance of the datasette.database.Database
class. For example:
from datasette.database import Database
datasette.add_database("my-new-database", Database(
datasette,
path="path/to/my-new-database.db",
is_mutable=True
))
This will add a mutable database from the provided file path.
The Database()
constructor takes four arguments: the first is the datasette
instance you are attaching to, the second is a path=
, then is_mutable
and is_memory
are both optional arguments.
Use is_mutable
if it is possible that updates will be made to that database - otherwise Datasette will open it in immutable mode and any changes could cause undesired behavior.
Use is_memory
if the connection is to an in-memory SQLite database.
name
- stringThe name of the database to be removed.
This removes a database that has been previously added. name=
is the unique name of that database, also used in the URL for it.
value
- any serializable typeThe value to be signed.
namespace
- string, optionalAn alternative namespace, see the itsdangerous salt documentation.
Utility method for signing values, such that you can safely pass data to and from an untrusted environment. This is a wrapper around the itsdangerous library.
This method returns a signed string, which can be decoded and verified using datasette_unsign
.
signed
- any serializable typeThe signed string that was created using
datasette_sign
.namespace
- string, optionalThe alternative namespace, if one was used.
Returns the original, decoded object that was passed to datasette_sign
. If the signature is not valid this raises a itsdangerous.BadSignature
exception.
request
- RequestThe current Request object
message
- stringThe message string
message_type
- constant, optionalThe message type -
datasette.INFO
,datasette.WARNING
ordatasette.ERROR
Datasette's flash messaging mechanism allows you to add a message that will be displayed to the user on the next page that they visit. Messages are persisted in a ds_messages
cookie. This method adds a message to that cookie.
You can try out these messages (including the different visual styling of the three message types) using the /-/messages
debugging tool.
Instances of the Database
class can be used to execute queries against attached SQLite databases, and to run introspection against their schemas.
Executes a SQL query against the database and returns the resulting rows (see database_results
).
sql
- string (required)The SQL query to execute. This can include
?
or:named
parameters.params
- list or dictA list or dictionary of values to use for the parameters. List for
?
, dictionary for:named
.truncate
- booleanShould the rows returned by the query be truncated at the maximum page size? Defaults to
True
, set this toFalse
to disable truncation.custom_time_limit
- integer msA custom time limit for this query. This can be set to a lower value than the Datasette configured default. If a query takes longer than this it will be terminated early and raise a
dataette.database.QueryInterrupted
exception.page_size
- integerSet a custom page size for truncation, over-riding the configured Datasette default.
log_sql_errors
- booleanShould any SQL errors be logged to the console in addition to being raised as an error? Defaults to
True
.
The db.execute()
method returns a single Results
object. This can be used to access the rows returned by the query.
Iterating over a Results
object will yield SQLite Row objects. Each of these can be treated as a tuple or can be accessed using row["column"]
syntax:
info = []
results = await db.execute("select name from sqlite_master")
for row in results:
info.append(row["name"])
The Results
object also has the following properties and methods:
.truncated
- booleanIndicates if this query was truncated - if it returned more results than the specified
page_size
. If this is true then the results object will only provide access to the firstpage_size
rows in the query result. You can disable truncation by passingtruncate=False
to thedb.query()
method..columns
- list of stringsA list of column names returned by the query.
.rows
- list of sqlite3.RowThis property provides direct access to the list of rows returned by the database. You can access specific rows by index using
results.rows[0]
..first()
- row or NoneReturns the first row in the results, or
None
if no rows were returned..single_value()
Returns the value of the first column of the first row of results - but only if the query returned a single row with a single column. Raises a
datasette.database.MultipleValues
exception otherwise..__len__()
Calling
len(results)
returns the (truncated) number of returned results.
Executes a given callback function against a read-only database connection running in a thread. The function will be passed a SQLite connection, and the return value from the function will be returned by the await
.
Example usage:
def get_version(conn):
return conn.execute(
"select sqlite_version()"
).fetchall()[0][0]
version = await db.execute_fn(get_version)
SQLite only allows one database connection to write at a time. Datasette handles this for you by maintaining a queue of writes to be executed against a given database. Plugins can submit write operations to this queue and they will be executed in the order in which they are received.
This method can be used to queue up a non-SELECT SQL query to be executed against a single write connection to the database.
You can pass additional SQL parameters as a tuple or dictionary.
By default queries are considered to be "fire and forget" - they will be added to the queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task.
If you pass block=True
this behaviour changes: the method will block until the write operation has completed, and the return value will be the return from calling conn.execute(...)
using the underlying sqlite3
Python library.
This method works like .execute_write()
, but instead of a SQL statement you give it a callable Python function. This function will be queued up and then called when the write connection is available, passing that connection as the argument to the function.
The function can then perform multiple actions, safe in the knowledge that it has exclusive access to the single writable connection as long as it is executing.
For example:
def my_action(conn):
conn.execute("delete from some_table")
conn.execute("delete from other_table")
await database.execute_write_fn(my_action)
This method is fire-and-forget, queueing your function to be executed and then allowing your code after the call to .execute_write_fn()
to continue running while the underlying thread waits for an opportunity to run your function. A UUID representing the queued task will be returned.
If you pass block=True
your calling code will block until the function has been executed. The return value to the await
will be the return value of your function.
If your function raises an exception and you specified block=True
, that exception will be propagated up to the await
line. With block=False
any exceptions will be silently ignored.
Here's an example of block=True
in action:
def my_action(conn):
conn.execute("delete from some_table where id > 5")
return conn.execute("select count(*) from some_table").fetchone()[0]
try:
num_rows_left = await database.execute_write_fn(my_action, block=True)
except Exception as e:
print("An error occurred:", e)
The Database
class also provides properties and methods for introspecting the database.
db.name
- stringThe name of the database - usually the filename without the
.db
prefix.db.size
- integerThe size of the database file in bytes. 0 for
:memory:
databases.db.mtime_ns
- integer or NoneThe last modification time of the database file in nanoseconds since the epoch.
None
for:memory:
databases.await db.table_exists(table)
- booleanCheck if a table called
table
exists.await db.table_names()
- list of stringsList of names of tables in the database.
await db.view_names()
- list of stringsList of names of views in tha database.
await db.table_columns(table)
- list of stringsNames of columns in a specific table.
await db.primary_keys(table)
- list of stringsNames of the columns that are part of the primary key for this table.
await db.fts_table(table)
- string or NoneThe name of the FTS table associated with this table, if one exists.
await db.label_column_for_table(table)
- string or NoneThe label column that is associated with this table - either automatically detected or using the
"label_column"
key frommetadata
, seelabel_columns
.await db.foreign_keys_for_table(table)
- list of dictionariesDetails of columns in this table which are foreign keys to other tables. A list of dictionaries where each dictionary is shaped like this:
{"column": string, "other_table": string, "other_column": string}
.await db.hidden_table_names()
- list of stringsList of tables which Datasette "hides" by default - usually these are tables associated with SQLite's full-text search feature, the SpatiaLite extension or tables hidden using the
metadata_hiding_tables
feature.await db.get_table_definition(table)
- stringReturns the SQL definition for the table - the
CREATE TABLE
statement and any associatedCREATE INDEX
statements.await db.get_view_definition(view)
- stringReturns the SQL definition of the named view.
await db.get_all_foreign_keys()
- dictionaryDictionary representing both incoming and outgoing foreign keys for this table. It has two keys,
"incoming"
and"outgoing"
, each of which is a list of dictionaries with keys"column"
,"other_table"
and"other_column"
. For example:{ "incoming": [], "outgoing": [ { "other_table": "attraction_characteristic", "column": "characteristic_id", "other_column": "pk", }, { "other_table": "roadside_attractions", "column": "attraction_id", "other_column": "pk", } ] }
Datasette uses asgi-csrf to guard against CSRF attacks on form POST submissions. Users receive a ds_csrftoken
cookie which is compared against the csrftoken
form field (or x-csrftoken
HTTP header) for every incoming request.
If your plugin implements a <form method="POST">
anywhere you will need to include that token. You can do so with the following template snippet:
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">