Skip to content

Latest commit

 

History

History
777 lines (480 loc) · 16.2 KB

reference.rst

File metadata and controls

777 lines (480 loc) · 16.2 KB

Reference

This page contains the full reference to pytest's API.

.. autofunction:: _pytest.python_api.approx

Tutorial: :ref:`skipping`

.. autofunction:: _pytest.outcomes.fail

.. autofunction:: _pytest.outcomes.skip(msg, [allow_module_level=False])

.. autofunction:: _pytest.outcomes.importorskip

.. autofunction:: _pytest.outcomes.xfail

.. autofunction:: _pytest.outcomes.exit

.. autofunction:: _pytest.config.main

.. autofunction:: pytest.param(*values, [id], [marks])

Tutorial: :ref:`assertraises`.

.. autofunction:: pytest.raises(expected_exception: Exception, [match], [message])
    :with: excinfo

Tutorial: :ref:`ensuring_function_triggers`.

.. autofunction:: pytest.deprecated_call()
    :with:

Tutorial: :ref:`assertion-rewriting`.

.. autofunction:: pytest.register_assert_rewrite

Tutorial: :ref:`assertwarnings`

.. autofunction:: pytest.warns(expected_warning: Exception, [match])
    :with:


Marks can be used apply meta data to test functions (but not fixtures), which can then be accessed by fixtures or plugins.

Tutorial: :doc:`parametrize`.

.. automethod:: _pytest.python.Metafunc.parametrize


Tutorial: :ref:`skip`.

Unconditionally skip a test function.

.. py:function:: pytest.mark.skip(*, reason=None)

    :keyword str reason: Reason why the test function is being skipped.


Tutorial: :ref:`xfail`.

Skip a test function if a condition is True.

.. py:function:: pytest.mark.skipif(condition, *, reason=None)

    :type condition: bool or str
    :param condition: ``True/False`` if the condition should be skipped or a :ref:`condition string <string conditions>`.
    :keyword str reason: Reason why the test function is being skipped.


Tutorial: :ref:`xfail`.

Marks a test function as expected to fail.

.. py:function:: pytest.mark.xfail(condition=None, *, reason=None, raises=None, run=True, strict=False)

    :type condition: bool or str
    :param condition: ``True/False`` if the condition should be marked as xfail or a :ref:`condition string <string conditions>`.
    :keyword str reason: Reason why the test function is marked as xfail.
    :keyword Exception raises: Exception subclass expected to be raised by the test function; other exceptions will fail the test.
    :keyword bool run:
        If the test function should actually be executed. If ``False``, the function will always xfail and will
        not be executed (useful a function is segfaulting).
    :keyword bool strict:
        * If ``False`` (the default) the function will be shown in the terminal output as ``xfailed`` if it fails
          and as ``xpass`` if it passes. In both cases this will not cause the test suite to fail as a whole. This
          is particularly useful to mark *flaky* tests (tests that random at fail) to be tackled later.
        * If ``True``, the function will be shown in the terminal output as ``xfailed`` if it fails, but if it
          unexpectedly passes then it will **fail** the test suite. This is particularly useful to mark functions
          that are always failing and there should be a clear indication if they unexpectedly start to pass (for example
          a new release of a library fixes a known bug).


Marks are created dynamically using the factory object pytest.mark and applied as a decorator.

For example:

@pytest.mark.timeout(10, 'slow', method='thread')
def test_function():
    ...

Will create and attach a :class:`MarkInfo <_pytest.mark.MarkInfo>` object to the collected :class:`Item <_pytest.nodes.Item>`, which can then be accessed by fixtures or hooks with :meth:`Node.get_marker <_pytest.nodes.Node.get_marker>`. The mark object will have the following attributes:

mark.args == (10, 'slow')
mark.kwargs == {'method': 'thread'}

Tutorial: :ref:`fixture`.

Fixtures are requested by test functions or other fixtures by declaring them as argument names.

Example of a test requiring a fixture:

def test_output(capsys):
    print('hello')
    out, err = capsys.readouterr()
    assert out == 'hello\n'

Example of a fixture requiring another fixture:

@pytest.fixture
def db_session(tmpdir):
    fn = tmpdir / 'db.file'
    return connect(str(fn))

For more details, consult the full :ref:`fixtures docs <fixture>`.

.. autofunction:: pytest.fixture
    :decorator:


Tutorial: :ref:`cache`.

The config.cache object allows other plugins and fixtures to store and retrieve values across test runs. To access it from fixtures request pytestconfig into your fixture and get it with pytestconfig.cache.

Under the hood, the cache plugin uses the simple dumps/loads API of the :py:mod:`json` stdlib module.

.. currentmodule:: _pytest.cacheprovider

.. automethod:: Cache.get
.. automethod:: Cache.set
.. automethod:: Cache.makedir


Tutorial: :doc:`capture`.

.. currentmodule:: _pytest.capture

.. autofunction:: capsys()
    :no-auto-options:

    Returns an instance of :py:class:`CaptureFixture`.

    Example:

    .. code-block:: python

        def test_output(capsys):
            print("hello")
            captured = capsys.readouterr()
            assert captured.out == "hello\n"

.. autoclass:: CaptureFixture()
    :members:


Tutorial: :doc:`capture`.

.. autofunction:: capsysbinary()
    :no-auto-options:

    Returns an instance of :py:class:`CaptureFixture`.

    Example:

    .. code-block:: python

        def test_output(capsysbinary):
            print("hello")
            captured = capsysbinary.readouterr()
            assert captured.out == b"hello\n"


Tutorial: :doc:`capture`.

.. autofunction:: capfd()
    :no-auto-options:

    Returns an instance of :py:class:`CaptureFixture`.

    Example:

    .. code-block:: python

        def test_system_echo(capfd):
            os.system('echo "hello"')
            captured = capsys.readouterr()
            assert captured.out == "hello\n"


Tutorial: :doc:`capture`.

.. autofunction:: capfdbinary()
    :no-auto-options:

    Returns an instance of :py:class:`CaptureFixture`.

    Example:

    .. code-block:: python

        def test_system_echo(capfdbinary):
            os.system('echo "hello"')
            captured = capfdbinary.readouterr()
            assert captured.out == b"hello\n"


Tutorial: :doc:`doctest`.

.. autofunction:: _pytest.doctest.doctest_namespace()

    Usually this fixture is used in conjunction with another ``autouse`` fixture:

    .. code-block:: python

        @pytest.fixture(autouse=True)
        def add_np(doctest_namespace):
            doctest_namespace['np'] = numpy

    For more details: :ref:`doctest_namespace`.


Tutorial: :ref:`request example`.

The request fixture is a special fixture providing information of the requesting test function.

.. autoclass:: _pytest.fixtures.FixtureRequest()
    :members:


.. autofunction:: _pytest.fixtures.pytestconfig()


Tutorial: :ref:`record_xml_property example`.

.. autofunction:: _pytest.junitxml.record_xml_property()

Tutorial: :doc:`logging`.

.. autofunction:: _pytest.logging.caplog()
    :no-auto-options:

    This returns a :class:`_pytest.logging.LogCaptureFixture` instance.

.. autoclass:: _pytest.logging.LogCaptureFixture
    :members:


.. currentmodule:: _pytest.monkeypatch

Tutorial: :doc:`monkeypatch`.

.. autofunction:: _pytest.monkeypatch.monkeypatch()
    :no-auto-options:

    This returns a :class:`MonkeyPatch` instance.

.. autoclass:: _pytest.monkeypatch.MonkeyPatch
    :members:

.. currentmodule:: _pytest.pytester

This fixture provides a :class:`Testdir` instance useful for black-box testing of test files, making it ideal to test plugins.

To use it, include in your top-most conftest.py file:

pytest_plugins = 'pytester'
.. autoclass:: Testdir()
    :members: runpytest,runpytest_subprocess,runpytest_inprocess,makeconftest,makepyfile

.. autoclass:: RunResult()
    :members:

.. autoclass:: LineMatcher()
    :members:


Tutorial: :ref:`assertwarnings`

.. currentmodule:: _pytest.recwarn

.. autofunction:: recwarn()
    :no-auto-options:

.. autoclass:: _pytest.recwarn.WarningsRecorder()
    :members:

Each recorded warning is an instance of :class:`warnings.WarningMessage`.

Note

:class:`RecordedWarning` was changed from a plain class to a namedtuple in pytest 3.1

Note

DeprecationWarning and PendingDeprecationWarning are treated differently; see :ref:`ensuring_function_triggers`.

Tutorial: :doc:`tmpdir`

.. currentmodule:: _pytest.tmpdir

.. autofunction:: tmpdir()
    :no-auto-options:


Tutorial: :ref:`tmpdir factory example`

tmpdir_factory instances have the following methods:

.. currentmodule:: _pytest.tmpdir

.. automethod:: TempdirFactory.mktemp
.. automethod:: TempdirFactory.getbasetemp


Tutorial: :doc:`writing_plugins`.

.. currentmodule:: _pytest.hookspec

Reference to all hooks which can be implemented by :ref:`conftest.py files <localplugin>` and :ref:`plugins <plugins>`.

Bootstrapping hooks called for plugins registered early enough (internal and setuptools plugins).

.. autofunction:: pytest_load_initial_conftests
.. autofunction:: pytest_cmdline_preparse
.. autofunction:: pytest_cmdline_parse
.. autofunction:: pytest_cmdline_main

Initialization hooks called for plugins and conftest.py files.

.. autofunction:: pytest_addoption
.. autofunction:: pytest_addhooks
.. autofunction:: pytest_configure
.. autofunction:: pytest_unconfigure

All runtest related hooks receive a :py:class:`pytest.Item <_pytest.main.Item>` object.

.. autofunction:: pytest_runtestloop
.. autofunction:: pytest_runtest_protocol
.. autofunction:: pytest_runtest_logstart
.. autofunction:: pytest_runtest_logfinish
.. autofunction:: pytest_runtest_setup
.. autofunction:: pytest_runtest_call
.. autofunction:: pytest_runtest_teardown
.. autofunction:: pytest_runtest_makereport

For deeper understanding you may look at the default implementation of these hooks in :py:mod:`_pytest.runner` and maybe also in :py:mod:`_pytest.pdb` which interacts with :py:mod:`_pytest.capture` and its input/output capturing in order to immediately drop into interactive debugging when a test failure occurs.

The :py:mod:`_pytest.terminal` reported specifically uses the reporting hook to print information about a test run.

pytest calls the following hooks for collecting files and directories:

.. autofunction:: pytest_collection
.. autofunction:: pytest_ignore_collect
.. autofunction:: pytest_collect_directory
.. autofunction:: pytest_collect_file

For influencing the collection of objects in Python modules you can use the following hook:

.. autofunction:: pytest_pycollect_makeitem
.. autofunction:: pytest_generate_tests
.. autofunction:: pytest_make_parametrize_id

After collection is complete, you can modify the order of items, delete or otherwise amend the test items:

.. autofunction:: pytest_collection_modifyitems

Session related reporting hooks:

.. autofunction:: pytest_collectstart
.. autofunction:: pytest_itemcollected
.. autofunction:: pytest_collectreport
.. autofunction:: pytest_deselected
.. autofunction:: pytest_report_header
.. autofunction:: pytest_report_collectionfinish
.. autofunction:: pytest_report_teststatus
.. autofunction:: pytest_terminal_summary
.. autofunction:: pytest_fixture_setup
.. autofunction:: pytest_fixture_post_finalizer

And here is the central hook for reporting about test execution:

.. autofunction:: pytest_runtest_logreport

You can also use this hook to customize assertion representation for some types:

.. autofunction:: pytest_assertrepr_compare


There are few hooks which can be used for special reporting or interaction with exceptions:

.. autofunction:: pytest_internalerror
.. autofunction:: pytest_keyboard_interrupt
.. autofunction:: pytest_exception_interact
.. autofunction:: pytest_enter_pdb


Full reference to objects accessible from :ref:`fixtures <fixture>` or :ref:`hooks <hook-reference>`.

.. autoclass:: _pytest.runner.CallInfo()
    :members:


.. autoclass:: _pytest.python.Class()
    :members:
    :show-inheritance:

.. autoclass:: _pytest.nodes.Collector()
    :members:
    :show-inheritance:

.. autoclass:: _pytest.config.Config()
    :members:

.. autoclass:: _pytest._code.ExceptionInfo
    :members:

.. autoclass:: _pytest.fixtures.FixtureDef()
    :members:
    :show-inheritance:

.. autoclass:: _pytest.nodes.FSCollector()
    :members:
    :show-inheritance:

.. autoclass:: _pytest.python.Function()
    :members:
    :show-inheritance:

.. autoclass:: _pytest.nodes.Item()
    :members:
    :show-inheritance:

.. autoclass:: _pytest.mark.MarkDecorator
    :members:

.. autoclass:: _pytest.mark.MarkGenerator
    :members:

.. autoclass:: _pytest.mark.MarkInfo
    :members:

.. autoclass:: _pytest.python.Metafunc
    :members:

.. autoclass:: _pytest.python.Module()
    :members:
    :show-inheritance:

.. autoclass:: _pytest.nodes.Node()
    :members:

.. autoclass:: _pytest.config.Parser()
    :members:

.. autoclass:: pluggy.PluginManager()
    :members:


.. autoclass:: _pytest.config.PytestPluginManager()
    :members:
    :undoc-members:
    :show-inheritance:

.. autoclass:: _pytest.main.Session()
    :members:
    :show-inheritance:

.. autoclass:: _pytest.runner.TestReport()
    :members:
    :inherited-members:

.. autoclass:: pluggy._Result
    :members:

pytest treats some global variables in a special manner when defined in a test module.

Tutorial: :ref:`available installable plugins`

Can be declared at the global level in test modules and conftest.py files to register additional plugins. Can be either a str or Sequence[str].

pytest_plugins = "myapp.testsupport.myplugin"
pytest_plugins = ("myapp.testsupport.tools", "myapp.testsupport.regression")

Tutorial: :ref:`scoped-marking`

Can be declared at the global level in test modules to apply one or more :ref:`marks <marks ref>` to all test functions and methods. Can be either a single mark or a sequence of marks.

import pytest
pytestmark = pytest.mark.webtest
import pytest
pytestmark = (pytest.mark.integration, pytest.mark.slow)