Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions AUTHORS
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ Contributors include::
Aaron Coleman
Abdeali JK
Abhijeet Kasurde
Adam Johnson
Ahn Ki-Wook
Alan Velasco
Alexander Johnson
Expand Down
1 change: 1 addition & 0 deletions changelog/4557.doc.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Markers example documentation page updated to support latest pytest version.
1 change: 1 addition & 0 deletions changelog/4558.doc.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Update cache documentation example to correctly show cache hit and miss.
1 change: 1 addition & 0 deletions changelog/4580.doc.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Improved detailed summary report documentation.
10 changes: 6 additions & 4 deletions doc/en/cache.rst
Original file line number Diff line number Diff line change
Expand Up @@ -185,20 +185,22 @@ across pytest invocations::
import pytest
import time

def expensive_computation():
print("running expensive computation...")

@pytest.fixture
def mydata(request):
val = request.config.cache.get("example/value", None)
if val is None:
time.sleep(9*0.6) # expensive computation :)
expensive_computation()
val = 42
request.config.cache.set("example/value", val)
return val

def test_function(mydata):
assert mydata == 23

If you run this command once, it will take a while because
of the sleep:
If you run this command for the first time, you can see the print statement:

.. code-block:: pytest

Expand All @@ -217,7 +219,7 @@ of the sleep:
1 failed in 0.12 seconds

If you run it a second time the value will be retrieved from
the cache and this will be quick:
the cache and nothing will be printed:

.. code-block:: pytest

Expand Down
11 changes: 1 addition & 10 deletions doc/en/example/markers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -308,7 +308,7 @@ apply a marker to an individual test instance::
@pytest.mark.foo
@pytest.mark.parametrize(("n", "expected"), [
(1, 2),
pytest.mark.bar((1, 3)),
pytest.param((1, 3), marks=pytest.mark.bar),
(2, 3),
])
def test_increment(n, expected):
Expand All @@ -318,15 +318,6 @@ In this example the mark "foo" will apply to each of the three
tests, whereas the "bar" mark is only applied to the second test.
Skip and xfail marks can also be applied in this way, see :ref:`skip/xfail with parametrize`.

.. note::

If the data you are parametrizing happen to be single callables, you need to be careful
when marking these items. ``pytest.mark.xfail(my_func)`` won't work because it's also the
signature of a function being decorated. To resolve this ambiguity, you need to pass a
reason argument:
``pytest.mark.xfail(func_bar, reason="Issue#7")``.


.. _`adding a custom marker from a plugin`:

Custom marker and command line option to control test runs
Expand Down
2 changes: 1 addition & 1 deletion doc/en/fixture.rst
Original file line number Diff line number Diff line change
Expand Up @@ -804,7 +804,7 @@ different ``App`` instances and respective smtp servers. There is no
need for the ``app`` fixture to be aware of the ``smtp_connection``
parametrization because pytest will fully analyse the fixture dependency graph.

Note, that the ``app`` fixture has a scope of ``module`` and uses a
Note that the ``app`` fixture has a scope of ``module`` and uses a
module-scoped ``smtp_connection`` fixture. The example would still work if
``smtp_connection`` was cached on a ``session`` scope: it is fine for fixtures to use
"broader" scoped fixtures but not the other way round:
Expand Down
3 changes: 0 additions & 3 deletions doc/en/getting-started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,6 @@ Installation and Getting Started

**PyPI package name**: `pytest <https://pypi.org/project/pytest/>`_

**Dependencies**: `py <https://pypi.org/project/py/>`_,
`colorama (Windows) <https://pypi.org/project/colorama/>`_,

**Documentation as PDF**: `download latest <https://media.readthedocs.org/pdf/pytest/latest/pytest.pdf>`_

``pytest`` is a framework that makes building simple and scalable tests easy. Tests are expressive and readable—no boilerplate code required. Get started in minutes with a small unit test or complex functional test for your application or library.
Expand Down
72 changes: 66 additions & 6 deletions doc/en/usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ Detailed summary report

.. versionadded:: 2.9

The ``-r`` flag can be used to display test results summary at the end of the test session,
The ``-r`` flag can be used to display a "short test summary info" at the end of the test session,
making it easy in large test suites to get a clear picture of all failures, skips, xfails, etc.

Example:
Expand All @@ -158,9 +158,34 @@ Example:
=========================== test session starts ============================
platform linux -- Python 3.x.y, pytest-4.x.y, py-1.x.y, pluggy-0.x.y
rootdir: $REGENDOC_TMPDIR, inifile:
collected 0 items

======================= no tests ran in 0.12 seconds =======================
collected 7 items

test_examples.py ..FEsxX [100%]

==================================== ERRORS ====================================
_________________________ ERROR at setup of test_error _________________________
file /Users/chainz/tmp/pytestratest/test_examples.py, line 17
def test_error(unknown_fixture):
E fixture 'unknown_fixture' not found
> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_xml_attribute, record_xml_property, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
> use 'pytest --fixtures [testpath]' for help on them.

/Users/chainz/tmp/pytestratest/test_examples.py:17
=================================== FAILURES ===================================
__________________________________ test_fail ___________________________________

def test_fail():
> assert 0
E assert 0

test_examples.py:14: AssertionError
=========================== short test summary info ============================
FAIL test_examples.py::test_fail
ERROR test_examples.py::test_error
SKIP [1] test_examples.py:21: Example
XFAIL test_examples.py::test_xfail
XPASS test_examples.py::test_xpass
= 1 failed, 2 passed, 1 skipped, 1 xfailed, 1 xpassed, 1 error in 0.07 seconds =

The ``-r`` options accepts a number of characters after it, with ``a`` used above meaning "all except passes".

Expand All @@ -183,9 +208,44 @@ More than one character can be used, so for example to only see failed and skipp
=========================== test session starts ============================
platform linux -- Python 3.x.y, pytest-4.x.y, py-1.x.y, pluggy-0.x.y
rootdir: $REGENDOC_TMPDIR, inifile:
collected 0 items
collected 2 items

test_examples.py Fs [100%]

=================================== FAILURES ===================================
__________________________________ test_fail ___________________________________

def test_fail():
> assert 0
E assert 0

test_examples.py:14: AssertionError
=========================== short test summary info ============================
FAIL test_examples.py::test_fail
SKIP [1] test_examples.py:21: Example
===================== 1 failed, 1 skipped in 0.09 seconds ======================

Using ``p`` lists the passing tests, whilst ``P`` adds an extra section "PASSES" with those tests that passed but had
captured output:

======================= no tests ran in 0.12 seconds =======================
.. code-block:: pytest

$ pytest -rpP
=========================== test session starts ============================
platform linux -- Python 3.x.y, pytest-4.x.y, py-1.x.y, pluggy-0.x.y
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items

test_examples.py .. [100%]
=========================== short test summary info ============================
PASSED test_examples.py::test_pass
PASSED test_examples.py::test_pass_with_output

==================================== PASSES ====================================
____________________________ test_pass_with_output _____________________________
----------------------------- Captured stdout call -----------------------------
Passing test
=========================== 2 passed in 0.04 seconds ===========================

.. _pdb-option:

Expand Down