Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration between twister and pytest: tracking issue. #58288

Open
6 of 9 tasks
PerMac opened this issue May 25, 2023 · 12 comments
Open
6 of 9 tasks

Integration between twister and pytest: tracking issue. #58288

PerMac opened this issue May 25, 2023 · 12 comments
Labels
area: Twister Twister Enhancement Changes/Updates/Additions to existing features

Comments

@PerMac
Copy link
Member

PerMac commented May 25, 2023

Integration between twister and pytest: tracking issue.

An integration between pytest and twister is added with #57117. More details about the integration can be found at https://docs.zephyrproject.org/latest/develop/test/pytest.html. This issue is meant to track the current limitations of the integration and provide a space to discuss what and how is to be added and/or improved. For bigger tasks an individual issues will be created and linked in the list.

Known limitations:

Tasks

Requested functionality

Tasks

Ideas for pytest usage:

  • MCUboot +MCUmngr tests. Comment: We already have started developing those. Will be added soon. Work in progress. tests: mcuboot: pytest: Add image swap test with mcumgr #58393
  • End2end tests. Comment: @SeppoTakalo already had some prototype for Lwm2m with a leshan virtual server. Work in progress.
  • Porting bash-based babblesim tests to pytest. Comment: We already have started looking into those and have some ideas.
@PerMac PerMac added the Enhancement Changes/Updates/Additions to existing features label May 25, 2023
@PerMac PerMac mentioned this issue May 25, 2023
5 tasks
@SeppoTakalo
Copy link
Collaborator

Pytest don't seem to follow the timeout value from testcase.yaml

I had to modify the twister_harness/device/simulator_adapter.py

@PerMac
Copy link
Member Author

PerMac commented Jun 2, 2023

@SeppoTakalo Thanks for picking this out! We already have a fix for this waiting to be accepted #58491 (comment) Is this fixing what you are referring to?

@SeppoTakalo
Copy link
Collaborator

@PerMac
No that is not fixing the timeout.

When simulation starts, device.flash_and_run() is called without any timeout parameter.

So even if I define my test case like this:

tests:
  sample.net.lwm2m:
    harness: pytest
    timeout: 300

That timeout does not affect how long the simulation can run. I would assume that the testcase timeout parameter should flow into the flash_and_run() as well.

@PerMac
Copy link
Member Author

PerMac commented Jun 19, 2023

Indeed, thanks for pointing this out. I think the timeout from the yaml might not be the best to pass, since it is a value for the whole scenario (an image built), i.e. times for all related pytest test combined. If we want to have 5 tests, each with 100s timeout, than the timeout in yaml should be 500s. However, as you pointed out, it is not possible yet to set timeouts for each individual test. We are thinking of adding timeout parametrization to the dut fixture. Therefore, a developer will be able to define the needed timeout on a test level, when asking for dut fixture

@gchwier
Copy link
Collaborator

gchwier commented Jun 19, 2023

@SeppoTakalo
Flash timeout is now hardcoded to 60s. There is no separate timeout for simulation (I agree adding this will be valuable).
timeout from testcase.yaml is used to avoid blocking the CI:

reader_t.join(timeout)

This timeout is used per one test in testcase.yaml (flash + simulation), so if you have couple of tests in a pytest file, then you should consider extending this timeout. If you run pytest directly (wihout twister call), than timeout from yaml is not used.

@SeppoTakalo
Copy link
Collaborator

@gchwier In PR #58791 I'm working on test cases that run LwM2M client against a real server. Many tests that are defined on LwM2M specification requires over 30 second timeout to be waited, so I cannot execute those without significantly modifying the simulation time.

@gopiotr
Copy link
Collaborator

gopiotr commented Aug 7, 2023

Pytest don't seem to follow the timeout value from testcase.yaml

I had to modify the twister_harness/device/simulator_adapter.py

@SeppoTakalo I created a PR where I solved this issue connected with "not respecting timeout from testcase.yaml": #61224. If you will have time, please verify if this works for you.

@SeppoTakalo
Copy link
Collaborator

Feature request

In testing LwM2M client, it would help me a lot if I can share a DUT between testcases.
Currently DUT or Shell are function scoped. If those would optionally allow session scoped, then I could run setup phase through for the DUT, then run multiple testcases in sequence.

@gopiotr
Copy link
Collaborator

gopiotr commented Oct 19, 2023

Feature request

In testing LwM2M client, it would help me a lot if I can share a DUT between testcases. Currently DUT or Shell are function scoped. If those would optionally allow session scoped, then I could run setup phase through for the DUT, then run multiple testcases in sequence.

@SeppoTakalo - thank you for letting us know. We discussed about this shortly, and this is possible to implement, but this is not "one line change" and we need some time to take a look at this. We have to rethink how to deal with initialization of logging files without matter what is dut fixture scope.

Temporarily if you would like to achieve something like what you described, you can create your own dut_session_scope fixture basing on device_object fixture (which has session scope). For example:

@pytest.fixture(scope='session')
def dut_session_scope(device_object: DeviceAdapter) -> Generator[DeviceAdapter, None, None]:
    """Return launched device - with run application."""
    try:
        device_object.launch()
        yield device_object
    finally:  # to make sure we close all running processes execution
        device_object.close()


@pytest.fixture(scope='session')
def shell_session_scope(dut_session_scope: DeviceAdapter) -> Shell:
    """Return ready to use shell interface"""
    shell = Shell(dut_session_scope, timeout=20.0)
    logger.info('Wait for prompt')
    assert shell.wait_for_prompt()
    return shell

@gchwier
Copy link
Collaborator

gchwier commented Oct 25, 2023

@SeppoTakalo please find, test and review requested feature: #64356

@SeppoTakalo
Copy link
Collaborator

@PerMac I have another feature request:

I'm planning to mark some testcases using Pytest markers, like slow/fast/smoke, etc.

@pytest.mark.slow
def test_LightweightM2M_1_1_int_310(shell: Shell, leshan: Leshan, endpoint: str):

Then in the testcase.yaml I could already use those to filter tests:

tests:
  net.lwm2m.interop.smoke:
    harness: pytest
    timeout: 60
    harness_config:
      pytest_dut_scope: module
      pytest_args: ['-k not slow']
  net.lwm2m.interop.slow:
    harness: pytest
    timeout: 600
    slow: true
    harness_config:
      pytest_dut_scope: module
      pytest_args: ['-k slow']

However, that fails because then if I allow slow tests to run, both tests start at the same time and Zephyr network (TUN/TAP) is only capable of running one instance at a time.

So this needs to be solved, either by:

  • Allow me to feed pytest_args from Twister command line
  • Allow me to limit which tests can run in parallel.

@gchwier
Copy link
Collaborator

gchwier commented Nov 15, 2023

  • Allow me to limit which tests can run in parallel.

@SeppoTakalo it is not easy to limit that. This is rather request to Twister core functionality, not related with pytest.
For now, simplest solution / workaround for you, is to run Twister with --jobs 1 - it limits jobs used for building / executing tests (but it increase the time of building .. so you can run Twister separately for building --build-only on all jobs and for executing with --test-only --jobs 1)

  • Allow me to feed pytest_args from Twister command line

I will try to add this in next week. We thought about that also, just to have possibility to filter tests using pytest command,
e.g. to run only one test:
-T zephyr/tests/net/lib/lwm2m/interop -s tests/net/lib/lwm2m/interop/net.lwm2m.interop --pytest-args='-k verify_LightweightM2M_1_1_int_0'
but when someone starts using pytest-args in testcase.yaml, then it will be overwritten.

Using markers in pytest scripts is great, but it is 'hidden' for Twister. Maybe would be better to add some filtering to testcase.yaml.
For now, you can mark test configuration in testcase.yaml as slow , select which scenario / module is used and then run twister with --enable-slow flag.

slow: true
harness_config:
  pytest_root:
    - pytest/test_slow_scenarios.py
    - pytest/test_lwm2m.py::test_LightweightM2M_1_1_int_102
    - pytest/test_lwm2m.py::test_LightweightM2M_1_1_int_104

(you can find some examples in docs: twister )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: Twister Twister Enhancement Changes/Updates/Additions to existing features
Projects
None yet
Development

No branches or pull requests

5 participants