-
Notifications
You must be signed in to change notification settings - Fork 6.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integration between twister and pytest: tracking issue. #58288
Comments
Pytest don't seem to follow the timeout value from I had to modify the |
@SeppoTakalo Thanks for picking this out! We already have a fix for this waiting to be accepted #58491 (comment) Is this fixing what you are referring to? |
@PerMac When simulation starts,
So even if I define my test case like this:
That timeout does not affect how long the simulation can run. I would assume that the testcase timeout parameter should flow into the |
Indeed, thanks for pointing this out. I think the timeout from the yaml might not be the best to pass, since it is a value for the whole scenario (an image built), i.e. times for all related pytest test combined. If we want to have 5 tests, each with 100s timeout, than the timeout in yaml should be 500s. However, as you pointed out, it is not possible yet to set timeouts for each individual test. We are thinking of adding timeout parametrization to the dut fixture. Therefore, a developer will be able to define the needed timeout on a test level, when asking for dut fixture |
@SeppoTakalo
This timeout is used per one test in testcase.yaml (flash + simulation), so if you have couple of tests in a pytest file, then you should consider extending this timeout. If you run pytest directly (wihout twister call), than timeout from yaml is not used.
|
@SeppoTakalo I created a PR where I solved this issue connected with "not respecting |
Feature requestIn testing LwM2M client, it would help me a lot if I can share a DUT between testcases. |
@SeppoTakalo - thank you for letting us know. We discussed about this shortly, and this is possible to implement, but this is not "one line change" and we need some time to take a look at this. We have to rethink how to deal with initialization of logging files without matter what is Temporarily if you would like to achieve something like what you described, you can create your own @pytest.fixture(scope='session')
def dut_session_scope(device_object: DeviceAdapter) -> Generator[DeviceAdapter, None, None]:
"""Return launched device - with run application."""
try:
device_object.launch()
yield device_object
finally: # to make sure we close all running processes execution
device_object.close()
@pytest.fixture(scope='session')
def shell_session_scope(dut_session_scope: DeviceAdapter) -> Shell:
"""Return ready to use shell interface"""
shell = Shell(dut_session_scope, timeout=20.0)
logger.info('Wait for prompt')
assert shell.wait_for_prompt()
return shell |
@SeppoTakalo please find, test and review requested feature: #64356 |
@PerMac I have another feature request: I'm planning to mark some testcases using Pytest markers, like slow/fast/smoke, etc. @pytest.mark.slow
def test_LightweightM2M_1_1_int_310(shell: Shell, leshan: Leshan, endpoint: str): Then in the tests:
net.lwm2m.interop.smoke:
harness: pytest
timeout: 60
harness_config:
pytest_dut_scope: module
pytest_args: ['-k not slow']
net.lwm2m.interop.slow:
harness: pytest
timeout: 600
slow: true
harness_config:
pytest_dut_scope: module
pytest_args: ['-k slow'] However, that fails because then if I allow slow tests to run, both tests start at the same time and Zephyr network (TUN/TAP) is only capable of running one instance at a time. So this needs to be solved, either by:
|
@SeppoTakalo it is not easy to limit that. This is rather request to Twister core functionality, not related with pytest.
I will try to add this in next week. We thought about that also, just to have possibility to filter tests using pytest command, Using markers in pytest scripts is great, but it is 'hidden' for Twister. Maybe would be better to add some filtering to
(you can find some examples in docs: twister ) |
Integration between twister and pytest: tracking issue.
An integration between pytest and twister is added with #57117. More details about the integration can be found at https://docs.zephyrproject.org/latest/develop/test/pytest.html. This issue is meant to track the current limitations of the integration and provide a space to discuss what and how is to be added and/or improved. For bigger tasks an individual issues will be created and linked in the list.
Known limitations:
Tasks
Requested functionality
Tasks
Ideas for pytest usage:
The text was updated successfully, but these errors were encountered: