Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Intermittent test failures from pcds-envs #169

Open
ZLLentz opened this issue Oct 27, 2022 · 0 comments
Open

Intermittent test failures from pcds-envs #169

ZLLentz opened this issue Oct 27, 2022 · 0 comments

Comments

@ZLLentz
Copy link
Member

ZLLentz commented Oct 27, 2022

Expected Behavior

Tests should always pass!

Current Behavior

See pcdshub/pcds-envs#253

lightpath/tests/test_gui.py::test_upstream_check times out occasionally (10 minutes)

________________________________ test_filtering ________________________________
lightapp = <lightpath.ui.gui.LightApp object at 0x7faf975e80d0>
monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7faf9c1b37f0>
    def test_filtering(lightapp: LightApp, monkeypatch):
        lightapp.destination_combo.setCurrentIndex(4)  # set current to MEC
        # Create mock functions
        for row in lightapp.rows:
            monkeypatch.setattr(row[0], 'setHidden', Mock())
        # Initialize properly with nothing hidden
        lightapp.filter()
        for row in lightapp.rows:
            row[0].setHidden.assert_called_with(False)
        # Insert at least one device then hide
        device_row = lightapp.rows[2][0]
        device_row.device.insert()
        lightapp.remove_check.setChecked(False)
        # Reset mock
        for row in lightapp.rows:
            row[0].setHidden.reset_mock()
        lightapp.filter()
        for row in lightapp.rows:
            if row[0].device.get_lightpath_state().removed:
>               row[0].setHidden.assert_called_with(True)
lightpath/tests/test_gui.py:90: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
self = <Mock id='140392215288080'>, args = (True,), kwargs = {}
expected = call(True), actual = call(False)
_error_message = <function NonCallableMock.assert_called_with.<locals>._error_message at 0x7faf9762c9d0>
cause = None
    def assert_called_with(self, /, *args, **kwargs):
        """assert that the last call was made with the specified arguments.
    
        Raises an AssertionError if the args and keyword args passed in are
        different to the last call to the mock."""
        if self.call_args is None:
            expected = self._format_mock_call_signature(args, kwargs)
            actual = 'not called.'
            error_message = ('expected call not found.\nExpected: %s\nActual: %s'
                    % (expected, actual))
            raise AssertionError(error_message)
    
        def _error_message():
            msg = self._format_mock_failure_message(args, kwargs)
            return msg
        expected = self._call_matcher(_Call((args, kwargs), two=True))
        actual = self._call_matcher(self.call_args)
        if actual != expected:
            cause = expected if isinstance(expected, Exception) else None
>           raise AssertionError(_error_message()) from cause
E           AssertionError: expected call not found.
E           Expected: mock(True)
E           Actual: mock(False)
../../../miniconda/envs/pcds-next-incr/lib/python3.9/unittest/mock.py:907: AssertionError
________________________________ test_callback _________________________________

path = BeamPath(range=(0.0, 30.0), devices=11)

    def test_callback(path: BeamPath):
        # Create mock callback
        cb = Mock()
        # Subscribe to event changes
        path.subscribe(cb, run=False)
        # Change state of beampath
        path.devices[4].insert()
        # Assert callback has been run
>       assert cb.called
E       AssertionError: assert False
E        +  where False = <Mock id='140651858531136'>.called

lightpath/tests/test_path.py:207: AssertionError

Possible Solution

Figure out how to reproduce these offline or otherwise figure out why they might be happening and mitigate the race condition risk.

Steps to Reproduce (for bugs)

  1. Run the test suite a lot

Context

Probably low priority. This adds some noise to the pcds-envs unit testing suite.

Your Environment

pcds-5.5.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant