Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix test failing due to fx files chosen differently on different OS's #1169

merged 1 commit into from
Jun 11, 2021


Copy link

@valeriupredoi valeriupredoi commented Jun 10, 2021


The strategy is to fix #1159 properly (one way or another), but to allow for us not having failed tests during the current release procedures, I am skipping the two tests that fail conditioning on version < 2.4.0 (if we don't fix it by then, we'll have those two tests fail again). I copied over the functionality of those two tests in a couple _incomplete tests that run everything else that doesn't fail and we should test for (we should remove those and re-instate the actual complete, currently skipped now).

Related to BUT IT DON'T CLOSE IT #1159

Link to documentation:

Before you get started


It is the responsibility of the author to make sure the pull request is ready to review. The icons indicate whether the item will be subject to the πŸ›  Technical or πŸ§ͺ Scientific review.

To help with the number pull requests:

Copy link

@sloosvel sloosvel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks fine to me as a temporary solution

Copy link
Contributor Author

cheers @sloosvel 🍺 @zklaus you want to include this in the release, mate? πŸ‘

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet

Successfully merging this pull request may close these issues.

Fx files are selected differently depending on what OS the code runs on
2 participants