Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Visualization tests with image regression testing #11433

Open
banesullivan opened this issue Jan 30, 2023 · 5 comments
Open

Visualization tests with image regression testing #11433

banesullivan opened this issue Jan 30, 2023 · 5 comments
Labels

Comments

@banesullivan
Copy link

banesullivan commented Jan 30, 2023

Describe the new feature or enhancement

PyVista has contract testing with MNE to make sure we don't break anything you all are using. I'm wondering if we want to take this a step further with image regression testing to validate further we aren't changing the visualizations you all create here

AFAIK, that isn't implemented here, but do let me know if I'm not correct.

Describe your proposed implementation

Now that pytest-pyvista is implemented, this should be straightforward to set up for the visualization tests. Though... I looked at the tests here, and it wasn't immediately obvious to me where/how PyVista is used in the tests

https://github.com/pyvista/pytest-pyvista

Describe possible alternatives

Or just leave it as is, which only validates API usage and not actual rendering results.

Additional context

No response

@larsoner
Copy link
Member

IIRC our testing was too slow when we actually rendered everything, so we hacked our tests to SetVisibility(False) on almost all visuals. We could add an env var to control this on main, though, and then it would produce correct images...

@banesullivan
Copy link
Author

our testing was too slow when we actually rendered everything

Oh... oof, it may be good to run this before releases (for PyVsita and/or MNE) but that slowdown would probably prove problematic then

@mmagnuski
Copy link
Member

@larsoner Could this be tested on the rendered docs then?

@banesullivan
Copy link
Author

PyVista does provide a pyvista.compare_images method which I recently used to compare different versions of built documentation images. My very rough script was:

import shutil
from tqdm import tqdm
from pathlib import Path
import os
import pyvista as pv
import warnings


truth_path = "/home/local/KHQ/bane.sullivan/Software/pyvista/pyvista-docs/_images/"
verify_path = (
    "/home/local/KHQ/bane.sullivan/Software/pyvista/pyvista/doc/_build/html/_images/"
)

images = [
    f for f in os.listdir(truth_path) if os.path.isfile(os.path.join(truth_path, f))
]


new_images = [
    f
    for f in os.listdir(verify_path)
    if os.path.isfile(os.path.join(verify_path, f))
    and f.lower().startswith("pyvista") # or f.lower().startswith('sphx')
    and f.lower().endswith(".png")
    and "_thumb" not in f
]
print(f'Number of images to compare: {len(new_images)}')

if os.exists('dump'):
    os.rmdir('dump')
os.makedirs("dump")

n_warnings = 0
for image in tqdm(new_images):
    if image in images:
        error = pv.compare_images(
            os.path.join(truth_path, image),
            os.path.join(verify_path, image),
        )

        allowed_error = 1000.0
        allowed_warning = 500.0

        if error > allowed_error:
            # warnings.warn(
            #     f"{image} Exceeded image regression error of "
            #     f"{allowed_error} with an image error equal to: {error}"
            # )
            n_warnings += 1

            shutil.copyfile(
                os.path.join(truth_path, image),
                os.path.join("./dump", image.replace(".png", "-0.37.png")),
            )
            shutil.copyfile(
                os.path.join(verify_path, image),
                os.path.join("./dump", image.replace(".png", "-0.38.png")),
            )

        # elif error > allowed_warning:
        #     warnings.warn(
        #         f"{image} Exceeded image regression warning of "
        #         f"{allowed_warning} with an image error of "
        #         f"{error}"
        #     )
        #     n_warnings += 1

print(f"Total warnings: {n_warnings}")

@larsoner
Copy link
Member

larsoner commented Feb 2, 2023

Yes in principle, but that would be way slower than just reenabling visibility in our unit tests :)

Looking into it on my laptop, time pytest mne/viz/_brain (which should be the slowest part) takes 94 sec on main. If I tweak the tests to make mne/viz/backends/_pyvista.py:_hide_testing_actor just return instead of hiding the actors, it actually takes the same amount of time. This isn't a fair representation of what will happen on CIs, though, because I have a proper GPU. It might be worth trying a monkey-patch of this function in PyVista to see how long it takes CIs there. @banesullivan I could try this if you want. If it's promising and we want to keep it, we could add a _MNE_TESTING_SHOW_ACTORS env var to do this short-circuiting...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants