Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MEP001 - Integrating NiiVue as plotting backend for visualization requiring meshes and volumes #12030

Open
christian-oreilly opened this issue Sep 29, 2023 · 9 comments
Assignees
Labels

Comments

@christian-oreilly
Copy link
Contributor

Describe the new feature or enhancement

Following a discussion with @larsoner, I am proposing this "MNE-Python Enhancement Proposal" for discussion. The implementation of this work would be driven and supported by a 2-year supplement to a current NIH-R01 (#1RF1MH133701-01; PI: Chris Rorden). The submission deadline for the supplement is November 30th.

The parent R01 is on the NiiVue project and its Python wrapper IPyNiiVue. A crude first version of the objectives of the supplement (these may shift as the writing of the proposal progresses) is:

  • Aim 1: Improve MNE-BIDS and EEGNET to support derivatives BIDS datasets of EEG/MEG source data.
  • Aim 2: Integrate NiiVue as a backend for MNE-Python plotting functions.
  • Aim 3: Integrate NiiVue as a Plotly Dash component for easy reuse in Python web dashboards.

Directly relevant to this MEP, is the aim 2. I provide the full list of three aims just because they may end up interacting with aim 2.

Describe your proposed implementation

In short, through this MEP, I would like to propose better integrating the NiiVue visualizer with MNE-Python for EEG/MEG processing by adding it to the supported backend for the plotting functions working with volumes and meshes. I think this viewer will have multiple advantages, the main one for me being its integration with web technologies, freeing MNE plotting functions from dependencies on C++ bindings (e.g., QT) which often causes installation issues.

Describe possible alternatives

NiiVue could be integrated using the already established plotting backend abstraction layer (mne/viz/backends/*.py) and the IPyNiiVue approach. Alternatively, it could also be first wrapped as a plotly component, and plotly support could be more thoroughly integrated. A small demo of Plotly integration was provided a while back https://mne.tools/mne-dash/. I think we pushed this concept much further with the Quality Control Review board we coded for PyLossless. I think these efforts can be synergized to offer much better dashboard capability in MNE-Python.

A measure of success would be the ability to easily plot (i.e., not with 20 lines of codes loading volumes, meshes, co-registering them, and all, but with something like a one-liner) source results in virtualized environments like the Google Colab. This would make teaching, training, and research with MNE-Python significantly easier.

Additional context

No response

@larsoner
Copy link
Member

A somewhat related issue would be #11990, which I should maybe also convert to a MEP. FYI after some brief preliminary discussion with @christian-oreilly I suggested this "MEP" idea since it seems to be used successfully by other projects to help frame discussions of large enhancements/changes.

@agramfort
Copy link
Member

having a good web based method would be awesome, yet do we know if it will play well with transparencies with many objects. Think plot_alignment eg like here https://mne.tools/stable/auto_tutorials/forward/25_automated_coreg.html#sphx-glr-auto-tutorials-forward-25-automated-coreg-py

past attempts showed it's where the pain starts. vtk automatically infers what object is in front and how transparency should operate. What maybe happen here is that any object behind a tranparent surface becomes invisible or something along these lines.

@christian-oreilly
Copy link
Contributor Author

Thanks for the feedback @agramfort ! I am not worried about transparency with NiiVue because the viewer has been developed explicitly to support co-registrations of meshes and volumes for neuroimaging applications. If you'd like to have a peak, here is the demo page for this viewer: https://niivue.github.io/niivue/

A few potentially relevant demos:

Hopefully, these and the rest of the demos should make a good argument that transparency and co-registration are well-managed in NiiVue itself, so an MNE-Python wrapper should not struggle with these. Note that the full list of demos also shows a range of features that could eventually be useful to extend some of the MNE features. For example, I can think of the pesky error about intersecting meshes when doing BEM models. NiiVue has annotation capability (https://niivue.github.io/niivue/features/freesurfer.html) that could potentially be used to visualize and edit meshes and manually correct intersecting mesh issues. The NiiVue project is in heavy development at this time, with us being at the start of the 3-year R01 grant, so MNE-Python needs could also inform the features being developed in NiiVue itself.

@agramfort
Copy link
Member

agramfort commented Oct 1, 2023 via email

@larsoner
Copy link
Member

larsoner commented Oct 2, 2023

@christian-oreilly from a brief discussion with @agramfort and @drammock this seems reasonable to try, so if you can get funding to do it that would be great! If you want to try playing around you could open a WIP PR to have Niivue added as a renderer starting with AbstractRenderer, e.g.:

class _PyVistaRenderer(_AbstractRenderer):

The stuff about checkboxes and layouts etc. would hopefully go away with #11990 so don't worry about those bits

@christian-oreilly
Copy link
Contributor Author

christian-oreilly commented Oct 15, 2023

@larsoner If we are serious about the MEP structure for this, maybe we can take inspiration from the process outlined for PEP: https://peps.python.org/pep-0001/ Looks like a nice text describing the whole process. I can make sure to go over this and document a MEP process for MNE as part of this work. Maybe establish the MEP process in a separate ticket so that the MEP structure could be debated separately? I'll wait for this project to be funded and kicked off before opening such a ticket but I'll put it in my planning.

Other useful reference:

@larsoner
Copy link
Member

If we are serious about the MEP structure for this, maybe we can take inspiration from the process outlined for PEP: https://peps.python.org/pep-0001/ Looks like a nice text describing the whole process.

I don't think we need to take it so seriously yet. Really my idea behind calling this a "MEP" is to make it clear that the scope is much larger than a typical ENH-style issue. Maybe in the future these will be more controversial and we'll need to set up some decision scheme / process beyond our standard one, but I'm okay with waiting until that day comes (and maybe YAGNI!).

@christian-oreilly
Copy link
Contributor Author

@larsoner Fair enough!

@jbednar
Copy link

jbednar commented Nov 17, 2023

Just so everyone interested in this issue is also aware of a parallel effort, please check out #12217, which focuses on stacked time series plots rather than meshes and volumes. Note that we're using HoloViews and Panel there rather than Plotly and Dash. I'm not sure whether Dash can include HoloViews/Bokeh plots cleanly, but Panel can easily include Plotly plots, so there would be a way forward if people wanted to make a web app that included both the mesh and volume plots from this proposal and the stacked timeseries plots from #12217.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants