-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MEP001 - Integrating NiiVue as plotting backend for visualization requiring meshes and volumes #12030
Comments
A somewhat related issue would be #11990, which I should maybe also convert to a MEP. FYI after some brief preliminary discussion with @christian-oreilly I suggested this "MEP" idea since it seems to be used successfully by other projects to help frame discussions of large enhancements/changes. |
having a good web based method would be awesome, yet do we know if it will play well with transparencies with many objects. Think plot_alignment eg like here https://mne.tools/stable/auto_tutorials/forward/25_automated_coreg.html#sphx-glr-auto-tutorials-forward-25-automated-coreg-py past attempts showed it's where the pain starts. vtk automatically infers what object is in front and how transparency should operate. What maybe happen here is that any object behind a tranparent surface becomes invisible or something along these lines. |
Thanks for the feedback @agramfort ! I am not worried about transparency with NiiVue because the viewer has been developed explicitly to support co-registrations of meshes and volumes for neuroimaging applications. If you'd like to have a peak, here is the demo page for this viewer: https://niivue.github.io/niivue/ A few potentially relevant demos:
Hopefully, these and the rest of the demos should make a good argument that transparency and co-registration are well-managed in NiiVue itself, so an MNE-Python wrapper should not struggle with these. Note that the full list of demos also shows a range of features that could eventually be useful to extend some of the MNE features. For example, I can think of the pesky error about intersecting meshes when doing BEM models. NiiVue has annotation capability (https://niivue.github.io/niivue/features/freesurfer.html) that could potentially be used to visualize and edit meshes and manually correct intersecting mesh issues. The NiiVue project is in heavy development at this time, with us being at the start of the 3-year R01 grant, so MNE-Python needs could also inform the features being developed in NiiVue itself. |
ok I have faith now :)
…On Sun, Oct 1, 2023 at 4:09 PM christian-oreilly ***@***.***> wrote:
Thanks for the feedback @agramfort <https://github.com/agramfort> ! I am
not worried about transparency with NiiVue because the viewer has been
developed explicitly to support co-registrations of meshes and volumes for
neuroimaging applications. If you'd like to have a peak, here is the demo
page for this viewer: https://niivue.github.io/niivue/
A few potentially relevant demos:
- saving of a scene as plain HTML (note that the mesh there is partly
transparent to show the voxelized ROI):
https://niivue.github.io/niivue/features/save.html.html
- also somewhat showing transparency (x-ray slider):
https://niivue.github.io/niivue/features/clipplanes.html
- overlaying of tracks and sliced MRI:
https://niivue.github.io/niivue/features/tracts.html
- shader options for meshes:
https://niivue.github.io/niivue/features/meshes.html
- multiple synchronized views:
https://niivue.github.io/niivue/features/sync.mesh.html
Hopefully, these and the rest of the demos should make a good argument
that transparency and co-registration are well-managed in NiiVue itself, so
an MNE-Python wrapper should not struggle with these. Note that the full
list of demos also shows a range of features that could eventually be
useful to extend some of the MNE features. For example, I can think of the
pesky error about intersecting meshes when doing BEM models. NiiVue has
annotation capability (
https://niivue.github.io/niivue/features/freesurfer.html) that could
potentially be used to visualize and edit meshes and manually correct
intersecting mesh issues. The NiiVue project is in heavy development at
this time, with us being at the start of the 3-year R01 grant, so
MNE-Python needs could also inform the features being developed in NiiVue
itself.
—
Reply to this email directly, view it on GitHub
<#12030 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AABHKHFRUGDIYXVGJKCEMWTX5F2QXANCNFSM6AAAAAA5MTR3QA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
@christian-oreilly from a brief discussion with @agramfort and @drammock this seems reasonable to try, so if you can get funding to do it that would be great! If you want to try playing around you could open a WIP PR to have Niivue added as a renderer starting with AbstractRenderer, e.g.: mne-python/mne/viz/backends/_pyvista.py Line 199 in 110947f
The stuff about checkboxes and layouts etc. would hopefully go away with #11990 so don't worry about those bits |
@larsoner If we are serious about the MEP structure for this, maybe we can take inspiration from the process outlined for PEP: https://peps.python.org/pep-0001/ Looks like a nice text describing the whole process. I can make sure to go over this and document a MEP process for MNE as part of this work. Maybe establish the MEP process in a separate ticket so that the MEP structure could be debated separately? I'll wait for this project to be funded and kicked off before opening such a ticket but I'll put it in my planning. Other useful reference: |
I don't think we need to take it so seriously yet. Really my idea behind calling this a "MEP" is to make it clear that the scope is much larger than a typical |
@larsoner Fair enough! |
Just so everyone interested in this issue is also aware of a parallel effort, please check out #12217, which focuses on stacked time series plots rather than meshes and volumes. Note that we're using HoloViews and Panel there rather than Plotly and Dash. I'm not sure whether Dash can include HoloViews/Bokeh plots cleanly, but Panel can easily include Plotly plots, so there would be a way forward if people wanted to make a web app that included both the mesh and volume plots from this proposal and the stacked timeseries plots from #12217. |
Describe the new feature or enhancement
Following a discussion with @larsoner, I am proposing this "MNE-Python Enhancement Proposal" for discussion. The implementation of this work would be driven and supported by a 2-year supplement to a current NIH-R01 (#1RF1MH133701-01; PI: Chris Rorden). The submission deadline for the supplement is November 30th.
The parent R01 is on the NiiVue project and its Python wrapper IPyNiiVue. A crude first version of the objectives of the supplement (these may shift as the writing of the proposal progresses) is:
Directly relevant to this MEP, is the aim 2. I provide the full list of three aims just because they may end up interacting with aim 2.
Describe your proposed implementation
In short, through this MEP, I would like to propose better integrating the NiiVue visualizer with MNE-Python for EEG/MEG processing by adding it to the supported backend for the plotting functions working with volumes and meshes. I think this viewer will have multiple advantages, the main one for me being its integration with web technologies, freeing MNE plotting functions from dependencies on C++ bindings (e.g., QT) which often causes installation issues.
Describe possible alternatives
NiiVue could be integrated using the already established plotting backend abstraction layer (mne/viz/backends/*.py) and the IPyNiiVue approach. Alternatively, it could also be first wrapped as a plotly component, and plotly support could be more thoroughly integrated. A small demo of Plotly integration was provided a while back https://mne.tools/mne-dash/. I think we pushed this concept much further with the Quality Control Review board we coded for PyLossless. I think these efforts can be synergized to offer much better dashboard capability in MNE-Python.
A measure of success would be the ability to easily plot (i.e., not with 20 lines of codes loading volumes, meshes, co-registering them, and all, but with something like a one-liner) source results in virtualized environments like the Google Colab. This would make teaching, training, and research with MNE-Python significantly easier.
Additional context
No response
The text was updated successfully, but these errors were encountered: