Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX: Update README for dev install #132

Merged
merged 3 commits into from
May 17, 2018
Merged

Conversation

larsoner
Copy link
Contributor

Following the advice in #93 helped, so I figured it should be added to the README.

Now when running jupyter notebook in the notebooks directory and running example.ipynb I get stuck on the second line with the following warnings:

[I 12:32:55.718 NotebookApp] Adapting to protocol v5.1 for kernel 9bd2798f-e1f5-4669-9c5c-e8e10cb28878
[W 12:32:56.202 NotebookApp] 404 GET /static/jupyter-threejs.js?v=20180517123250 (127.0.0.1) 20.48ms referer=http://localhost:8889/notebooks/example.ipynb
[W 12:32:56.273 NotebookApp] 404 GET /static/three.js?v=20180517123250 (127.0.0.1) 0.96ms referer=http://localhost:8889/notebooks/example.ipynb

@maartenbreddels any ideas on that one? My jupyter nbextension list gives:

Known nbextensions:
  config dir: /home/larsoner/.jupyter/nbconfig
    notebook section
      ipyvolume/extension  enabled 
      - Validating: OK
      jupyter-js-widgets/extension  enabled 
      - Validating: OK

This is with a pip --user-installed Jupyter (IPython 6.3.1) on Ubuntu 18.04 system Python 3.6.5. I can try removing and upgrading the stack if you think that would help.

@maartenbreddels
Copy link
Collaborator

Hi Eric,

thanks for your (first) contribution! Actually, also ipywebrtc and pythreejs are widget dependencies now (pythreejs will provide the three.js file). However, on notebook 5.3+ when these are pip installed they are automatically installed and enabled, so you may want to try that out. I wouldn't ever user pip install --user anymore, but use virtual envs or conda, --user installed packages are seem by all Python distributions, which is a Bad Thing :)

cheers,

Maarten

@larsoner
Copy link
Contributor Author

Indeed uninstalling/reinstalling ipywebrtc, pythreejs then doing nbextension enable --py --user pythreejs and the same for ipywebrtc made the examples work!

Do you think the extra two nbextension enable lines should be added, too?

Don't worry, when I use conda I set ENABLE_USER_SITE = False in site.py :)

@maartenbreddels
Copy link
Collaborator

Good to hear, yes, if you could put that in the instructions (with the comment that it is only needed for notebook 5.2 and below).
I didn't even know you could disable that, thanks for sharing!

@larsoner
Copy link
Contributor Author

Okay I added those lines and unified where this is mentioned, instead of having it in several places.

@maartenbreddels maartenbreddels merged commit 408f365 into widgetti:master May 17, 2018
@maartenbreddels
Copy link
Collaborator

Thanks! Hope to see more ;)

@larsoner
Copy link
Contributor Author

Hope to see more ;)

Be careful what you ask for :) We have a GSoC student looking into ipyvolume for 3D viz of brain meshes, basically we want this:

http://pysurfer.github.io/auto_examples/plot_meg_inverse_solution.html#sphx-glr-auto-examples-plot-meg-inverse-solution-py

Our student has mentioned that there does not seem to be control over:

  • camera scaling factor
  • light direction and color
  • options for shading

And at a more advanced level we will need:

  • vertex value -> colormap interpolation
  • multiple overlays on the same mesh (most likely via multiple translucent mesh replicates with proper WebGL rendering modes set)

Perhaps these are somewhere at the pythreejs level or we missed them at the ipyvolume level. But in any case, yes, we will probably make some upstream contributions in the coming months to try to get where we want our viz to be :)

@larsoner
Copy link
Contributor Author

(Actually I looked again and the student said they are looking for some of those options, not necessarily that they don't exist.)

@maartenbreddels
Copy link
Collaborator

Ah yes, @choldgraf mentioned that to me, excellent, and looking forward to all this!
Lighting is definitely something that needs improvement. Ipyvolume is currently more integrated with pythreejs, for instance the Camera and ShaderMaterial objects are now fully exposed. I think the same needs to happen with Lights.

A vertex value -> colormap is possible by doing the transformation at the kernel side (and thus sending all rgb values, which is slower). However, I/we (@SylvainCorlay ) plan to put scales/colormaps that are in bqplot to a separate library, which ipyvolume than can use. This means we can change the colormap or its range/limits on the frontend, and it will be interactive/fast/snappy.
The multiple overlays should be possible already now, since I'm exposing the ShaderMaterial, all transparancies/blending rules can be set.

Also, I'm regularly thinking about being able to inject arbitrarily shader snippets, and custom attributes (vertex values) into the mesh and scatter objects to have more programmable coloring options. Ideas on that are welcome ofcourse.

@larsoner
Copy link
Contributor Author

A vertex value -> colormap is possible by doing the transformation at the kernel side (and thus sending all rgb values, which is slower).

When you say "sending" you mean uploading a varying vec4 type of vertex buffer array from Python to JavaScript / WebGL via serialization, or something else?

I ask because we often want to interactively visualize data as a function of time (with a slider) for ~300k vertices and many time points (e.g., 1 sec of data at 1000 Hz), so this will hurt us. We usually have data actually defined at just ~20k of these points alongside a sparse (300k, 20k) array that does upsampling to the high-resolution mesh, though. So for sufficient speed I'm assuming that we will need to do this upsampling step in Javascript. And if we're already doing that in JS, we can live with doing the LUT translation ourselves while we are at it.

At least that's how I think of it, let me know if you see a better, preferably less annoying solution. (I don't know how easy it will be to mix these Python and JS operations while keeping pythreejs/ipyvolume happy, either.)

Also, I'm regularly thinking about being able to inject arbitrarily shader snippets, and custom attributes (vertex values) into the mesh and scatter objects to have more programmable coloring options. Ideas on that are welcome ofcourse.

Over in vispy this is done by, for each object, letting users do arbitrary transforms in some places. For example, with a color_transform that by default is a pass-through but can be altered to do a colormap lookup:

https://github.com/vispy/vispy/blob/master/vispy/visuals/mesh.py#L35

@maartenbreddels
Copy link
Collaborator

When you say "sending" you mean uploading a varying vec4 type of vertex buffer array from Python to JavaScript / WebGL via serialization, or something else?

yes, instead of just sending the min/max values, or the colormap, sending a few bytes over the wire (or actually, it may all happen at the frontend) and the shader does all the work.

I ask because we often want to interactively visualize data as a function of time (with a slider) for ~300k vertices and many time points (e.g., 1 sec of data at 1000 Hz), so this will hurt us.
300k vertices is possible, sending N times that may get uncomfortable for the browser, so it might need a custom solution. However, before going there I think it's best to try to sqeeze the best performance out of ipyvolume and the notebook.

That vispy solution may be something to look into!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants