-
Notifications
You must be signed in to change notification settings - Fork 615
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mesh Visualization #896
Comments
Did this work at some point previously? Could show the same problem with our Do you know if this worked at all in a previous version of |
that's my second try with vispy, so I don't know about previous states ;) I got the model from turbosquid: |
Yeah, and in case we need a simpler example:
I get the same behavior there. @campagnola any idea what might be causing this? |
@SimonDanisch, thanks for pointing out that the tutorials are broken! The screenshot you posted looks to me like the mesh is being drawn with face culling enabled. Make sure you call @Eric89GXL, your example looks correct on my machine. Can you verify that culling is being disabled in your MeshVisual (this has been tampered with recently, so check with latest master)? |
That looks like mine... While we're at it, a python beginner question: why are my vispy scripts only working in the example folder? |
Ok, I can enable culling and it does not generate the effect you see (so the normal vectors are correct for this mesh). However if I disable the depth test (search for 'depth_test' in visuals/mesh.py) then I see the same artifact you have. So you are correct that this is a depth testing issue. The question then is why it is only broken on some platforms.. |
@SimonDanisch Python has an order in which it looks at directories to import things. Let's say you say When you do You can always check to see where something is being imported from by doing: >>> import vispy
>>> vispy
<module 'vispy' from '/Users/larsoner/custombuilds/vispy/vispy/__init__.pyc'>
>>> The short answer to your question is that you should be able to run the examples from anywhere, since |
@campagnola I'm not sure why it would be platform dependent. I'll take a look on my Linux machine and see if it's fixed there tomorrow. |
Thanks a lot for the explanation! |
I can confirm this works on Linux, so it does appear to be platform-dependeng somehow. Argh... |
I think I am running into the same problem and I have a hypothesis :
As I understand it, I haven't found a way to quickly test this because I haven't found how to compute an inverse transpose using the vispy transforms pipeline. |
Maybe the normal computation is simply wrong. Could you check with glumpy normal computation code instead ? It's available at https://github.com/glumpy/glumpy/blob/master/glumpy/geometry/normals.py |
@rougier The normals computed by |
Are the normals inside the .obj file or are they computed from vispy ? |
I am comparing the normals in the .obj to the one computed from vispy and they are the same. I've made a gist with the code I'm using. That's what I get without the If I show the normals in the fragment shader ( Now, if I enable cull_face for the mesh (it defaults to False in This looks much better but there are still some artifacts depending on the viewpoint Finally, here is the smooth rendering I get with cull_face=True. For some reason, this doesn't look like per-fragment lightning : I am not sure why enabling culling reduces the number of artifacts. Is it that depth testing isn't working correctly and culling remove some of the faces that should have been removed by the depth test ? Regarding the lighting issue, this looks like some of the varying passed to the fragment shader are not interpolated correctly. |
Forcing |
I fixed it for my case by doing the following :
|
@julienr so the normal calculation seems correct, and it was a depth buffer + cull_face issue? |
@Eric89GXL yes. Mostly a depth buffer issue (the cull_face is somehow a workaround the depth buffer issue). Maybe vispy should default to 24bit depth buffer on OSX and use a slightly smaller value than 1000000.0 for far by default ? Or figure out the far value somewhat automatically from the scene ? |
It seems like we could try to infer the right camera depth value from the depths of the objects in the scene, but I wonder if it would be fragile, or perhaps slow e.g. if objects are moved via transforms often. |
I definitely think this is something that should just work. But you're right that this is more complicated than it looks. |
@almarklein any ideas for how to improve the depth buffer? |
Better defaults, perhaps. we should certainly aim for a default depth buffer of at least 24 bits. And if the current |
@astrofrog Yes definitely |
@astrofrog PR? :) |
I saw that many issues can be fixed by setting |
… 16 bits depth (see vispy#1174, vispy#1175, vispy#896)
Hi,
next example isn't working:
This looks definitely like wrong normals.
Is this a similar problem as in #892 ?
Best,
Simon
The text was updated successfully, but these errors were encountered: