-
-
Notifications
You must be signed in to change notification settings - Fork 35.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support WebGL2 and GLSL 3.00 in the scheme suggested by Mr Doob #13701
Support WebGL2 and GLSL 3.00 in the scheme suggested by Mr Doob #13701
Conversation
I would still prefer to go for the approach discussed on #12250. As currently the WebGL(1)Renderer is the stable one, and WebGL2Renderer will be unstable for a while, I think it's better to keep untouched everything we can on WebGLRenderer and move the conversion to the new one. |
@fernandojsg I'm kind of curious to see how much we can simplify this PR. |
The scheme implemented here is in line with "Include some kind of postprocessing of the whole shader like https://github.com/Jam3/glsl-100-to-300 so this could help also converting users's WebGL1 shaders.", except that the post processing goes from 3.00 to 1.00. I wondered a bit why the suggestion was to go from 3.00 to 1.00 but good news there is that the migration path is smoother, as only shader code we took time to rewrite will have to deal with 3.00 and nothing else can stay at 1.00. |
src/renderers/WebGLRenderer.js
Outdated
@@ -65,7 +65,9 @@ function WebGLRenderer( parameters ) { | |||
_antialias = parameters.antialias !== undefined ? parameters.antialias : false, | |||
_premultipliedAlpha = parameters.premultipliedAlpha !== undefined ? parameters.premultipliedAlpha : true, | |||
_preserveDrawingBuffer = parameters.preserveDrawingBuffer !== undefined ? parameters.preserveDrawingBuffer : false, | |||
_powerPreference = parameters.powerPreference !== undefined ? parameters.powerPreference : 'default'; | |||
_powerPreference = parameters.powerPreference !== undefined ? parameters.powerPreference : 'default', | |||
_webglversion = parameters.webgl2 !== undefined ? 'webgl2' : 'webgl'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of adding a new parameter, can we try doing this for now?
var canvas = document.createElement( 'canvas' );
var context = canvas.getContext( 'webgl2' );
var renderer = new THREE.WebGLRenderer( { canvas: canvas, context: context } );
Then, inside the renderer we can do:
var isWebGL2 = _context instanceof WebGL2RenderingContext;
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Passing a context created outside does work already. And you don't have to passing canvas even. a canonical check for isWebGL2 at one place presumably at WebGLRenderer would be good. (How do we proceed? Would you prefer a fresh pull request or it is okay for me to push a new commit to this branch and you pull them? Or this is close enough to be pulled, and we can fix issues over time?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do we proceed? Would you prefer a fresh pull request or it is okay for me to push a new commit to this branch and you pull them? Or this is close enough to be pulled, and we can fix issues over time?
Yeah, push changes to this branch.
I'm trying to see what's the minimum amount of changes needed in the current code to make this work.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And I now remember why those isWebGL2 tests are not as simple as (_context instanceof WebGL2RenderingContext): it did not work on browsers that don't support WebGL2. And also the renderer could potentially have a method called isWebGL2 or such, but the other classes such as WebGLExtensions.js, WebGLTextures.js, etc. only receives the actual context object and they still have to check it anyway. (And actually, it'd be very helpful if you agree to keep webgl2 parameter to the WebGLRenderer. It won't get in the way of others, right?)
I pushed a commit with the depth_frag error and just a cosmetic change to make the isWebGL2 check the same everywhere.
@@ -32,11 +34,11 @@ void main() { | |||
|
|||
#if DEPTH_PACKING == 3200 | |||
|
|||
gl_FragColor = vec4( vec3( 1.0 - gl_FragCoord.z ), opacity ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the 1.0 -
removal intentional?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow. Definitely not. Thanks for spotting.
@mrdoob so with the current approach on this PR we'll have webgl1 and webgl2 on the main renderer. Is that the desired approach instead of having two different renderers? |
I think the change will be smaller if we convert from GLSL 1.0 to GLSL 3.0. |
Ahh, using #define. It just did not occurred to me to use that! As you can see, the only changes to support WebGL2 features so far are setRenderTarget and (partial combinations of) some other types for the frame buffer. (Just those were what we needed for our project.) For that it was good not to have to use WebGL2Renderer just for that. As we keep adding those "little changes" to support more WebGL2 features, there may be a point where it might look complicated. But then, it is going to be not only WebGLRenderer.js but many other files such as WebGLTextures. |
I opened PR of GLSL 1 to 3 conversion approach #13717 |
I agree with @bhouston. It would be great if TBH, I'm not a fan of a separate |
Ok so we are all on the same boat for |
About I (maybe) almost have done initial WebGL2.0 support in #13717 but it doesn't look too messy yet. And I suppose even if we add all WebGL 2.0 features it won't be too messy. Personally I wanna keep considering both modulability and WebGL2.0 into And probably we might wanna think About auto WebGL 2.0 detection as the default, it'd be nice once Three.js + WebGL 2.0 is tested well and becomes stable. So doing now is a bit too early. Maybe r93 (or later) would be good? |
From what I have seen here, I have a sense that detecting webgl2 and use it when possible is not in mr doob’s mind. It’d be rather to use webgl2 when specified explicitly |
How about it is part of the configuration option passed into WebGLRenderer.... e.g. So for now it is default off but once our WebGL 2 support is fully baked and good, we make it the default. :) |
Passing in a flag to use WebGL2 is in this pull request, in case you missed it. |
|
It depending how the future turns out, but automatic fallback means that the same application running in different environment may use different versions of context. It might be okay, or it might be desirable: but it could be pretty bad for developers. The difference is bigger than a few extensions. |
As I wrote in #13717 I still prefer GLSL 1 to 3 conversion because
For us, WebGL 2.0 support is still unstable and experimental. So I don't think it's a good idea to affect Three.js + WebGL 1.0 stability by changing shader code (even if we convert to GLSL1 runtime). |
Just to get a closure, and to do what is supposed to happen, I merged @takahirox 's changes (which was a merge of his and mine) into this pull request. I believe the end result is identical but this one has a bit longer commit history. (And yes, it does work with our project out of the box: https://gist.github.com/yoshikiohshima/d25fb351cb226aa9aa1c6a62c553d43b So that is great for us) |
Well, I didn't want you to remove GLSL 3 to 1 conversion from this PR. We no longer be able to compare now between "3 to 1" and "1 to 3". |
I worked on it too a few days ago: https://github.com/DavidPeicho/three.js @mrdoob maybe you don't want to bloat the WebGLRenderer 1 with new features yet? |
@DavidPeicho I don't know how @mrdoob and people decide what to do, but as @takahirox has made more changes, his PR would carry more weight? If that one gets merged, I'd say that you would make a PR that is meant to go on top of it. In my mind, too many ideas in one PR make it harder to understand so mixing your changes and this one may not what I would suggest. (And, as long as we get a better three.js in the end, I should not complain that the idea and code here was simply cannibalized without giving me github karma...) |
Yes, obviously the proposed PR is a lot more robust than mine! |
Should we close this in favor of #13717? |
@yoshikiohshima okay! let us know. |
Closing in favour of #13717. |
Another attempt to support WebGL2RenderingContext and GLSL 3.00. This is follow up of #13692.
WebGLRenderer.js parameter may specify to call getContext() with argument 'webgl2'. Without the argument, the behavior is identical as before and compatible with existing code.
When webgl2 is specified, there is an option to use GLSL 3.00 by setting needsGLSL300 for material
objects.
Multiple Render Target can be specified from the client code.
Implementation:
The share code in ShaderChunk and ShaderLib is written with "in/out" instead of "varying/attribute".
There is a function called convertToGLSL1() that takes such code snippets and replace in/out to varying/attribute (and does a few other things.)
extensions natively supported by WebGL2RenderingContext are handled.
(For our project, we have a GPGPU framework that uses GLSL3.00 features such as texelFetch and new buffer types such as R32F. This patch allows integrating the framework with Three.js. It's been working well for our purpose, and hopefully it is useful for others who are trying a similar stuff.)