Skip to content
This repository has been archived by the owner on Jul 8, 2022. It is now read-only.

OSX and GL2.1 and 64bit #330

Open
bilgili opened this issue Aug 31, 2016 · 3 comments
Open

OSX and GL2.1 and 64bit #330

bilgili opened this issue Aug 31, 2016 · 3 comments

Comments

@bilgili
Copy link
Contributor

bilgili commented Aug 31, 2016

Because of the integer texture sampler restrictions we can not use the same code path, we are using with the current GL4 renderer. The sampling of integer textures are supported from 1.30 and on.

For GL2.1 to work textures should be in normalized form and, this means substantial amount of code change in TexturePool, Renderer and the fragment shader.

@bilgili bilgili mentioned this issue Aug 31, 2016
@eile
Copy link
Contributor

eile commented Aug 31, 2016

I'm not fully understanding: By integer and normalized do you mean GL_TEXTURE_RECTANGLE and GL_TEXTURE_2D 'addressing', or do the GL2 texture values need to be normalized? I.e., what does the integer in integer textures refer to, and what does the normalized form refer to?

@bilgili
Copy link
Contributor Author

bilgili commented Aug 31, 2016

Simply Sampler3D in the shaders cannot be used with integer textures, where now we are using absolute values of the data as textures with the latest changes so that we can use transfer functions as they are. I recommend you reading the glTexImage3D texture formats. The textures are normalized when you put into GPU memory or not. We are putting them as they are.

@hernando
Copy link
Contributor

hernando commented Sep 1, 2016

Try adding

#extension GL_EXT_gpu_shader4 : enable

in the GLSL shaders using integer texture samplers.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants