set texture size and position #92
Comments
gl-react currently focus on solving the "I want to make effects over content" usecase rather than "I want to do 3D stuff / polygons things". So the current limitation is to hardcode the whole vertex data and vertex shader part. It's not that it's not possible to solve, but it's just a lot of time we don't have yet ;) It's also a bit challenging in the React Native architecture with the bridge and the different threads. one question about your feature request. Why not doing the work in the actual fragment shader? like done in https://github.com/gre/gl-react-image Is it less performant though? (I don't have a good knowledge of how OpenGL works at low level, but I guess it's still less performant because it means uploading the whole texture to GPU vs selecting the rectangle of the image you want to upload) Which basically means you should better use crop and resize your image before dealing with it (of course it depends on what your usecase is). You can use ImageEditor.cropImage in React Native if you want to prepare your image to the proper crop/size. |
I'm a total noob at GLSL, so I didn't realize what was possible in the fragment shader until recently, after I asked this question. I've figured out how to translate, scale, crop, and stretch textures independently in the fragment shader. You basically just mess around with This is a handy function that crops around textures that are smaller than the view... you could adjust the limits to dynamically crop the image.
and this stackoverflow answer is really useful for fixing the aspect ratio of images whose dimensions don't match the view/surface making it easy to use multiple images of any size. It's probably less performant though, seeing as vertex shaders are meant to build shapes/set dimensions. I made an image blender where you can use swipe and pinch gestures to move linear and radial gradient alpha masks around today. Moving linear gradients around is super snappy, but stretching and altering the intensity of radial gradients is pretty laggy, I can probably do some optimizing though. Anyway, It would be really great to expose as many apis as possible :) I forked this repo and started attempting to add vertex shader support to android. |
yeah eventually we should think about supporting vertex shader. see also https://github.com/ProjectSeptemberInc/gl-react/issues/6 for now, this can be solved by ImageEditor.cropImage or by GLSL fragment mapping. doing it in a vertex is probably roughly the same perf as in a fragment shader. in both case you have to upload the image and the texture lookup is done in the GPU. |
I highly vote for this. While I have no experience with ios programming, I do have experience with Linux/Android one, so I can try to implement VBO interface for react-native. Looking at https://developer.apple.com/library/ios/documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/TechniquesforWorkingwithVertexData/TechniquesforWorkingwithVertexData.html it seems that ios part shouldn't be much harder. |
Right now when a texture is loaded into a shader it automatically fills the surface.
Looking at other openGL examples it seems the standard way to set texture size/position happens before it is loaded into the shader, but with gl-react-native I don't see a way to do this.
I'm not entirely sure what I'm doing, do I have to use a vertex shader to manipulate the texture size/position, or is this even possible in the current implementation?
If this is not trivial, can you point me towards the part of the codebase where textures are loaded natively?
Thanks.
The text was updated successfully, but these errors were encountered: