Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a way to texture meshes in Draw API #79

Closed
2 tasks
mitchmindtree opened this issue Mar 25, 2018 · 3 comments
Closed
2 tasks

Add a way to texture meshes in Draw API #79

mitchmindtree opened this issue Mar 25, 2018 · 3 comments
Labels
Milestone

Comments

@mitchmindtree
Copy link
Member

mitchmindtree commented Mar 25, 2018

  • Provide some way for images to be accessible to the draw API.
  • Refactor draw.mesh() to support:
    • .colored_tris(tris)
    • .textured_tris(tex, tris)
    • .colored_points(pts) calls colored_tris internally.
    • .textured_points(pts) calls textured_tris internally.
    • .colored_indexed(verts, indices)
    • .textured_indexed(tex, verts, indices)
@mitchmindtree
Copy link
Member Author

It's not yet clear to me if we should just rely on users providing a Arc<vk::ImmutableImage<vk::Format>> (perhaps behind some ImageRef type alias) to the above textured methods or if we should use an index into some other map of IDs to images.

One issue to consider on this topic is that images must be accessible via the fragment shader. This means providing samplers for each of the specified images along with some way of determining which sampler is being referred to in the shader code itself. If we allow users to pass pre-loaded images via the methods above, then we must support the case where the number of images changes during runtime. This suggests to me that we either need to:

  1. generate a new fragment shader each time a new set of images is seen and reconstruct the graphics pipeline,
  2. copy all the received images into a single Image2dArray, defeating the purpose of the users loading images onto the GPU in the first place, but allowing them to be indexed dynamically via the fragment shader or
  3. create a new draw command for every new image (slow).

On the other hand, if we were to take ownership over the image creation process, we could load images directly into an Image2dArray ready for runtime indexing.

It's worth noting that the Image2dArray approach for dynamic image loading would be perfect in the case that many images are loaded at once, though would be quite inefficient in the case that many images were added or removed at runtime. This is because the entire Image2dArray would have to be re-loaded every time an image was added or removed.

I wonder what other approaches exist for addressing dynamic numbers of images in shader code.

@mitchmindtree mitchmindtree changed the title Add texture::Map for supporting textures in Draw API Add a way to texture meshes in Draw API Aug 17, 2019
@MacTuitui
Copy link
Contributor

MacTuitui commented Aug 18, 2019

I would also consider fbos in the process from the start (with multiple render targets/sources) that might impose some restrictions on what you can use.
[Edit] Also, looking at how I would want the api to work, I think something along the line of draw.mesh().with_texture() and draw.mesh().with_textures() might be interesting to have. (I don't know what to pass to those but a reference to some image/buffer might be the most intuitive?)
It boils down to how you add your own shader in the end. Ideally I would want to swap one of the default shaders for my own and still get the same info in a intuitive way, where I would expect the textures to be available as samplers. It might mean a lot of work though to have this flexibility.

@mitchmindtree
Copy link
Member Author

Closed via #484.

See the draw_texture* examples for reference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants