Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement a low-level renderer #8

Closed
hecrj opened this issue Sep 5, 2019 · 7 comments · Fixed by #22
Closed

Implement a low-level renderer #8

hecrj opened this issue Sep 5, 2019 · 7 comments · Fixed by #22
Labels
feature New feature or request help wanted Extra attention is needed
Milestone

Comments

@hecrj
Copy link
Member

hecrj commented Sep 5, 2019

Both the renderer in coffee and the ggez one in the examples are using a high-level abstraction. This makes it really easy to build a somewhat simple renderer, but makes interesting features (like scrollables) complicated.

I would like to implement a renderer using a lower-level API. This renderer could accept different kinds of configuration, and also support widgets with many styling options.

I am familiar with wgpu, and supporting all the platforms natively and also the web sounds great! In any case, I think there are many interesting similar efforts that may be worth investigating and contributing to!

@hecrj hecrj added feature New feature or request help wanted Extra attention is needed labels Sep 5, 2019
@lachlansneff
Copy link

I'd like to help with this. I'm writing an application that is essentially a form of cad and I'd like to use iced as the gui for it.

@olanod
Copy link

olanod commented Sep 9, 2019

What would be the approach here, draw widgets in terms of the low level API directly or create higher level 2d drawing abstractions first that are then used to draw the widgets? I don't know the wgpu API but might seem a bit to low level?
Some time ago I read discussions(like here) about this kind of stuff and the need for 2d drawing APIs, I have little idea about low level graphics programming but what I got from @nical creator of lyon(btw dependency in ggez) is that games and UIs might have different needs on how they draw 2d.
iced would be mainly used for UIs but it would still be good to consider both cases by allowing widgets that have different renderer to be composed but hopefully share a common ground? For example I'd like to create a map widget and render tiles in a 2d/3d surface, the kind of APIs needed seem to be closer that of a game but mixes dialogs and is embedded on a normal UI, creators of such a widget usually bundle a lot of code because they create things in terms of the low level API when it seems there could be several things to be shared with the UI rendering code(e.g. text!).

Anyway maybe a good start could be something like Lyon? apparently it can render using the wgpu backend still more layers might need to go in the middle. Nical approach is more bottoms up, creating solid foundations, iced is starting from the top, having an opinionated but elegant design that UI developers are already used to and comfortable with. Now we are missing badass APIs that can fit in the middle that solve different use cases and can be used by widget developers in a composable way to create great experiences 😃

@hecrj
Copy link
Member Author

hecrj commented Sep 9, 2019

What would be the approach here, draw widgets in terms of the low level API directly or create higher level 2d drawing abstractions first that are then used to draw the widgets? I don't know the wgpu API but might seem a bit to low level?

The approach I have in mind is to build/reuse a flexible 2D renderer abstraction, backed first by wgpu. Widgets shouldn't have to deal with pipeline setup and shaders unless strictly necessary.

Some time ago I read discussions(like here) about this kind of stuff and the need for 2d drawing APIs,

The declarative command list approach is the one I had in mind. Widget::draw would only record renderer commands, completely side-effect free. Then Renderer::flush would compare last frame commands with the new ones, perform diffing and rerender only what is necessary.

This approach works really well with Iced, because it completely decouples the generation of commands (widgets and widget state) from the actual rendering operations (graphics backend). It fits naturally.

Additionally, a declarative, side-effect free approach makes everything easier to test, and provides a lot of information to the renderer to perform many optimizations and improve over time.

but what I got from @nical creator of lyon(btw dependency in ggez)

Coffee uses lyon too! It's great, and it is definitely something we could use to generate meshes.

it would still be good to consider both cases by allowing widgets that have different renderer to be composed but hopefully share a common ground? For example I'd like to create a map widget and render tiles in a 2d/3d surface

I think it's a bit too early to see how this would work. But I don't see why it would be a problem to have different renderer abstractions as long as they are using the same graphics backend, given a layer of coordination (i.e a UI widget providing you access to 3D graphics).

Now we are missing badass APIs that can fit in the middle that solve different use cases and can be used by widget developers in a composable way to create great experiences

Indeed! This is where the fun begins!

@nical
Copy link

nical commented Sep 9, 2019

If I may, for UI type things (as for everything really) it's important to have a look at what your main primitives are (what constitutes 90% of the type of things you'll be rendering).
In general for UIs these are mostly text, axis-aligned (potentially rounded) rectangles filled with a simple pattern (solid color, image, gradient), borders and maybe shadows. Lyon isn't the most straightforward or optimized way to deal with these simple (but important) primitives. I think lyon can help with some of the other 10% of fancier primitives, but I'd suggest designing around the common 90% first.

I'm super happy to see lyon used in more places in any case. Keep it up!

@hecrj hecrj added this to the 0.1.0 milestone Sep 24, 2019
@dylanede
Copy link

dylanede commented Oct 2, 2019

It's probably worth keeping pathfinder in mind as well.

@hecrj hecrj mentioned this issue Oct 14, 2019
4 tasks
@hecrj
Copy link
Member Author

hecrj commented Oct 15, 2019

I have implemented a first version of a very simple wgpu renderer in #22. As always, feedback is greatly appreciated.

I think exploring other options, like pathfinder and piet, and contributing to them could be a faster way to get a more feature-complete renderer while the wgpu one matures.

@hecrj hecrj closed this as completed in #22 Oct 23, 2019
@nmsmith
Copy link

nmsmith commented Dec 1, 2019

I'll just name drop WebRender as a higher-level alternative to wgpu. It provides optimized 2D rendering whilst abstracting away the raw GPU concerns. Still need Pathfinder for the vector graphics though.

ids1024 pushed a commit to ids1024/iced that referenced this issue Dec 23, 2022
feat: various improvements to window commands and event handling
CurryPseudo pushed a commit to CurryPseudo/iced that referenced this issue Apr 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants
@nical @hecrj @nmsmith @olanod @dylanede @lachlansneff and others