Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Isometric support #1729

Open
5 tasks
DragonMoffon opened this issue Apr 26, 2023 · 6 comments
Open
5 tasks

Isometric support #1729

DragonMoffon opened this issue Apr 26, 2023 · 6 comments

Comments

@DragonMoffon
Copy link
Collaborator

DragonMoffon commented Apr 26, 2023

Isometric has repeatedly come up on the Discord so im putting this here as a reminder and checklist.

This requires:

  • Tilemap Loading
  • Correct Sprite Ordering
  • Logical Hitboxes

Some cool extra things would be:

  • Rotatable camera (See the game Don't Starve)
  • Shadows?? This is a bonkers Idea i have if anyone wants to pick this up please do and ignore this.
@pushfoo
Copy link
Member

pushfoo commented Apr 27, 2023

tl;dr I think we should focus on adding a good sprite ordering abstraction instead of polishing isometric support

We currently don't have an easy way to efficiently update sprite draw order, and it's important for more than just isometric games. For example, a more top-down game like Undertale also needs to draw sprites in order based on their bottom edge's Y value.

The most accessible way is currently calling SpriteList.sort, but it's very slow. Is the best way to do this using a different shader with a SpriteList? My rudimentary understanding of GLSL is as follows:

  1. Shaders have access to the depth value somehow
  2. This may not work when alpha is involved

@Cleptomania
Copy link
Member

Cleptomania commented Apr 27, 2023

We have depth testing in Arcade 3.0 now I thought?. There is a depth parameter on the Sprite that to my knowledge should be working.

I could be wrong about that and the shaders/SpriteList might not be updated yet, @einarf would know better as I believe he added the parameter. If this isn't working yet, I think that was the intended path forward.

It may be the case that this was left un-done, with the plan to have a way to enable it optionally? Ordering based on the z-index can have unnecessary performance problems. While it is obviously less problematic than sorting the SpriteList, it's probably an unnecessary performance impact to use a depth buffer to sort the sprites, I think shader wise this would be about the heaviest operation the sprite shader performs if it were doing so.

@einarf
Copy link
Member

einarf commented Apr 27, 2023 via email

@pushfoo
Copy link
Member

pushfoo commented Apr 27, 2023

There is also example in experimental

On development, run it with:

python -m arcade.experimental.sprite_depth

I've also made a follow-up PR linked above, and fixed some bugs in a related example (#1732).

While it is obviously less problematic than sorting the SpriteList, it's probably an unnecessary performance impact to use a depth buffer to sort the sprites

I agree that it's worth looking into optimized alternatives for specific common behaviors. However, we already have the feature partially implemented. It's worth considering ways to abstract it in a user-friendly way which limits performance impacts.

To my understanding, one way could be variants of SpriteList which wrap the original draw with gl.Context.enabled internally. The following names are ungainly examples, but they get the point across:

  • DepthSortingSpriteList
  • BottomEdgeSortingSpriteList

They also seem like acceptable short-term solutions compared to the SpriteList.sort solution as long we do the following in their docstrings:

  • Label them as best reserved for advanced users
  • Include warnings about performance

If there are better backing implementations, we can plan on replacing the insides later after the prototype is working with a clean API.

@Cleptomania
Copy link
Member

I think the only real concern I have about performance related to depth testing is do we have GPUs that don’t have a hardware depth buffer? I don’t think OpenGL actually makes any indication about wether or not a GPU has to implement something on a hardware level vs emulate it in software in the driver.

If a GPU has a hardware depth buffer(and most do, I’m worried about the edge cases here), then enabling the depth testing just by default has probably a negligible to literally no draw-back. I’m not even sure how to answer this question unless @einarf has a better of what hardware support for this is like.

My GUESS would be that we are fine to enable it by default, even the raspberry pi 4 to my knowledge has a hardware depth buffer(but there are some cases where you can’t have a depth buffer based on how many/size of render targets you have or something, can’t remember but don’t think it effects us really).

@DragonMoffon
Copy link
Collaborator Author

The other issue with the depth buffer is that it currently does not support semi-transparent sprites.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants