-
-
Notifications
You must be signed in to change notification settings - Fork 651
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dropping frames: Trying to use optimizations. Can't determine my issue #1130
Comments
Hi, thank you for using Ebiten! I'd like to know what kind of machine you are using (Linux and Windows). I tried your game on my MacBook Pro 2018, and saw it kept 60FPS. I also tried
|
OK I understood the reason why the app was slow on you machine: your application is drawing too many objects per one frame: over about 300K vertices (= 75K rectangles). I don't think your game needs to draw all of them at the same time. Would it be possible to reduce the number by rendering objects only visible in the current region? |
@hajimehoshi it certainly would help! An optimization I made in the pixel game engine was to only redraw when I knew something changed, like camera movement: https://github.com/jrcichra/gollercoaster/blob/62570d8957c033fe121c111bc883e7df919eb24a/game/game.go#L83 The game would play at a high framerate, even with camera movement. Only when I placed blocks did the framerate dip significantly. I'm running a new Ryzen 9 3900X with an AMD WX2100 for my host OS (Linux, Ubuntu 19.04 w/ latest Mesa Drivers). I also do IOMMU passthrough with Linux and pass all cores to a Windows 10 VM. That Windows VM has a dedicated RX580 in it. It should be near native performance. I also tried on a Windows 10 laptop with intel integrated graphics and had the same slowness. I have not tried on a Mac yet. Windows (may) be locking the FPS to 30fps, with a TPS of 60. I need to do more testing. The game at the current state makes a single core go to 100%, even on a 3900X, if I'm focused on the game engine with my mouse. I can limit the number of draw calls I make at my game.go level but I was hoping I could leverage some optimizations done in ebiten to write less complicated "when-to-draw" logic. If you want to try the difference between ebiten and pixel with my code, here are the builds from Github Actions with the change #'s associated there: Pixel: https://github.com/jrcichra/gollercoaster/releases/tag/45 Ebiten: https://github.com/jrcichra/gollercoaster/releases/tag/47 I think you'll find Pixel is faster with those optimizations. FYI when you place a tile in Pixel, I rerender only the tiles Tx-1,Ty-1 to the front to satisfy the painter's method in a simple way. Placing a block closer to the front is faster than placing a block near 0,0. I could do something similar for Ebiten. |
Thank you for the info!
So the issue is that Ebiten doesn't have a clear notion of 'render-only-when-necessary', whereas Pixels has one. In the Ebiten version of your code, all the tiles are rendered every frame, right? I'd use an offscreen rendering. You can create an Ebiten image and use it as a render target. The offscreen should be updated only when necessary. Does this make sense? |
This sounds the similar way to what I suggested at #1130 (comment). Both (this way and the offscreen way) are fine :-) |
Correct, I'm rendering every tile every frame, where every tile has a variable N sprites/images on it. If I'm understanding, I'll make a global Could I then always draw that buffer to the screen in |
You should use
Yes, that's correct! This is generally cheap compared to the current way, since the number of draw calls would be much smaller. |
Thanks @hajimehoshi ! I'm working on a prototype change and I'm already seeing vast performance increases. Once I have a few more optimizations in I'll post a diff and close this issue. Thanks! |
This (about one skipping per second) is usual. Rendering happens based on the display's refresh rate, while updating happens based on the OS's timer, and there is a slight difference between them. |
@jrcichra I've noticed similar problems too. As Hajimehoshi mentioned, its a problem with the refresh rate / OS timer de-syncing. Here are some articles that might give you ideas on how to reduce stutter with heuristics. https://medium.com/@alen.ladavac/the-elusive-frame-timing-168f899aec92 |
@silbinarywolf Wow, I didn't take frame timing into as much consideration as I should have. These are great resources. Thanks! |
I think the cause and the solutions are clear now. Let me close this. Feel free to reopen when you find any other issues. |
Hi @hajimehoshi!
I'm in the process of porting my 2D isometric game to ebiten from pixel. I'm struggling with performance. I'm dropping frames on Linux and much worse on Windows.
I used the SubImage feature on a single ebiten.NewImageFromImage (using that one source) for all my sprites.
My code (at the time of writing): https://github.com/jrcichra/gollercoaster/tree/d5288f1812a16655b995b44e8eb9668d18bba789
This is my render() function, which comes at the end of update()
t.Draw() is here:
How I load textures:
I am using pointers so I am not sure what the issue is.
Thanks for your help and an awesome Go 2D library!
EDIT: Attached profiler chart:
The text was updated successfully, but these errors were encountered: