New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SDL: Do not reset window size when engines update rendering surface #1007
Conversation
Another possibility to avoid using |
I have not looked at the code in this PR yet, and I don't know if that will help, but you do have a pair of virtual functions in OSystem that are called when an engine starts and finishes. It sounds like they could be reimplemented in the SDL backend to do what you want. |
f8532bd
to
5ad2cbf
Compare
As indicated to snore on IRC this change causing changing the graphics mode in the GUI options (from the launcher) to misbehave. The window is not resized when switching between mode that use a different scaler (for example 2x to 3x) in surface sdl. I can understand not resizing the window when switching between mode that use the same scaler, but even that might not be the expected behaviour. So I would suggest always resizing the window (if needed) when changing the graphics mode from the GUI options. |
😴
I think the clearest way to fix this will probably be to just always allow programmatic window resizes whenever the launcher is open. This is trivial with gh-1009 since the launcher’s visibility state bool becomes shared, so SdlGraphicsManager can use it directly. To avoid that extra dependency, I tried just unlocking the window size from
I would not mind that, though I’m not sure offhand if the added code complexity would be worth it, since one probably would only want to do this when going from Surface to OpenGL right now. Maybe it makes more sense to revisit this later if/when the Surface renderer is updated to render GUI at whatever the current window dimensions are, since then we’d want to persist dimensions bidirectionally? |
Sorry @csnover. I will blame macOS X auto-correct feature. Right now Windows tried to auto-correct it to crossover. Maybe you prefer that one? Allowing programmatic window resizes whenever the launcher is open seems fine to me. I am also wondering about the case where games change graphics mode. Not resizing looks fine in the case of dreamweb because it starts with the biggest size (640x480) and then switch to 320x200, which is thus upscaled. Upscaling usually looks OK. Downscaling usually does not though (especially when there is text that becomes difficult to read, and it might be an issue to find small hotspots). So I am worried the change might be an issue with other engines that start with a lower resolution (e.g. 320x200) and then switch to a higher one (e.g. 640x480). Supernova does that currently, but I have not tested the actual behaviour with the changes from this PR and I am just guessing that it would result in the 640x480 bitmap being downscaled to a 320x200 window (when playing in "Normal" graphics mode - and downscaled to 640x400 if playing with a 2x mode and no AR). And even for games that starts with a higher resolution intro it might looks strange to have a different behaviour (in term of window size we get) depending if we start from the start or load a game from the launcher. Initially I was wondering is resizing should always be allowed when the requested size is bigger than the current window size. That might looks strange though as we would get asymetric behaviour where if a game starts in 320x200, then switch to 640x480 the window would grow, but then when it switches back to 320x200 the windows doesn't go back to its initial size. And would it not solve the different behaviour depending how we start the game. Maybe we should simply document somewhere that engines that use different graphics mode should first always do a initGraphics() with the biggest size they are going to use. |
I mean… I think they are both much more fun choices than spelling it correctly. :)
I like where you are going with this, but just this change is not sufficient to fix the problem. A game like DreamWeb which uses both 320x200 and 640x480 + To really solve this, I think the Let me know your thoughts. |
5ad2cbf
to
56a79e9
Compare
The 'defaultTo1x' does not mean "force the scale to 1x". It means that if no graphics mode is specified for this game it should default to 1x instead of using the graphics mode specified in the global options. But the user can specify a 2x or 3x scaler in the game settings, and this will always be respected. In the case of Dreamweb it actually causes the game to switch between 960x600 and 1920x1440 resolutions, which is not that great. Asking the engine to give a list of the resolutions they are going to use and letting the backend choose a window size from this seems fine to me. Of course the backend would need to take into account the graphics mode specified in the game settings, if any, and otherwise the one specified in the ScummVM options. But the defaultTo1x flag could indeed possibly be removed, and the backend could have more liberty to choose the actual window size. And indeed we could consider making the display resolution configurable by the user (maybe only as an advanced options by editing the scummvm.ini, at least initially). In the case of the SurfaceSdl graphics modes resizing the window (as made possible by this PR) and allowing user to configure window size for games raises the question of how to make the graphics mode play nicely with the window size though since each graphics mode, coupled to the original game resolution would result in a preferred size (where SDL does not do any additional scaling). For example should the graphics mode, and thus scaling (1x, 2x or 3x) be chosen automatically to use one that fits the window size best? Although the fact we don't separate between scaler and graphics mode, and that some mode are only available with a 2x scaling (e.g. DotMatrix or TV2x) will be an issue here. Maybe the future is to abandon the SurfaceSdl backend and go to OpenGL + shaders. Then those questions would go away! :P (seriously, it would be great if the work LordHoto started a while ago to support the libretro shaders was completed - I tried a few times and never went very far before the lack of free time caught up with me) |
56a79e9
to
0c554ae
Compare
This is a good question. For the moment I think the best approach is to just scale the surface, as this patch does, in the interest of getting this feature done and landed for 1.10. Things like integral scaling and automatic scaling mode selection can come in a future PR, since the former really needs the backends refactor PR to be viable and the latter has potentially undesirable performance ramifications.
Probably, but I don’t think I have the time/energy to invest in that in the short-to-medium-term. :) Reviewing the comment history here, in order to finish and land this, it looks like the only critical change is to adjust the initGraphics API to receive a list of used hardware modes so that backends can more intelligently select the correct initial window size, so I will work on making that change and update this PR once it is ready. |
This change allows: * Engines to update their target rendering surface/size and pixel format with the backend multiple times during gameplay; * Users to resize the ScummVM window without having it reset size/position every time an engine updates its target surface format; * Conversions/scaling to continue to run efficiently in hardware, instead of requiring engines to pick their maximum possible output format once and upscale inefficiently in software; * The window to reset size once when an engine calls to set its initial output size, and to reset again once ScummVM returns to the launcher. This is relevant for at least SCI32 and DreamWeb engines, which perform graphics mode switches during games.
This flag is removed for a few reasons: * Engines universally set this flag to true for widths > 320, which made it redundant everywhere; * This flag functioned primarily as a "force 1x scaler" flag, since its behaviour was almost completely undocumented and users would need to figure out that they'd need an explicit non-default scaler set to get a scaler to operate at widths > 320; * (Most importantly) engines should not be in the business of deciding how the backend may choose to render its virtual screen. The choice of rendering behaviour belongs to the user, and the backend, in that order. A nearby future commit restores the default1x scaler behaviour in the SDL backend code for the moment, but in the future it is my hope that there will be a better configuration UI to allow users to specify how they want scaling to work for high resolutions.
Almost the entire file does not use the aliased PixelFormat except for a single function, so just make that function work like everything else already in the TU.
0c554ae
to
69f09e1
Compare
O-KAY. I’ve updated this PR with some new commits that allow engines to opt in to sending additional mode information to the backend using a new I’ve also removed the default1x flag and reimplemented that logic in backend code until we can decide what to do with that behaviour later. It’s been difficult for me to manage some of the state-complexity, and so I don’t love some of the contortions in the backends code. As always I am open to feedback on how this might be done less poorly. |
Oh, I forgot to mention, I was able to test DreamWeb, but I don’t have Lost In Time or the CD version of Future Wars so I was not able to verify that the updated mode handling is working properly for those. |
I played the first part of Lost in time, the boat, and I didn't run into any issues. The main window stayed the same size while videos within change within it. I did both HQ2x and OpenGL. |
The changes look good to me, and I like the new behaviour. Also the introduction of the Graphics::Mode struct should help with adding proper AR handling. It does look a bit weird with Dreamweb that if we override the graphics setting for the game to use a 2x scaler when the default is 3x, we end up with a bigger window (or that if we override the graphics mode to use the same mode as the default we end up with a different window size), but I understand why that is (and this is a residual behaviour from the defaultTo1x, so removing that is a step in the right direction to maybe one day have a less bizarre behaviour :P ). And for those poor souls with an old platform it's a shame that the SDL 1.2 backend does not benefit from these changes :'( |
This looks good to me. It may be useful to add a no op default implementation to |
Thank you both for your excellent reviews and feedback! I’ve made initSizeHint an optional extension by giving it a default empty implementation and this feature is now landed. |
This change allows:
format with the backend multiple times during gameplay;
size/position every time an engine updates its target surface
format;
instead of requiring engines to pick their maximum possible
output format once and upscale inefficiently in software;
initial output size, and to reset again once ScummVM returns to
the launcher.
This is relevant for at least SCI32 and DreamWeb engines, which
perform graphics mode switches during games.
This is a not-awesome hack, but it seems to be the least terrible
way I could find to enable a satisfactory user experience for
resizing without making more extensive changes to the graphics
manager API, and without forcing engines to opt-in to not breaking
the user experience with their calls to change screen modes (which
should not be the responsibility of game engines).
One potential alternative to having the graphics manager track the
global engine pointer would be to instead use a flag that gets set
and cleared by the main ScummVM loop before it calls to run an
engine, and after the engine run call returns.
Please let me know thoughts or suggestions on making this less gross. Thanks!