Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SDL: Do not reset window size when engines update rendering surface #1007

Closed
wants to merge 5 commits into from

Conversation

csnover
Copy link
Member

@csnover csnover commented Sep 3, 2017

This change allows:

  • Engines to update their target rendering surface/size and pixel
    format with the backend multiple times during gameplay;
  • Users to resize the ScummVM window without having it reset
    size/position every time an engine updates its target surface
    format;
  • Conversions/scaling to continue to run efficiently in hardware,
    instead of requiring engines to pick their maximum possible
    output format once and upscale inefficiently in software;
  • The window to reset size once when an engine calls to set its
    initial output size, and to reset again once ScummVM returns to
    the launcher.

This is relevant for at least SCI32 and DreamWeb engines, which
perform graphics mode switches during games.

This is a not-awesome hack, but it seems to be the least terrible
way I could find to enable a satisfactory user experience for
resizing without making more extensive changes to the graphics
manager API, and without forcing engines to opt-in to not breaking
the user experience with their calls to change screen modes (which
should not be the responsibility of game engines).

One potential alternative to having the graphics manager track the
global engine pointer would be to instead use a flag that gets set
and cleared by the main ScummVM loop before it calls to run an
engine, and after the engine run call returns.

Please let me know thoughts or suggestions on making this less gross. Thanks!

@bgK
Copy link
Member

bgK commented Sep 9, 2017

Another possibility to avoid using g_engine could be to send an event when starting / returning from an engine. That way anything that watches events could know when to set up / tear down engine specific state.

@criezy
Copy link
Member

criezy commented Sep 9, 2017

I have not looked at the code in this PR yet, and I don't know if that will help, but you do have a pair of virtual functions in OSystem that are called when an engine starts and finishes. It sounds like they could be reimplemented in the SDL backend to do what you want.

@csnover
Copy link
Member Author

csnover commented Sep 13, 2017

Thanks, @bgK and @criezy, for your invaluable feedback! I’ve updated this to use the OSystem hooks, which feels much better now. Let me know if there are any other ideas on improvements for this or if it looks OK to land.

@criezy
Copy link
Member

criezy commented Sep 17, 2017

As indicated to snore on IRC this change causing changing the graphics mode in the GUI options (from the launcher) to misbehave. The window is not resized when switching between mode that use a different scaler (for example 2x to 3x) in surface sdl. I can understand not resizing the window when switching between mode that use the same scaler, but even that might not be the expected behaviour. So I would suggest always resizing the window (if needed) when changing the graphics mode from the GUI options.
Although maybe it would be nice when going from surface SDL to OpenGL to keep the window size since OpenGL graphics mode does not really have to specify a size.

@csnover
Copy link
Member Author

csnover commented Sep 19, 2017

snore

😴

this change causing changing the graphics mode in the GUI options (from the launcher) to misbehave. The window is not resized when switching between mode that use a different scaler (for example 2x to 3x) in surface sdl. I can understand not resizing the window when switching between mode that use the same scaler, but even that might not be the expected behaviour. So I would suggest always resizing the window (if needed) when changing the graphics mode from the GUI options.

I think the clearest way to fix this will probably be to just always allow programmatic window resizes whenever the launcher is open. This is trivial with gh-1009 since the launcher’s visibility state bool becomes shared, so SdlGraphicsManager can use it directly. To avoid that extra dependency, I tried just unlocking the window size from setGraphicsMode, but that interferes in playback of games like DreamWeb CD which mode switch using the “force normal1x” flag, so doesn’t immediately seem like a viable path, but I might be a little blind to the possibilities at the moment.

Although maybe it would be nice when going from surface SDL to OpenGL to keep the window size since OpenGL graphics mode does not really have to specify a size.

I would not mind that, though I’m not sure offhand if the added code complexity would be worth it, since one probably would only want to do this when going from Surface to OpenGL right now. Maybe it makes more sense to revisit this later if/when the Surface renderer is updated to render GUI at whatever the current window dimensions are, since then we’d want to persist dimensions bidirectionally?

@criezy
Copy link
Member

criezy commented Sep 19, 2017

Sorry @csnover. I will blame macOS X auto-correct feature. Right now Windows tried to auto-correct it to crossover. Maybe you prefer that one?

Allowing programmatic window resizes whenever the launcher is open seems fine to me.

I am also wondering about the case where games change graphics mode. Not resizing looks fine in the case of dreamweb because it starts with the biggest size (640x480) and then switch to 320x200, which is thus upscaled. Upscaling usually looks OK. Downscaling usually does not though (especially when there is text that becomes difficult to read, and it might be an issue to find small hotspots). So I am worried the change might be an issue with other engines that start with a lower resolution (e.g. 320x200) and then switch to a higher one (e.g. 640x480). Supernova does that currently, but I have not tested the actual behaviour with the changes from this PR and I am just guessing that it would result in the 640x480 bitmap being downscaled to a 320x200 window (when playing in "Normal" graphics mode - and downscaled to 640x400 if playing with a 2x mode and no AR). And even for games that starts with a higher resolution intro it might looks strange to have a different behaviour (in term of window size we get) depending if we start from the start or load a game from the launcher.

Initially I was wondering is resizing should always be allowed when the requested size is bigger than the current window size. That might looks strange though as we would get asymetric behaviour where if a game starts in 320x200, then switch to 640x480 the window would grow, but then when it switches back to 320x200 the windows doesn't go back to its initial size. And would it not solve the different behaviour depending how we start the game. Maybe we should simply document somewhere that engines that use different graphics mode should first always do a initGraphics() with the biggest size they are going to use.

@csnover
Copy link
Member Author

csnover commented Sep 20, 2017

Sorry @csnover. I will blame macOS X auto-correct feature. Right now Windows tried to auto-correct it to crossover. Maybe you prefer that one?

I mean… I think they are both much more fun choices than spelling it correctly. :)

Maybe we should simply document somewhere that engines that use different graphics mode should first always do a initGraphics() with the biggest size they are going to use.

I like where you are going with this, but just this change is not sufficient to fix the problem. A game like DreamWeb which uses both 320x200 and 640x480 + defaultTo1xScaler can’t give the correct answer, because 640x480 is the largest resolution if the user picks a 1x or 2x scaler, but 320x200 becomes the largest resolution if a 3x scaler is used. This is just a fundamental problem with the way we allow the engines to force the 1x scaler, and with how we intentionally limit scalers to only certain resolutions.

To really solve this, I think the defaultTo1xScaler parameter should be removed totally, and engines should just pass a list of all the resolutions they are going to use at graphics init time. The backend can then decide how to calculate the correct resolution for the window. This should also provide more flexibility in terms of user-defined scaler operation, since we’re at the point now with powerful hardware and high-resolution screens where running an advanced scaler on content >320x240 actually makes sense in some situations, but it’s impossible for users to do this at all because we let engines force the scaler to 1x (which right now usually just means >320x240 = force 1x). Making this fully user-configurable is probably a 1.11 feature at this point, but we could at least set the groundwork now with this API change which will also resolve this window resizing stuff.

Let me know your thoughts.

@criezy
Copy link
Member

criezy commented Sep 20, 2017

This should also provide more flexibility in terms of user-defined scaler operation, since we’re at the point now with powerful hardware and high-resolution screens where running an advanced scaler on content >320x240 actually makes sense in some situations, but it’s impossible for users to do this at all because we let engines force the scaler to 1x (which right now usually just means >320x240 = force 1x).

The 'defaultTo1x' does not mean "force the scale to 1x". It means that if no graphics mode is specified for this game it should default to 1x instead of using the graphics mode specified in the global options. But the user can specify a 2x or 3x scaler in the game settings, and this will always be respected. In the case of Dreamweb it actually causes the game to switch between 960x600 and 1920x1440 resolutions, which is not that great.

Asking the engine to give a list of the resolutions they are going to use and letting the backend choose a window size from this seems fine to me. Of course the backend would need to take into account the graphics mode specified in the game settings, if any, and otherwise the one specified in the ScummVM options. But the defaultTo1x flag could indeed possibly be removed, and the backend could have more liberty to choose the actual window size. And indeed we could consider making the display resolution configurable by the user (maybe only as an advanced options by editing the scummvm.ini, at least initially).

In the case of the SurfaceSdl graphics modes resizing the window (as made possible by this PR) and allowing user to configure window size for games raises the question of how to make the graphics mode play nicely with the window size though since each graphics mode, coupled to the original game resolution would result in a preferred size (where SDL does not do any additional scaling). For example should the graphics mode, and thus scaling (1x, 2x or 3x) be chosen automatically to use one that fits the window size best? Although the fact we don't separate between scaler and graphics mode, and that some mode are only available with a 2x scaling (e.g. DotMatrix or TV2x) will be an issue here.

Maybe the future is to abandon the SurfaceSdl backend and go to OpenGL + shaders. Then those questions would go away! :P (seriously, it would be great if the work LordHoto started a while ago to support the libretro shaders was completed - I tried a few times and never went very far before the lack of free time caught up with me)

@csnover
Copy link
Member Author

csnover commented Sep 30, 2017

For example should the graphics mode, and thus scaling (1x, 2x or 3x) be chosen automatically to use one that fits the window size best?

This is a good question. For the moment I think the best approach is to just scale the surface, as this patch does, in the interest of getting this feature done and landed for 1.10. Things like integral scaling and automatic scaling mode selection can come in a future PR, since the former really needs the backends refactor PR to be viable and the latter has potentially undesirable performance ramifications.

Maybe the future is to abandon the SurfaceSdl backend and go to OpenGL + shaders. Then those questions would go away! :P (seriously, it would be great if the work LordHoto started a while ago to support the libretro shaders was completed - I tried a few times and never went very far before the lack of free time caught up with me)

Probably, but I don’t think I have the time/energy to invest in that in the short-to-medium-term. :)

Reviewing the comment history here, in order to finish and land this, it looks like the only critical change is to adjust the initGraphics API to receive a list of used hardware modes so that backends can more intelligently select the correct initial window size, so I will work on making that change and update this PR once it is ready.

This change allows:

* Engines to update their target rendering surface/size and pixel
  format with the backend multiple times during gameplay;
* Users to resize the ScummVM window without having it reset
  size/position every time an engine updates its target surface
  format;
* Conversions/scaling to continue to run efficiently in hardware,
  instead of requiring engines to pick their maximum possible
  output format once and upscale inefficiently in software;
* The window to reset size once when an engine calls to set its
  initial output size, and to reset again once ScummVM returns to
  the launcher.

This is relevant for at least SCI32 and DreamWeb engines, which
perform graphics mode switches during games.
This flag is removed for a few reasons:

* Engines universally set this flag to true for widths > 320,
  which made it redundant everywhere;
* This flag functioned primarily as a "force 1x scaler" flag,
  since its behaviour was almost completely undocumented and users
  would need to figure out that they'd need an explicit non-default
  scaler set to get a scaler to operate at widths > 320;
* (Most importantly) engines should not be in the business of
  deciding how the backend may choose to render its virtual screen.
  The choice of rendering behaviour belongs to the user, and the
  backend, in that order.

A nearby future commit restores the default1x scaler behaviour in
the SDL backend code for the moment, but in the future it is my
hope that there will be a better configuration UI to allow users
to specify how they want scaling to work for high resolutions.
Almost the entire file does not use the aliased PixelFormat except
for a single function, so just make that function work like
everything else already in the TU.
@csnover
Copy link
Member Author

csnover commented Oct 1, 2017

O-KAY. I’ve updated this PR with some new commits that allow engines to opt in to sending additional mode information to the backend using a new initGraphicsModes free function, to mirror initGraphics, which calls through to OSystem::initSizeHint/GraphicsManager::initSizeHint in the backends. Use of this function is optional, so only engines that actually switch between multiple screen sizes during a game actually need to call it. Everyone else can continue to just use initGraphics as before.

I’ve also removed the default1x flag and reimplemented that logic in backend code until we can decide what to do with that behaviour later.

It’s been difficult for me to manage some of the state-complexity, and so I don’t love some of the contortions in the backends code. As always I am open to feedback on how this might be done less poorly.

@csnover
Copy link
Member Author

csnover commented Oct 2, 2017

Oh, I forgot to mention, I was able to test DreamWeb, but I don’t have Lost In Time or the CD version of Future Wars so I was not able to verify that the updated mode handling is working properly for those.

@dafioram
Copy link
Contributor

dafioram commented Oct 4, 2017

I played the first part of Lost in time, the boat, and I didn't run into any issues. The main window stayed the same size while videos within change within it. I did both HQ2x and OpenGL.

@criezy
Copy link
Member

criezy commented Oct 4, 2017

The changes look good to me, and I like the new behaviour. Also the introduction of the Graphics::Mode struct should help with adding proper AR handling.

It does look a bit weird with Dreamweb that if we override the graphics setting for the game to use a 2x scaler when the default is 3x, we end up with a bigger window (or that if we override the graphics mode to use the same mode as the default we end up with a different window size), but I understand why that is (and this is a residual behaviour from the defaultTo1x, so removing that is a step in the right direction to maybe one day have a less bizarre behaviour :P ).

And for those poor souls with an old platform it's a shame that the SDL 1.2 backend does not benefit from these changes :'(

@bgK
Copy link
Member

bgK commented Oct 6, 2017

This looks good to me. It may be useful to add a no op default implementation to initSizeHint since it is kind of optional for backends to implement (and to avoid breaking the build of the non-SDL backends).

@csnover
Copy link
Member Author

csnover commented Oct 7, 2017

Thank you both for your excellent reviews and feedback! I’ve made initSizeHint an optional extension by giving it a default empty implementation and this feature is now landed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants