Skip to content

Conversation

@bluegr
Copy link
Member

@bluegr bluegr commented Oct 31, 2024

This is a continuation of the work made by @ccawley2011 in PR #4939, addressing the pending issues made in that PR by @sluicebox and @bluegr, and bringing the code up to date with the latest GFX driver changes in SCI.

The logic has been changed to try the best screen format to play the QT video, otherwise it tries to either transform the resulting video to the screen format (for non-indexed screen resolutions), or enable QT dithering for CLUT/indexed screen resolutions.

With the changes here, the videos in KQ6 Mac will still play for normal desktop systems like before (in SCI RGB and non RGB modes), and they will be dithered for systems that only support indexed screen resolutions.

There are two small issues / things to discuss:

  • The bottom of the screen will show the Mac UI while videos play, but that'll show distorted if the resolution changes. It returns to normal after the video is done Fixed by @athrxx in 9339b3b
  • For some reason, I needed to initialize the screen again, even if the screen format is the same, if the RGB checkbox is checked. This seems related to the internal logic in the SCI GFX drivers - perhaps @athrxx could help here? This is expected behavior, as normally the game sends paletted data - we need to initialize the screen again, even if the RGB checkbox is checked, to display the video RGB data.

@athrxx
Copy link
Contributor

athrxx commented Oct 31, 2024

  • For some reason, I needed to initialize the screen again, even if the screen format is the same, if the RGB checkbox is checked. This seems related to the internal logic in the SCI GFX drivers - perhaps @athrxx could help here?

If you pass a format argument to GfxDriver::::initScreen(), this specifies the format of the input data. For normal SCI graphics you should leave it empty, this means that the driver will expect 8bit input (this is regardless of the RGB checkbox, the input data format never changes). So, if you call GfxDriver::::initScreen() with a 32bit rgb format (for the data from the video decoder) you'll have to reset it to normal 8bit input afterwards by calling it without specifying any input format.

@bluegr
Copy link
Member Author

bluegr commented Oct 31, 2024

  • For some reason, I needed to initialize the screen again, even if the screen format is the same, if the RGB checkbox is checked. This seems related to the internal logic in the SCI GFX drivers - perhaps @athrxx could help here?

If you pass a format argument to GfxDriver::::initScreen(), this specifies the format of the input data. For normal SCI graphics you should leave it empty, this means that the driver will expect 8bit input (this is regardless of the RGB checkbox, the input data format never changes). So, if you call GfxDriver::::initScreen() with a 32bit rgb format (for the data from the video decoder) you'll have to reset it to normal 8bit input afterwards by calling it without specifying any input format.

Thanks @athrxx, the issue I am having is that although the screen is initialized to an RGB format (because the RGB SCI game option is checked), we need to initialize the screen again with the same pixel format. Check lines 165-169:

// Init the screen again
const Graphics::PixelFormat format = g_system->getScreenFormat();
g_sci->_gfxScreen->gfxDriver()->initScreen(&format);
videoDecoder->setOutputPixelFormat(g_system->getScreenFormat());
switchedGraphicsMode = true;

@athrxx
Copy link
Contributor

athrxx commented Oct 31, 2024

  • For some reason, I needed to initialize the screen again, even if the screen format is the same, if the RGB checkbox is checked. This seems related to the internal logic in the SCI GFX drivers - perhaps @athrxx could help here?

If you pass a format argument to GfxDriver::::initScreen(), this specifies the format of the input data. For normal SCI graphics you should leave it empty, this means that the driver will expect 8bit input (this is regardless of the RGB checkbox, the input data format never changes). So, if you call GfxDriver::::initScreen() with a 32bit rgb format (for the data from the video decoder) you'll have to reset it to normal 8bit input afterwards by calling it without specifying any input format.

Thanks @athrxx, the issue I am having is that although the screen is initialized to an RGB format (because the RGB SCI game option is checked), we need to initialize the screen again with the same pixel format. Check lines 165-169:

// Init the screen again
const Graphics::PixelFormat format = g_system->getScreenFormat();
g_sci->_gfxScreen->gfxDriver()->initScreen(&format);
videoDecoder->setOutputPixelFormat(g_system->getScreenFormat());
switchedGraphicsMode = true;

Yes, because otherwise the driver will not eat the rgb data (regardless of whether the RGB checkbox is checked or not, it always wants normal 8 bit data, the rgb rendering is a completely driver internal thing). The specific situation where we actually send rgb data only happens for the video.

@bluegr
Copy link
Member Author

bluegr commented Oct 31, 2024

Yes, because otherwise the driver will not eat the rgb data (regardless of whether the RGB checkbox is checked or not, it always wants normal 8 bit data, the rgb rendering is a completely driver internal thing). The specific situation where we actually send rgb data only happens for the video.

You're absolutely right :) Thanks!

@athrxx
Copy link
Contributor

athrxx commented Oct 31, 2024

Yes, because otherwise the driver will not eat the rgb data (regardless of whether the RGB checkbox is checked or not, it always wants normal 8 bit data, the rgb rendering is a completely driver internal thing). The specific situation where we actually send rgb data only happens for the video.

You're absolutely right :) Thanks!

This would probably be less confusing if the function for setting up the screen and setting up the source data pixel format were not the same. But since the Mac video playback is the only actual situation that needs any other source format than 8 bit I kept the minimalistic approach.

@bluegr
Copy link
Member Author

bluegr commented Nov 1, 2024

Pushed an updated cleaner version, now that the screen initialization has been cleared out.
I've pushed @ccawley2011's commits which support playing videos with higher pixel formats in scale2x(), since these are quite straightforward, in order to help clear out the noise in this PR

This is needed for the QT videos of the Mac version of KQ6
@sluicebox
Copy link
Member

sluicebox commented Nov 3, 2024

Looks good to me, I tested quite a few configurations and didn't see any regressions. Also, KQ6-Mac no longer crashes when RGB mode is enabled.

I noticed two minor glitches that are unrelated to this change, but I'm mentioning just in case someone here wants to take a look. Both occurred on Windows.

  1. With RGB mode enabled, the cursor becomes before and after HalfDome. It recovers afterwards.
  2. During one test, the HalfDome movie switched me from full screen to windows. I was in SDL render mode at the time, with the KQ6 high resolution option disabled. It may involve the confusion surrounding the fullscreen config. Doesn't seem like that should happen under any circumstances while in the middle of a game. (I also got it to occur in nightly, so unrelated to this PR.)

@bluegr Thanks for seeing this through! =)

@ccawley2011
Copy link
Member

Personally, I still think dithering should be enabled or disabled purely based on the RGB mode setting rather than trying to switch from paletted to true colour mode when RGB mode is off, both for cases where performance is an issue and for greater accuracy with the original. That said, I'm not convinced the code as I left off in the previous PR was actually all that accurate (I wouldn't have expected the original to try and dither an already dithered video), but since I don't have a way of testing the original executable myself, I can't really confirm how it behaves with a 256 colour display.

If someone else can provide screenshots and/or videos of the original in action on a 256 colour Mac (real or emulated), I might be able to give this another look.

@sluicebox
Copy link
Member

@ccawley2011 Here's a youtube of both KQ6 movies running on a Mac: https://www.youtube.com/watch?v=2-0usTnaoXE

Compression artifacts aside, that video is how it looked for me in an emulator.

@bluegr
Copy link
Member Author

bluegr commented Nov 4, 2024

Thank you all! I'm merging this, since it resolves issues with KQ6 Mac (e.g. crash when enabling the "RGB" checkbox). Also, it allows the QT videos to be dithered, if the backend used doesn't support RGB resolutions.

The QT KQ6 videos are encoded in 32-bit color, so enabling dithering on them makes them look awfully ugly. I disagree with @ccawley2011 that we can use the RGB checkbox for the dithering feature, as the default settings on a desktop platform enable dithering, which isn't what we want - check #4939 for screenshots on how ugly dithering looks with these videos.

Merging and we can check again the dithering functionality in a future PR, if needed

@bluegr bluegr merged commit 5972e1b into scummvm:master Nov 4, 2024
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants