Skip to content

BACKENDS: OPENGL: Fix getScreenFormat while running in 3D #6675

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jun 1, 2025

Conversation

lephilousophe
Copy link
Member

This is a small fix to allow engines to use the optimum textures when 3D rendering.

I also removed the format argument from loadVideoMode as we were never using it (in 2D and in 3D).

@lephilousophe lephilousophe requested a review from aquadran June 1, 2025 10:43
@lephilousophe lephilousophe merged commit c6b7cea into scummvm:master Jun 1, 2025
8 checks passed
@lephilousophe lephilousophe deleted the fix-3d-format branch June 1, 2025 11:32
@ccawley2011
Copy link
Member

It might be a good idea to move the pixel format to texture format mapping into common code to make this easier to make use of.

@tunnelsociety
Copy link
Contributor

tunnelsociety commented Jun 1, 2025

Build failure; need ifdef?

./configure ... --disable-opengl-game ...
C++      backends/graphics/opengl/opengl-graphics.o
backends/graphics/opengl/opengl-graphics.cpp:240:6: error: use of undeclared identifier '_renderer3d'
        if (_renderer3d) {
            ^
1 error generated.

@lephilousophe
Copy link
Member Author

Indeed, fixing ASAP

@lephilousophe
Copy link
Member Author

@tunnelsociety Should be fixed in master.

@tunnelsociety
Copy link
Contributor

@lephilousophe Mercy buckets 😸

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants