Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

crt-auto should auto-switch to VGA shaders in EGA modes with 18-bit VGA palettes #2819

Merged
merged 36 commits into from
Sep 6, 2023

Conversation

johnnovak
Copy link
Member

@johnnovak johnnovak commented Sep 2, 2023

Note: I will use parts of this writing in my upcoming mega-article about PC monitors. Hence I put quite some effort into it, but you can safely skip the details in the second post and only focus on this first post.


Overview

Demos and many DOS ports of Amiga action/platformer games "repurpose" the 320x200 0Dh EGA mode: instead of the default 16-colour palette, they reprogram the palette to use 18-bit VGA colours. The planar 320x200 16-colour 0Dh EGA mode is preferred by these programs because certain effects/operations are more efficient in planar modes than in the "chunky" (1 byte per pixel) 320x200 256-colour 13h mode (e.g., smooth-scrolling on low-end hardware, such as 286 and slow 386 machines equipped with VGA adapters; a popular low-cost option in the early 90s in Europe where most of these Amiga ports originated from).

These programs need a VGA card; in most cases, they don't even work on an EGA card, or they do but display wrong colours. Therefore, the authentic choice in these cases for the crt-auto adaptive CRT shader is to use a VGA shader.

200-line EGA modes

The most important difference between using EGA vs VGA shaders for these 200-line games is that the VGA shaders are double-scanned, resulting in a very different-looking picture (big chunky pixels, as is expected from low-res VGA games). The single-scanned EGA look is like a really sharp Commodore monitor (which never existed 😅) and thus it's completely inauthentic. Can be a cool effect, though, but that's what the "fantasy" crt-arcade shader is for.

In 99% of cases, such programs that repurpose EGA modes with VGA colours use the 320x200 0Dh low-res EGA mode. I know of a single obscure game called Rusty that does it with the 640x200 0Eh EGA mode, and only in-game. This game is handled correctly by the detection logic and gets double-scanned with a VGA shader too.

350-line EGA modes

I'm not bothering with 640x350 10h EGA modes as those are single-scanned both on EGA and VGA, so there's minimal difference between the EGA and VGA CRT shaded output. For 640x350 EGA modes we continue to always pick EGA shaders.

I know only of two games that use the 640x350 mode with VGA colours: Darkseed (the whole game) and Daughter of Serpents (only the character creation screen). Of these two, Darkseed is the really important one, so we're really talking about a single game, and it looks completely fine with EGA shaders.

The main reason is I couldn't work out the VGA palette detection for 350-line modes, but as it turns out, luckily this affects only Darkseed and would make very little difference.

cga_colors

Naturally, the palette detection is compatible with any custom CGA palette configured via cga_colors (built-in or user-provided). JustWorks(tm), just like you'd expect! 😎

Example

Gods – double-scanned VGA shader, correct

gods-vga

Gods – single-scanned EGA shader, incorrect

gods-ega

@johnnovak johnnovak force-pushed the jn/cga-ega-on-vga-detection branch 8 times, most recently from 650c54d to 5139724 Compare September 2, 2023 12:13
@johnnovak johnnovak self-assigned this Sep 2, 2023
@johnnovak johnnovak added video Graphics and video related issues enhancement New feature or enhancement of existing features shaders Issues related to shaders labels Sep 2, 2023
@johnnovak
Copy link
Member Author

johnnovak commented Sep 3, 2023

Test cases

Testing graphics is always fun, especially edge-case-y stuff like this for a person with mild OCD tendencies. Like me 😎 (I'd say I'm a solid 5 on a 1 to 10 OCD scale 😅)

Note that some games look like EGA games, but many of them are these weird low-effort "VGA conversions". They slapped on a 256-colour start screen maybe for the VGA release, then reused the exact same 16-colour graphics in-game in the 320x200 VGA 13h mode so they can proudly put the "now with VGA support!" badge on the box... 🤦🏻 Windwalker is a prime example; the VGA intro doesn't look any better than the EGA version (I'd say it's worse with those ghastly gradients), and the in-game graphics are 100% the same EGA art. Yeah, there are many games like that.

Naturally, we cannot and should not treat those as EGA games, but it's quite confusing. Some other games tweak the default EGA palette in quite insignificant ways when you run them on a VGA card, e.g., just change the purple colour to a skin tone (Might and Magic 2 does that, and only in-game; people in the start screen and the hand in the intro are still purple... talk about low-effort!). These "technically VGA games" pretty much still look like EGA games to the untrained eye, and in fact, the graphics are still effectively 90%+ EGA!

You won't believe how many games do this low-effort "VGA conversion" thing, and technically that makes them "VGA games" from the point of view of the detection algo. People who bought them back in the day, thinking they'd get some nice VGA game... well, at least they got a double-scanned EGA game on their VGA monitor with minor palette tweaks, so we're emulating their experience rather accurately—including their confusion and the disappointment 😎

Anyway, these are the cards we're dealt with 🤷 (these puns will never end...)

320x200 0Dh EGA mode with default colours

These games use the default 16-colour CGA palette, so single-scanned EGA shaders are selected for them. This is basically regression testing to make sure true EGA games are not misdetected as VGA games.

To be clear, all these games should appear single-scanned, as they would on an EGA monitor at the time of release.

Success rate is 100% for all these games ✅

  • 2400 A.D.
  • Arachnophobia
  • Bard's Tale 1, The - Tales Of The Unknown
  • Bard's Tale 2, The - The Destiny Knight
  • Bard's Tale 3, The - Thief Of Fate
  • Bard's Tale I
  • California Games
  • Castle Master
  • Catacomb 3D
  • Champions of Krynn
  • Chip's Challenge
  • Commander Keen - Keen Dreams
  • Commander Keen 1 - Marooned on Mars
  • Commander Keen 2 - The Earth Explodes
  • Commander Keen 3 - Keen Must Die!
  • Commander Keen 4 - Secret of the Oracle
  • Cosmo's Cosmic Adventure - Forbidden Planet
  • Death Knights of Krynn
  • Defender of the Crown
  • Dragon Wars
  • Drakkhen
  • Duke Nukem - Episode 1 - Shrapnel City
  • Eye of the Beholder I
  • Fountain of Dreams
  • It Came From The Desert
  • King's Bounty
  • Leisure Suit Larry 2
  • Out of This World (EGA mode)
  • Pool of Radiance
  • Quest for Glory I
  • Quest for Glory II
  • Rings of Medusa
  • Space Quest 3
  • Street Rod
  • Total Eclipse
  • Ultima IV - Quest of the Avatar
  • Ultima V - Warriors of Destiny
  • Wasteland
  • Wizardry VI - Bane of the Cosmic Forge
  • Zak McKracken and the Alien Mindbenders (Enhanced)

Dragon Wars

dragon-wars

Street Rod

street-rod

Champions of Krynn

krynn

320x200 0Dh EGA mode with VGA colours

These games set up 16 custom 18-bit VGA colours. They require a VGA card, therefore they must appear double-scanned.

Most of these are Amiga conversions, and ironically displaying them single-scanned is how their originals looked on the Amiga. But that's what the crt-arcade adaptive shader is for, exactly for these games!

100% success rate ✅

  • Blues Brothers
  • Cadaver
  • Cadaver - The Payoff
  • Duke Nukem II
  • Gods
  • Heimdall
  • Immortal, The
  • Knightmare
  • Magic Pockets
  • Metal Mutant
  • Out of this World (Another World)
  • Prehistorik 2
  • Rastan
  • Might and Magic II (on VGA it redefines the skin tone colour, in-game only)
  • Rick Dangerous 2
  • Speedball 2
  • Starblade
  • Xenon 2 - Megablast
  • Zool

Cadaver

cadaver

Duke Nukem II

duke-nukem-ii

Gods

gods-vga

Heimdall

heimdall

Metal Mutant

metal-mutant

Out of This World (Another World)

out-of-this-world

Speedball 2

speedball-2

640x200 0Eh EGA mode with VGA colours

These games set up 16 custom 18-bit VGA colours. They require VGA cards, therefore double-scanned VGA shaders are in order.

100% success rate ✅ (yes, that's 1 out of 1! 😎 )

  • Rusty (in-game only)

Double-scanned, correct (VGA shader)

rusty-vga

Single-scanned, incorrect (EGA shader)

rusty-ega

640x350 10h EGA mode with default colours

These games pick 16 colours from the standard 64-colour EGA palette, hence EGA shaders will be picked (single-scanned, but it makes no difference as 350-line modes are always single-scanned on VGA anyway).

100% success rate ✅

  • Chuck Yeager's Advanced Flight Trainer 2.0
  • Corruption
  • EGA Trek
  • Fish!
  • Gateway
  • Jet
  • Jinxter
  • Pawn, The
  • Sierra's 3-D Helicopter Simulator
  • SimAnt
  • SimCity
  • SimFarm
  • Spellcasting 101
  • Spellcasting 201
  • Sub Battle Simulator The Masters Collection
  • Timequest
  • UFO

Spellcasting 201

spellcasting201-1

spellcasting201-2

Jinxter

jinxter

SimCity

simcity

640x350 EGA mode with custom VGA palette

These games set up 16 custom 18-bit VGA colours. We just use EGA shaders for these as the difference would be small compared to VGA shaders in this mode (because of single scanning). I can't be bothered to figure out the detection for these modes, it's not straightforward, and the reward is quite low...

100% success rate, according to the above simplified criteria ✅ (all 2! 😅 )

  • Darkseed
  • Daughter of Serpents (character creation screen only)

Darkseed

darkseed1

darkseed2

@johnnovak
Copy link
Member Author

johnnovak commented Sep 3, 2023

Confusing "enhanced" VGA versions

Might and Magic 2 - Gates to Another World

On VGA adapters, you can run the game in VGA mode with loadfix mm2 M to get EGA graphics with a minimally enhanced palette but in the 320x200 256-colour 13h VGA mode, or with loadfix mm2 E to run it in pure EGA mode.

Character creation – EGA

Zero difference between the two modes in terms of colours; skin tones are always purple.

mm2-char-ega

Character creation – VGA

mm2-char-vga

In-game – EGA

The purple colour used for skin tones in the EGA version has a proper skin tone colour in VGA mode. That's the only difference! I hope the team did not have to work overtime to create the "enhanced VGA version" 😛

mm2-ingame-ega

In-game – VGA

mm2-ingame-vga

Windwalker

EGA

This is how the game looks with machine = ega:

windwalker-ega

VGA

On a VGA adapter, the intro screens are a bit different because they use a 256-colour palette, but the actual game uses the same EGA graphics.

But there's lots of weirdness here:

  • Running the game with WIND.EXE or WIND.EXE EGA results in identical looking results, but the former uses mode 13h (VGA), and the latter 0dh (EGA).
  • Both seem to use the standard EGA palette, but there must be some slight change in EGA mode on VGA as it triggers double-scanning.
  • Note that the brown colour is mapped to pink in the machine = ega output. But that seems to be the standard EGA brown! So something super weird is going on here... You don't get the canonical palette with machine = ega mode because the brown is mapped to pink, but you get it on VGA... when you run it in EGA more... but it's slightly different than the EGA palette still. Wat???!?

windwalker-vga

It Came From the Desert

Run DESERT /I to reconfigure the game.

Title screen – VGA

This might look like the bog standard EGA palette... but it's actually not. Compare it to the pure EGA version to see the difference; they tweaked the colour a bit but used the exact same graphics otherwise.

desert-title-vga

Title screen – EGA

desert-title-ega

In-game – VGA

In-game, they made the dark blue a bit darker and called it a day. VGA version, yay! Put the badge on the box, let the boss know, then let's head to the pub! 😅 🍻

desert-game-vga

In-game – EGA

desert-game-ega

@johnnovak johnnovak marked this pull request as ready for review September 3, 2023 07:58
@johnnovak johnnovak changed the title DON'T REVIEW crt-auto should auto-switch to VGA shaders in EGA modes with 18-bit VGA palettes Sep 3, 2023
@johnnovak
Copy link
Member Author

johnnovak commented Sep 3, 2023

PVS Studio wants me to pass unions that wrap uint8_ts by reference 🥶 I refuse to do that, that's crazy...

image

@FeralChild64
Copy link
Collaborator

PVS Studio wants me to pass unions that wrap uint8_ts by reference 🥶 I refuse to do that, that's crazy...

Always happens with the bit_view style unions. I’m afraid at some point we will get overhelmed with such warnings…

@johnnovak
Copy link
Member Author

johnnovak commented Sep 3, 2023

PVS Studio wants me to pass unions that wrap uint8_ts by reference 🥶 I refuse to do that, that's crazy...

Always happens with the bit_view style unions. I’m afraid at some point we will get overhelmed with such warnings…

Yeah I'd turn this warning off then. Bitviews are super useful, and I can see them being used more and more going forward, and yes, also as function arguments. No way I'm gonna pass a 64-bit pointer (reference) instead of a byte...

@kcgen
Copy link
Member

kcgen commented Sep 3, 2023

Yup, no harm in disabling this specific V-number.

@johnnovak , in the pvs-studio CI yaml, add V801 to the list:

disable_warnings="V002,V1042,V826,V802,V2008,V1071"

Should bring down the count by a significant number.

My guess is it's seeing the enum class objects as real classes at the object/symbol level (and with strong type guarantees), and so probably has a hard time differentiating.

Where as the C-enums are just integers so those appear as scalars.

Our coding style passes real class objects as const-ref: Function(const ClassType& class_instance), so just have to catch pass-by-value at review time (although I think we're all pretty good about that).

@weirddan455
Copy link
Collaborator

Looks fine as far as I can tell but those vga* files look like black magic to me 😆

The bulk of the changes look like refacors which seem to seem help readability. Other than that, nice job improving shader auto-detection for people like me who wouldn't know the right one to choose 👍

@johnnovak
Copy link
Member Author

johnnovak commented Sep 6, 2023

Ok @kcgen a forward declaration in int10.h has obviated the need for including rgb666.h, but I've added some industrial-strength check_casting to the RGB helpers for good measure 😎 That will do for now.

@kcgen
Copy link
Member

kcgen commented Sep 6, 2023

Related read @kcgen, written by a Clang / LLVM contributor 😄

C Is Not a Low-level Language Your computer is not a fast PDP-11.

https://queue.acm.org/detail.cfm?id=3212479

(not sure I need to read it 😅 I worked with guys who wrote machine code on mainframes.. and trust me, they told me I basically was spoiled for having a C compiler that wrote the machine code for me: write once, compile many! C and then C++ was a universal high-level language that finally let people stop writing actual machine code.. compilers were truly a marvel!)

@johnnovak
Copy link
Member Author

johnnovak commented Sep 6, 2023

Related read @kcgen, written by a Clang / LLVM contributor 😄
C Is Not a Low-level Language Your computer is not a fast PDP-11.
https://queue.acm.org/detail.cfm?id=3212479

(not sure I need to read it 😅 I worked with guys who wrote machine code on mainframes.. and trust me, they told me I basically was spoiled for having a C compiler that wrote the machine code for me: write once, compile many! C and then C++ was a universal high-level language that finally let people stop writing actual machine code.. compilers were truly a marvel!)

Up to you, but it's not about that 😄 The TL;DR is that back in the day C was arguably a cross-platform assembler, but these days it's so far removed from the actual hardware with optimising compilers that it's basically a quite bad abstraction to the underlying machine. Kind of "worst of both worlds"; neither high-level enough, neither low-level enough, just archaic. Same goes for C++.

The important takeaway for me is that the "illusion of control" of C just gets in the way these days; it would be far better to just let describe "intent" using much higher-level abstractions then let the optimising compiler do its job properly. There is an interesting analogy with Fortran, and that's one of the reasons why C never reached Fortran speeds; with all that micro-management of "illusory low-levelness" you just make the job of the optimising compiler a lot harder.

Anyway, I found it interesting, and integer promotion in C makes me cry every single time I run into it...

@johnnovak
Copy link
Member Author

Can I get the tick please @kcgen ?

@kcgen
Copy link
Member

kcgen commented Sep 6, 2023

Related read @kcgen, written by a Clang / LLVM contributor 😄
C Is Not a Low-level Language Your computer is not a fast PDP-11.
https://queue.acm.org/detail.cfm?id=3212479

(not sure I need to read it 😅 I worked with guys who wrote machine code on mainframes.. and trust me, they told me I basically was spoiled for having a C compiler that wrote the machine code for me: write once, compile many! C and then C++ was a universal high-level language that finally let people stop writing actual machine code.. compilers were truly a marvel!)

Up to you, but it's not about that 😄 The TL;DR is that back in the day C was arguably a cross-platform assembler, but these days it's so far removed from the actual hardware with optimising compilers that it's basically a quite bad abstraction to the underlying machine. Kind of "worst of both worlds"; neither high-level enough, neither low-level enough, just archaic. Same goes for C++.

The important takeaway for me is that the "illusion of control" of C just gets in the way these days; it would be far better to just let describe "intent" using much higher-level abstractions then let the optimising compiler do its job properly. There is an interesting analogy with Fortran, and that's one of the reasons why C never reached Fortran speeds; with all that micro-management of "illusory low-levelness" you just make the job of the optimising compiler a lot harder.

Anyway, I found it interesting, and integer promotion in C makes me cry every single time I run into it...

Ok - read (most) of it.

If we went back in time to when C was authored and those memory layout guarantees were put in place.. would the article still be true? I don't think so. Surely those guarantees were trivial and mapped 1:1 to mainframes and their non-parallelized CPU's inner workings.

I agree: I'd also lament that C's old memory state model isn't a good fit for instruction-hungry processors that are digesting (hundreds!) of adjacent instructions all in parallel (speculatively trying!) to maximize performance.

But is that concern due to the circa 1970's C language spec? It was a perfect fit at the time. To me, it's the C language committee's fault for not evolving the language to suit our security needs and the evolving hardware architectures and how they want to behave to get maximum performance.

The C language committee has too weak a backbone to strongly deprecate older features and firmly mandate new ones through a language-spec "firewall" that guarantees one or the other.

It should be like Python: there's 2.x and 3.x: you can't use the inferior 2.x language features in 3.x, full stop. But if the C language committee was managing Python, it would be a big ball of backward-compatible mud!

He might have been careful to only bash C and not C++ though. Because C++ is trying hard to free developers from these issues by moving to code-by-intent, which is the purpose of <algorithm>: it lets optimizing compilers "achieve the requested action" using any means possible, in any order (or in parallel at the SIMD or thread level).

there have been lots of attempts at this, like Intel's SIMD and parallel pragmas, and with hybrid C languages like ISPC.

So this is super exciting that C++ is finally at this state - and will get even more of it; perhaps even GPU/compute offload.

Fully agree though - people just need to stop using it all together, especially the old C language constructs, and move to <algorithm> and the other nice new stuff!

@kcgen kcgen self-requested a review September 6, 2023 08:14
@johnnovak
Copy link
Member Author

johnnovak commented Sep 6, 2023

But is that concern due to the circa 1970's C language spec? It was a perfect fit at the time. To me, it's the C language committee's fault for not evolving the language to suit our security needs and the evolving hardware architectures. The C C language committee has too weak a backbone to strongly deprecate older features and firmly mandate new ones through a language-spec "firewall" that guarantees one or the other.

Oh yeah, in a way it's a victim of its own success; machines kept evolving, but the C language not so much—to the point that CPU manufacturers actually started to need to design their CPUs around the vast amounts of existing compiled C code. Which is a bit comical, but I guess yeah, the practical choice.

I guess what I find frustrating is when mankind collectively gets stuck with something that's just "good enough". But that's my problem, I know 😄

It should be like Python: there's 2.x and 3.x: you can't use the inferior 2.x language features in 3.x, full stop. But if the C language committee was managing Python, it would be a big ball of backward-compatible mud!

I'm quite certain of that 😄

Ultimately, we have the benefit of hindsight as armchair-language-critiques (like me 😎) Plus maybe it's better to be blissfully unaware how the sausage is made... 😛

@johnnovak
Copy link
Member Author

Hey, and thanks for the review @kcgen 😄

@weirddan455
Copy link
Collaborator

There's also the fact that the x86 assembly language is arguably not low level anymore. The "registers" don't map directly to hardware registers anymore. They're more like variables to the CPU. All the instructors get broken down into micro-ops and then executed in a completely different order than you typed them. Even if C does map neatly to assembly, it doesn't necessarily map to the hardware.

I do wish they would add better SIMD support to the language though.

Copy link
Member

@kcgen kcgen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These VGA overhauls are making marked difference in readability, @johnnovak!

Just a couple minor comments, merge away when ready!

.github/workflows/pvs-studio.yml Show resolved Hide resolved
include/rgb666.h Outdated Show resolved Hide resolved
include/rgb888.h Outdated Show resolved Hide resolved
include/vga.h Show resolved Hide resolved
src/hardware/vga_attr.cpp Show resolved Hide resolved
src/hardware/vga_attr.cpp Show resolved Hide resolved
@johnnovak
Copy link
Member Author

johnnovak commented Sep 6, 2023

There's also the fact that the x86 assembly language is arguably not low level anymore. The "registers" don't map directly to hardware registers anymore. They're more like variables to the CPU. All the instructors get broken down into micro-ops and then executed in a completely different order than you typed them. Even if C does map neatly to assembly, it doesn't necessarily map to the hardware.

I do wish they would add better SIMD support to the language though.

100%, when even x86 instruction set is a facade and the underlying microcode is something completely different and alien to that, I guess it becomes a bit of joke when trying to "hand-optimise" code. Sure, sometime you can achieve modest benefits and out-smart the compiler, but the future is machine-driven optimisation. I guess that was my "point", if I had any 😄

That's part of the charm of retro-coding for me on old 8-bit or 16/32-bit processors. Even an 486 is quite oldschool compared to what we have now; I was counting instruction latencies when writing x86 asm and it was an easy to understand model. Even more so for the Motorola 68k, MOS 6502 or the Z80—there is no pipelining, no completely ridiculous L2/L3/.../L100 caches, just serial execution. There's a beauty in that, a warm fuzzy feeling that you as a single person can hold the workings of the machine fully in your head. I doubt there are more than say 10 people alive in the world today who 100% understand what happens when execute a program on a modern CPU monster, down to the lowest levels...

...but try to encode H.265 on a Motorola 68k or view a JPEG on a c64. You can't win in everything. 😎

@kcgen
Copy link
Member

kcgen commented Sep 6, 2023

I guess what I find frustrating is when mankind collectively gets stuck with something that's just "good enough". But that's my problem, I know 😄

Lol.. yeah.. every since it was formed, the steering committee's revolving door of sponsors and corporate members have had millions (and now billions?) of LOC of old C and C++ between all of them.

A language firewall like Python 3 would strand most of their products and require massive re-write efforts.. so they've (always) been 100% "backward-compatible friendly". 🤦 To all of our detriment!

The good news is the NSA has declared C and C++ unsafe - and recommends people stop using it.

https://www.nsa.gov/Press-Room/News-Highlights/Article/Article/3215760/nsa-releases-guidance-on-how-to-protect-against-software-memory-safety-issues/

There's no wiggle room here. New "big-gov" contracts for sensitive materiel are simply going to block C and C++.

The only way to get around that wording is to release a new language, C++2. we'll finally get that strong C++ language firewall!

@johnnovak johnnovak merged commit 712deac into main Sep 6, 2023
50 checks passed
@johnnovak johnnovak deleted the jn/cga-ega-on-vga-detection branch October 24, 2023 23:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or enhancement of existing features shaders Issues related to shaders video Graphics and video related issues
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

None yet

4 participants