-
-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
crt-auto
should auto-switch to VGA shaders in EGA modes with 18-bit VGA palettes
#2819
Conversation
650c54d
to
5139724
Compare
Test casesTesting graphics is always fun, especially edge-case-y stuff like this for a person with mild OCD tendencies. Like me 😎 (I'd say I'm a solid 5 on a 1 to 10 OCD scale 😅) Note that some games look like EGA games, but many of them are these weird low-effort "VGA conversions". They slapped on a 256-colour start screen maybe for the VGA release, then reused the exact same 16-colour graphics in-game in the 320x200 VGA 13h mode so they can proudly put the "now with VGA support!" badge on the box... 🤦🏻 Windwalker is a prime example; the VGA intro doesn't look any better than the EGA version (I'd say it's worse with those ghastly gradients), and the in-game graphics are 100% the same EGA art. Yeah, there are many games like that. Naturally, we cannot and should not treat those as EGA games, but it's quite confusing. Some other games tweak the default EGA palette in quite insignificant ways when you run them on a VGA card, e.g., just change the purple colour to a skin tone (Might and Magic 2 does that, and only in-game; people in the start screen and the hand in the intro are still purple... talk about low-effort!). These "technically VGA games" pretty much still look like EGA games to the untrained eye, and in fact, the graphics are still effectively 90%+ EGA! You won't believe how many games do this low-effort "VGA conversion" thing, and technically that makes them "VGA games" from the point of view of the detection algo. People who bought them back in the day, thinking they'd get some nice VGA game... well, at least they got a double-scanned EGA game on their VGA monitor with minor palette tweaks, so we're emulating their experience rather accurately—including their confusion and the disappointment 😎 Anyway, these are the cards we're dealt with 🤷 (these puns will never end...) 320x200 0Dh EGA mode with default coloursThese games use the default 16-colour CGA palette, so single-scanned EGA shaders are selected for them. This is basically regression testing to make sure true EGA games are not misdetected as VGA games. To be clear, all these games should appear single-scanned, as they would on an EGA monitor at the time of release. Success rate is 100% for all these games ✅
Dragon WarsStreet RodChampions of Krynn320x200 0Dh EGA mode with VGA coloursThese games set up 16 custom 18-bit VGA colours. They require a VGA card, therefore they must appear double-scanned. Most of these are Amiga conversions, and ironically displaying them single-scanned is how their originals looked on the Amiga. But that's what the 100% success rate ✅
CadaverDuke Nukem IIGodsHeimdallMetal MutantOut of This World (Another World)Speedball 2640x200 0Eh EGA mode with VGA coloursThese games set up 16 custom 18-bit VGA colours. They require VGA cards, therefore double-scanned VGA shaders are in order. 100% success rate ✅ (yes, that's 1 out of 1! 😎 )
Double-scanned, correct (VGA shader)Single-scanned, incorrect (EGA shader)640x350 10h EGA mode with default coloursThese games pick 16 colours from the standard 64-colour EGA palette, hence EGA shaders will be picked (single-scanned, but it makes no difference as 350-line modes are always single-scanned on VGA anyway). 100% success rate ✅
Spellcasting 201JinxterSimCity640x350 EGA mode with custom VGA paletteThese games set up 16 custom 18-bit VGA colours. We just use EGA shaders for these as the difference would be small compared to VGA shaders in this mode (because of single scanning). I can't be bothered to figure out the detection for these modes, it's not straightforward, and the reward is quite low... 100% success rate, according to the above simplified criteria ✅ (all 2! 😅 )
Darkseed |
5139724
to
6472009
Compare
Confusing "enhanced" VGA versionsMight and Magic 2 - Gates to Another WorldOn VGA adapters, you can run the game in VGA mode with Character creation – EGAZero difference between the two modes in terms of colours; skin tones are always purple. Character creation – VGAIn-game – EGAThe purple colour used for skin tones in the EGA version has a proper skin tone colour in VGA mode. That's the only difference! I hope the team did not have to work overtime to create the "enhanced VGA version" 😛 In-game – VGAWindwalkerEGAThis is how the game looks with VGAOn a VGA adapter, the intro screens are a bit different because they use a 256-colour palette, but the actual game uses the same EGA graphics. But there's lots of weirdness here:
It Came From the DesertRun Title screen – VGAThis might look like the bog standard EGA palette... but it's actually not. Compare it to the pure EGA version to see the difference; they tweaked the colour a bit but used the exact same graphics otherwise. Title screen – EGAIn-game – VGAIn-game, they made the dark blue a bit darker and called it a day. VGA version, yay! Put the badge on the box, let the boss know, then let's head to the pub! 😅 🍻 In-game – EGA |
6472009
to
61fdb84
Compare
crt-auto
should auto-switch to VGA shaders in EGA modes with 18-bit VGA palettes
61fdb84
to
e88a2c0
Compare
Always happens with the bit_view style unions. I’m afraid at some point we will get overhelmed with such warnings… |
Yeah I'd turn this warning off then. Bitviews are super useful, and I can see them being used more and more going forward, and yes, also as function arguments. No way I'm gonna pass a 64-bit pointer (reference) instead of a byte... |
Yup, no harm in disabling this specific V-number. @johnnovak , in the pvs-studio CI yaml, add V801 to the list:
Should bring down the count by a significant number. My guess is it's seeing the Where as the C-enums are just integers so those appear as scalars. Our coding style passes real class objects as const-ref: |
Looks fine as far as I can tell but those vga* files look like black magic to me 😆 The bulk of the changes look like refacors which seem to seem help readability. Other than that, nice job improving shader auto-detection for people like me who wouldn't know the right one to choose 👍 |
540cf06
to
0d47e9f
Compare
Ok @kcgen a forward declaration in |
0d47e9f
to
8c5f848
Compare
(not sure I need to read it 😅 I worked with guys who wrote machine code on mainframes.. and trust me, they told me I basically was spoiled for having a C compiler that wrote the machine code for me: write once, compile many! C and then C++ was a universal high-level language that finally let people stop writing actual machine code.. compilers were truly a marvel!) |
Up to you, but it's not about that 😄 The TL;DR is that back in the day C was arguably a cross-platform assembler, but these days it's so far removed from the actual hardware with optimising compilers that it's basically a quite bad abstraction to the underlying machine. Kind of "worst of both worlds"; neither high-level enough, neither low-level enough, just archaic. Same goes for C++. The important takeaway for me is that the "illusion of control" of C just gets in the way these days; it would be far better to just let describe "intent" using much higher-level abstractions then let the optimising compiler do its job properly. There is an interesting analogy with Fortran, and that's one of the reasons why C never reached Fortran speeds; with all that micro-management of "illusory low-levelness" you just make the job of the optimising compiler a lot harder. Anyway, I found it interesting, and integer promotion in C makes me cry every single time I run into it... |
Can I get the tick please @kcgen ? |
Ok - read (most) of it. If we went back in time to when C was authored and those memory layout guarantees were put in place.. would the article still be true? I don't think so. Surely those guarantees were trivial and mapped 1:1 to mainframes and their non-parallelized CPU's inner workings. I agree: I'd also lament that C's old memory state model isn't a good fit for instruction-hungry processors that are digesting (hundreds!) of adjacent instructions all in parallel (speculatively trying!) to maximize performance. But is that concern due to the circa 1970's C language spec? It was a perfect fit at the time. To me, it's the C language committee's fault for not evolving the language to suit our security needs and the evolving hardware architectures and how they want to behave to get maximum performance. The C language committee has too weak a backbone to strongly deprecate older features and firmly mandate new ones through a language-spec "firewall" that guarantees one or the other. It should be like Python: there's 2.x and 3.x: you can't use the inferior 2.x language features in 3.x, full stop. But if the C language committee was managing Python, it would be a big ball of backward-compatible mud! He might have been careful to only bash C and not C++ though. Because C++ is trying hard to free developers from these issues by moving to code-by-intent, which is the purpose of there have been lots of attempts at this, like Intel's SIMD and parallel pragmas, and with hybrid C languages like ISPC. So this is super exciting that C++ is finally at this state - and will get even more of it; perhaps even GPU/compute offload. Fully agree though - people just need to stop using it all together, especially the old C language constructs, and move to |
Oh yeah, in a way it's a victim of its own success; machines kept evolving, but the C language not so much—to the point that CPU manufacturers actually started to need to design their CPUs around the vast amounts of existing compiled C code. Which is a bit comical, but I guess yeah, the practical choice. I guess what I find frustrating is when mankind collectively gets stuck with something that's just "good enough". But that's my problem, I know 😄
I'm quite certain of that 😄 Ultimately, we have the benefit of hindsight as armchair-language-critiques (like me 😎) Plus maybe it's better to be blissfully unaware how the sausage is made... 😛 |
Hey, and thanks for the review @kcgen 😄 |
There's also the fact that the x86 assembly language is arguably not low level anymore. The "registers" don't map directly to hardware registers anymore. They're more like variables to the CPU. All the instructors get broken down into micro-ops and then executed in a completely different order than you typed them. Even if C does map neatly to assembly, it doesn't necessarily map to the hardware. I do wish they would add better SIMD support to the language though. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These VGA overhauls are making marked difference in readability, @johnnovak!
Just a couple minor comments, merge away when ready!
100%, when even x86 instruction set is a facade and the underlying microcode is something completely different and alien to that, I guess it becomes a bit of joke when trying to "hand-optimise" code. Sure, sometime you can achieve modest benefits and out-smart the compiler, but the future is machine-driven optimisation. I guess that was my "point", if I had any 😄 That's part of the charm of retro-coding for me on old 8-bit or 16/32-bit processors. Even an 486 is quite oldschool compared to what we have now; I was counting instruction latencies when writing x86 asm and it was an easy to understand model. Even more so for the Motorola 68k, MOS 6502 or the Z80—there is no pipelining, no completely ridiculous L2/L3/.../L100 caches, just serial execution. There's a beauty in that, a warm fuzzy feeling that you as a single person can hold the workings of the machine fully in your head. I doubt there are more than say 10 people alive in the world today who 100% understand what happens when execute a program on a modern CPU monster, down to the lowest levels... ...but try to encode H.265 on a Motorola 68k or view a JPEG on a c64. You can't win in everything. 😎 |
Lol.. yeah.. every since it was formed, the steering committee's revolving door of sponsors and corporate members have had millions (and now billions?) of LOC of old C and C++ between all of them. A language firewall like Python 3 would strand most of their products and require massive re-write efforts.. so they've (always) been 100% "backward-compatible friendly". 🤦 To all of our detriment! The good news is the NSA has declared C and C++ unsafe - and recommends people stop using it. There's no wiggle room here. New "big-gov" contracts for sensitive materiel are simply going to block C and C++. The only way to get around that wording is to release a new language, C++2. we'll finally get that strong C++ language firewall! |
f7b1fb4
to
ec6e555
Compare
Note: I will use parts of this writing in my upcoming mega-article about PC monitors. Hence I put quite some effort into it, but you can safely skip the details in the second post and only focus on this first post.
Overview
Demos and many DOS ports of Amiga action/platformer games "repurpose" the 320x200 0Dh EGA mode: instead of the default 16-colour palette, they reprogram the palette to use 18-bit VGA colours. The planar 320x200 16-colour 0Dh EGA mode is preferred by these programs because certain effects/operations are more efficient in planar modes than in the "chunky" (1 byte per pixel) 320x200 256-colour 13h mode (e.g., smooth-scrolling on low-end hardware, such as 286 and slow 386 machines equipped with VGA adapters; a popular low-cost option in the early 90s in Europe where most of these Amiga ports originated from).
These programs need a VGA card; in most cases, they don't even work on an EGA card, or they do but display wrong colours. Therefore, the authentic choice in these cases for the
crt-auto
adaptive CRT shader is to use a VGA shader.200-line EGA modes
The most important difference between using EGA vs VGA shaders for these 200-line games is that the VGA shaders are double-scanned, resulting in a very different-looking picture (big chunky pixels, as is expected from low-res VGA games). The single-scanned EGA look is like a really sharp Commodore monitor (which never existed 😅) and thus it's completely inauthentic. Can be a cool effect, though, but that's what the "fantasy"
crt-arcade
shader is for.In 99% of cases, such programs that repurpose EGA modes with VGA colours use the 320x200 0Dh low-res EGA mode. I know of a single obscure game called Rusty that does it with the 640x200 0Eh EGA mode, and only in-game. This game is handled correctly by the detection logic and gets double-scanned with a VGA shader too.
350-line EGA modes
I'm not bothering with 640x350 10h EGA modes as those are single-scanned both on EGA and VGA, so there's minimal difference between the EGA and VGA CRT shaded output. For 640x350 EGA modes we continue to always pick EGA shaders.
I know only of two games that use the 640x350 mode with VGA colours: Darkseed (the whole game) and Daughter of Serpents (only the character creation screen). Of these two, Darkseed is the really important one, so we're really talking about a single game, and it looks completely fine with EGA shaders.
The main reason is I couldn't work out the VGA palette detection for 350-line modes, but as it turns out, luckily this affects only Darkseed and would make very little difference.
cga_colors
Naturally, the palette detection is compatible with any custom CGA palette configured via
cga_colors
(built-in or user-provided). JustWorks(tm), just like you'd expect! 😎Example
Gods – double-scanned VGA shader, correct
Gods – single-scanned EGA shader, incorrect