Implementation of hardware-based scaling #246

Closed
chocolate-import opened this Issue Dec 8, 2013 · 37 comments

Comments

Projects
None yet
6 participants

The following bug was originally reported on Sourceforge by twipley, 2012-08-23 23:02:51:

fraggle has written: "Ideally what I ought to be doing is investigating how to do [scaling] using OpenGL," adding, some months later, "that's the route I want to go down."

Such an objective, indeed, has been driven from time-ripen reflection from the part of the fraggle mind -- reflection concerning a definite issue in relation to modern monitors, springing from the fact that while the original resolution is of 320x200, the intended aspect ratio is of 4:3, therefore leaving no room at all* for adequate integer-factor scaling.

As the mastermind notes, the solution best capable, at the same time, of achieving, on the one hand, proper scaling, and on the other, proper aspect-ratio output, might well be residing in the future implementation of an OpenGL-based hardware scaler.

* outside of having to resort to 1600x1200 monitors or the like;

Owner

fragglet commented Mar 10, 2014

First prototype of this is now on gl-branch.

Contributor

AXDOOMER commented Mar 10, 2014

I played on Linux. It glitched with two video cards but I got it to work with a third one. Looks great! The screen is now stretched to its maximum, which makes it a bit blurrier than software mode, but it still feels perfectly blocky. Nice job!

Owner

fragglet commented Mar 11, 2014

Thanks for the praise, but I'd be more appreciative of bug reports :)

You said it "glitched" on two cards: can you tell me what video cards those were, and what the nature of the "glitch" was?

You say it's blurrier when it's stretched to maximum; how noticeable is this? If you increase gl_max_scale in chocolate-doom.cfg (to 5 or 6 for example), does it improve?

Contributor

AXDOOMER commented Mar 15, 2014

I didn't want to report bug reports about this at first because it's only a prototype. There are no official builds and when I tried to build it on Windows, it wouldn't. If you happen to build a Windows version successfully, then upload it somewhere and give me a like, that would be appreciated.

More blurriness is because OpenGL mode also stretches horizontally, compared to software scaling, which only stretches vertically. As long as gl_max_scale is over 2, I can't see any real difference (1920x1080), so 4 is the right default value. Even if it's still blocky, I get a different feeling when I play. When I said "The screen is now stretched to the maximum", I meant it's now fully stretched to 1080 vertically, whereas software scaling only scaled it to 1000, but it had no blurred pixels.

Chocolate-Doom won't let me take screenshots using printscreen, is there anyway to fix this on Linux? I can't even take screenshots in windowed mode when the focus in on Choco.

twipley commented Mar 16, 2014

Wouldn't it be best for it to default at 1000, in this case?

Because then, once vertical integer-scaling has reached its max value according to monitor resolution, horizontal scaling can take place to reach the intended ratio. In this case, horizontal would scale to 1333. Would that help reduce the blurriness?

Layman opinion makes sense? :)

Owner

fragglet commented Mar 16, 2014

@twipley Sorry, that doesn't make sense. I think you might have misunderstood.

I'll give an explanation of how this works, to help you guys understand better.

Chocolate Doom's software scaling code is hand-written with a separate function for each screen size that is supported. The way it's implemented relies on particular scaling ratios that let it use pregenerated lookup tables for speed. That's why (for example) there's a mode that scales it to 1000 pixels high, but not 1080 pixels high. You can see the code here:

https://github.com/chocolate-doom/chocolate-doom/blob/master/src/i_scale.c

The GL code tries to reproduce the behavior in hardware. If you do a normal OpenGL scale from 320x200 to eg. 1440x1080, you'll get an interpolated "blurry" look that isn't what we want (blocky). So instead, Chocolate Doom renders "blockily" from the original 320x200 to an offscreen buffer that is an exact multiple of the original screen: for example, 960x600. Then it renders from that buffer to the screen using the interpolation. So the original 320x200 "pixels" should still be large and blocky, but the edges between them should be interpolated.

If you have the Gimp, you can get the same effect by playing around with its resizing feature: take a 320x200 Doom screenshot, convert to RGB, scale up to a large size (eg. 1280x800) with interpolation "None", then scale down to a smaller size (eg. 640x480) with interpolation "Linear". Should look similar to how Chocolate Doom's GL scaling code looks.

There can still be some blurriness. The larger the size of the intermediate buffer, the better the result is likely to look. For example, if you go from 320x200->640x400->1440x1080, the results aren't likely to look too good. You want something more like 320x200->1280x800->1440x1080 at least. That's what gl_max_scale controls. The larger that value goes, the larger the size of the intermediate buffer. A value of 2 means 640x400 maximum for example. A value of at least 4 is probably necessary for decent results.

twipley commented Mar 17, 2014

Alright. Thanks for the explanation! I now understand quite better.

(I am still wondering what just-horizontal interpolation from a 1600x1000 intermediate buffer to a 1333x1000 screen output would look like, since integer-scaling that way still would be effected concerning one the axes. Perhaps it would not be better, but I am still wondering -- hehe.)

Screenshot comparisons between, for example, 1440x1080 or 1333x1000, using the software-scaling module on the one hand, and, on the other, the hardware-scaling one, would be welcomed. Just for the sake of catching a glimpse of the overall improvement being achieved, because it would, for various reasons, be pertinent to do so.

I would be eager to test and to lend a hand, but I must note that I most likely will not be able to do so before the end of April, so, sorry guys, but until then I somewhat will be taking the role of a passive observer in all of this. Although I am intuiting that you guys are innate testers as well.

twipley commented Mar 23, 2014

I'll be testing this around the 29th of April, so as for it to be said. (I would then benefit from a prebuilt Windows executable, as I am more of an end-user than anything else, hehe.) I'll be posting back! 💯

twipley commented Apr 7, 2014

I was wondering, as an option probably to go with this new scaling module, if you think it would be appropriate to implement (some amount of) scanlines in order to match the original experience?

Owner

fragglet commented Apr 10, 2014

That is indeed something that has occurred to me.

Owner

fragglet commented Apr 10, 2014

Now implemented on gl-branch, though I don't see the appeal of this feature myself.

twipley commented Apr 10, 2014

­> Now implemented on gl-branch

I have downloaded the zip file over at https://github.com/chocolate-doom/chocolate-doom/tree/gl-branch , and followed the instructions over at http://www.chocolate-doom.org/wiki/index.php/Building_Chocolate_Doom_on_Windows to install Cygwin.

However, I am stuck there. I have tried dragging-and-dropping the autogen.sh file over to the bash window, and it says there is no "configure" folder. Creating such and moving in the "configure.ac" file does not help either.

Following the commands listed in the linked-to page, I do not know how to specifically download the GL build, presuming "sh build-chocolate-doom -git" only deals with the master branch. I would need some help, here. :)

though I don't see the appeal of this feature myself.

No? Not even a philosophical one?

twipley commented Apr 10, 2014

Even running "sh build-chocolate-doom -git" from bash tells me I have to install "gcc." I believe I have installed every listed package (I checked every one of them twice.) I'm really eager to test, but would rather not be turned off by such technicalities. Would it be possible for you to (make an exception and) provide a link to a Windows build so I can test without such hurdles? It would really be appreciated.

Thanks in advance for the help, in whatever form it may come!

Owner

fragglet commented Apr 10, 2014

However, I am stuck there. I have tried dragging-and-dropping the autogen.sh file over to the bash window, and it says there is no "configure" folder. Creating such and moving in the "configure.ac" file does not help either.

It's a command line; dragging and dropping things isn't likely to do anything useful. I don't know what that will even do.

Even running "sh build-chocolate-doom -git" from bash tells me I have to install "gcc." I believe I have installed every listed package (I checked every one of them twice.)

That's weird. Try installing the Cygwin gcc package as well. It shouldn't be needed/used for build but might get things going.

Following the commands listed in the linked-to page, I do not know how to specifically download the GL build, presuming "sh build-chocolate-doom -git" only deals with the master branch. I would need some help, here. :)

Sure. Run the script to build the master branch first. Then cd to the chocolate-doom-git directory and type "git checkout gl-branch". Re-run the build script again and it should build.

Would it be possible for you to (make an exception and) provide a link to a Windows build so I can test without such hurdles? It would really be appreciated.

Understood, but unfortunately I don't have any Windows builds to give you!

So far I've only tested this on Mac OS X and Linux.

Owner

fragglet commented Apr 10, 2014

Note to self: One problem with the current implementation is that it doesn't always fall back to software mode if the GL setup fails. For example on the Raspberry Pi it fails to start with the error "Couldn't find matching GLX visual" - probably because it couldn't set up a GL display mode.

twipley commented Apr 11, 2014

Thanks fraggle for the hand. I have downloaded almost all the packages related to gcc, but it tells me "Your compiler (gcc) does not produce Win32 executables!" on running the build command ("sh build-chocolate-doom -git").

I am up to testing, but might have to wait until someone provides Windows builds to do so (or someone else who succeeds building guides me in doing so). AXDOOMER also is in the same boat.

EDIT: building on Linux also is not working for me -- following the instructions, it halts on "make install" with:
make[3]: Nothing to be done for install-exec-am'. make[3]: Nothing to be done forinstall-data-am'.

Contributor

AXDOOMER commented Apr 11, 2014

I tried to make a Windows build using Code::Blocks and Visual Studio some weeks ago, but I encountered several errors and they were different in each compilers. I'll try again this weekend and I hope I'll be able to make it work.

Owner

fragglet commented Apr 11, 2014

Sorry to hear you're still having problems with the Windows build. I'll see if I can maybe do a cross-compile this weekend.

EDIT: building on Linux also is not working for me -- following the instructions, it halts on "make install" with:
make[3]: Nothing to be done for install-exec-am'.

Sounds like that build was successful? Go to ~/chocolate-doom/build/chocolate-doom-git/src and type ./chocolate-doom

twipley commented Apr 12, 2014

I tried again running "sudo make" instead of a simple "make," and same thing for "make install," and it went further. Combined with what you suggested, that is, indeed putting doom1.wad in /src and running "./chocolate-doom," makes it work under Linux!

However, it says (using default Ubuntu open-source xserver-xorg-video-nouveau drivers):
I_InitGraphics: Pillarboxed (1400x1050 within 1680x1050)
Failed to set up framebuffer.
Failed to initialize in OpenGL mode. Falling back to software mode instead.

Although, after installing proprietary NVIDIA drivers (that is, for the GeForce GTX 550 Ti card) and rebooting, the error message is gone. Perhaps it is using OpenGL, now? (Any way to know, besides fiddling with the gl_max_scale value to see if there is any difference in screen output?)

Also, I have found the "chocolate-doom.cfg" file after quite a long search (note to self: file lays hidden in "~/.chocolate-doom").

Also, could you teach how to use scanlines?

I will be testing when I next get some time. In the meanwhile, I, too, would be interested in a Windows build, so that I can test on my Windows box.

Owner

fragglet commented Apr 12, 2014

Simply run with chocolate-doom -scanline

twipley commented Apr 12, 2014

Oh, nice. I am getting them, which means I am definitely on OpenGL (which, thus, works).

I am noticing it's quite different with scanlines on. It's been too long since I last played on mid-90's CRT monitors -- do you think it kind of mimics what it was like back then?

Furthermore, I would be interested to know if you consider them to be aligned with the philosophical system of the project. (And, if not too indiscreet to ask, why as to the "unappeal" mentioned above.)

twipley commented Apr 12, 2014

As a side note, I can only take screenshots like vanilla does; that is, they are being rendered in 320x200. Linux Mint's print screen button does not work while inside the game, so there is no way for me to currently compare software against hardware rendering under Linux.

Testing under Windows in the future might solve the problem, though.

Owner

fragglet commented Apr 12, 2014

Maybe print screen will work if you run with -nograbmouse, or try it when the game is paused or menu is activated?

I am noticing it's quite different with scanlines on. It's been too long since I last played on mid-90's CRT monitors -- do you think it kind of mimics what it was like back then?

Furthermore, I would be interested to know if you consider them to be aligned with the philosophical system of the project. (And, if not too indiscreet to ask, why as to the "unappeal" mentioned above.)

As I mentioned, I don't see the appeal. I originally added this as a hack in software mode just because someone requested it and it was easy to implement.

When people ask for scanline emulation it's usually for retro consoles that ran on TVs. But although the CRT technology is the same, TVs and monitors show things very differently. Remember that unlike TVs, CRT monitors were designed to be able to render things in high resolution at a variety of different screen resolutions.

To see what I mean, take a look at these:

http://imagizer.imageshack.us/a/img809/2500/hp14popclose1.jpg [zoom in, no scanlines]
http://microvga.com/images/uvga_plasma.jpg [no scanlines]
Wikipedia pic, close-up of a CRT monitor: https://commons.wikimedia.org/wiki/File:CRT_screen._closeup.jpg

Maybe some monitors were better than others and some really did have scanlines. I'm welcome to be proved wrong. But I don't have any memory of scanlines on PCs. I certainly don't find it essential to an accurate retro experience. Maybe people just associate "scan lines = retro" because we have that memory of old TVs. I don't know.

Owner

fragglet commented Apr 12, 2014

Another example: go watch "A visit to id Software" on Youtube (it's a great video to watch anyway if you haven't seen it):

https://www.youtube.com/watch?v=Q65xJfVkiaI

There are lots of gameplay segments with close-ups of the screen, and although there's lots of flicker (caused by mismatch in refresh rates between monitor and camera), there's no sign of any scan line pattern. And this is a video from before Doom was even released.

Owner

fragglet commented Apr 12, 2014

Here's a Windows build.

http://bluebell.soulsphere.org/~fraggle/gl-branch-ge364787-win32.zip

Let me know how it goes.

twipley commented Apr 12, 2014

Maybe print screen will work if you run with -nograbmouse, or try it when the game is paused or menu is activated?

Negative. Although I have found the "Shutter" screenshot tool in the "software manager," which takes desktop shots with a delay, and seems to represent a viable alternative.

Hey! Great news for the Windows build.

I have found out the "Lightscreen" program for Windows, which works (as it seems the delay feature is needed) for taking Chocolate-Doom screenshots. However, the -scanline parameter does not add scanlines, so I am assuming it is reverting to software rendering and leaves OpenGL out (under an x86-XP-VirtualBox machine, but I have tested and it is able to run some programs requiring OpenGL).


I don't have any memory of scanlines on PCs.

OK! Me neither.

The linked-to Doom-development video puts even further weight on your argument. Now, I do not see any appeal to scanlines in the scope of this project, either.

Contributor

AXDOOMER commented Apr 12, 2014

Here's a Windows build. Let me know how it goes.

Thanks for making a build! Although, it doesn't work.

GeForce 310: The screens turns black for a fraction of a second and goes back to normal. Then a dialog box appears and it says "Chocolate-Doom 2.0.0 has stopped working". Once the box is closed, it goes black again for a fraction of a second and goes back to normal.

Intel 4000: The screens turns black for a fraction of a second and goes back to normal. Then it does the same thing a second time.

EDIT: I remember there were scanlines on my CRT and I could only see them at a resolution of 320x200 and 640x480. I think scanlines helps to reproduce a retro experience on a LCD, because CRT monitors with scanlines produced a darker image.

twipley commented Apr 13, 2014

It has been my pleasure to do some testing!

I am reporting that gl_max_scale at higher values than 4 change nothing for 1080p monitors. Is that a bug? In other words, it seems the intermediate buffer currently does not go over 1280x800. (Since the game seems light on resources, ome might not mind benefiting -- to a certain degree -- from a larger intermediate buffer.)

A comparison-shots album between different renderers and resolutions is available over at http://imgur.com/a/G4kEd -- note, though, that there are no interpretations; only facts left over to interpret. (One should click the "download" button to view all pictures full-sized.)

One can then see what it does GL-horizontal-stretching to 4:3 while maintaining a 1000-pixels (integer-factor) vertical resolution.

One can also see changes between 1333x1000 and 1400x1050 scalings, both being 4:3 renderings performed through OpenGL.

twipley commented Apr 16, 2014

It's hard to judge as of now, since of the limited gl_max_scale values. It's a little more blurry -- as intended -- , but perhaps with a bigger intermediate buffer things would get a little more crisp.

Looking at the face gives a nice idea of the induced blurriness. Besides, it's great for everyone to finally have access to 4:3 ratios.

Owner

fragglet commented Apr 16, 2014

With scaling it's very much a case of diminishing returns. I think once the scale factor goes past about 4 (=1280x800) it's very hard to notice any difference.

If you set gl_max_scale it will go higher (provided your hardware supports it of course). But I'm skeptical as to how many people - if any - can tell any difference.

twipley commented Apr 16, 2014

If you set gl_max_scale it will go higher (provided your hardware supports it of course).

Oh! Well I guess mine doesn't support anything higher than 4 then (although I haven't found any information to that effect). It is a GeForce GTX 550 Ti. If it's not on your side though, then it's on mine! :)

EDIT: I was wondering, principally because moving from 3 to 4 did produce a significative difference. It appeared to me moving from 4 to 5 or 6 might also be producing such a difference. But again, I am defering to your judgement, master. (haha)

Contributor

AXDOOMER commented Apr 16, 2014

There is a difference when gl_max_scale is bigger than 4. I could see it with 5 and 6, but the difference is almost unnoticeable. I guess it has to do with the screen resolution, mine was 1920x1080.

twipley commented Apr 17, 2014

Okay.

I guess it depends too, on whether one checks to notice it while playing, or while comparing screenshots.

And, a great thing this scaling module has been implemented, by any means! 👍

EDIT: @Fraggle: I was also wondering on what you said above, that linear interpolation is akin to what is used by the code. However, the Gimp suggests either cubic or sinc. Are those modes available for us, too? I am assuming that some drawbacks make them a bad choice for our purposes, since you said that linear is used. I was curious as to the reasons, though.

twipley commented Apr 17, 2014

I would need external perspectives here -- is it just me or is the output crisper going (for example) from a 1600x1000 intermediate buffer (gl_max_scale set to 5) to a 1333x1000 output?

That way, there is integer-factor scaling for the vertical axis. I do not know whether in theory this changes anything, but the screenshots I have taken seem to suggest this being the case.

@fragglet fragglet referenced this issue Jul 7, 2014

Closed

CPU consumption #418

@jmtd jmtd added the graphics label Sep 8, 2015

Contributor

jmtd commented Jun 9, 2016

Fixed in sdl2-branch I believe.

@jmtd jmtd closed this Jun 9, 2016

mfrancis95 pushed a commit to mfrancis95/chocolate-doom that referenced this issue Jan 21, 2018

brightmaps: make brightmapping independent of sector light level
In my previous implementation, I had the brightmapping of a texture
depend on the light level of the texture's line's front sector. This
looked quite realistic, as highlights were never fully brightmapped for
textures in very dark sectors, but on the other hand the brightmapping
would also oscillate for sectors with variable light levels and even
extinguish (!) when sectors go full dark. This was unacceptable. Now,
texture highlights are always brightmapped with full intensity.

With this change I consider my brightmaps implementation complete! \o/

What an unexpectedly athmospheric feature that I won't want to miss
anymore, thanks to all involved (especially @Nechaevsky for your
impressive support!). Fixes #246.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment