Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Variance Shadow Mapping - Breaks rendering #8

Closed
ForeverTPS opened this issue May 26, 2014 · 16 comments
Closed

Variance Shadow Mapping - Breaks rendering #8

ForeverTPS opened this issue May 26, 2014 · 16 comments

Comments

@ForeverTPS
Copy link

The latest commit which implements variance shadow mapping appears to have broken the functionality as it existed pre-shadows.

The first soft shadowing code worked perfectly but the variance shadow mapping code appears to break all the lighting, normal mapping etc although there are basic textures rendered.

This was run using unmodified latest checking from the repository. Maybe another nVidia/Intel/ATI issue?

broken

@pythoneer
Copy link

Same results here. Arch Linux 3.14.4-1 nvidia 337.19 GTX 750 Ti

@Colt-Zero
Copy link

@Zammalad

Hmm... While I have not downloaded and checked the latest commit, I can say that after having watching the latest addition to the tutorial, that I am not experiencing this problem with variance shadow mapping. Having typed out the code myself whilst following Benny's video, I am not sure about the commit itself.

This is a screenshot of my current scene, showing that I don't seem to be experiencing the problem you are.

screeny3

Cough_Yes, ponies_Cough (These are actually not my pony models/textures, but I am planning to make my own)

Anyway, because it seems important. I'm running on Windows 7 Home with nvidia GTX 650

@pythoneer
Copy link

@Colt-Zero what happens if you build from the latest git commit?

@pythoneer
Copy link

Benny merged a pullrequest "Merge pull request #7 from pseudosoftware/master
Fix a problem with Intel integrated graphic cards." which added two lines of code

glDrawBuffer(GL_NONE);
glReadBuffer(GL_NONE);

if i remove them, all seems to be working fine. In my case the code breaks at this point, can you check this @Zammalad ? The code is in texture.cpp at line 97-98

@Colt-Zero
Copy link

@pythoneer

Hmm.... Yes, there does seem to be an issue with the commit. Interesting, though. Because this commit seems to have some extra stuff in it that Benny has yet to cover in the latest video.

My code is only up to date as far as the video goes, so I'm not exactly up to date with the commit.

@pythoneer
Copy link

@Colt-Zero Benny added this Fix right after he committed the variance shadow code. The code is up to date with the video i guess, this fix has nothing to do with the material covered in the videos .

@Colt-Zero
Copy link

@pythoneer

I'm perhaps a little confused. Are you saying that all this extra filtering stuff that was not in the latest video not going to be a part of the video series?

@pythoneer
Copy link

@Colt-Zero NO! sorry if i'm confusing you :)

glDrawBuffer(GL_NONE);
glReadBuffer(GL_NONE);

is and will never be part of the video series – i guess
this is a commit (pull request merge) that was done right after the commit of the variance shadow code, which was part of the latest video – thus it is on par with the code you typed directly from the video. [except the little two line fix which broke the code]

@Colt-Zero
Copy link

@pythoneer

Ah.... I get you now. Well... Does that mean the issue has been solved? That fix did work for me, at least.
Perhaps we should get more opinions first

@pythoneer
Copy link

@Colt-Zero don't know – the code wasn't there for nothing. Like the commit message says: "Fix a problem with Intel integrated graphic cards." ... this was not intended to break the nvidia side and i guess it worked well with AMD cuz Benny merged it and he is using an AMD card if i recall correctly

@Colt-Zero
Copy link

@pythoneer

Ooch... One of these problems where it fixes one but breaks the other. It would probably be a bad move to make it check which kind of graphics card you have to determine whether the fix should be applied or not.

A non-universal fix for the problem such as a graphics card dependency could lead the way to future troubles.

@pythoneer
Copy link

At this Point we can't do much. Benny and pseudosoftware have to look at this since i don't really know what the fix is trying to do. maybe

glColorMask (GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE)

can help but i can only guess here :)

@ForeverTPS
Copy link
Author

I can confirm that removal of

glDrawBuffer(GL_NONE);
glReadBuffer(GL_NONE);

resolves the issue on my system (nVidia 770, 337.50 Driver). As has been stated this needs to be looked at by Benny/pseudosoftware since they were handling that specific commit for an Intel fix.

@BennyQBD
Copy link
Owner

Thanks for bringing this up everyone! For the time being, I've commented out those lines of code. I won't be adding them back until I receive more input from pseudosoftware.

@pseudosoftware
Copy link
Contributor

I'll have a look.

@DCubix
Copy link

DCubix commented Jan 29, 2015

Having the same problem with an intel processor #30

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants