-
-
Notifications
You must be signed in to change notification settings - Fork 6.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default SpriteBatch shader fails to compile #3559
Comments
Update: I created a minimal project from the gdx setup up wizard with the same configuration, and again I receive the same exception. This is both with |
It appears as though I cannot easily find an earlier Java version, so I am trying update 66 in hope. If that fails, I will compile from source and add a version tag to the shader. |
Possible duplicate of #3273 |
I am indeed using an AMD graphics card. But it is strange that this has only become an issue now. Other than possibly a java version bump, I can't think what has materially changed in my setup. |
This shader workaround fixes the ShapeRenderer issue for glsl #version 150: public class ImmediateModeShader30 {
static private String createVertexShader (boolean hasNormals, boolean hasColors, int numTexCoords) {
String shader = "#version 150\n" +
"attribute vec4 " + ShaderProgram.POSITION_ATTRIBUTE + ";\n"
+ (hasNormals ? "attribute vec3 " + ShaderProgram.NORMAL_ATTRIBUTE + ";\n" : "")
+ (hasColors ? "attribute vec4 " + ShaderProgram.COLOR_ATTRIBUTE + ";\n" : "");
for (int i = 0; i < numTexCoords; i++) {
shader += "attribute vec2 " + ShaderProgram.TEXCOORD_ATTRIBUTE + i + ";\n";
}
shader += "uniform mat4 u_projModelView;\n";
shader += (hasColors ? "varying vec4 v_col;\n" : "");
for (int i = 0; i < numTexCoords; i++) {
shader += "varying vec2 v_tex" + i + ";\n";
}
shader += "void main() {\n" + " gl_Position = u_projModelView * " + ShaderProgram.POSITION_ATTRIBUTE + ";\n"
+ (hasColors ? " v_col = " + ShaderProgram.COLOR_ATTRIBUTE + ";\n" : "");
for (int i = 0; i < numTexCoords; i++) {
shader += " v_tex" + i + " = " + ShaderProgram.TEXCOORD_ATTRIBUTE + i + ";\n";
}
shader += " gl_PointSize = 1.0;\n";
shader += "}\n";
return shader;
}
static private String createFragmentShader (boolean hasNormals, boolean hasColors, int numTexCoords) {
String shader ="#version 150\n"
+ "#ifdef GL_ES\n" + "precision mediump float;\n" + "#endif\n";
if (hasColors) shader += "varying vec4 v_col;\n";
for (int i = 0; i < numTexCoords; i++) {
shader += "varying vec2 v_tex" + i + ";\n";
shader += "uniform sampler2D u_sampler" + i + ";\n";
}
shader += "void main() {\n" + " gl_FragColor = " + (hasColors ? "v_col" : "vec4(1, 1, 1, 1)");
if (numTexCoords > 0) shader += " * ";
for (int i = 0; i < numTexCoords; i++) {
if (i == numTexCoords - 1) {
shader += " texture2D(u_sampler" + i + ", v_tex" + i + ")";
} else {
shader += " texture2D(u_sampler" + i + ", v_tex" + i + ") *";
}
}
shader += ";\n}";
return shader;
}
/** Returns a new instance of the default shader used by SpriteBatch for GL2 when no shader is specified. */
static public ShaderProgram createDefaultShader (boolean hasNormals, boolean hasColors, int numTexCoords) {
String vertexShader = createVertexShader(hasNormals, hasColors, numTexCoords);
String fragmentShader = createFragmentShader(hasNormals, hasColors, numTexCoords);
ShaderProgram program = new ShaderProgram(vertexShader, fragmentShader);
if (program.isCompiled() == false) throw new IllegalArgumentException("Error compiling shapeRenderer shader: " + program.getLog());
return program;
}
} It would be nice for |
This is an issue tracker. If you need help then use the forum or irc instead. If you think there is actually a new issue, then please include the required information to reproduce the issue you're reporting. If this is indeed a duplicate then please close this. |
I'm sorry. Maybe I'm misreading your tone (one of flustered impatience with one of many "noobish" issues) but I feel that this is a terrible attitude to have. I am someone who is becoming familiar enough with the libgdx codebase such I feel comfortable browsing around the source. I don't think you want to turn such people away. I was not looking for help, I am trying to make it clear that a) on a fresh install of Windows 10 b) with an AMD card, c) the default SpriteBatch shader is failing to compile, moreover the ShapeRender's default shader is failing to compile, but does so in a silent manner. This is a serious issue. This is a major failing in Libgdx's introductory user experience. The university I am at uses Libgdx as their engine of choice for their introductory game design course. At that point in my education, I would have had absolutely no clue as to how to debug such a problem, it would have been a major turn off from an ecosystem whose modular approach I have major respect for. I have reproduced my problem, in a minimal as possible way by creating a desktop project with no extraneous libs, on a dev environment almost as clean as it could be. This IS a new issue, at least for me, I did not encounter it prior to today, when I reformatted and installed Oracle JDK 1.8.65. But naturally because this involves GL, this is a hardware specific problem. I do not know whether this problem applies to all AMD graphics cards, but with 18% percent of development environments potentially affected (AMD desktop share) it should be of concern to you. If you give me a bit of time (I'm currently writing finals), I can create a branch to try and architect a solution in a platform agnostic manner. I ask that you please do not close this, it may be more widespread and simply hitting the silent and inexperienced than you think. |
I'm sorry if you misunderstood me. What I meant was is that it is unclear what you're trying to achieve by creating this issue. A new issue is an issue that hasn't been reported before. As said, it looks like this is already reported and discussed. If you'd have searched the issue tracker as you were asked to do when you created this issue, then you'd have seen that. So either: Don't get me wrong, if there is an actual new issue then it is very well appreciated that you report it. However, the issue you've reported is that you receive the shader compilation error: "Implicit version number 110 not supported by GL3 forward compatible context". So that implies that you've set You also said that "This is both with config.useGL30 set to true and false". Which would imply that somehow gles 3 is enabled while you didn't specify it. That would be a new issue, but you haven't provided enough information for that (you didn't even include your config). If that's the case then please include the required information to reproduce it. Please note that the gles3 shader issue is known (e.g. see this, this and this and I'm sure you'll find more of it when you search the issue tracker). There is no easy fix (although it is arguable whether this is an actual issue). If you chose to decide to use gles 3 then you will have to provide gles 3 compatible shaders. But then again, why would you enable gles 3 in the first place if you don't want to use it? |
Like I said, error occurs in vanilla result from GDX setup wizard. I saw issue #3273 however I believed it to be a separate issue due to it being closed and #3273 seemingly working with the use of GL30. There is little point in including the code because you could generate it yourself, however here it is in a single file code snippet. The problem also occurrs using BarebonesBatch package com.mygdx.game.desktop;
import com.badlogic.gdx.backends.lwjgl.LwjglApplication;
import com.badlogic.gdx.backends.lwjgl.LwjglApplicationConfiguration;
import com.mygdx.game.MyGdxGame;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
public class DesktopLauncher extends ApplicationAdapter {
public static void main (String[] arg) {
LwjglApplicationConfiguration config = new LwjglApplicationConfiguration();
config.useGL30={{false|true}}; //irrelevent either way, added it just to test
new LwjglApplication(new DesktopLauncher(), config);
}
SpriteBatch batch;
Texture img;
@Override
public void create () {
batch = new SpriteBatch();
img = new Texture("badlogic.jpg");
}
@Override
public void render () {
Gdx.gl.glClearColor(1, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
batch.draw(img, 0, 0);
batch.end();
}
} I want to support android, so I'm concurrently developing alternative versions of effect shaders for different GL versions.
Please let me know if you need any more information. |
So the issue you're reporting is that gles 3 is enabled even when you set |
I am using Gdx version Modifying the create method above to: public void create () {
System.out.println("Is Gdx30 null?: "+(Gdx.gl30==null?"Yes":"No"));
batch = new SpriteBatch();
img = new Texture("badlogic.jpg");
} Produces: |
I added #version 110 to the top of my modified default shader, and it compiled. My guess is that AMD have added some form of 'strict mode'. Could be completely off though. |
Thanks, I'm not sure whether this is a libgdx, lwjgl or driver issue. If it is a libgdx issue (which is quite possible) then it is not something to easily fix. Since I don't have an AMD gpu myself, I would like some verification that this only happens with (what I assume) this specific driver version. There haven't been any other issues reported on this specifically before, so perhaps it is only since the latest AMD driver update. To reproduce (if i understand correctly): |
It might only be restricted to the R9 270X, the R9 series or a subset of the AMD cards or for some frustrating reason only on my PC, but yes. Would simply adding #version 110 to the default shaders break older configurations? If not, that appears to be an easy fix. Thank you for the patience BTW. |
I doubt they use a different (pre)compiler implementation for every gpu, but it could be. Indeed, as you've read in those issues I referred to, the #version directive is not compatible between gles 2, gles 3 and gl 3+. |
From the Opengl ES Reference
#version 100 works on desktop. It may be possible that |
Hello I'm not an expert but i am getting the same error as @ncthbrt when I try to run the desktop project from cmd with: gradlew desktop:run Well I tried to create a new libgdx project and used the same command but it failed as well. I don't understand however, why does my project run in eclipse perfectly fine and the "useGL30 = true;" also works just fine, though running and building it with gradle doesn't. I am posting this here to confirm the issue that @ncthbrt stated, thus i must add that I am also running on Windows 10 and I also have an AMD GPU with latest drivers installed. Finally, my GPU is not from R9 series but rather HD6000(it's AMD Radeon HD6670 specifically) series, so I guess this issue applies to all systems having the newest AMD drivers installed. |
Same issue here, only on Linux OS: Ubuntu x64 16.04 This issue appeared only after the installation of the proprietary catalyst driver using the open source driver works fine if you need more info don't hesitate to ask! |
Could you try to add the following code to your desktop starter. config.useGL30 = true;
ShaderProgram.prependVertexCode = "#version 140\n#define varying out\n#define attribute in\n";
ShaderProgram.prependFragmentCode = "#version 140\n#define varying in\n#define texture2D texture\n#define gl_FragColor fragColor\nout vec4 fragColor;\n"; |
I can confirm that this does, indeed, fix the mentioned problem with AMD card (which I was experiencing before myself). |
Thanks @kibertoad! I just want to say, in case someone else comes across this issue, the code/settings posted by @kibertoad fixes the issue (I have an AMD card: Radeon 5770). It also solves another issue I had with Box2DDebugRenderer not rendering the debug geometry. Before this fix, I had to use a custom default shader and use spritebatch like so: |
Is this issue still affecting anyone? |
It's still crashing for me on 1.9.3 when I'm not using aforementioned ShaderProgram code. |
Hello, I just had this issue by running a hello world kind of program on the latest libgdx. The fix mentioned by @xoppa seems to work. I also have an AMD Radeon gpu, r9 280x |
I've also recently started having this issue on Libgdx 1.9.3 on Ubuntu 14.04 using an Nvidia GeForce GTX 285 card with proprietary 304.132 drivers. I didn't change anything that I know of. No driver changes, no hardware changes, no code changes. It just stopped working from one day to the next. I have tried the ShaderProgram code suggested as well as |
I suddenly noticed that I'm having this issue, and I'm using libGDX 1.9.4 on Windows 10. Graphics hardware is AMD Radeon HD 7900 + AMD Radeon HD 5670. This is a weird one... I'm getting the error ONLY when I run my games from command line (e.g. Here's the confusing part- I have some old builds of my game that work via command line, and some old builds that don't work. I swear they all worked before, and I don't recall updating my graphics drivers. I can confirm that adding the code posted by @xoppa resolves the issue in 1.9.4. |
@Schyfis latest version is 1.9.5 try it maybe https://libgdx.badlogicgames.com/versions.html Start from scratch to have clean base, uninstall every version of java you have, then reinstall, and reboot |
One of my friends attempted to test my game and had a problem, so I'm back. @ghost here are the steps I took:
I can absolutely, positively confirm that I now only have ONE version of Java installed, but the problem I noted in my previous post is the same. Double clicking on a jar file to run it works. Running it from the command line does not, and presents me with the implicit version number error when creating the SpriteBatch. The other thing I'm here to report is that I'm kind of in a pickle.
I've sent my friend a version that includes the useGL30 line, and I'll report back when he tests it. Trouble is, if it works, I can't use it as a solution, because when I include that line it significantly impacts the game's performance on my computer. |
From @crykn: the issue seems to be related to specific hardware, so there’s nothing libGDX can do on its end; the Mac issue mentioned in the last post is a separate problem (only core profile support on Mac). |
Hi there:
The following code is crashing with the error code posted at the bottom of this issue:
I'm on a fresh install of Windows 10, in IntelliJ 15.0, running Oracle JDK version 1.8.0_65. This project was working the day before yesterday with an existing Windows10 installation, but I believe an older version of Java 1.8.0 Installed. This is possibly the source of my troubles. I'll install the previous release and report back.
The text was updated successfully, but these errors were encountered: