Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default SpriteBatch shader fails to compile #3559

Closed
ncthbrt opened this issue Nov 12, 2015 · 29 comments
Closed

Default SpriteBatch shader fails to compile #3559

ncthbrt opened this issue Nov 12, 2015 · 29 comments

Comments

@ncthbrt
Copy link

ncthbrt commented Nov 12, 2015

Hi there:

The following code is crashing with the error code posted at the bottom of this issue:

public class Iteration3Main extends Game {
    SpriteBatch batch;
    private static TextureAtlas atlas;
    private static final float colourCycleTime = 30f;
    private static HSVColour currentColour = new HSVColour(0, 0.81f, 0.7f, 1f);

    public static Colour currentColour() {
        return currentColour.toRGB();
    }

    private static TitleScreen titleScreen;
    private static Iteration3Main main;

    public static TextureAtlas textureAtlas() {
        return atlas;
    }

    @Override
    public void create() {
        batch = new SpriteBatch();
        atlas = new TextureAtlas(Gdx.files.internal("global.atlas"));
        titleScreen = new TitleScreen(this, atlas);
        main = this;
        setScreen(titleScreen);
    }


    @Override
    public void render() {
        float nextHue = currentColour.hue() + Gdx.graphics.getDeltaTime() / colourCycleTime;
        nextHue = nextHue > 1 ? 0 : nextHue;
        currentColour.hue(nextHue);
        Gdx.gl.glClearColor(0, 0, 0, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        super.render();
    }
}

I'm on a fresh install of Windows 10, in IntelliJ 15.0, running Oracle JDK version 1.8.0_65. This project was working the day before yesterday with an existing Windows10 installation, but I believe an older version of Java 1.8.0 Installed. This is possibly the source of my troubles. I'll install the previous release and report back.

Exception in thread "LWJGL Application" java.lang.IllegalArgumentException: Error compiling shader: Vertex shader failed to compile with the following errors:
ERROR: error(#272) Implicit version number 110 not supported by GL3 forward compatible context
ERROR: error(#273) 1 compilation errors.  No code generated

Fragment shader failed to compile with the following errors:
ERROR: error(#272) Implicit version number 110 not supported by GL3 forward compatible context
ERROR: error(#273) 1 compilation errors.  No code generated


    at com.badlogic.gdx.graphics.g2d.SpriteBatch.createDefaultShader(SpriteBatch.java:157)
    at com.badlogic.gdx.graphics.g2d.SpriteBatch.<init>(SpriteBatch.java:120)
    at com.badlogic.gdx.graphics.g2d.SpriteBatch.<init>(SpriteBatch.java:73)
    at com.deepwallgames.quantumhue.Iteration3Main.create(Iteration3Main.java:29)
    at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:143)
    at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:120)
@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

Update: I created a minimal project from the gdx setup up wizard with the same configuration, and again I receive the same exception. This is both with config.useGL30 set to true and false
This also applies to outside the IDE with the use of CMD and the gradlew wrapper.

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

It appears as though I cannot easily find an earlier Java version, so I am trying update 66 in hope. If that fails, I will compile from source and add a version tag to the shader.

@xoppa
Copy link
Member

xoppa commented Nov 12, 2015

Possible duplicate of #3273

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

I am indeed using an AMD graphics card. But it is strange that this has only become an issue now. Other than possibly a java version bump, I can't think what has materially changed in my setup.

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

This shader workaround fixes the ShapeRenderer issue for glsl #version 150:

public class ImmediateModeShader30 {

    static private String createVertexShader (boolean hasNormals, boolean hasColors, int numTexCoords) {
        String shader = "#version 150\n" +
                "attribute vec4 " + ShaderProgram.POSITION_ATTRIBUTE + ";\n"
                + (hasNormals ? "attribute vec3 " + ShaderProgram.NORMAL_ATTRIBUTE + ";\n" : "")
                + (hasColors ? "attribute vec4 " + ShaderProgram.COLOR_ATTRIBUTE + ";\n" : "");

        for (int i = 0; i < numTexCoords; i++) {
            shader += "attribute vec2 " + ShaderProgram.TEXCOORD_ATTRIBUTE + i + ";\n";
        }

        shader += "uniform mat4 u_projModelView;\n";
        shader += (hasColors ? "varying vec4 v_col;\n" : "");

        for (int i = 0; i < numTexCoords; i++) {
            shader += "varying vec2 v_tex" + i + ";\n";
        }

        shader += "void main() {\n" + "   gl_Position = u_projModelView * " + ShaderProgram.POSITION_ATTRIBUTE + ";\n"
                + (hasColors ? "   v_col = " + ShaderProgram.COLOR_ATTRIBUTE + ";\n" : "");

        for (int i = 0; i < numTexCoords; i++) {
            shader += "   v_tex" + i + " = " + ShaderProgram.TEXCOORD_ATTRIBUTE + i + ";\n";
        }
        shader += "   gl_PointSize = 1.0;\n";
        shader += "}\n";
        return shader;
    }

    static private String createFragmentShader (boolean hasNormals, boolean hasColors, int numTexCoords) {
        String shader ="#version 150\n"
                + "#ifdef GL_ES\n" + "precision mediump float;\n" + "#endif\n";

        if (hasColors) shader += "varying vec4 v_col;\n";
        for (int i = 0; i < numTexCoords; i++) {
            shader += "varying vec2 v_tex" + i + ";\n";
            shader += "uniform sampler2D u_sampler" + i + ";\n";
        }

        shader += "void main() {\n" + "   gl_FragColor = " + (hasColors ? "v_col" : "vec4(1, 1, 1, 1)");

        if (numTexCoords > 0) shader += " * ";

        for (int i = 0; i < numTexCoords; i++) {
            if (i == numTexCoords - 1) {
                shader += " texture2D(u_sampler" + i + ",  v_tex" + i + ")";
            } else {
                shader += " texture2D(u_sampler" + i + ",  v_tex" + i + ") *";
            }
        }

        shader += ";\n}";
        return shader;
    }

    /** Returns a new instance of the default shader used by SpriteBatch for GL2 when no shader is specified. */
    static public ShaderProgram createDefaultShader (boolean hasNormals, boolean hasColors, int numTexCoords) {
        String vertexShader = createVertexShader(hasNormals, hasColors, numTexCoords);
        String fragmentShader = createFragmentShader(hasNormals, hasColors, numTexCoords);
        ShaderProgram program = new ShaderProgram(vertexShader, fragmentShader);
        if (program.isCompiled() == false) throw new IllegalArgumentException("Error compiling shapeRenderer shader: " + program.getLog());
        return program;
    }
}

It would be nice for ImmediateMode20 and the default SpriteBatch shaders to add a version tag to the shader if useGL30 is set to true. Or alternatively have a gauntlet of default shaders that the ShapeRenderer and SpriteBatch have to fall through until successful compilation (might slow down startup considerably however).

@xoppa
Copy link
Member

xoppa commented Nov 12, 2015

This is an issue tracker. If you need help then use the forum or irc instead. If you think there is actually a new issue, then please include the required information to reproduce the issue you're reporting. If this is indeed a duplicate then please close this.
https://github.com/libgdx/libgdx/blob/master/CONTRIBUTING.md

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

I'm sorry. Maybe I'm misreading your tone (one of flustered impatience with one of many "noobish" issues) but I feel that this is a terrible attitude to have. I am someone who is becoming familiar enough with the libgdx codebase such I feel comfortable browsing around the source. I don't think you want to turn such people away.

I was not looking for help, I am trying to make it clear that a) on a fresh install of Windows 10 b) with an AMD card, c) the default SpriteBatch shader is failing to compile, moreover the ShapeRender's default shader is failing to compile, but does so in a silent manner.

This is a serious issue. This is a major failing in Libgdx's introductory user experience. The university I am at uses Libgdx as their engine of choice for their introductory game design course. At that point in my education, I would have had absolutely no clue as to how to debug such a problem, it would have been a major turn off from an ecosystem whose modular approach I have major respect for.

I have reproduced my problem, in a minimal as possible way by creating a desktop project with no extraneous libs, on a dev environment almost as clean as it could be. This IS a new issue, at least for me, I did not encounter it prior to today, when I reformatted and installed Oracle JDK 1.8.65.

But naturally because this involves GL, this is a hardware specific problem. I do not know whether this problem applies to all AMD graphics cards, but with 18% percent of development environments potentially affected (AMD desktop share) it should be of concern to you.

If you give me a bit of time (I'm currently writing finals), I can create a branch to try and architect a solution in a platform agnostic manner.

I ask that you please do not close this, it may be more widespread and simply hitting the silent and inexperienced than you think.

@xoppa
Copy link
Member

xoppa commented Nov 12, 2015

I'm sorry if you misunderstood me. What I meant was is that it is unclear what you're trying to achieve by creating this issue. A new issue is an issue that hasn't been reported before. As said, it looks like this is already reported and discussed. If you'd have searched the issue tracker as you were asked to do when you created this issue, then you'd have seen that. So either:
a) this is a duplicate issue,
b) you are asking for help instead of reporting an issue with libgdx, or
c) you found a new issue with libgdx that hasn't been reported before

Don't get me wrong, if there is an actual new issue then it is very well appreciated that you report it. However, the issue you've reported is that you receive the shader compilation error: "Implicit version number 110 not supported by GL3 forward compatible context". So that implies that you've set useGL30 to true, in which case this is not a new issue.

You also said that "This is both with config.useGL30 set to true and false". Which would imply that somehow gles 3 is enabled while you didn't specify it. That would be a new issue, but you haven't provided enough information for that (you didn't even include your config). If that's the case then please include the required information to reproduce it.

Please note that the gles3 shader issue is known (e.g. see this, this and this and I'm sure you'll find more of it when you search the issue tracker). There is no easy fix (although it is arguable whether this is an actual issue). If you chose to decide to use gles 3 then you will have to provide gles 3 compatible shaders. But then again, why would you enable gles 3 in the first place if you don't want to use it?

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

Like I said, error occurs in vanilla result from GDX setup wizard. I saw issue #3273 however I believed it to be a separate issue due to it being closed and #3273 seemingly working with the use of GL30.

There is little point in including the code because you could generate it yourself, however here it is in a single file code snippet. The problem also occurrs using BarebonesBatch

package com.mygdx.game.desktop;

import com.badlogic.gdx.backends.lwjgl.LwjglApplication;
import com.badlogic.gdx.backends.lwjgl.LwjglApplicationConfiguration;
import com.mygdx.game.MyGdxGame;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;

public class DesktopLauncher extends ApplicationAdapter {
    public static void main (String[] arg) {
        LwjglApplicationConfiguration config = new LwjglApplicationConfiguration();
        config.useGL30={{false|true}}; //irrelevent either way, added it just to test 
        new LwjglApplication(new DesktopLauncher(), config);
    }


    SpriteBatch batch;
    Texture img;

    @Override
    public void create () {
        batch = new SpriteBatch();
        img = new Texture("badlogic.jpg");
    }

    @Override
    public void render () {
        Gdx.gl.glClearColor(1, 0, 0, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        batch.begin();
        batch.draw(img, 0, 0);
        batch.end();
    }
}

I want to support android, so I'm concurrently developing alternative versions of effect shaders for different GL versions.

Computer Specs:
Graphics card: AMD GTX R9 270
RAM: 16 GB RAM
OS: Windows 10 Pro N
Processor: Intel Core i5-4690K
Architecture: 64 bit
AMD Catalyst Control Center version 2015.0804.21.41908
OpenGL Version: 4.3 [More details](http://feedback.wildfiregames.com/report/opengl/device/ASUS%20R9%20270X)

Please let me know if you need any more information.

@xoppa
Copy link
Member

xoppa commented Nov 12, 2015

So the issue you're reporting is that gles 3 is enabled even when you set useGL30 to false? Can you verify that by checking whether Gdx.gl30 is not null? Also, is this latest nighly or which version of libgdx are you using (there has been an issue which would cause that, but that is fixed a release or two ago)?

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

I am using Gdx version 1.7.1

Modifying the create method above to:

public void create () {
        System.out.println("Is Gdx30 null?: "+(Gdx.gl30==null?"Yes":"No"));
        batch = new SpriteBatch();
        img = new Texture("badlogic.jpg");
    }

Produces:
Is Gdx30 null?: Yes

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

I added #version 110 to the top of my modified default shader, and it compiled. My guess is that AMD have added some form of 'strict mode'. Could be completely off though.

@xoppa
Copy link
Member

xoppa commented Nov 12, 2015

Thanks, I'm not sure whether this is a libgdx, lwjgl or driver issue. If it is a libgdx issue (which is quite possible) then it is not something to easily fix. Since I don't have an AMD gpu myself, I would like some verification that this only happens with (what I assume) this specific driver version. There haven't been any other issues reported on this specifically before, so perhaps it is only since the latest AMD driver update.

To reproduce (if i understand correctly):
On windows, with an AMD gpu with latest driver create a new project using gdx-setup and run it without any modification. It should give the above error.

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

It might only be restricted to the R9 270X, the R9 series or a subset of the AMD cards or for some frustrating reason only on my PC, but yes.

Would simply adding #version 110 to the default shaders break older configurations? If not, that appears to be an easy fix.

Thank you for the patience BTW.

@xoppa
Copy link
Member

xoppa commented Nov 12, 2015

I doubt they use a different (pre)compiler implementation for every gpu, but it could be.

Indeed, as you've read in those issues I referred to, the #version directive is not compatible between gles 2, gles 3 and gl 3+.

@ncthbrt
Copy link
Author

ncthbrt commented Nov 12, 2015

From the Opengl ES Reference

Each compilation unit should declare the version of the language it is written to using the #version
directive: #version number
where number must be 100 for this specification’s version of the language (following the same convention as VERSION above), in which case the directive will be accepted with no errors or warnings. Any number less than 100 will cause an error to be generated. Any number greater than the latest version of the language a compiler supports will also cause an error to be generated. Version 100 of the language does not require compilation units to include this directive, and compilation units that do not include a #version directive will be treated as targeting version 100.

#version 100 works on desktop. It may be possible that #version 100 would work across all three of these GL targets.

@AndrazCepic
Copy link

Hello I'm not an expert but i am getting the same error as @ncthbrt when I try to run the desktop project from cmd with: gradlew desktop:run

Well I tried to create a new libgdx project and used the same command but it failed as well. I don't understand however, why does my project run in eclipse perfectly fine and the "useGL30 = true;" also works just fine, though running and building it with gradle doesn't.

I am posting this here to confirm the issue that @ncthbrt stated, thus i must add that I am also running on Windows 10 and I also have an AMD GPU with latest drivers installed. Finally, my GPU is not from R9 series but rather HD6000(it's AMD Radeon HD6670 specifically) series, so I guess this issue applies to all systems having the newest AMD drivers installed.

@ghost
Copy link

ghost commented Jan 17, 2016

Same issue here, only on Linux

OS: Ubuntu x64 16.04
HD 7770
gdxVersion = '1.8.1-SNAPSHOT'

This issue appeared only after the installation of the proprietary catalyst driver

using the open source driver works fine

if you need more info don't hesitate to ask!

@xoppa
Copy link
Member

xoppa commented Jan 18, 2016

Could you try to add the following code to your desktop starter.

config.useGL30 = true;
ShaderProgram.prependVertexCode = "#version 140\n#define varying out\n#define attribute in\n";
ShaderProgram.prependFragmentCode = "#version 140\n#define varying in\n#define texture2D texture\n#define gl_FragColor fragColor\nout vec4 fragColor;\n";

@kibertoad
Copy link

Could you try to add the following code to your desktop starter.

config.useGL30 = true;
ShaderProgram.prependVertexCode = "#version 140\n#define varying out\n#define attribute in\n";
ShaderProgram.prependFragmentCode = "#version 140\n#define varying in\n#define texture2D texture\n#define gl_FragC

I can confirm that this does, indeed, fix the mentioned problem with AMD card (which I was experiencing before myself).

@gordon13
Copy link

gordon13 commented Apr 2, 2016

Thanks @kibertoad! I just want to say, in case someone else comes across this issue, the code/settings posted by @kibertoad fixes the issue (I have an AMD card: Radeon 5770).

It also solves another issue I had with Box2DDebugRenderer not rendering the debug geometry. Before this fix, I had to use a custom default shader and use spritebatch like so:
sb = new SpriteBatch(100, createDefaultShader());
This would fix the version issue however the Box2D debug renderer would not display anything.
This new fix fixes everything!

@kerberjg
Copy link
Contributor

Is this issue still affecting anyone?

@kibertoad
Copy link

Is this issue still affecting anyone?

It's still crashing for me on 1.9.3 when I'm not using aforementioned ShaderProgram code.

@viftodi
Copy link

viftodi commented Jun 24, 2016

Hello,

I just had this issue by running a hello world kind of program on the latest libgdx.

The fix mentioned by @xoppa seems to work.

I also have an AMD Radeon gpu, r9 280x

@kennethjor
Copy link

I've also recently started having this issue on Libgdx 1.9.3 on Ubuntu 14.04 using an Nvidia GeForce GTX 285 card with proprietary 304.132 drivers. I didn't change anything that I know of. No driver changes, no hardware changes, no code changes. It just stopped working from one day to the next.

I have tried the ShaderProgram code suggested as well as config.useGL30 = true, no luck so far.

@untiedgames
Copy link

I suddenly noticed that I'm having this issue, and I'm using libGDX 1.9.4 on Windows 10. Graphics hardware is AMD Radeon HD 7900 + AMD Radeon HD 5670.

This is a weird one... I'm getting the error ONLY when I run my games from command line (e.g. java -jar your_jar_here.jar).
Double-clicking the jar works. I thought it might be that I have multiple versions of Java installed- one that's associated with double-clicking, and one that's on the command line. This was indeed the case, but unfortunately when I tested, neither version would run the jar from the command line. It crashes in a libGDX internal method when it tries to create the default shader.

Here's the confusing part- I have some old builds of my game that work via command line, and some old builds that don't work. I swear they all worked before, and I don't recall updating my graphics drivers.

I can confirm that adding the code posted by @xoppa resolves the issue in 1.9.4.

@ghost
Copy link

ghost commented Jan 23, 2017

@Schyfis latest version is 1.9.5 try it maybe https://libgdx.badlogicgames.com/versions.html

Start from scratch to have clean base, uninstall every version of java you have, then reinstall, and reboot

@untiedgames
Copy link

One of my friends attempted to test my game and had a problem, so I'm back.

@ghost here are the steps I took:

  • Uninstalled ALL older versions of Java
  • Installed Java version 8 update 121 (latest version) and rebooted
  • Verified that the symlinks in the Java path folder are correct
  • Verified that my path points to the newly-installed version of Java
  • Updated libGDX to 1.9.6 (latest version) and rebuilt my project, re-exported as jar
  • Checked the Java version with java -version (output is 1.8.0_121 as expected)
  • Checked the Java version from my game's log file when run via double clicking the jar (output is 1.8.0_121 as expected)

I can absolutely, positively confirm that I now only have ONE version of Java installed, but the problem I noted in my previous post is the same. Double clicking on a jar file to run it works. Running it from the command line does not, and presents me with the implicit version number error when creating the SpriteBatch.

The other thing I'm here to report is that I'm kind of in a pickle.
The code from @xoppa, minus the useGL30 line, solves the issue on my computer, but crashes when my friend tries to run it on his Mac, with the following error.

java.lang.IllegalArgumentException: Error compiling shader: ERROR: 0:1: '' :  version '140' is not supported
ERROR: 0:1: '' :  version '140' is not supported
    at com.badlogic.gdx.graphics.g2d.SpriteBatch.createDefaultShader(SpriteBatch.java:160)
    at com.badlogic.gdx.graphics.g2d.SpriteBatch.<init>(SpriteBatch.java:123)
    at com.badlogic.gdx.graphics.g2d.SpriteBatch.<init>(SpriteBatch.java:84)

I've sent my friend a version that includes the useGL30 line, and I'll report back when he tests it. Trouble is, if it works, I can't use it as a solution, because when I include that line it significantly impacts the game's performance on my computer.

@NathanSweet
Copy link
Member

From @crykn: the issue seems to be related to specific hardware, so there’s nothing libGDX can do on its end; the Mac issue mentioned in the last post is a separate problem (only core profile support on Mac).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants