Skip to content
This repository has been archived by the owner. It is now read-only.
Browse files

Fixed bug 1145 (GL Context creation fails for OpenGL 3.2 + Alpha buff…

…er with X11 BadMatch)

 Matthias      2011-02-23 09:37:51 PST

Please view the attached source file. Using this minimal program (as attached),
it creates an OpenGL 2.0 context with a cleared color buffer. If I set the
OpenGL version to 3.2, the function SDL_GL_CreateContext fails (or more
specifically, glXMakeCurrent fails) with an X11 BadMatch error:

X Error of failed request:  BadMatch (invalid parameter attributes)
  Major opcode of failed request:  128 (GLX)
  Minor opcode of failed request:  5 (X_GLXMakeCurrent)
  Serial number of failed request:  153
  Current serial number in output stream:  153

Also note that if I do not specify the alpha buffer size, the program works for
OpenGL 2.0 and OpenGL 3.2.

After some further analysis, I believe I have found the problem. The specific
issue is in:


Note that for OpenGL 3.2 contexts, the GLXFBConfig to use is chosen as the best
match from glXChooseFBConfig. However, opengl attributes originally set with
SDL_GL_SetAttribute are not mapped to GLX attributes and then passed to the
glXChooseFBConfig function. According to the GLX 1.4 specification, if the
attributes are not specified, the function falls back to defaults (which, in
this particular case, prefer alpha channel size == 0).

For testing purposes, I modified the call to glXChooseFBConfig to look
something like this:

int glxAttribs[] =

if (!glXChooseFBConfig ||
 !(framebuffer_config = glXChooseFBConfig(display, DefaultScreen(display),
glxAttribs, &fbcount)))

The best match GLXFBConfig then supports 8 bit alpha channel. The program then
works as intended.

Hope this helps!
  • Loading branch information
slouken committed Feb 25, 2011
1 parent 6779c29 commit f3eb4daad73efdd2cb3edae5c390857cb21e7b93
Showing with 22 additions and 7 deletions.
  1. +22 −7 src/video/x11/SDL_x11opengl.c
@@ -284,15 +284,14 @@ X11_GL_InitExtensions(_THIS)

XVisualInfo *
X11_GL_GetVisual(_THIS, Display * display, int screen)
X11_GL_GetAttributes(_THIS, Display * display, int screen, int * attribs, int size)
XVisualInfo *vinfo;

/* 64 seems nice. */
int attribs[64];
int i = 0;

/* assert buffer is large enough to hold all SDL attributes. */
/* assert(size >= 32);*/

/* Setup our GLX attributes according to the gl_config. */
attribs[i++] = GLX_RGBA;
attribs[i++] = GLX_RED_SIZE;
@@ -366,6 +365,18 @@ X11_GL_GetVisual(_THIS, Display * display, int screen)

attribs[i++] = None;

return i;

XVisualInfo *
X11_GL_GetVisual(_THIS, Display * display, int screen)
XVisualInfo *vinfo;

/* 64 seems nice. */
int attribs[64];
int i = X11_GL_GetAttributes(_this,display,screen,attribs,64);

vinfo = _this->gl_data->glXChooseVisual(display, screen, attribs);
if (!vinfo) {
@@ -422,6 +433,8 @@ X11_GL_CreateContext(_THIS, SDL_Window * window)
SDL_SetError("GL 3.x is not supported");
context = temp_context;
} else {
int glxAttribs[64];

/* Create a GL 3.x context */
GLXFBConfig *framebuffer_config = NULL;
int fbcount = 0;
@@ -436,10 +449,12 @@ X11_GL_CreateContext(_THIS, SDL_Window * window)
int *)) _this->gl_data->
glXGetProcAddress((GLubyte *) "glXChooseFBConfig");


if (!glXChooseFBConfig
|| !(framebuffer_config =
DefaultScreen(display), NULL,
DefaultScreen(display), glxAttribs,
&fbcount))) {
("No good framebuffers found. GL 3.x disabled");

0 comments on commit f3eb4da

Please sign in to comment.