Support for AMD hybrid technology (PowerXpress) #52

Open
Lekensteyn opened this Issue Jan 21, 2012 · 124 comments

Comments

Projects
None yet
@Lekensteyn
Member

Lekensteyn commented Jan 21, 2012

It's confirmed that the concept for Bumblebee also works for AMD Hybrid graphics: http://forums.gentoo.org/viewtopic-t-909802.html (via http://phoronix.com/forums/showthread.php?68327-Bumblebee-Has-Tumbleweed-For-NVIDIA-Optimus-On-Linux&p=247659#post247659)

For supporting AMD hybrids, we need to:

  • modify the PCI Bus ID detection to check for both the NVIDIA and AMD vendor IDs (add an extra field to the bb_status struct)
  • modify documentation, texts, comments to refer to "discrete video card" or "discrete %s video card" instead of "nvidia card". Replace "Optimus" by "Hybrid Graphics" where applicable
  • adjust the switching methods in switch/ (mainly switcheroo) to detect radeon drivers (bbswitch won't detect a card and does not get loaded anyway)
  • separate xorg error log analysis?
  • add xorg.conf for radeon (and fglrx), extend bumblebee.conf
  • Look for ways to extend bbswitch supporting AMD, possibly helpful: http://git.kernel.org/?p=linux/kernel/git/next/linux-next-history.git;a=blob;f=drivers/gpu/drm/radeon/radeon_atpx_handler.c;hb=HEAD
  • (edit if I forgot something)

I don't have AMD hardware to play with, but it shouldn't be difficult to extend bbswitch for AMD PowerXpress. The other changes are neither difficult.

@Thulinma

This comment has been minimized.

Show comment
Hide comment
@Thulinma

Thulinma Jan 22, 2012

Member

Looks like all we really need is victi.... ehh... volunteers, to test stuff on.

Member

Thulinma commented Jan 22, 2012

Looks like all we really need is victi.... ehh... volunteers, to test stuff on.

@hirakendu

This comment has been minimized.

Show comment
Hide comment
@hirakendu

hirakendu Jan 23, 2012

Following up on the howto I wrote on Gentoo forums, sharing some system logs (Xorg.log and dmesg) at http://hirakendu.mooo.com/powerexpress-stuff/logs/ for various combinations of Intel X.org drivers (2.15 vs 2.17), OpenGL libraries (mesa's vs fglrx's libGL.so), first or second X server (like bumblebee).

Currently, the only working combination with fglrx is when using with Intel X.org 2.15 (thanks to the tip in http://forums.gentoo.org/viewtopic-t-881115.html), with fglrx libGL.so and on primary server.

Following up on the howto I wrote on Gentoo forums, sharing some system logs (Xorg.log and dmesg) at http://hirakendu.mooo.com/powerexpress-stuff/logs/ for various combinations of Intel X.org drivers (2.15 vs 2.17), OpenGL libraries (mesa's vs fglrx's libGL.so), first or second X server (like bumblebee).

Currently, the only working combination with fglrx is when using with Intel X.org 2.15 (thanks to the tip in http://forums.gentoo.org/viewtopic-t-881115.html), with fglrx libGL.so and on primary server.

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Jan 28, 2012

Member

There seems to be two kinds of AMD hybrid graphics (thanks for making it more confusing AMD...). Relevant documentation:

Marketing:

From Radeon 6000M, it'll be called "Radeon" instead of "ATI Radeon". Not sure if this has implications for the PCI Vendor ID.

Member

Lekensteyn commented Jan 28, 2012

There seems to be two kinds of AMD hybrid graphics (thanks for making it more confusing AMD...). Relevant documentation:

Marketing:

From Radeon 6000M, it'll be called "Radeon" instead of "ATI Radeon". Not sure if this has implications for the PCI Vendor ID.

Lekensteyn added a commit that referenced this issue Jan 28, 2012

NVIDIA/Optimus -> AMD/hybrid graphics (GH-52)
Remaining textual changes: init scripts, bugreport

Lekensteyn added a commit that referenced this issue Jan 28, 2012

Add driver detection for fglrx+radeon (GH-52)
This breaks driver detection for optirun, which will be fixed when the protocol
is improved.

Lekensteyn added a commit that referenced this issue Jan 28, 2012

Adds support for radeon in vga_switcheroo method (GH-52)
Whether this method works for you still depends on switcheroo in kernel!
@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Jan 28, 2012

Member

Driver detection is now broken in optirun, to fix that, fix #31. Power management stuff has not completed yet.

Member

Lekensteyn commented Jan 28, 2012

Driver detection is now broken in optirun, to fix that, fix #31. Power management stuff has not completed yet.

@kravemir

This comment has been minimized.

Show comment
Hide comment
@kravemir

kravemir May 23, 2012

I'm ArchLinux user with Intel/AMD vga and in Arch is realy hard to install Catalyst, because it's not compatible with new versions xorg. So i'm open to help you with this project and test stuff.

I'm ArchLinux user with Intel/AMD vga and in Arch is realy hard to install Catalyst, because it's not compatible with new versions xorg. So i'm open to help you with this project and test stuff.

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Aug 1, 2012

Any update on this?, could i help in anyway? Have a hp laptop with amd/amd hybrid graphics.

Any update on this?, could i help in anyway? Have a hp laptop with amd/amd hybrid graphics.

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Aug 2, 2012

Member

No update on this, though AMD seems to be working on this. I saw some APTX ACPI patches on the dri-devel ml. For you it means that there may be a way to switch between cards later.

Member

Lekensteyn commented Aug 2, 2012

No update on this, though AMD seems to be working on this. I saw some APTX ACPI patches on the dri-devel ml. For you it means that there may be a way to switch between cards later.

Lekensteyn added a commit that referenced this issue Oct 5, 2012

NVIDIA/Optimus -> AMD/hybrid graphics (GH-52)
Remaining textual changes: init scripts, bugreport

Lekensteyn added a commit that referenced this issue Oct 5, 2012

Add driver detection for fglrx+radeon (GH-52)
The nvidia change is white-space only.
@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Oct 5, 2012

Member

Rebased on the development branch.

Todo:

  • separate xorg error log analysis?
  • extend bumblebee.conf (the driver-radeon and driver-fglrx sections)
  • no PM support, use PMMethod=none. If you are using radeon, you could try switcheroo, but this does not work for suspend/resume. Work is in progress to fix that s/r issue (https://bugzilla.kernel.org/show_bug.cgi?id=44391)
  • update documentation with proprietary fglrx instructions

AMD users, please test the common-amd branch! When reporting, please mention whether you have an amd/amd or amd/intel setup.

Member

Lekensteyn commented Oct 5, 2012

Rebased on the development branch.

Todo:

  • separate xorg error log analysis?
  • extend bumblebee.conf (the driver-radeon and driver-fglrx sections)
  • no PM support, use PMMethod=none. If you are using radeon, you could try switcheroo, but this does not work for suspend/resume. Work is in progress to fix that s/r issue (https://bugzilla.kernel.org/show_bug.cgi?id=44391)
  • update documentation with proprietary fglrx instructions

AMD users, please test the common-amd branch! When reporting, please mention whether you have an amd/amd or amd/intel setup.

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Oct 5, 2012

Can't get X to start on the discrete graphic card, it crash with a "Segmentation fault". (open-source radeon driver)

See Xorg.log https://gist.github.com/3842851

00:01.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI BeaverCreek [Radeon HD 6520G]
01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Whistler [AMD Radeon HD 6600M Series]

Hardware is a HP Pavilion dv6-6145eo with a Radeon HD 6755G2 (integrated AMD Radeon HD 6520G APU graphic card and discrete AMD Radeon HD 6750M graphic card), where i can't get the X server to start on "AMD Radeon HD 6750M", the setup is Muxless.

uname -a 
Linux arch 3.5.4-1-ARCH #1 SMP PREEMPT Sat Sep 15 08:12:04 CEST 2012 x86_64 GNU/Linux

pacman -Q | grep ati
ati-dri 8.0.4-3
lib32-ati-dri 8.0.4-4
xf86-video-ati 1:6.14.6-1

Can't get X to start on the discrete graphic card, it crash with a "Segmentation fault". (open-source radeon driver)

See Xorg.log https://gist.github.com/3842851

00:01.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI BeaverCreek [Radeon HD 6520G]
01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Whistler [AMD Radeon HD 6600M Series]

Hardware is a HP Pavilion dv6-6145eo with a Radeon HD 6755G2 (integrated AMD Radeon HD 6520G APU graphic card and discrete AMD Radeon HD 6750M graphic card), where i can't get the X server to start on "AMD Radeon HD 6750M", the setup is Muxless.

uname -a 
Linux arch 3.5.4-1-ARCH #1 SMP PREEMPT Sat Sep 15 08:12:04 CEST 2012 x86_64 GNU/Linux

pacman -Q | grep ati
ati-dri 8.0.4-3
lib32-ati-dri 8.0.4-4
xf86-video-ati 1:6.14.6-1
@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Oct 6, 2012

Member

Can you rebuild xf86-video-radeon and xorg-server with debugging symbols enabled? use export CFLAGS='-g -O1' (or even -O0 for no optimization and better debugging).

Then run (adjust paths accordingly):

# gdb --args X :8 -config /etc/bumblebee/xorg.conf.radeon -sharevts -nolisten tcp -noreset -isolateDevice PCI:01:00:00
(gdb) r
... crash here ...
(gdb) bt
... or if it is not a segfault, just run ...
(gdb) c

(r = run, bt = backtrace, c = continue).

Member

Lekensteyn commented Oct 6, 2012

Can you rebuild xf86-video-radeon and xorg-server with debugging symbols enabled? use export CFLAGS='-g -O1' (or even -O0 for no optimization and better debugging).

Then run (adjust paths accordingly):

# gdb --args X :8 -config /etc/bumblebee/xorg.conf.radeon -sharevts -nolisten tcp -noreset -isolateDevice PCI:01:00:00
(gdb) r
... crash here ...
(gdb) bt
... or if it is not a segfault, just run ...
(gdb) c

(r = run, bt = backtrace, c = continue).

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Oct 7, 2012

Member

@klausenbusk Please take it to the upstream developers (radeon and/or xorg-server). As far as I can see, it segfaults because no outputs are found in xf86Crtc.c:2362:

   2362     ret = xf86CollectEnabledOutputs(scrn, config, enabled);
   2363     if (ret == FALSE && canGrow) {
   2364         xf86DrvMsg(i, X_WARNING,
   2365                    "Unable to find connected outputs - setting %dx%d initial framebuffer\n",
   2366                    NO_OUTPUT_DEFAULT_WIDTH, NO_OUTPUT_DEFAULT_HEIGHT);
   2367         have_outputs = FALSE;
   2368     }

Then, on the end some function is called that ultimately tries to access config->output (according to your backtrace) which is NULL:

   2503     xf86SetScrnInfoModes(scrn);

I think the radeon people cannot do much about it, but I might be wrong. The crash when output is NULL should be fixed by the xorg-server devs though.

(if it helps, there is a thread on the AL forums with the same error message, https://bbs.archlinux.org/viewtopic.php?id=141616)

Member

Lekensteyn commented Oct 7, 2012

@klausenbusk Please take it to the upstream developers (radeon and/or xorg-server). As far as I can see, it segfaults because no outputs are found in xf86Crtc.c:2362:

   2362     ret = xf86CollectEnabledOutputs(scrn, config, enabled);
   2363     if (ret == FALSE && canGrow) {
   2364         xf86DrvMsg(i, X_WARNING,
   2365                    "Unable to find connected outputs - setting %dx%d initial framebuffer\n",
   2366                    NO_OUTPUT_DEFAULT_WIDTH, NO_OUTPUT_DEFAULT_HEIGHT);
   2367         have_outputs = FALSE;
   2368     }

Then, on the end some function is called that ultimately tries to access config->output (according to your backtrace) which is NULL:

   2503     xf86SetScrnInfoModes(scrn);

I think the radeon people cannot do much about it, but I might be wrong. The crash when output is NULL should be fixed by the xorg-server devs though.

(if it helps, there is a thread on the AL forums with the same error message, https://bbs.archlinux.org/viewtopic.php?id=141616)

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Oct 7, 2012

Thanks for your time!!
Have posted a bug on https://bugs.freedesktop.org/show_bug.cgi?id=55731 but think i have written it under a wrong "Component" :)

Do you self have some ATI/AMD hardware to test with?

And how can i help with getting Bumblebee to work with Intel/AMD/ATI, just write here if it crash? or how?
Because i also have a older laptop with Intel/ATI i could test with, and maybe i could test with Fglrx on this laptop where radeon do not work.

Thanks for your time!!
Have posted a bug on https://bugs.freedesktop.org/show_bug.cgi?id=55731 but think i have written it under a wrong "Component" :)

Do you self have some ATI/AMD hardware to test with?

And how can i help with getting Bumblebee to work with Intel/AMD/ATI, just write here if it crash? or how?
Because i also have a older laptop with Intel/ATI i could test with, and maybe i could test with Fglrx on this laptop where radeon do not work.

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Oct 7, 2012

Member

@klausenbusk Unfortunately I have no AMD hw on a laptop, so I need to rely on users like you who want to get their stuff working.

Currently this bug is focused on getting optirun to run at least on AMD hw (either intel/amd or amd/amd). Later on, power control should be made available as an extension to bbswitch (works for fglrx too) or fix to vga_switcheroo (will only work for opensource drivers).

Please test it with all hybrid Intel/ATI/AMD+ATI/AMD laptops you have available on. In order to help with fixing vga_switcheroo, I request you to answer the questions given in https://bugzilla.kernel.org/show_bug.cgi?id=45061#c3

Member

Lekensteyn commented Oct 7, 2012

@klausenbusk Unfortunately I have no AMD hw on a laptop, so I need to rely on users like you who want to get their stuff working.

Currently this bug is focused on getting optirun to run at least on AMD hw (either intel/amd or amd/amd). Later on, power control should be made available as an extension to bbswitch (works for fglrx too) or fix to vga_switcheroo (will only work for opensource drivers).

Please test it with all hybrid Intel/ATI/AMD+ATI/AMD laptops you have available on. In order to help with fixing vga_switcheroo, I request you to answer the questions given in https://bugzilla.kernel.org/show_bug.cgi?id=45061#c3

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Oct 8, 2012

Have just tested a bit on my Intel/Ati laptop.
First switcheroo seems to work (suspend/resume not tested, and resume from suspend don't work for some reason on this laptop (black screen)).
Just need to made a "[driver-radeon]" section and add "PMMethod=switcheroo", maybe you could add "[driver-radeon]" and "[driver-fglrx]" section to bumblebee.conf in git. Edit: There seems to be some problem with it, a little hard to explain. (don't have time to write right now)

Now the problems begin, can't start oilrush with optirun ([VGL] ERROR: glXCreateContextAttribsARB symbol not loaded), it start fine on the integrated card and fine on the discrete card (laptop have a multiplexer), but can't get it started with optirun.
Logs: (say if you need more info)
bumblebeed --debug
To long to ber here: https://gist.github.com/3854709
optirun --debug oilrush

[kristian@arch ~]$ optirun --debug oilrush
[  285.121137] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[  285.121211] [DEBUG]Active configuration:
[  285.121249] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[  285.121277] [DEBUG] X display: :8
[  285.121303] [DEBUG] LD_LIBRARY_PATH: 
[  285.121330] [DEBUG] Socket path: /var/run/bumblebee.socket
[  285.121357] [DEBUG] VGL Compression: proxy
[  297.755905] [INFO]Response: Yes. X is active.

[  297.755930] [INFO]Running application through vglrun.
[  297.756071] [DEBUG]Process vglrun started, PID 8304.
QGtkStyle was unable to detect the current GTK+ theme.
Loading "/home/kristian/.OilRush/oilrush.cfg"...
Loading "libNetwork_x86.so"...
Loading "libGL.so.1"...
Loading "libopenal.so.1"...
Set 1366x768 fullscreen video mode
[VGL] ERROR: glXCreateContextAttribsARB symbol not loaded
AL lib: ALc.c:1879: exit(): closing 1 Device
AL lib: ALc.c:1808: alcCloseDevice(): destroying 1 Context(s)
[  303.904476] [DEBUG]SIGCHILD received, but wait failed with No child processes
[  303.904705] [DEBUG]Socket closed.
[  303.904873] [DEBUG]Killing all remaining processes.

Have looked at #124 but there seems to be no fix.

Minecraft won't start with "optirun java -jar minecraftjar" but work with "optirun --vgl-options -nodl java -jar minecraft.jar", do there need to be fixed something in something?

Have just tested a bit on my Intel/Ati laptop.
First switcheroo seems to work (suspend/resume not tested, and resume from suspend don't work for some reason on this laptop (black screen)).
Just need to made a "[driver-radeon]" section and add "PMMethod=switcheroo", maybe you could add "[driver-radeon]" and "[driver-fglrx]" section to bumblebee.conf in git. Edit: There seems to be some problem with it, a little hard to explain. (don't have time to write right now)

Now the problems begin, can't start oilrush with optirun ([VGL] ERROR: glXCreateContextAttribsARB symbol not loaded), it start fine on the integrated card and fine on the discrete card (laptop have a multiplexer), but can't get it started with optirun.
Logs: (say if you need more info)
bumblebeed --debug
To long to ber here: https://gist.github.com/3854709
optirun --debug oilrush

[kristian@arch ~]$ optirun --debug oilrush
[  285.121137] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[  285.121211] [DEBUG]Active configuration:
[  285.121249] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[  285.121277] [DEBUG] X display: :8
[  285.121303] [DEBUG] LD_LIBRARY_PATH: 
[  285.121330] [DEBUG] Socket path: /var/run/bumblebee.socket
[  285.121357] [DEBUG] VGL Compression: proxy
[  297.755905] [INFO]Response: Yes. X is active.

[  297.755930] [INFO]Running application through vglrun.
[  297.756071] [DEBUG]Process vglrun started, PID 8304.
QGtkStyle was unable to detect the current GTK+ theme.
Loading "/home/kristian/.OilRush/oilrush.cfg"...
Loading "libNetwork_x86.so"...
Loading "libGL.so.1"...
Loading "libopenal.so.1"...
Set 1366x768 fullscreen video mode
[VGL] ERROR: glXCreateContextAttribsARB symbol not loaded
AL lib: ALc.c:1879: exit(): closing 1 Device
AL lib: ALc.c:1808: alcCloseDevice(): destroying 1 Context(s)
[  303.904476] [DEBUG]SIGCHILD received, but wait failed with No child processes
[  303.904705] [DEBUG]Socket closed.
[  303.904873] [DEBUG]Killing all remaining processes.

Have looked at #124 but there seems to be no fix.

Minecraft won't start with "optirun java -jar minecraftjar" but work with "optirun --vgl-options -nodl java -jar minecraft.jar", do there need to be fixed something in something?

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Oct 9, 2012

Member

-nodl prevents VGL from intercepting libGL library calls, possibly preventing optirun from having any effect.

switcheroo is supposed to work for ON/OFF until you do a suspend. When you have a black screen, you can try ssh'ing into your machine and look for errors in your dmesg and Xorg.0.log.

Do optirun glxgears and optirun glxspheres work? glXCreateContextAttribsARB exists in the Mesa GL library, not sure why it does not work here.

Oh, and can you compare /var/log/Xorg.0.log against Xorg.8.log? The integrated GPU is also listed in Xorg.8.log which looks a bit strange.

Member

Lekensteyn commented Oct 9, 2012

-nodl prevents VGL from intercepting libGL library calls, possibly preventing optirun from having any effect.

switcheroo is supposed to work for ON/OFF until you do a suspend. When you have a black screen, you can try ssh'ing into your machine and look for errors in your dmesg and Xorg.0.log.

Do optirun glxgears and optirun glxspheres work? glXCreateContextAttribsARB exists in the Mesa GL library, not sure why it does not work here.

Oh, and can you compare /var/log/Xorg.0.log against Xorg.8.log? The integrated GPU is also listed in Xorg.8.log which looks a bit strange.

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Oct 9, 2012

Return back when i have tested with mesa 9.0, where the Intel driver have got OpenGL 3.1 and radeon 3.0
http://www.phoronix.com/scan.php?page=news_item&px=MTIwMjE
http://www.phoronix.com/scan.php?page=news_item&px=MTIwMjQ
Then it should work with mesa 9.0, because glXCreateContextAttribsARB require OpenGL 3.0 or later.
Current intel and radeon driver only have OpenGL 2.1

Return back when i have tested with mesa 9.0, where the Intel driver have got OpenGL 3.1 and radeon 3.0
http://www.phoronix.com/scan.php?page=news_item&px=MTIwMjE
http://www.phoronix.com/scan.php?page=news_item&px=MTIwMjQ
Then it should work with mesa 9.0, because glXCreateContextAttribsARB require OpenGL 3.0 or later.
Current intel and radeon driver only have OpenGL 2.1

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Oct 13, 2012

Mesa 9.0 fix nothing, same error. Don't know if it need kernel 3.6?

Mesa 9.0 fix nothing, same error. Don't know if it need kernel 3.6?

@BlueCase

This comment has been minimized.

Show comment
Hide comment
@BlueCase

BlueCase Oct 15, 2012

We had the problem with glXCreateContextAttribsARB already with the first bumblebee version.
MrMEEE/bumblebee-Old-and-abbandoned#314

We had the problem with glXCreateContextAttribsARB already with the first bumblebee version.
MrMEEE/bumblebee-Old-and-abbandoned#314

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Oct 15, 2012

Thanks BlueCase, i believe what you say. Will try with Primus alternative to virtualgl.

Thanks BlueCase, i believe what you say. Will try with Primus alternative to virtualgl.

@BlueCase

This comment has been minimized.

Show comment
Hide comment
@BlueCase

BlueCase Oct 15, 2012

I will check tomorrow the VirtualGL package. Last time MrMEEE solved it and games how OilRush were playable.

I will check tomorrow the VirtualGL package. Last time MrMEEE solved it and games how OilRush were playable.

@HariSeldon85

This comment has been minimized.

Show comment
Hide comment
@HariSeldon85

HariSeldon85 Nov 3, 2012

Hi,
I'm an arch linux user with a dell inspiron 15r se : HD4000 + 7730M.
I would like to help in testing and, if I can, in developing.
how si the actual state of the project ?

Thanks,
HS

Hi,
I'm an arch linux user with a dell inspiron 15r se : HD4000 + 7730M.
I would like to help in testing and, if I can, in developing.
how si the actual state of the project ?

Thanks,
HS

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Nov 3, 2012

Member

@HariSeldon85 The state is still the same as a month ago. Does switcheroo ON/OFF work for you? Is the system initially booted on the more powerful chip or the power efficient one?
Does Bumblebee from the common-amd branch work for you?

Member

Lekensteyn commented Nov 3, 2012

@HariSeldon85 The state is still the same as a month ago. Does switcheroo ON/OFF work for you? Is the system initially booted on the more powerful chip or the power efficient one?
Does Bumblebee from the common-amd branch work for you?

@HariSeldon85

This comment has been minimized.

Show comment
Hide comment
@HariSeldon85

HariSeldon85 Nov 3, 2012

Hi,
Thanks for the answer.
I've just cloned the common-amd branch and i'm trying to compile and install.
After I've generated the configure file with "autoreconf -fi" how should I "adjust the library and set the module path"?

On the readme is written :
"/configure CONF_DRIVER=nvidia CONF_DRIVER_MODULE_NVIDIA=nvidia-current
CONF_LDPATH_NVIDIA=/usr/lib/nvidia-current:/usr/lib32/nvidia-current
CONF_MODPATH_NVIDIA=/usr/lib/nvidia-current/xorg,/usr/lib/xorg/modules"

I'm using the radeonhd open driver.

Sorry for askin but these days i've not so much time to investigate myself.
Thanks

HS

Hi,
Thanks for the answer.
I've just cloned the common-amd branch and i'm trying to compile and install.
After I've generated the configure file with "autoreconf -fi" how should I "adjust the library and set the module path"?

On the readme is written :
"/configure CONF_DRIVER=nvidia CONF_DRIVER_MODULE_NVIDIA=nvidia-current
CONF_LDPATH_NVIDIA=/usr/lib/nvidia-current:/usr/lib32/nvidia-current
CONF_MODPATH_NVIDIA=/usr/lib/nvidia-current/xorg,/usr/lib/xorg/modules"

I'm using the radeonhd open driver.

Sorry for askin but these days i've not so much time to investigate myself.
Thanks

HS

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Nov 3, 2012

Member

For the open-source drivers it is sufficient to use ./configure --sysconfdir /etc CONF_DRIVER=radeon, no special library or module paths are necessary.
Then run make && sudo make install. If you prefer, you can use this PKGBUILD. The only changes you need is the ./configure line I mentioned and in addition you need to change _gitbranch=develop to _gitbranch=common-amd.

Member

Lekensteyn commented Nov 3, 2012

For the open-source drivers it is sufficient to use ./configure --sysconfdir /etc CONF_DRIVER=radeon, no special library or module paths are necessary.
Then run make && sudo make install. If you prefer, you can use this PKGBUILD. The only changes you need is the ./configure line I mentioned and in addition you need to change _gitbranch=develop to _gitbranch=common-amd.

@HariSeldon85

This comment has been minimized.

Show comment
Hide comment
@HariSeldon85

HariSeldon85 Nov 4, 2012

Here we are.
I'll try to do a little report as best as I can.

-- ASSUMPTIONS:

I've an Ivy bridge i7 CPU with intel HD4000 integrated GPU and the radeon hd 7730M (Dell inspiron 15R SE - a.k.a. inspiron 7520 ).

After I've installed ArchLinux (or any other distro) the system starts with the HD4000 gpu.
This is because my system is a mux-less one, so, Xorg can't do the rendering on the radeon GPU because it is not attached to any display (the same behaviour of Nvidia/Optimus I think).

The reason above, is why switcheroo doesn't work out of the box for me and i cannot activate/use the discrete GPU.
The only working feature of switcheroo is the powering off of the radeon card: If I do an "echo OFF > /sys/debug/vgaswitcheroo/switch" on /etc/rc.local, it really power off the radeon GPU. (checked with "lspci -vnnn | grep VGA).

--BUMBLEBEE BEHAVIOUR:

Now comes to bumblebee.
I've installed bumblebee common-amd branch with the pkgbuild you linked. Installation was fine.
Then I reboot.

The systems boots up correctly with the intel driver (checked with glxinfo).
So I tryed to "optirun glxgears" but I've noticed that the bumblebee service was not running and my user wasn't added to "bumblebbe" group.

So I had to manually:

  1. start bumblebee damon with systemd service
  2. add my user to "bumblebee" group.

After that i've tried again to "optirun glxgears", but i've had this error:

................
[gigi@sirius ~]$ optirun -vv glxgears
[ 137.139406] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[ 137.139838] [INFO]Configured driver: radeon
[ 137.140190] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[ 137.140216] [DEBUG]Active configuration:
[ 137.140224] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 137.140231] [DEBUG] X display: :8
[ 137.140237] [DEBUG] LD_LIBRARY_PATH:
[ 137.140244] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 137.140250] [DEBUG] VGL Compression: proxy
[ 137.157851] [INFO]Response: No - error: XORG Failed to load module "mouse" (module does not exist, 0)

[ 137.157883] [ERROR]Cannot access secondary GPU - error: XORG Failed to load module "mouse" (module does not exist, 0)

[ 137.157893] [DEBUG]Socket closed.
[ 137.157923] [ERROR]Aborting because fallback start is disabled.
[ 137.157933] [DEBUG]Killing all remaining processes.
................

So, i've googled around, and i've found this your post:
#123 (comment)

But in /etc/X11/xorg.conf.d/ folder i've only 3 files : 10-synaptics.conf, 10quirks.conf, 10-evdev.conf
and no one of these files forces the intel driver.

now i'm stucked here. I don't know what to do.

Hope this little report could help.

Really thanks for your hard work, and let me know how I could help.

Thanks,

HS

Here we are.
I'll try to do a little report as best as I can.

-- ASSUMPTIONS:

I've an Ivy bridge i7 CPU with intel HD4000 integrated GPU and the radeon hd 7730M (Dell inspiron 15R SE - a.k.a. inspiron 7520 ).

After I've installed ArchLinux (or any other distro) the system starts with the HD4000 gpu.
This is because my system is a mux-less one, so, Xorg can't do the rendering on the radeon GPU because it is not attached to any display (the same behaviour of Nvidia/Optimus I think).

The reason above, is why switcheroo doesn't work out of the box for me and i cannot activate/use the discrete GPU.
The only working feature of switcheroo is the powering off of the radeon card: If I do an "echo OFF > /sys/debug/vgaswitcheroo/switch" on /etc/rc.local, it really power off the radeon GPU. (checked with "lspci -vnnn | grep VGA).

--BUMBLEBEE BEHAVIOUR:

Now comes to bumblebee.
I've installed bumblebee common-amd branch with the pkgbuild you linked. Installation was fine.
Then I reboot.

The systems boots up correctly with the intel driver (checked with glxinfo).
So I tryed to "optirun glxgears" but I've noticed that the bumblebee service was not running and my user wasn't added to "bumblebbe" group.

So I had to manually:

  1. start bumblebee damon with systemd service
  2. add my user to "bumblebee" group.

After that i've tried again to "optirun glxgears", but i've had this error:

................
[gigi@sirius ~]$ optirun -vv glxgears
[ 137.139406] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[ 137.139838] [INFO]Configured driver: radeon
[ 137.140190] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[ 137.140216] [DEBUG]Active configuration:
[ 137.140224] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 137.140231] [DEBUG] X display: :8
[ 137.140237] [DEBUG] LD_LIBRARY_PATH:
[ 137.140244] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 137.140250] [DEBUG] VGL Compression: proxy
[ 137.157851] [INFO]Response: No - error: XORG Failed to load module "mouse" (module does not exist, 0)

[ 137.157883] [ERROR]Cannot access secondary GPU - error: XORG Failed to load module "mouse" (module does not exist, 0)

[ 137.157893] [DEBUG]Socket closed.
[ 137.157923] [ERROR]Aborting because fallback start is disabled.
[ 137.157933] [DEBUG]Killing all remaining processes.
................

So, i've googled around, and i've found this your post:
#123 (comment)

But in /etc/X11/xorg.conf.d/ folder i've only 3 files : 10-synaptics.conf, 10quirks.conf, 10-evdev.conf
and no one of these files forces the intel driver.

now i'm stucked here. I don't know what to do.

Hope this little report could help.

Really thanks for your hard work, and let me know how I could help.

Thanks,

HS

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Nov 5, 2012

HariSeldon85: Try install xf86-input-mouse to get the "mouse" module, what i do :) and remov the changes you have made to xorg.conf.d files.

HariSeldon85: Try install xf86-input-mouse to get the "mouse" module, what i do :) and remov the changes you have made to xorg.conf.d files.

@HariSeldon85

This comment has been minimized.

Show comment
Hide comment
@HariSeldon85

HariSeldon85 Nov 5, 2012

Ok,
Done :)

Now the error is the following:

[gigi@sirius ~]$ optirun -vv glxgears
[ 23.653104] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[ 23.653587] [INFO]Configured driver: radeon
[ 23.653900] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[ 23.653926] [DEBUG]Active configuration:
[ 23.653934] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 23.653941] [DEBUG] X display: :8
[ 23.653947] [DEBUG] LD_LIBRARY_PATH:
[ 23.653954] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 23.653960] [DEBUG] VGL Compression: proxy
[ 23.678535] [INFO]Response: No - error: XORG No devices detected.

[ 23.678554] [ERROR]Cannot access secondary GPU - error: XORG No devices detected.

[ 23.678558] [DEBUG]Socket closed.
[ 23.678573] [ERROR]Aborting because fallback start is disabled.
[ 23.678578] [DEBUG]Killing all remaining processes.

Ok,
Done :)

Now the error is the following:

[gigi@sirius ~]$ optirun -vv glxgears
[ 23.653104] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[ 23.653587] [INFO]Configured driver: radeon
[ 23.653900] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[ 23.653926] [DEBUG]Active configuration:
[ 23.653934] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 23.653941] [DEBUG] X display: :8
[ 23.653947] [DEBUG] LD_LIBRARY_PATH:
[ 23.653954] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 23.653960] [DEBUG] VGL Compression: proxy
[ 23.678535] [INFO]Response: No - error: XORG No devices detected.

[ 23.678554] [ERROR]Cannot access secondary GPU - error: XORG No devices detected.

[ 23.678558] [DEBUG]Socket closed.
[ 23.678573] [ERROR]Aborting because fallback start is disabled.
[ 23.678578] [DEBUG]Killing all remaining processes.

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Nov 5, 2012

What do bumblebbeed say? (try run it in a terminal with debug mode). I you radeon card enable in vgaswitchroo?

What do bumblebbeed say? (try run it in a terminal with debug mode). I you radeon card enable in vgaswitchroo?

@HariSeldon85

This comment has been minimized.

Show comment
Hide comment
@HariSeldon85

HariSeldon85 Nov 5, 2012

The debug output is the following:

......................
[gigi@sirius ~]$ optirun --debug glxgears
[ 74.372345] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[ 74.372388] [DEBUG]Active configuration:
[ 74.372397] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 74.372404] [DEBUG] X display: :8
[ 74.372410] [DEBUG] LD_LIBRARY_PATH:
[ 74.372417] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 74.372424] [DEBUG] VGL Compression: proxy
[ 74.396452] [INFO]Response: No - error: XORG No devices detected.

[ 74.396483] [ERROR]Cannot access secondary GPU - error: XORG No devices detected.

[ 74.396493] [DEBUG]Socket closed.
[ 74.396524] [ERROR]Aborting because fallback start is disabled.
[ 74.396535] [DEBUG]Killing all remaining processes.
........................

The radeon card is powered on as vgaswitcheroo says:

........................
[gigi@sirius ~]$ cat /sys/kernel/debug/vgaswitcheroo/switch
0:DIS: :Pwr:0000:01:00.0
1:IGD:+:Pwr:0000:00:02.0
........................

and the radeon module is loaded as shown by get_module:
........................
[gigi@sirius ~]$ sudo get_module radeon
coresize : 859398
initsize : 0
initstate : live
refcnt : 0
taint :
uevent : (null)
Parameters:
agpmode : 0
audio : 0
benchmark : 0
connector_table : 0
disp_priority : 0
dynclks : -1
gartsize : 512
hw_i2c : 0
lockup_timeout : 10000
modeset : 1
msi : -1
no_wb : 0
pcie_gen2 : -1
r4xx_atom : 0
test : 0
tv : 1
vramlimit : 0
Sections:
.bss : 0xffffffffa0872be0
.data : 0xffffffffa0869500
.data.unlikely : 0xffffffffa0872810
.devinit.text : 0xffffffffa0842c50
.exit.text : 0xffffffffa0842d09
.fixup : 0xffffffffa0842d27
.gnu.linkonce.this_module : 0xffffffffa0872980
.init.text : 0xffffffffa0961000
.note.gnu.build-id : 0xffffffffa0842f78
.parainstructions : 0xffffffffa08693e0
.rodata : 0xffffffffa0842fa0
.rodata.str1.1 : 0xffffffffa08568c1
.rodata.str1.8 : 0xffffffffa085cda8
.smp_locks : 0xffffffffa0869098
.strtab : 0xffffffffa0974470
.symtab : 0xffffffffa09610f0
.text : 0xffffffffa07b4000
__bug_table : 0xffffffffa08692f0
__ex_table : 0xffffffffa0869118
__jump_table : 0xffffffffa0872780
__mcount_loc : 0xffffffffa0866728
__param : 0xffffffffa0866508
__tracepoints_ptrs : 0xffffffffa0869420
__tracepoints_strings : 0xffffffffa0869450
__tracepoints : 0xffffffffa0872840
_ftrace_events : 0xffffffffa0872818
...................

This is a clean Arch installation, just for bumblebee testing purpose :).
No rc-local script, no radeon blacklisting and modprobing. Just the intel and radeon driver package installed and then bumblebee.

The debug output is the following:

......................
[gigi@sirius ~]$ optirun --debug glxgears
[ 74.372345] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[ 74.372388] [DEBUG]Active configuration:
[ 74.372397] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 74.372404] [DEBUG] X display: :8
[ 74.372410] [DEBUG] LD_LIBRARY_PATH:
[ 74.372417] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 74.372424] [DEBUG] VGL Compression: proxy
[ 74.396452] [INFO]Response: No - error: XORG No devices detected.

[ 74.396483] [ERROR]Cannot access secondary GPU - error: XORG No devices detected.

[ 74.396493] [DEBUG]Socket closed.
[ 74.396524] [ERROR]Aborting because fallback start is disabled.
[ 74.396535] [DEBUG]Killing all remaining processes.
........................

The radeon card is powered on as vgaswitcheroo says:

........................
[gigi@sirius ~]$ cat /sys/kernel/debug/vgaswitcheroo/switch
0:DIS: :Pwr:0000:01:00.0
1:IGD:+:Pwr:0000:00:02.0
........................

and the radeon module is loaded as shown by get_module:
........................
[gigi@sirius ~]$ sudo get_module radeon
coresize : 859398
initsize : 0
initstate : live
refcnt : 0
taint :
uevent : (null)
Parameters:
agpmode : 0
audio : 0
benchmark : 0
connector_table : 0
disp_priority : 0
dynclks : -1
gartsize : 512
hw_i2c : 0
lockup_timeout : 10000
modeset : 1
msi : -1
no_wb : 0
pcie_gen2 : -1
r4xx_atom : 0
test : 0
tv : 1
vramlimit : 0
Sections:
.bss : 0xffffffffa0872be0
.data : 0xffffffffa0869500
.data.unlikely : 0xffffffffa0872810
.devinit.text : 0xffffffffa0842c50
.exit.text : 0xffffffffa0842d09
.fixup : 0xffffffffa0842d27
.gnu.linkonce.this_module : 0xffffffffa0872980
.init.text : 0xffffffffa0961000
.note.gnu.build-id : 0xffffffffa0842f78
.parainstructions : 0xffffffffa08693e0
.rodata : 0xffffffffa0842fa0
.rodata.str1.1 : 0xffffffffa08568c1
.rodata.str1.8 : 0xffffffffa085cda8
.smp_locks : 0xffffffffa0869098
.strtab : 0xffffffffa0974470
.symtab : 0xffffffffa09610f0
.text : 0xffffffffa07b4000
__bug_table : 0xffffffffa08692f0
__ex_table : 0xffffffffa0869118
__jump_table : 0xffffffffa0872780
__mcount_loc : 0xffffffffa0866728
__param : 0xffffffffa0866508
__tracepoints_ptrs : 0xffffffffa0869420
__tracepoints_strings : 0xffffffffa0869450
__tracepoints : 0xffffffffa0872840
_ftrace_events : 0xffffffffa0872818
...................

This is a clean Arch installation, just for bumblebee testing purpose :).
No rc-local script, no radeon blacklisting and modprobing. Just the intel and radeon driver package installed and then bumblebee.

@amonakov

This comment has been minimized.

Show comment
Hide comment
@amonakov

amonakov Jul 3, 2013

Contributor

I suppose you're using the open-source radeon driver, not fglrx, right?

Please try with /etc/X11/xorg.conf and /opt/bumblebee/bumblebee/xorg.conf.radeon modified to explicitly set BusID for each card.

Again, I'd appreciate if you explained why you're trying to use Bumblebee rather than PRIME offloading.

Contributor

amonakov commented Jul 3, 2013

I suppose you're using the open-source radeon driver, not fglrx, right?

Please try with /etc/X11/xorg.conf and /opt/bumblebee/bumblebee/xorg.conf.radeon modified to explicitly set BusID for each card.

Again, I'd appreciate if you explained why you're trying to use Bumblebee rather than PRIME offloading.

@Ekleog

This comment has been minimized.

Show comment
Hide comment
@Ekleog

Ekleog Jul 3, 2013

Yes, I'm using radeon.

I don't use PRIME offloading because, according to this page, PRIME requires a compositing manager which I do not use (i3 is not compositing), and a second screen for slave card (not sure if I understand correctly, but I did not manage to make X accept to run without a screen -- and an almost empty screen section did not fit it). The real reason being I did not know about it, though.

If I specifically set the radeon PCI bus id in /opt/bumblebee/bumblebee/xorg.conf.radeon, I get the following part (until KMS line everything is just the same) :

[ 5376.523715] [DEBUG][XORG] (II) [KMS] Kernel modesetting enabled.
[ 5376.523735] [DEBUG][XORG] (II) RADEON(0): Creating default Display subsection in Screen section
[ 5376.523755] [DEBUG][XORG]    "Screen0" for depth/fbbpp 24/32
[ 5376.523784] [DEBUG][XORG] (==) RADEON(0): Depth 24, (--) framebuffer bpp 32
[ 5376.523804] [DEBUG][XORG] (II) RADEON(0): Pixel depth = 24 bits stored in 4 bytes (32 bpp pixmaps)
[ 5376.523822] [DEBUG][XORG] (==) RADEON(0): Default visual is TrueColor
[ 5376.523844] [DEBUG][XORG] (==) RADEON(0): RGB weight 888
[ 5376.523866] [DEBUG][XORG] (II) RADEON(0): Using 8 bits per RGB (8 bit DAC)
[ 5376.523888] [DEBUG][XORG] (--) RADEON(0): Chipset: "TURKS" (ChipID = 0x6840)
[ 5376.523905] [DEBUG][XORG] (II) Loading sub module "dri2"
[ 5376.523927] [DEBUG][XORG] (II) LoadModule: "dri2"
[ 5376.523948] [DEBUG][XORG] (II) Module "dri2" already built-in
[ 5376.523967] [DEBUG][XORG] (II) Loading sub module "exa"
[ 5376.523987] [DEBUG][XORG] (II) LoadModule: "exa"
[ 5376.524007] [DEBUG][XORG] (II) Loading /usr/lib/xorg/modules/libexa.so
[ 5376.524028] [DEBUG][XORG] (II) Module exa: vendor="X.Org Foundation"
[ 5376.524048] [DEBUG][XORG]    compiled for 1.14.2, module version = 2.6.0
[ 5376.524076] [DEBUG][XORG]    ABI class: X.Org Video Driver, version 14.1
[ 5376.524103] [DEBUG][XORG] (II) RADEON(0): KMS Color Tiling: enabled
[ 5376.524122] [DEBUG][XORG] (II) RADEON(0): KMS Color Tiling 2D: enabled
[ 5376.524143] [DEBUG][XORG] (II) RADEON(0): KMS Pageflipping: enabled
[ 5376.524166] [DEBUG][XORG] (II) RADEON(0): SwapBuffers wait for vsync: enabled
[ 5376.524189] [DEBUG][XORG] (WW) RADEON(0): No outputs definitely connected, trying again...
[ 5376.524209] [DEBUG][XORG] (WW) RADEON(0): Unable to find connected outputs - setting 1024x768 initial framebuffer
[ 5376.524230] [DEBUG][XORG] (II) RADEON(0): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated.
[ 5376.524251] [DEBUG][XORG] (II) RADEON(0): mem size init: gart size :1fdef000 vram size: s:80000000 visible:7fcc0000
[ 5376.524271] [DEBUG][XORG] (II) RADEON(0): EXA: Driver will allow EXA pixmaps in VRAM
[ 5376.524291] [DEBUG][XORG] (==) RADEON(0): DPI set to (96, 96)
[ 5376.524310] [DEBUG][XORG] (II) Loading sub module "fb"
[ 5376.524329] [DEBUG][XORG] (II) LoadModule: "fb"
[ 5376.524350] [DEBUG][XORG] (II) Loading /usr/lib/xorg/modules/libfb.so
[ 5376.524370] [DEBUG][XORG] (II) Module fb: vendor="X.Org Foundation"
[ 5376.524390] [DEBUG][XORG]    compiled for 1.14.2, module version = 1.0.0
[ 5376.524418] [DEBUG][XORG]    ABI class: X.Org ANSI C Emulation, version 0.4
[ 5376.524445] [DEBUG][XORG] (II) Loading sub module "ramdac"
[ 5376.524464] [DEBUG][XORG] (II) LoadModule: "ramdac"
[ 5376.524483] [DEBUG][XORG] (II) Module "ramdac" already built-in
[ 5376.524503] [ERROR][XORG] (EE) RADEON(0): No modes.
[ 5376.524522] [DEBUG][XORG] (II) UnloadModule: "radeon"
[ 5376.524543] [DEBUG][XORG] (II) UnloadSubModule: "fb"
[ 5376.524563] [DEBUG][XORG] (II) Unloading fb
[ 5376.524582] [DEBUG][XORG] (II) UnloadSubModule: "exa"
[ 5376.524601] [DEBUG][XORG] (II) Unloading exa
[ 5376.524621] [ERROR][XORG] (EE) Screen(s) found, but none have a usable configuration.
[ 5376.524641] [ERROR][XORG] (EE) 
[ 5376.524658] [DEBUG][XORG] Fatal server error:
[ 5376.524680] [ERROR][XORG] (EE) no screens found(EE) 
[ 5376.524701] [ERROR][XORG] (EE) 
[ 5376.524720] [DEBUG][XORG] Please consult the The X.Org Foundation support 
[ 5376.524740] [DEBUG][XORG]     at http://wiki.x.org
[ 5376.524767] [DEBUG][XORG]  for help. 
[ 5376.524789] [ERROR][XORG] (EE) Please also check the log file at "/var/log/Xorg.8.log" for additional information.
[ 5376.524812] [ERROR][XORG] (EE) 
[ 5376.524833] [ERROR][XORG] (EE) Server terminated with error (1). Closing log file.
[ 5376.524853] [ERROR]X did not start properly
[ 5376.524942] [DEBUG]Socket closed.

Looks like it went a bit farther !

Ekleog commented Jul 3, 2013

Yes, I'm using radeon.

I don't use PRIME offloading because, according to this page, PRIME requires a compositing manager which I do not use (i3 is not compositing), and a second screen for slave card (not sure if I understand correctly, but I did not manage to make X accept to run without a screen -- and an almost empty screen section did not fit it). The real reason being I did not know about it, though.

If I specifically set the radeon PCI bus id in /opt/bumblebee/bumblebee/xorg.conf.radeon, I get the following part (until KMS line everything is just the same) :

[ 5376.523715] [DEBUG][XORG] (II) [KMS] Kernel modesetting enabled.
[ 5376.523735] [DEBUG][XORG] (II) RADEON(0): Creating default Display subsection in Screen section
[ 5376.523755] [DEBUG][XORG]    "Screen0" for depth/fbbpp 24/32
[ 5376.523784] [DEBUG][XORG] (==) RADEON(0): Depth 24, (--) framebuffer bpp 32
[ 5376.523804] [DEBUG][XORG] (II) RADEON(0): Pixel depth = 24 bits stored in 4 bytes (32 bpp pixmaps)
[ 5376.523822] [DEBUG][XORG] (==) RADEON(0): Default visual is TrueColor
[ 5376.523844] [DEBUG][XORG] (==) RADEON(0): RGB weight 888
[ 5376.523866] [DEBUG][XORG] (II) RADEON(0): Using 8 bits per RGB (8 bit DAC)
[ 5376.523888] [DEBUG][XORG] (--) RADEON(0): Chipset: "TURKS" (ChipID = 0x6840)
[ 5376.523905] [DEBUG][XORG] (II) Loading sub module "dri2"
[ 5376.523927] [DEBUG][XORG] (II) LoadModule: "dri2"
[ 5376.523948] [DEBUG][XORG] (II) Module "dri2" already built-in
[ 5376.523967] [DEBUG][XORG] (II) Loading sub module "exa"
[ 5376.523987] [DEBUG][XORG] (II) LoadModule: "exa"
[ 5376.524007] [DEBUG][XORG] (II) Loading /usr/lib/xorg/modules/libexa.so
[ 5376.524028] [DEBUG][XORG] (II) Module exa: vendor="X.Org Foundation"
[ 5376.524048] [DEBUG][XORG]    compiled for 1.14.2, module version = 2.6.0
[ 5376.524076] [DEBUG][XORG]    ABI class: X.Org Video Driver, version 14.1
[ 5376.524103] [DEBUG][XORG] (II) RADEON(0): KMS Color Tiling: enabled
[ 5376.524122] [DEBUG][XORG] (II) RADEON(0): KMS Color Tiling 2D: enabled
[ 5376.524143] [DEBUG][XORG] (II) RADEON(0): KMS Pageflipping: enabled
[ 5376.524166] [DEBUG][XORG] (II) RADEON(0): SwapBuffers wait for vsync: enabled
[ 5376.524189] [DEBUG][XORG] (WW) RADEON(0): No outputs definitely connected, trying again...
[ 5376.524209] [DEBUG][XORG] (WW) RADEON(0): Unable to find connected outputs - setting 1024x768 initial framebuffer
[ 5376.524230] [DEBUG][XORG] (II) RADEON(0): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated.
[ 5376.524251] [DEBUG][XORG] (II) RADEON(0): mem size init: gart size :1fdef000 vram size: s:80000000 visible:7fcc0000
[ 5376.524271] [DEBUG][XORG] (II) RADEON(0): EXA: Driver will allow EXA pixmaps in VRAM
[ 5376.524291] [DEBUG][XORG] (==) RADEON(0): DPI set to (96, 96)
[ 5376.524310] [DEBUG][XORG] (II) Loading sub module "fb"
[ 5376.524329] [DEBUG][XORG] (II) LoadModule: "fb"
[ 5376.524350] [DEBUG][XORG] (II) Loading /usr/lib/xorg/modules/libfb.so
[ 5376.524370] [DEBUG][XORG] (II) Module fb: vendor="X.Org Foundation"
[ 5376.524390] [DEBUG][XORG]    compiled for 1.14.2, module version = 1.0.0
[ 5376.524418] [DEBUG][XORG]    ABI class: X.Org ANSI C Emulation, version 0.4
[ 5376.524445] [DEBUG][XORG] (II) Loading sub module "ramdac"
[ 5376.524464] [DEBUG][XORG] (II) LoadModule: "ramdac"
[ 5376.524483] [DEBUG][XORG] (II) Module "ramdac" already built-in
[ 5376.524503] [ERROR][XORG] (EE) RADEON(0): No modes.
[ 5376.524522] [DEBUG][XORG] (II) UnloadModule: "radeon"
[ 5376.524543] [DEBUG][XORG] (II) UnloadSubModule: "fb"
[ 5376.524563] [DEBUG][XORG] (II) Unloading fb
[ 5376.524582] [DEBUG][XORG] (II) UnloadSubModule: "exa"
[ 5376.524601] [DEBUG][XORG] (II) Unloading exa
[ 5376.524621] [ERROR][XORG] (EE) Screen(s) found, but none have a usable configuration.
[ 5376.524641] [ERROR][XORG] (EE) 
[ 5376.524658] [DEBUG][XORG] Fatal server error:
[ 5376.524680] [ERROR][XORG] (EE) no screens found(EE) 
[ 5376.524701] [ERROR][XORG] (EE) 
[ 5376.524720] [DEBUG][XORG] Please consult the The X.Org Foundation support 
[ 5376.524740] [DEBUG][XORG]     at http://wiki.x.org
[ 5376.524767] [DEBUG][XORG]  for help. 
[ 5376.524789] [ERROR][XORG] (EE) Please also check the log file at "/var/log/Xorg.8.log" for additional information.
[ 5376.524812] [ERROR][XORG] (EE) 
[ 5376.524833] [ERROR][XORG] (EE) Server terminated with error (1). Closing log file.
[ 5376.524853] [ERROR]X did not start properly
[ 5376.524942] [DEBUG]Socket closed.

Looks like it went a bit farther !

@zette

This comment has been minimized.

Show comment
Hide comment
@zette

zette Jul 3, 2013

To be 100% honest, I must say that I missed PRIME introduction. Now I had tested it, and the offloading works fine.
There are however problems with switcheroo PM - discret card have to be ON while xserver starts (so xrandr sees both display adatpers) and turning it OFF when not needed sometimes breake xserver.

My setup build heat quiqly while discret card is turned on, so running it all the time is not possible.
Maybe You know how to force xrandr to see both cards, after turning discrete ON while already in X?

zette commented Jul 3, 2013

To be 100% honest, I must say that I missed PRIME introduction. Now I had tested it, and the offloading works fine.
There are however problems with switcheroo PM - discret card have to be ON while xserver starts (so xrandr sees both display adatpers) and turning it OFF when not needed sometimes breake xserver.

My setup build heat quiqly while discret card is turned on, so running it all the time is not possible.
Maybe You know how to force xrandr to see both cards, after turning discrete ON while already in X?

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Jul 3, 2013

You all talk about PRIME, but I can't really find anything about it. Just some blog-posts, where can I get it from and "install" it?

You all talk about PRIME, but I can't really find anything about it. Just some blog-posts, where can I get it from and "install" it?

@zette

This comment has been minimized.

Show comment
Hide comment
@zette

zette Jul 3, 2013

PRIME is included in xorg 1.13 and corresponding dirvers for graphics card.
Verification:
xrandr --listproviders
should return list of integrated and discrete cards with indexes 0 and 1.

Running application with system variable DRI_PRIME set to 0 or 1 will run it at corresponding engine.
ie: DRI_PRIME=1 glxinfo

There can be problem with default settings of main and offload engine, and this selection must be made manually then with 'xrandr --setprovideroffloadsink' followed by id from --listproviders.

For more information check this: http://phoronix.com/forums/showthread.php?73649-Nouveau-Releases-New-Driver-With-PRIME-Support

zette commented Jul 3, 2013

PRIME is included in xorg 1.13 and corresponding dirvers for graphics card.
Verification:
xrandr --listproviders
should return list of integrated and discrete cards with indexes 0 and 1.

Running application with system variable DRI_PRIME set to 0 or 1 will run it at corresponding engine.
ie: DRI_PRIME=1 glxinfo

There can be problem with default settings of main and offload engine, and this selection must be made manually then with 'xrandr --setprovideroffloadsink' followed by id from --listproviders.

For more information check this: http://phoronix.com/forums/showthread.php?73649-Nouveau-Releases-New-Driver-With-PRIME-Support

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Jul 3, 2013

I say Thanks zette! :)

I say Thanks zette! :)

@Ekleog

This comment has been minimized.

Show comment
Hide comment
@Ekleog

Ekleog Jul 3, 2013

Well... So... I use xorg 1.14, with xorg-video-{intel,ati} installed, xrandr version 1.4, and...

$ xrandr --listproviders                       
Providers: number : 1
Provider 0: id: 0x46 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 3 outputs: 4 associated providers: 0 name:Intel

So it looks like PRIME is not yet supported for my card (as hinted by post number 8 of your link).

Ekleog commented Jul 3, 2013

Well... So... I use xorg 1.14, with xorg-video-{intel,ati} installed, xrandr version 1.4, and...

$ xrandr --listproviders                       
Providers: number : 1
Provider 0: id: 0x46 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 3 outputs: 4 associated providers: 0 name:Intel

So it looks like PRIME is not yet supported for my card (as hinted by post number 8 of your link).

@zette

This comment has been minimized.

Show comment
Hide comment
@zette

zette Jul 4, 2013

do:
sudo cat /sys/kernel/debug/vgaswitcheroo/switch
and check if both card are powered on after the X starts. If no You will probably need to check Your's kernel boot parameters and/or startup scripts to see if it does not set power management on start, and eventually force discrete card to power ON during startup sequence.

zette commented Jul 4, 2013

do:
sudo cat /sys/kernel/debug/vgaswitcheroo/switch
and check if both card are powered on after the X starts. If no You will probably need to check Your's kernel boot parameters and/or startup scripts to see if it does not set power management on start, and eventually force discrete card to power ON during startup sequence.

@Ekleog

This comment has been minimized.

Show comment
Hide comment
@Ekleog

Ekleog Jul 4, 2013

Oh, it needs to be powered on since boot ?
I just powered it on through a # echo ON > /sys/kernel/debug/vgaswitcheroo/switch manually, to test.

OK... Just tried stopping X, enabling the radeon, and starting X again, and I now have two providers !

After a xrandr --setprovideroffloadsink radeon Intel, I can run glxspheres (e.g.), with both DRI_PRIME=0 and DRI_PRIME=1 (and get a *3 speedup).

However, I can see the result only in full screen. I suppose this is due to i3 not being compositing ?

Ekleog commented Jul 4, 2013

Oh, it needs to be powered on since boot ?
I just powered it on through a # echo ON > /sys/kernel/debug/vgaswitcheroo/switch manually, to test.

OK... Just tried stopping X, enabling the radeon, and starting X again, and I now have two providers !

After a xrandr --setprovideroffloadsink radeon Intel, I can run glxspheres (e.g.), with both DRI_PRIME=0 and DRI_PRIME=1 (and get a *3 speedup).

However, I can see the result only in full screen. I suppose this is due to i3 not being compositing ?

@ChristophHaag

This comment has been minimized.

Show comment
Hide comment
@ChristophHaag

ChristophHaag Jul 4, 2013

  1. This is the wrong place to discuss this.
  2. A "lightweight" compositing manager like xcompmgr should give you compositing for any window manager (I don't know if it actually works together with i3, but it should).
  1. This is the wrong place to discuss this.
  2. A "lightweight" compositing manager like xcompmgr should give you compositing for any window manager (I don't know if it actually works together with i3, but it should).
@Ekleog

This comment has been minimized.

Show comment
Hide comment
@Ekleog

Ekleog Jul 4, 2013

Yes, sorry.

Anyway, if I can do anything to help bumblebee for intel/AMD setup, please tell me.

Ekleog commented Jul 4, 2013

Yes, sorry.

Anyway, if I can do anything to help bumblebee for intel/AMD setup, please tell me.

@quantax0

This comment has been minimized.

Show comment
Hide comment
@quantax0

quantax0 Jul 18, 2013

I /think/ i was finally able to get this to work -- I did some very rudimentary benchmarks, and it /appears/ that virtualgl is offloading to my 7970m -- the performance on vglrun glxspheres is showing 2x as many FPS as just running glxspheres directly

however, i do notice, when i run vglrun glxspheres it shows the following:

OpenGL Renderer: Mesa DRI Intel(R) Ivybridge Mobile

is this normal, that the program would detect the intel card, even though it's getting processed by the radeon via vgl?

I /think/ i was finally able to get this to work -- I did some very rudimentary benchmarks, and it /appears/ that virtualgl is offloading to my 7970m -- the performance on vglrun glxspheres is showing 2x as many FPS as just running glxspheres directly

however, i do notice, when i run vglrun glxspheres it shows the following:

OpenGL Renderer: Mesa DRI Intel(R) Ivybridge Mobile

is this normal, that the program would detect the intel card, even though it's getting processed by the radeon via vgl?

@Hohahiu

This comment has been minimized.

Show comment
Hide comment
@Hohahiu

Hohahiu Jul 18, 2013

What works? intel+fglrx or intel+radeon?

Hohahiu commented Jul 18, 2013

What works? intel+fglrx or intel+radeon?

@YoRyan

This comment has been minimized.

Show comment
Hide comment
@YoRyan

YoRyan Jul 20, 2013

Hello,

I got the common-amd branch to work with my HP dv7 laptop, which has a quad-core Intel i7-2630QM CPU and a Radeon HD 6400M GPU. I used a combination of fglrx (with PowerXpress support) and i915. With some hacks I got direct rendering working on both cards simultaneously, and I can call the ATI card using optirun/virtualgl.

My question is if AMD hybrid support is still in the works - I am interested in using the latest Bumblebee and primus instead of VirtualGL.

YoRyan commented Jul 20, 2013

Hello,

I got the common-amd branch to work with my HP dv7 laptop, which has a quad-core Intel i7-2630QM CPU and a Radeon HD 6400M GPU. I used a combination of fglrx (with PowerXpress support) and i915. With some hacks I got direct rendering working on both cards simultaneously, and I can call the ATI card using optirun/virtualgl.

My question is if AMD hybrid support is still in the works - I am interested in using the latest Bumblebee and primus instead of VirtualGL.

@amonakov

This comment has been minimized.

Show comment
Hide comment
@amonakov

amonakov Jul 20, 2013

Contributor

I don't think anybody was recently working on merging common-amd branch.

primus should work, it's not nVidia specific

Contributor

amonakov commented Jul 20, 2013

I don't think anybody was recently working on merging common-amd branch.

primus should work, it's not nVidia specific

@YoRyan

This comment has been minimized.

Show comment
Hide comment
@YoRyan

YoRyan Jul 20, 2013

I have never gotten primus to run.

> optirun glxinfo | head
name of display: :0
display: :0  screen: 0
direct rendering: Yes
server glx vendor string: VirtualGL
server glx version string: 1.4
server glx extensions:
    GLX_ARB_create_context, GLX_ARB_create_context_profile, 
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
    GLX_EXT_visual_rating, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, 
    GLX_SGI_make_current_read, GLX_SUN_get_transparent_index
> primusrun glxinfo | head
Xlib:  extension "NV-GLX" missing on display ":8".
XIO:  fatal IO error 11 (Resource temporarily unavailable) on X server ":8"
      after 28 requests (28 known processed) with 0 events remaining.
name of display: :0

Presumably I need Bumblebee with primus support, which the amd branch lacks.

YoRyan commented Jul 20, 2013

I have never gotten primus to run.

> optirun glxinfo | head
name of display: :0
display: :0  screen: 0
direct rendering: Yes
server glx vendor string: VirtualGL
server glx version string: 1.4
server glx extensions:
    GLX_ARB_create_context, GLX_ARB_create_context_profile, 
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
    GLX_EXT_visual_rating, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, 
    GLX_SGI_make_current_read, GLX_SUN_get_transparent_index
> primusrun glxinfo | head
Xlib:  extension "NV-GLX" missing on display ":8".
XIO:  fatal IO error 11 (Resource temporarily unavailable) on X server ":8"
      after 28 requests (28 known processed) with 0 events remaining.
name of display: :0

Presumably I need Bumblebee with primus support, which the amd branch lacks.

@ChristophHaag

This comment has been minimized.

Show comment
Hide comment
@ChristophHaag

ChristophHaag Aug 1, 2013

Ok, I can confirm that the fglrx + bumblebee works now. But not out of the box. Let's see if I get everything together I did. This is specific to a certain package in the archlinux aur, modified for amd's beta driver: http://www2.ati.com/drivers/beta/amd-catalyst-13.15.100.1-linux-x86.x86_64.zip

So only sort of a checklist of stuff you can watch out for if it doesn't work for you.

bumblebee.conf needs this added to the config:

[driver-fglrx]
KernelDriver=fglrx
PMMethod=none
LibraryPath=/usr/lib/catalystpxp/fglrx:/usr/lib32/catalystpxp/fglrx
XorgModulePath=/usr/lib/xorg,/usr/lib/xorg/modules,/usr/lib/xorg/modules/updates/extensions/fglrx/
XorgConfFile=/etc/bumblebee/xorg.conf.fglrx

Probably doesn't need /usr/lib/xorg in the XorgModulePath, but whatever.

/usr/lib/xorg/modules/updates/extensions/fglrx/ contains fglrx-libglx.so and I symlinked it to libglx.so in the same directory because the linker wouldn't pick it up. You'll find this out if you run bumblebeed --debug -C /your/config and there is a symbol lookup error.

/etc/bumblebee/xorg.conf.fglrx is the one from the repository I think:

Section "ServerLayout"
    Identifier "Layout0"
    Option "AutoAddDevices" "false"
EndSection

Section "Device"
    Identifier "Device1"
    Driver "fglrx"
EndSection

To make your integrated gpu also work, make sure that LIBGL_DRIVERS_PATH and LD_LIBRARY_PATH are not set to fglrx's paths by any script e.g. in /etc/profile.d

Then make sure that in /etc/X11/xorg.conf.d/ there is no ModulePath set to any of fglrx's paths.

I found that I had to create 00-libglxModulePath.conf with content

Section "Files"
         ModulePath   "/usr/lib/xorg/modules/extensions"
         ModulePath   "/usr/lib/xorg/modules"
EndSection

because X would pick fglrx's libglx first for some reason.

That's pretty much it.

In /usr/lib/xorg/modules/dri I have fglrx_dri.so and the intel stuff. In /usr/lib/xorg/modules/drivers/ I have fglrx_drv.so and the intel stuff. I also have /usr/lib/xorg/modules/linux/libfglrxdrm.so, and lastly like I already mentioned my fglrx's libglx in /usr/lib/xorg/modules/updates/extensions/fglrx/

Edit: Oh yes, in /usr/lib/catalystpxp there is fglrx's libGL.so, (and symlinks libGL.so.1, libGL.so.1.2)

I'm not completely sure how everything finds the correct paths with this strange setup, but it seems to work.

I have only one little problem: After the X server with fglrx started by optirun exits, my main X server gets all messed up and I have to go to a tty and then back to X for it to render correctly again, but then my second screen only shows black...? Probably more of an intel problem I would guess...

Last question: catalyst doesn't seem to power off the card when it's not used. What would be an appropriate method for radeon cards?

Ok, I can confirm that the fglrx + bumblebee works now. But not out of the box. Let's see if I get everything together I did. This is specific to a certain package in the archlinux aur, modified for amd's beta driver: http://www2.ati.com/drivers/beta/amd-catalyst-13.15.100.1-linux-x86.x86_64.zip

So only sort of a checklist of stuff you can watch out for if it doesn't work for you.

bumblebee.conf needs this added to the config:

[driver-fglrx]
KernelDriver=fglrx
PMMethod=none
LibraryPath=/usr/lib/catalystpxp/fglrx:/usr/lib32/catalystpxp/fglrx
XorgModulePath=/usr/lib/xorg,/usr/lib/xorg/modules,/usr/lib/xorg/modules/updates/extensions/fglrx/
XorgConfFile=/etc/bumblebee/xorg.conf.fglrx

Probably doesn't need /usr/lib/xorg in the XorgModulePath, but whatever.

/usr/lib/xorg/modules/updates/extensions/fglrx/ contains fglrx-libglx.so and I symlinked it to libglx.so in the same directory because the linker wouldn't pick it up. You'll find this out if you run bumblebeed --debug -C /your/config and there is a symbol lookup error.

/etc/bumblebee/xorg.conf.fglrx is the one from the repository I think:

Section "ServerLayout"
    Identifier "Layout0"
    Option "AutoAddDevices" "false"
EndSection

Section "Device"
    Identifier "Device1"
    Driver "fglrx"
EndSection

To make your integrated gpu also work, make sure that LIBGL_DRIVERS_PATH and LD_LIBRARY_PATH are not set to fglrx's paths by any script e.g. in /etc/profile.d

Then make sure that in /etc/X11/xorg.conf.d/ there is no ModulePath set to any of fglrx's paths.

I found that I had to create 00-libglxModulePath.conf with content

Section "Files"
         ModulePath   "/usr/lib/xorg/modules/extensions"
         ModulePath   "/usr/lib/xorg/modules"
EndSection

because X would pick fglrx's libglx first for some reason.

That's pretty much it.

In /usr/lib/xorg/modules/dri I have fglrx_dri.so and the intel stuff. In /usr/lib/xorg/modules/drivers/ I have fglrx_drv.so and the intel stuff. I also have /usr/lib/xorg/modules/linux/libfglrxdrm.so, and lastly like I already mentioned my fglrx's libglx in /usr/lib/xorg/modules/updates/extensions/fglrx/

Edit: Oh yes, in /usr/lib/catalystpxp there is fglrx's libGL.so, (and symlinks libGL.so.1, libGL.so.1.2)

I'm not completely sure how everything finds the correct paths with this strange setup, but it seems to work.

I have only one little problem: After the X server with fglrx started by optirun exits, my main X server gets all messed up and I have to go to a tty and then back to X for it to render correctly again, but then my second screen only shows black...? Probably more of an intel problem I would guess...

Last question: catalyst doesn't seem to power off the card when it's not used. What would be an appropriate method for radeon cards?

@amonakov

This comment has been minimized.

Show comment
Hide comment
@amonakov

amonakov Aug 1, 2013

Contributor

What would be an appropriate method for radeon cards?

Unload catalyst, load the open-source driver and power down the card via vgaswitcheroo. The relevant ACPI method is ATPX. On nVidia systems, bbswitch can take care of invoking the ACPI method (_DSM), but nobody has worked on adding ATPX handling in bbswitch.

I looks like you haven't attempted to test primus? The conversation about it continued here: amonakov/primus#104

Contributor

amonakov commented Aug 1, 2013

What would be an appropriate method for radeon cards?

Unload catalyst, load the open-source driver and power down the card via vgaswitcheroo. The relevant ACPI method is ATPX. On nVidia systems, bbswitch can take care of invoking the ACPI method (_DSM), but nobody has worked on adding ATPX handling in bbswitch.

I looks like you haven't attempted to test primus? The conversation about it continued here: amonakov/primus#104

@YoRyan

This comment has been minimized.

Show comment
Hide comment
@YoRyan

YoRyan Aug 2, 2013

@ChristophHaag I put together some scripts and config files to get Bumblebee to play nice with fglrx.
https://github.com/YoRyan/bumblebee-amd-hacks

The X server corruption can be avoided by sending a kill -s 9 to the secondary X server instead of a normal termination (lol). The radeon card can be turned off using a single ACPI call, however I have not found a way to turn it back on without rebooting.

YoRyan commented Aug 2, 2013

@ChristophHaag I put together some scripts and config files to get Bumblebee to play nice with fglrx.
https://github.com/YoRyan/bumblebee-amd-hacks

The X server corruption can be avoided by sending a kill -s 9 to the secondary X server instead of a normal termination (lol). The radeon card can be turned off using a single ACPI call, however I have not found a way to turn it back on without rebooting.

@ChristophHaag

This comment has been minimized.

Show comment
Hide comment
@ChristophHaag

ChristophHaag Aug 14, 2013

@amonakov That is a possibility, but I have had severe problems with kernel panics when unloading and loading radeon/fglrx. I think for radeon this has been finally solved in 3.10 or so. But for fglrx I still have it, for example this was just today:

panic

@YoRyan Yes, kill -9 does kill the other xserver without affecting my primary one. Took a few seconds to track, so if people want to have that behaviour from bumblebee by default, it is in bb_stop_wait src/bbrun.c in

I just commented out the other stuff:

    //the first 10 attempts, use SIGTERM
//    if (i < 10) {
//      kill(proc, SIGTERM);
//    } else {
      //after that, use SIGKILL
      kill(proc, SIGKILL);
//    }

But it would be better to have it fixed upstream at the Xorg server... this does qualify as a bug, doesn't it?

@amonakov That is a possibility, but I have had severe problems with kernel panics when unloading and loading radeon/fglrx. I think for radeon this has been finally solved in 3.10 or so. But for fglrx I still have it, for example this was just today:

panic

@YoRyan Yes, kill -9 does kill the other xserver without affecting my primary one. Took a few seconds to track, so if people want to have that behaviour from bumblebee by default, it is in bb_stop_wait src/bbrun.c in

I just commented out the other stuff:

    //the first 10 attempts, use SIGTERM
//    if (i < 10) {
//      kill(proc, SIGTERM);
//    } else {
      //after that, use SIGKILL
      kill(proc, SIGKILL);
//    }

But it would be better to have it fixed upstream at the Xorg server... this does qualify as a bug, doesn't it?

@f-r-i-t-z

This comment has been minimized.

Show comment
Hide comment
@f-r-i-t-z

f-r-i-t-z Sep 14, 2013

Hello everyone. Please, apologize If I'm doing a wrong post in the wrong place.
Soo!!! Lets goo...
I'm an un happy and pissed owner of an hp envy with intel hd3000 and amd 6850m ( maybe the only notebook using this card).
I'm trying to use Bumblebee + primus in a pale hope to have mine discrete card working under linux. ( since under windows seems our card just doesn't work to).

First I tried with radeon xorg drivers. From the first perspective seems everything work ( testing under arch distro and ubuntu 13.10) and under arch at last seems it work. but i'm concerned with the performance when doing some comparison with dri prime.
Here example;


LIBGL_DEBUG=verbose optirun -vvvvv --debug  glxspheres --info
[  188.447936] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[  188.448354] [INFO]Configured driver: radeon
[  188.448708] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[  188.448738] [DEBUG]Active configuration:
[  188.448748] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[  188.448757] [DEBUG] X display: :8
[  188.448765] [DEBUG] LD_LIBRARY_PATH: 
[  188.448773] [DEBUG] Socket path: /var/run/bumblebee.socket
[  188.448781] [DEBUG] VGL Compression: proxy
[  188.448922] [INFO]Response: Yes. X is active.
[  188.448946] [INFO]Running application through vglrun.
[  188.449098] [DEBUG]Process vglrun started, PID 1237.
Polygons in scene: 62464
libGL: OpenDriver: trying /usr/lib/xorg/modules/dri/tls/r600_dri.so
libGL: OpenDriver: trying /usr/lib/xorg/modules/dri/r600_dri.so
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
libGL: OpenDriver: trying /usr/lib/xorg/modules/dri/tls/i965_dri.so
libGL: OpenDriver: trying /usr/lib/xorg/modules/dri/i965_dri.so
Error: nConfigOptions (12) does not match the actual number of options in
       __driConfigOptions (13).
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
Visual ID of window: 0x69
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
Context is Direct
OpenGL Renderer: Gallium 0.4 on AMD JUNIPER
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
59.872598 frames/sec - 66.817819 Mpixels/sec
59.997185 frames/sec - 66.956858 Mpixels/sec
59.998107 frames/sec - 66.957887 Mpixels/sec

When I'm trying the same using DRI_PRIME;


LIBGL_DEBUG=verbose DRI_PRIME=1  glxspheres --info
Polygons in scene: 62464
libGL: OpenDriver: trying  /usr/lib/xorg/modules/dri/tls/r600_dri.so
libGL: OpenDriver: trying  /usr/lib/xorg/modules/dri/tls/r600_dri.so
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
Visual ID of window: 0x59
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
Context is Direct
OpenGL Renderer: Gallium 0.4 on AMD JUNIPER
315.433255 frames/sec - 352.023513 Mpixels/sec
271.609452 frames/sec - 381.825452 Mpixels/sec
306.515778 frames/sec - 366.256866 Mpixels/sec

so.. why using bumblebee seems load the R600 driver and later i965 driver? i'm missing something?
OBS: is impossible run under ubuntu 13.10 bumblebee, always receive some dri version error.

Using catalyst..
By other side when I try to use the CATALYST drive instead the radeon driver this 6850m card seems blacklisted inside fglrx or fglrx load the firegl driver ( since 6850m are re branded) or fglrx load the hd 6850 standart alone driver and hangs with the infamous

 PowerXpress feature is not supported on A+I Mux platform Please uninstall fglrx driver. 
r. Doesn't matter what I do or how, always when bumblebee load I receive this message ( if there is a way of circumvent this message or change the vendor:device id from card to try another tests, I'm open minded, please......).

Under ubuntu same using intel and unloading/blacklisting fglrx and doing al kind of sorcery with libgl inte/fglrx drivers, well BLACK SCREEN. cant load to test bumblebee then..

So.. I'm glad if someone comment about ....

Hello everyone. Please, apologize If I'm doing a wrong post in the wrong place.
Soo!!! Lets goo...
I'm an un happy and pissed owner of an hp envy with intel hd3000 and amd 6850m ( maybe the only notebook using this card).
I'm trying to use Bumblebee + primus in a pale hope to have mine discrete card working under linux. ( since under windows seems our card just doesn't work to).

First I tried with radeon xorg drivers. From the first perspective seems everything work ( testing under arch distro and ubuntu 13.10) and under arch at last seems it work. but i'm concerned with the performance when doing some comparison with dri prime.
Here example;


LIBGL_DEBUG=verbose optirun -vvvvv --debug  glxspheres --info
[  188.447936] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[  188.448354] [INFO]Configured driver: radeon
[  188.448708] [DEBUG]optirun version 3.0-51-ga00f533 starting...
[  188.448738] [DEBUG]Active configuration:
[  188.448748] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[  188.448757] [DEBUG] X display: :8
[  188.448765] [DEBUG] LD_LIBRARY_PATH: 
[  188.448773] [DEBUG] Socket path: /var/run/bumblebee.socket
[  188.448781] [DEBUG] VGL Compression: proxy
[  188.448922] [INFO]Response: Yes. X is active.
[  188.448946] [INFO]Running application through vglrun.
[  188.449098] [DEBUG]Process vglrun started, PID 1237.
Polygons in scene: 62464
libGL: OpenDriver: trying /usr/lib/xorg/modules/dri/tls/r600_dri.so
libGL: OpenDriver: trying /usr/lib/xorg/modules/dri/r600_dri.so
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
libGL: OpenDriver: trying /usr/lib/xorg/modules/dri/tls/i965_dri.so
libGL: OpenDriver: trying /usr/lib/xorg/modules/dri/i965_dri.so
Error: nConfigOptions (12) does not match the actual number of options in
       __driConfigOptions (13).
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
Visual ID of window: 0x69
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
Context is Direct
OpenGL Renderer: Gallium 0.4 on AMD JUNIPER
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
59.872598 frames/sec - 66.817819 Mpixels/sec
59.997185 frames/sec - 66.956858 Mpixels/sec
59.998107 frames/sec - 66.957887 Mpixels/sec

When I'm trying the same using DRI_PRIME;


LIBGL_DEBUG=verbose DRI_PRIME=1  glxspheres --info
Polygons in scene: 62464
libGL: OpenDriver: trying  /usr/lib/xorg/modules/dri/tls/r600_dri.so
libGL: OpenDriver: trying  /usr/lib/xorg/modules/dri/tls/r600_dri.so
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
Visual ID of window: 0x59
libGL: Can't open configuration file /home/nash/.drirc: No such file or directory.
Context is Direct
OpenGL Renderer: Gallium 0.4 on AMD JUNIPER
315.433255 frames/sec - 352.023513 Mpixels/sec
271.609452 frames/sec - 381.825452 Mpixels/sec
306.515778 frames/sec - 366.256866 Mpixels/sec

so.. why using bumblebee seems load the R600 driver and later i965 driver? i'm missing something?
OBS: is impossible run under ubuntu 13.10 bumblebee, always receive some dri version error.

Using catalyst..
By other side when I try to use the CATALYST drive instead the radeon driver this 6850m card seems blacklisted inside fglrx or fglrx load the firegl driver ( since 6850m are re branded) or fglrx load the hd 6850 standart alone driver and hangs with the infamous

 PowerXpress feature is not supported on A+I Mux platform Please uninstall fglrx driver. 
r. Doesn't matter what I do or how, always when bumblebee load I receive this message ( if there is a way of circumvent this message or change the vendor:device id from card to try another tests, I'm open minded, please......).

Under ubuntu same using intel and unloading/blacklisting fglrx and doing al kind of sorcery with libgl inte/fglrx drivers, well BLACK SCREEN. cant load to test bumblebee then..

So.. I'm glad if someone comment about ....

@Hohahiu

This comment has been minimized.

Show comment
Hide comment
@Hohahiu

Hohahiu Sep 17, 2013

The fglrx error means that your laptop has muxed hybrid graphics. But I would suggest you to switch to open source radeon drivers. First, in kernel 3.10 the support of VDPAU through UVD was merged. Dynamic power management (DPM) was included in 3.11. In your case your discrete GPU is supported by r600g driver, which is mature enough. As for DRI_PRIME, did you do

xrandr --setprovideroffloadsink radeon Intel

before running application? Also since your system is muxed IIRC you need to restart x server after enabling discrete GPU through vgaswitcheroo.

Is there any success story with bumblebbe+fglrx on non Arch systems?

Hohahiu commented Sep 17, 2013

The fglrx error means that your laptop has muxed hybrid graphics. But I would suggest you to switch to open source radeon drivers. First, in kernel 3.10 the support of VDPAU through UVD was merged. Dynamic power management (DPM) was included in 3.11. In your case your discrete GPU is supported by r600g driver, which is mature enough. As for DRI_PRIME, did you do

xrandr --setprovideroffloadsink radeon Intel

before running application? Also since your system is muxed IIRC you need to restart x server after enabling discrete GPU through vgaswitcheroo.

Is there any success story with bumblebbe+fglrx on non Arch systems?

@f-r-i-t-z

This comment has been minimized.

Show comment
Hide comment
@f-r-i-t-z

f-r-i-t-z Sep 17, 2013

"The fglrx error means that your laptop has muxed hybrid graphics." NO.. This card/configuration is muxless ( under windows act/work being a muxless and doing some hacks works on demand like last cards, per app) (this is almost the same problem from some 7xx/67xx/69xx cards).

http://www.amd.com/us/products/notebook/graphics/amd-radeon-6000m/amd-radeon-6800m/Pages/amd-radeon-6800m.aspx#2
http://sites.amd.com/us/game/shop/Pages/hp-envy-17-3d-and-radeon-6850m.aspx

Seems or the FGLRX already have this id:manufaturer configuration blacklisted or is loading the wrong card id since 6850m is a re-branded "5850hd" or the hp bios ( always fucked) setup by default in igpu not dgpu. i'm not speaking about muxed output here) "..

When using aticonfig --px-dgpu the fglrx driver load himself and unload the previous loaded intel module trying to load by himself the intel module and then this message. Because this we CANT load fglrx driver + intel same time.
Again, if someone know how to change or fake the card id fglrx under linux or bypass this, i'm open minded.

I'm already trying use the r600g driver..
DRI_PRIME i'm using for long. In fact last UBUNTU 13.10 releases already is doing this "xrandr --setprovideroffloadsink radeon Intel" by default and seems some portions from UBUNTU UNITY already is using offloading ( need a further investigation, when blacklisting radeon, lots of gl extensions doesn't load in unity).

No.. vgaswitcheroo doesnt work in this intel/amd configuration because in fact they are MUXLESS.

"... Is there any success story with bumblebbe+fglrx on non Arch systems? ..."
Under ubuntu with radeon driver being used/loaded in offloadmode by default almost everything work until receive some DRI version error.


LIBGL_DEBUG=verbose optirun -vvvvv --debug  glxspheres --info
[  657.166106] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[  657.166464] [INFO]Configured driver: radeon
[  657.166617] [DEBUG]optirun version 3.2-19-g1b3f8b8 starting...
[  657.166631] [DEBUG]Active configuration:
[  657.166637] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[  657.166643] [DEBUG] X display: :8
[  657.166648] [DEBUG] LD_LIBRARY_PATH: 
[  657.166653] [DEBUG] Socket path: /var/run/bumblebee.socket
[  657.166659] [DEBUG] Accel/display bridge: virtualgl
[  657.166664] [DEBUG] VGL Compression: proxy
[  657.166670] [DEBUG] VGLrun extra options: 
[  657.166675] [DEBUG] Primus LD Path: /usr/lib/x86_64-linux-gnu/primus:/usr/lib/i386-linux-gnu/primus
[  657.179838] [INFO]Response: No - error: [XORG] (EE) RADEON(0): [drm] failed to set drm interface version.
'
[  657.179870] [ERROR]Cannot access secondary GPU - error: [XORG] (EE) RADEON(0): [drm] failed to set drm interface version.
[  657.179878] [DEBUG]Socket closed.
[  657.179906] [ERROR]Aborting because fallback start is disabled.
[  657.179917] [DEBUG]Killing all remaining processes.

Seems good to bumblebee team take a look in this default offloading in ubuntu, since this will cause a lot of bug requests when ubuntu 13/10 release.

"The fglrx error means that your laptop has muxed hybrid graphics." NO.. This card/configuration is muxless ( under windows act/work being a muxless and doing some hacks works on demand like last cards, per app) (this is almost the same problem from some 7xx/67xx/69xx cards).

http://www.amd.com/us/products/notebook/graphics/amd-radeon-6000m/amd-radeon-6800m/Pages/amd-radeon-6800m.aspx#2
http://sites.amd.com/us/game/shop/Pages/hp-envy-17-3d-and-radeon-6850m.aspx

Seems or the FGLRX already have this id:manufaturer configuration blacklisted or is loading the wrong card id since 6850m is a re-branded "5850hd" or the hp bios ( always fucked) setup by default in igpu not dgpu. i'm not speaking about muxed output here) "..

When using aticonfig --px-dgpu the fglrx driver load himself and unload the previous loaded intel module trying to load by himself the intel module and then this message. Because this we CANT load fglrx driver + intel same time.
Again, if someone know how to change or fake the card id fglrx under linux or bypass this, i'm open minded.

I'm already trying use the r600g driver..
DRI_PRIME i'm using for long. In fact last UBUNTU 13.10 releases already is doing this "xrandr --setprovideroffloadsink radeon Intel" by default and seems some portions from UBUNTU UNITY already is using offloading ( need a further investigation, when blacklisting radeon, lots of gl extensions doesn't load in unity).

No.. vgaswitcheroo doesnt work in this intel/amd configuration because in fact they are MUXLESS.

"... Is there any success story with bumblebbe+fglrx on non Arch systems? ..."
Under ubuntu with radeon driver being used/loaded in offloadmode by default almost everything work until receive some DRI version error.


LIBGL_DEBUG=verbose optirun -vvvvv --debug  glxspheres --info
[  657.166106] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[  657.166464] [INFO]Configured driver: radeon
[  657.166617] [DEBUG]optirun version 3.2-19-g1b3f8b8 starting...
[  657.166631] [DEBUG]Active configuration:
[  657.166637] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[  657.166643] [DEBUG] X display: :8
[  657.166648] [DEBUG] LD_LIBRARY_PATH: 
[  657.166653] [DEBUG] Socket path: /var/run/bumblebee.socket
[  657.166659] [DEBUG] Accel/display bridge: virtualgl
[  657.166664] [DEBUG] VGL Compression: proxy
[  657.166670] [DEBUG] VGLrun extra options: 
[  657.166675] [DEBUG] Primus LD Path: /usr/lib/x86_64-linux-gnu/primus:/usr/lib/i386-linux-gnu/primus
[  657.179838] [INFO]Response: No - error: [XORG] (EE) RADEON(0): [drm] failed to set drm interface version.
'
[  657.179870] [ERROR]Cannot access secondary GPU - error: [XORG] (EE) RADEON(0): [drm] failed to set drm interface version.
[  657.179878] [DEBUG]Socket closed.
[  657.179906] [ERROR]Aborting because fallback start is disabled.
[  657.179917] [DEBUG]Killing all remaining processes.

Seems good to bumblebee team take a look in this default offloading in ubuntu, since this will cause a lot of bug requests when ubuntu 13/10 release.

@ChristophHaag

This comment has been minimized.

Show comment
Hide comment
@ChristophHaag

ChristophHaag Oct 15, 2013

Can you do something about OpenCL? It sucks a bit on catalyst because you need an X server that was started with the fglrx ddx.

hello_world from git://people.freedesktop.org/~tstellar/opencl-example

$ optirun ./hello_world
There are 1 platforms.
clGetDeviceIDs() failed: CL_DEVICE_NOT_FOUND

$ DISPLAY=:8 optirun ./hello_world
There are 1 platforms.
There are 1 GPU devices.
clCreateContext() succeeded.
clCreateCommandQueue() succeeded.
clCreateProgramWithSource() suceeded.
clBuildProgram() suceeded.
clCreateKernel() suceeded.
clCreateBuffer() succeeded.
clSetKernelArg() succeeded.
clEnqueueNDRangeKernel() suceeded.
clFinish() succeeded.
clEnqueueReadBuffer() suceeded.
pi = 3.141590

Can you do something about OpenCL? It sucks a bit on catalyst because you need an X server that was started with the fglrx ddx.

hello_world from git://people.freedesktop.org/~tstellar/opencl-example

$ optirun ./hello_world
There are 1 platforms.
clGetDeviceIDs() failed: CL_DEVICE_NOT_FOUND

$ DISPLAY=:8 optirun ./hello_world
There are 1 platforms.
There are 1 GPU devices.
clCreateContext() succeeded.
clCreateCommandQueue() succeeded.
clCreateProgramWithSource() suceeded.
clBuildProgram() suceeded.
clCreateKernel() suceeded.
clCreateBuffer() succeeded.
clSetKernelArg() succeeded.
clEnqueueNDRangeKernel() suceeded.
clFinish() succeeded.
clEnqueueReadBuffer() suceeded.
pi = 3.141590
@amonakov

This comment has been minimized.

Show comment
Hide comment
@amonakov

amonakov Oct 15, 2013

Contributor

@ChristophHaag, are you asking for an optirun command line flag to reset $DISPLAY to :8?

(it's quite unfortunate fglrx needs that; does it mean you can't run opencl programs without starting an X session?)

Contributor

amonakov commented Oct 15, 2013

@ChristophHaag, are you asking for an optirun command line flag to reset $DISPLAY to :8?

(it's quite unfortunate fglrx needs that; does it mean you can't run opencl programs without starting an X session?)

@ChristophHaag

This comment has been minimized.

Show comment
Hide comment
@ChristophHaag

ChristophHaag Oct 15, 2013

The X session started by bumblebee is fine. But you can only use OpenCL if you set DISPLAY to that X session. That's no problem for command line programs but if you have a gui program that makes use of OpenCL it will get displayed on the invisible X and you have to use VNC or xpra or so to see it.

I have no idea how fglrx's OpenCL works so I don't know whether it would be easy with some trickery to make it work with optirun without the need to set DISPLAY to :8.

The X session started by bumblebee is fine. But you can only use OpenCL if you set DISPLAY to that X session. That's no problem for command line programs but if you have a gui program that makes use of OpenCL it will get displayed on the invisible X and you have to use VNC or xpra or so to see it.

I have no idea how fglrx's OpenCL works so I don't know whether it would be easy with some trickery to make it work with optirun without the need to set DISPLAY to :8.

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Oct 15, 2013

Member

@ChristophHaag For OpenCL you do not even need the X server. Support for OpenCL on radeon (using the r600 Gallium driver) is experimental (expect bugs!) and you need the following:

  • A kernel with the radeon driver.
  • A recent LLVM version
  • Mesa built with OpenCL support for r600
  • Permission to read/write to /dev/dri/cardN. For example, if you have identified that /dev/dri/card0 is the Intel device and /dev/dri/card1 is the AMD graphics device, then you can use sudo setfacl -m u:$USER:rw /dev/dri/card1 to grant yourself access.

I have not tried fglrx, so perhaps read/write access to /dev/dri/cardN is sufficient to get OpenCL to work for it too. You can also join #radeon on Freenode if you need more help with OpenCL and radeon.

Member

Lekensteyn commented Oct 15, 2013

@ChristophHaag For OpenCL you do not even need the X server. Support for OpenCL on radeon (using the r600 Gallium driver) is experimental (expect bugs!) and you need the following:

  • A kernel with the radeon driver.
  • A recent LLVM version
  • Mesa built with OpenCL support for r600
  • Permission to read/write to /dev/dri/cardN. For example, if you have identified that /dev/dri/card0 is the Intel device and /dev/dri/card1 is the AMD graphics device, then you can use sudo setfacl -m u:$USER:rw /dev/dri/card1 to grant yourself access.

I have not tried fglrx, so perhaps read/write access to /dev/dri/cardN is sufficient to get OpenCL to work for it too. You can also join #radeon on Freenode if you need more help with OpenCL and radeon.

@Lekensteyn Lekensteyn referenced this issue in Bumblebee-Project/bbswitch Jan 15, 2014

Closed

No suitable _DSM call found. #84

@natashaDnepr

This comment has been minimized.

Show comment
Hide comment
@natashaDnepr

natashaDnepr Dec 14, 2014

Good afternoon.

I have obtained the link to this project in http://forums.amd.com/game/messageview.cfm?catid=488&threadid=176217&enterthread=y and try to use it for enabling my dedicated card (AMD Radeon HD 8750M) with fglrx driver on Lenovo IdeaPad G500A,

#  lspci | grep 'VGA\|ATI'
00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 Display controller: Advanced Micro Devices, Inc. [AMD/ATI] Mars [Radeon HD 8670A/8670M/8750M]

OS openSUSE 13.1 (I want to use it for OpenCL programming).

I have downloaded common-amd branch of the project and installed it, adding

#define PCI_CLASS_DISPLAY_OTHER  0x0380

into Bumblebee-common-amd/src/pci.h and adding PCI_CLASS_DISPLAY_OTHER into this line of the Bumblebee-common-amd/src/pci.c file:

if (pci_class == PCI_CLASS_DISPLAY_VGA ||
                pci_class == PCI_CLASS_DISPLAY_3D || pci_class == PCI_CLASS_DISPLAY_OTHER)

Also I added lines recommended by @ChristophHaag into bumblebee.conf file (correspondingly to my system):

[driver-fglrx]
KernelDriver=fglrx
PMMethod=none
LibraryPath=/usr/lib64/fglrx:/usr/lib/fglrx
XorgModulePath=/usr/lib64/xorg,/usr/lib64/xorg/modules,/usr/lib64/xorg/modules/extensions/fglrx/
XorgConfFile=/etc/bumblebee/xorg.conf.fglrx

I start Bumblebee as follows.

boot with Intel card
> su
# export PATH=/usr/local/sbin:$PATH
# export LD_LIBRARY_PATH=/usr/local/lib64:$LD_LIBRARY_PATH
# bumblebeed --daemon
# bumblebeed --debug
[  216.942812] [INFO]PM is disabled, not performing detection.
[  216.942844] [DEBUG]Active configuration:
[  216.942854] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[  216.942881] [DEBUG] X display: :8
[  216.942891] [DEBUG] LD_LIBRARY_PATH: /usr/lib64/fglrx:/usr/lib/fglrx
[  216.942899] [DEBUG] Socket path: /var/run/bumblebee.socket
[  216.942910] [DEBUG] pidfile: /var/run/bumblebeed.pid
[  216.942919] [DEBUG] xorg.conf file: /etc/bumblebee/xorg.conf.fglrx
[  216.942928] [DEBUG] ModulePath: /usr/lib64/xorg,/usr/lib64/xorg/modules,/usr/lib64/xorg/modules/extensions/fglrx/
[  216.942937] [DEBUG] GID name: bumblebee
[  216.942945] [DEBUG] Power method: none
[  216.942954] [DEBUG] Stop X on exit: 1
[  216.942962] [DEBUG] Driver: fglrx
[  216.942977] [DEBUG] Driver module: fglrx
[  216.942986] [DEBUG] Card shutdown state: 1
[  216.943102] [DEBUG]Process /sbin/modinfo started, PID 2063.
[  216.943164] [DEBUG]Hiding stderr for execution of /sbin/modinfo
[  216.944668] [DEBUG]SIGCHILD received, but wait failed with No child processes
[  216.944705] [DEBUG]Configuration test passed.
[  216.944747] [ERROR]Daemon already running, pid 2059
# aticonfig --pxl
PowerXpress: Integrated GPU is active (Power-Saving mode).
# optirun --debug glxgears
[ 1349.078985] [DEBUG]optirun version 3.0.1-2012-10-05-Format:%h$ starting...
[ 1349.079022] [DEBUG]Active configuration:
[ 1349.079025] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 1349.079029] [DEBUG] X display: :8
[ 1349.079033] [DEBUG] LD_LIBRARY_PATH: /usr/lib64/fglrx:/usr/lib/fglrx
[ 1349.079036] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 1349.079039] [DEBUG] VGL Compression: proxy
[ 1350.908494] [INFO]Response: No - error: [XORG] (EE) fglrx(0): Failed to open CMMQS connection.

[ 1350.908512] [ERROR]Cannot access secondary GPU - error: [XORG] (EE) fglrx(0): Failed to open CMMQS connection.

[ 1350.908516] [DEBUG]Socket closed.
[ 1350.908539] [ERROR]Aborting because fallback start is disabled.
[ 1350.908544] [DEBUG]Killing all remaining processes.
# aticonfig --px-dgpu
PowerXpress: Discrete GPU is selected (High-Performance mode), please restart Xserver(s) for changes to take effect!
# optirun --debug glxgears

It returns first 7 lines of this command run with Intel card. After this machine hangs. Sometimes this command completed execution before computer hanged. There was one error in the output. It was written that display: :8 cannot be opened. Unfortunately, I couldn't reproduce this to take photo. The Xorg.8.log file is empty.

Could anybody help get this trick working, please?

Thank you in advance,
Natalia

Good afternoon.

I have obtained the link to this project in http://forums.amd.com/game/messageview.cfm?catid=488&threadid=176217&enterthread=y and try to use it for enabling my dedicated card (AMD Radeon HD 8750M) with fglrx driver on Lenovo IdeaPad G500A,

#  lspci | grep 'VGA\|ATI'
00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 Display controller: Advanced Micro Devices, Inc. [AMD/ATI] Mars [Radeon HD 8670A/8670M/8750M]

OS openSUSE 13.1 (I want to use it for OpenCL programming).

I have downloaded common-amd branch of the project and installed it, adding

#define PCI_CLASS_DISPLAY_OTHER  0x0380

into Bumblebee-common-amd/src/pci.h and adding PCI_CLASS_DISPLAY_OTHER into this line of the Bumblebee-common-amd/src/pci.c file:

if (pci_class == PCI_CLASS_DISPLAY_VGA ||
                pci_class == PCI_CLASS_DISPLAY_3D || pci_class == PCI_CLASS_DISPLAY_OTHER)

Also I added lines recommended by @ChristophHaag into bumblebee.conf file (correspondingly to my system):

[driver-fglrx]
KernelDriver=fglrx
PMMethod=none
LibraryPath=/usr/lib64/fglrx:/usr/lib/fglrx
XorgModulePath=/usr/lib64/xorg,/usr/lib64/xorg/modules,/usr/lib64/xorg/modules/extensions/fglrx/
XorgConfFile=/etc/bumblebee/xorg.conf.fglrx

I start Bumblebee as follows.

boot with Intel card
> su
# export PATH=/usr/local/sbin:$PATH
# export LD_LIBRARY_PATH=/usr/local/lib64:$LD_LIBRARY_PATH
# bumblebeed --daemon
# bumblebeed --debug
[  216.942812] [INFO]PM is disabled, not performing detection.
[  216.942844] [DEBUG]Active configuration:
[  216.942854] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[  216.942881] [DEBUG] X display: :8
[  216.942891] [DEBUG] LD_LIBRARY_PATH: /usr/lib64/fglrx:/usr/lib/fglrx
[  216.942899] [DEBUG] Socket path: /var/run/bumblebee.socket
[  216.942910] [DEBUG] pidfile: /var/run/bumblebeed.pid
[  216.942919] [DEBUG] xorg.conf file: /etc/bumblebee/xorg.conf.fglrx
[  216.942928] [DEBUG] ModulePath: /usr/lib64/xorg,/usr/lib64/xorg/modules,/usr/lib64/xorg/modules/extensions/fglrx/
[  216.942937] [DEBUG] GID name: bumblebee
[  216.942945] [DEBUG] Power method: none
[  216.942954] [DEBUG] Stop X on exit: 1
[  216.942962] [DEBUG] Driver: fglrx
[  216.942977] [DEBUG] Driver module: fglrx
[  216.942986] [DEBUG] Card shutdown state: 1
[  216.943102] [DEBUG]Process /sbin/modinfo started, PID 2063.
[  216.943164] [DEBUG]Hiding stderr for execution of /sbin/modinfo
[  216.944668] [DEBUG]SIGCHILD received, but wait failed with No child processes
[  216.944705] [DEBUG]Configuration test passed.
[  216.944747] [ERROR]Daemon already running, pid 2059
# aticonfig --pxl
PowerXpress: Integrated GPU is active (Power-Saving mode).
# optirun --debug glxgears
[ 1349.078985] [DEBUG]optirun version 3.0.1-2012-10-05-Format:%h$ starting...
[ 1349.079022] [DEBUG]Active configuration:
[ 1349.079025] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 1349.079029] [DEBUG] X display: :8
[ 1349.079033] [DEBUG] LD_LIBRARY_PATH: /usr/lib64/fglrx:/usr/lib/fglrx
[ 1349.079036] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 1349.079039] [DEBUG] VGL Compression: proxy
[ 1350.908494] [INFO]Response: No - error: [XORG] (EE) fglrx(0): Failed to open CMMQS connection.

[ 1350.908512] [ERROR]Cannot access secondary GPU - error: [XORG] (EE) fglrx(0): Failed to open CMMQS connection.

[ 1350.908516] [DEBUG]Socket closed.
[ 1350.908539] [ERROR]Aborting because fallback start is disabled.
[ 1350.908544] [DEBUG]Killing all remaining processes.
# aticonfig --px-dgpu
PowerXpress: Discrete GPU is selected (High-Performance mode), please restart Xserver(s) for changes to take effect!
# optirun --debug glxgears

It returns first 7 lines of this command run with Intel card. After this machine hangs. Sometimes this command completed execution before computer hanged. There was one error in the output. It was written that display: :8 cannot be opened. Unfortunately, I couldn't reproduce this to take photo. The Xorg.8.log file is empty.

Could anybody help get this trick working, please?

Thank you in advance,
Natalia

@ChristophHaag

This comment has been minimized.

Show comment
Hide comment
@ChristophHaag

ChristophHaag Dec 14, 2014

It's an interesting question whether it actually does work at all. By now there have been many changes like the "megadrivers" in mesa, that could be a complication.

I haven't tried to use this in a looong time. But I remember that it was always a bit tricky to load the right libraries in the right order.

Do not use aticonfig --px-dgpu with bumblebee. It changes libraries around like replaces the system's libGL with fglrx's libGL. That's not what you want to do, you want bumblebee to handle it. I think I remember that my graphical output was hanging when attempting to do stuff like that too.

But when starting with optirun you might need to set LD_LIBRARY_PATH and/or LIBGL_DRIVERS_PATH and maybe even LD_PRELOAD to load fglrx's libraries. Sorry, don't know much more right now. I can only say that after a bit of tinkering it gets clearer that aticonfig --px-dgpu doesn't actually do anything "interesting" except moving a few files around which will break the X session currently running on intel.

fglrx(0): Failed to open CMMQS connection.

That's an error I have never seen before, maybe it's new.

Do you only need OpenCL? You might not need bumblebee at all for it. Just try to start your opencl program with root, with a little bit of luck it "just works". Maybe you need to set LD_LIBRARY_PATH for it to use the fglrx's libraries first. At least I believe that using fglrx's OpenCL support as a normal user required an X session, but as root it would work without an x session like from a tty.

It's an interesting question whether it actually does work at all. By now there have been many changes like the "megadrivers" in mesa, that could be a complication.

I haven't tried to use this in a looong time. But I remember that it was always a bit tricky to load the right libraries in the right order.

Do not use aticonfig --px-dgpu with bumblebee. It changes libraries around like replaces the system's libGL with fglrx's libGL. That's not what you want to do, you want bumblebee to handle it. I think I remember that my graphical output was hanging when attempting to do stuff like that too.

But when starting with optirun you might need to set LD_LIBRARY_PATH and/or LIBGL_DRIVERS_PATH and maybe even LD_PRELOAD to load fglrx's libraries. Sorry, don't know much more right now. I can only say that after a bit of tinkering it gets clearer that aticonfig --px-dgpu doesn't actually do anything "interesting" except moving a few files around which will break the X session currently running on intel.

fglrx(0): Failed to open CMMQS connection.

That's an error I have never seen before, maybe it's new.

Do you only need OpenCL? You might not need bumblebee at all for it. Just try to start your opencl program with root, with a little bit of luck it "just works". Maybe you need to set LD_LIBRARY_PATH for it to use the fglrx's libraries first. At least I believe that using fglrx's OpenCL support as a normal user required an X session, but as root it would work without an x session like from a tty.

@natashaDnepr

This comment has been minimized.

Show comment
Hide comment
@natashaDnepr

natashaDnepr Dec 15, 2014

Thank you for answer!

Yes, this card is mainly assumed to be used as a compute device. Graphics capabilities of the Intel card are enough for me.

I have tried to execute clinfo as root, but it returns Segmentation fault. With own program I have found that this happens on the line

clGetPlatformIDs(0,NULL,&platform_number);

I have tried to do

# export LD_LIBRARY_PATH=/usr/lib64/fglrx:/usr/lib/fglrx

but nothing changes.

Help me, please.

I understand, this is not exactly the topic of this discussion, but I really want to get the AMD GPU working...

Thank you for answer!

Yes, this card is mainly assumed to be used as a compute device. Graphics capabilities of the Intel card are enough for me.

I have tried to execute clinfo as root, but it returns Segmentation fault. With own program I have found that this happens on the line

clGetPlatformIDs(0,NULL,&platform_number);

I have tried to do

# export LD_LIBRARY_PATH=/usr/lib64/fglrx:/usr/lib/fglrx

but nothing changes.

Help me, please.

I understand, this is not exactly the topic of this discussion, but I really want to get the AMD GPU working...

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Dec 16, 2014

Member

If you just need it for compute purposes, maybe you will have more success with the opensource radeon driver. It can be competive with the fglrx driver.

Member

Lekensteyn commented Dec 16, 2014

If you just need it for compute purposes, maybe you will have more success with the opensource radeon driver. It can be competive with the fglrx driver.

@natashaDnepr

This comment has been minimized.

Show comment
Hide comment
@natashaDnepr

natashaDnepr Dec 16, 2014

I tried radeon driver. The driver by itself works, but GalliumCompute SDK supports cards up to Tahiti only. My GPU is Oland and it is too new for that SDK.

I tried radeon driver. The driver by itself works, but GalliumCompute SDK supports cards up to Tahiti only. My GPU is Oland and it is too new for that SDK.

@baudouinroullier

This comment has been minimized.

Show comment
Hide comment
@baudouinroullier

baudouinroullier Dec 17, 2014

Hey everyone,
I'd be glad to help with testing this branch fo bumblebee (and getting some help to use my discrete GPU would be nice as well :D )
For now it is not working properly. I have followed the archwiki on Bumblebee (https://wiki.archlinux.org/index.php/Bumblebee) and I have the error systemd-logind: failed to get session: PID XXX does not belong to any known session.
On the wiki they give a fix for intel chipsets (i915), but I have AMD/AMD GPUs, so I do not really know how to adapt this fix to my computer.
I am using the open source drivers. Should I put something like that in mkinitcpio.conf: MODULES="radeon bbswitch" with radeon in stead of i915?
Thanks for your help, and I hope I'll be able to help in return with data or anything you might need.

Hey everyone,
I'd be glad to help with testing this branch fo bumblebee (and getting some help to use my discrete GPU would be nice as well :D )
For now it is not working properly. I have followed the archwiki on Bumblebee (https://wiki.archlinux.org/index.php/Bumblebee) and I have the error systemd-logind: failed to get session: PID XXX does not belong to any known session.
On the wiki they give a fix for intel chipsets (i915), but I have AMD/AMD GPUs, so I do not really know how to adapt this fix to my computer.
I am using the open source drivers. Should I put something like that in mkinitcpio.conf: MODULES="radeon bbswitch" with radeon in stead of i915?
Thanks for your help, and I hope I'll be able to help in return with data or anything you might need.

@klausenbusk

This comment has been minimized.

Show comment
Hide comment
@klausenbusk

klausenbusk Dec 17, 2014

Maybe a stupid question. Bur how is Bumblebee different than DRI_Prime, if you is using the open source driver?

Den 17/12/2014 kl. 14.25 skrev baudouinroullier notifications@github.com:

Hey everyone,
I'd be glad to help with testing this branch fo bumblebee (and getting some help to use my discrete GPU would be nice as well :D )
For now it is not working properly. I have followed the archwiki on Bumblebee (https://wiki.archlinux.org/index.php/Bumblebee) and I have the error systemd-logind: failed to get session: PID XXX does not belong to any known session.
On the wiki they give a fix for intel chipsets (i915), but I have AMD/AMD GPUs, so I do not really know how to adapt this fix to my computer.
I am using the open source drivers. Should I put something like that in mkinitcpio.conf: MODULES="radeon bbswitch" with radeon in stead of i915?
Thanks for your help, and I hope I'll be able to help in return with data or anything you might need.


Reply to this email directly or view it on GitHub.

Maybe a stupid question. Bur how is Bumblebee different than DRI_Prime, if you is using the open source driver?

Den 17/12/2014 kl. 14.25 skrev baudouinroullier notifications@github.com:

Hey everyone,
I'd be glad to help with testing this branch fo bumblebee (and getting some help to use my discrete GPU would be nice as well :D )
For now it is not working properly. I have followed the archwiki on Bumblebee (https://wiki.archlinux.org/index.php/Bumblebee) and I have the error systemd-logind: failed to get session: PID XXX does not belong to any known session.
On the wiki they give a fix for intel chipsets (i915), but I have AMD/AMD GPUs, so I do not really know how to adapt this fix to my computer.
I am using the open source drivers. Should I put something like that in mkinitcpio.conf: MODULES="radeon bbswitch" with radeon in stead of i915?
Thanks for your help, and I hope I'll be able to help in return with data or anything you might need.


Reply to this email directly or view it on GitHub.

@Lekensteyn

This comment has been minimized.

Show comment
Hide comment
@Lekensteyn

Lekensteyn Dec 17, 2014

Member

@baudouinroullier It is no use to use bbswitch for radeon hardware, bbswitch is written for nvidia hardware.

The main advantage of bumblebee over PRIME is that you can disable the card when you decide not to use it. Since that is not possible using bbswitch, I suggest to try PRIME instead. If you enable runtime power management for the PCI device, then it will disable itself after the last user (an application via PRIME) is gone (after a delay of 5 seconds).

Member

Lekensteyn commented Dec 17, 2014

@baudouinroullier It is no use to use bbswitch for radeon hardware, bbswitch is written for nvidia hardware.

The main advantage of bumblebee over PRIME is that you can disable the card when you decide not to use it. Since that is not possible using bbswitch, I suggest to try PRIME instead. If you enable runtime power management for the PCI device, then it will disable itself after the last user (an application via PRIME) is gone (after a delay of 5 seconds).

@baudouinroullier

This comment has been minimized.

Show comment
Hide comment
@baudouinroullier

baudouinroullier Dec 17, 2014

Thank you for your answers.

@klausenbusk Sorry, I didn't really understand your question. I'm quite a begginer in hybrid graphics and drivers and whatnot.
@Lekensteyn If I understand you correctly bumblebee works for AMD but not Radeon? Or am I completely wrong?
I've tried PRIME briefly without success (libGL errors), that's why I decided to try bumblebee (quoting arch wiki: "You can also use bumblebee with radeon, there is a bumblebee-amd-git package on AUR.")
Well, I don't think this is the place to ask for help about PRIME so I'll be off.

If you do need data or anything from my hardware I'd be glad to help.

Thank you for your answers.

@klausenbusk Sorry, I didn't really understand your question. I'm quite a begginer in hybrid graphics and drivers and whatnot.
@Lekensteyn If I understand you correctly bumblebee works for AMD but not Radeon? Or am I completely wrong?
I've tried PRIME briefly without success (libGL errors), that's why I decided to try bumblebee (quoting arch wiki: "You can also use bumblebee with radeon, there is a bumblebee-amd-git package on AUR.")
Well, I don't think this is the place to ask for help about PRIME so I'll be off.

If you do need data or anything from my hardware I'd be glad to help.

@Hohahiu

This comment has been minimized.

Show comment
Hide comment
@Hohahiu

Hohahiu Dec 17, 2014

Maine a stupid question. Bur how is Bumblebee different than DRI_Prime, if you is using the open >source driver?

DRI_PRIME is a proper implementation of switchable graphics based on kernel mechanisms. And as such it is only supported by free drivers. Also it can automatically power on and shutdown the discrete graphics. At this point this is much better solution for radeon GPUs. In addition, due to AMD's decision to use free kernel driver as a base for future versions of userspace fglrx it will be in even better state.

Hohahiu commented Dec 17, 2014

Maine a stupid question. Bur how is Bumblebee different than DRI_Prime, if you is using the open >source driver?

DRI_PRIME is a proper implementation of switchable graphics based on kernel mechanisms. And as such it is only supported by free drivers. Also it can automatically power on and shutdown the discrete graphics. At this point this is much better solution for radeon GPUs. In addition, due to AMD's decision to use free kernel driver as a base for future versions of userspace fglrx it will be in even better state.

@ArchangeGabriel ArchangeGabriel modified the milestone: Bumblebee Future Jan 2, 2015

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment