New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bumblebee 3.2 breaks Multimonitor with screenclone #401
Comments
The guide mentions replacing |
I diff'd the output from xorg with 3.1 and 3.2 and noticed that xorg was doing a lot more things in 3.1 because it was using my global xorg.conf.d folder instead of the new one with the empty dummy conf file. I tried to use some of my config files instead of the dummy file end it seems like the VGA-output on the NVIDIA card is only active if the Device that is used in the xorg.conf.nvidia file is also present in the configs of the config folder. There don't have to be actually any Options in it, it just needs to be there. I opened a pull-request for this here: #402 |
Then we should add this to the instructions about multi-monitor, because as I've said in #402, we can't merge this. |
It would be nice if bumblebee would query the Outputs for external Monitors and would set the appropriate configuration for xorg. For now I think I'll update the script in the guide that enables the external monitors so that it will create the config file end remove it when the monitor is disconnected. |
Please provide |
with config: http://pastebin.com/DfBXBH3v The diff of these look like having the Device Section in the config folder will overwrite the Devire Section from the xorg.conf.nvidia completely. the xorg with the empty config seems to just autoconfigure everything and the settings from xorg.conf.nvidia are completely missing. Maybe it is possible to set the xorg.conf.nvidia in a way that will let it autoconfigure external monitors when there are any but force it to run without a display device if there is none? |
What about this line: ab5f409#L2L33? |
Yeah. You have I've asked this at the very beginning:
The replacement does not have that option. |
@amonakov @ArchangeGabriel I played around a bit with the xorg.conf.nvidia though and I found a configuration that seems to work.
optirun works with and without a connected display this way. Also the VGA Output seems to be always on this way. If have a monitor connected while running e.g. optirun glxspheres (without enabling multiple monitors) the monitor will turn on and show a black screen. This is also the case when I start optirun without a monitor connected and connect it while optirun is running. Actually using the moniter with the screenclone script also works as expected. The UseEDID line can also stay uncommented but the output on the external monitor will have the wrong resolution. The biggest Problem with this setup is that the ConnectedMonitor line depends on the Outputs the NVIDIA card of the system actually has. I found this Issue that dealt with that same Problem by adding the UseDisplayDevice line to the config (#21) but this is exactly what prevents the NVIDIA card from sending a signal to the VGA Output. An autodetection for this would be nice but I guess I'll change the guide so that people can set this themselves for now. |
Well, of course if you do have an external monitor connected to nVidia, you need to remove |
Then closing, because the feature request is a duplicate, and everything else has been adressed when possible. |
I just found this issue when my setup using 'screenclone' failed after upgrading bumblebee. The multimonitor setup on the wiki doesn't mention anything about disabling "UseDisplayDevice" "none" (which fixed the problem for me), but I'm not entirely clear on whether you only need that when using the screenclone method or whether everyone should disable that. If the latter, please give the word and I'll edit that wiki page. |
Everybody wishing to use an external display that is actually connected. |
I've successfully been using the screenclone method (see Optimus Archlinux Graphics Setup for Thinkpads or Optimal Ubuntu Graphics Setup for Thinkpads ) to easily use a second display for almost a year now, but after the update to 3.2 my extenal display does not get a signal from the NVIDIA card any more.
My setup for this is exactly as described in the guide for Archlinux. I'm using Archlinux on a ThinkPad W520 (VGA and HDMI wired to NVIDIA card) for this. I don't seem to get any error messages and running programs through optirun works the same for me in 3.1 and 3.2. The only difference seems to be that the NVIDIA card does not send any signal to the external Monitor any more
I'm not sure if this is even a bug or my config files are just wrong for what I want to do here but I tried every possible configuration I could come up with.
The text was updated successfully, but these errors were encountered: