Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bumblebee 3.2 breaks Multimonitor with screenclone #401

Closed
Gordin opened this issue Apr 29, 2013 · 13 comments
Closed

Bumblebee 3.2 breaks Multimonitor with screenclone #401

Gordin opened this issue Apr 29, 2013 · 13 comments

Comments

@Gordin
Copy link

Gordin commented Apr 29, 2013

I've successfully been using the screenclone method (see Optimus Archlinux Graphics Setup for Thinkpads or Optimal Ubuntu Graphics Setup for Thinkpads ) to easily use a second display for almost a year now, but after the update to 3.2 my extenal display does not get a signal from the NVIDIA card any more.
My setup for this is exactly as described in the guide for Archlinux. I'm using Archlinux on a ThinkPad W520 (VGA and HDMI wired to NVIDIA card) for this. I don't seem to get any error messages and running programs through optirun works the same for me in 3.1 and 3.2. The only difference seems to be that the NVIDIA card does not send any signal to the external Monitor any more
I'm not sure if this is even a bug or my config files are just wrong for what I want to do here but I tried every possible configuration I could come up with.

@amonakov
Copy link
Contributor

The guide mentions replacing xorg.conf.nvidia — have you done that after the upgrade? If yes, pastebin your /var/log/Xorg.8.log

@Gordin
Copy link
Author

Gordin commented Apr 29, 2013

I diff'd the output from xorg with 3.1 and 3.2 and noticed that xorg was doing a lot more things in 3.1 because it was using my global xorg.conf.d folder instead of the new one with the empty dummy conf file. I tried to use some of my config files instead of the dummy file end it seems like the VGA-output on the NVIDIA card is only active if the Device that is used in the xorg.conf.nvidia file is also present in the configs of the config folder. There don't have to be actually any Options in it, it just needs to be there. I opened a pull-request for this here: #402

@ArchangeGabriel
Copy link
Member

Then we should add this to the instructions about multi-monitor, because as I've said in #402, we can't merge this.

@Gordin
Copy link
Author

Gordin commented Apr 29, 2013

It would be nice if bumblebee would query the Outputs for external Monitors and would set the appropriate configuration for xorg. For now I think I'll update the script in the guide that enables the external monitors so that it will create the config file end remove it when the monitor is disconnected.

@amonakov
Copy link
Contributor

Please provide /var/log/Xorg.8.log files with and without the modified 10-dummy.conf

@Gordin
Copy link
Author

Gordin commented Apr 29, 2013

with config: http://pastebin.com/DfBXBH3v
with empty config: http://pastebin.com/wGDhMTKY

The diff of these look like having the Device Section in the config folder will overwrite the Devire Section from the xorg.conf.nvidia completely. the xorg with the empty config seems to just autoconfigure everything and the settings from xorg.conf.nvidia are completely missing. Maybe it is possible to set the xorg.conf.nvidia in a way that will let it autoconfigure external monitors when there are any but force it to run without a display device if there is none?
edit: Or to force xorg to always send a signal to the outputs without failing when there is nothing attached to the outputs

@ArchangeGabriel
Copy link
Member

What about this line: ab5f409#L2L33?

@amonakov
Copy link
Contributor

Yeah. You have Option "UseDisplayDevice" "none" in the stock config file. If you read the log, you'd see nvidia driver saying entering NoScanout mode.

I've asked this at the very beginning:

The guide mentions replacing xorg.conf.nvidia — have you done that after the upgrade?

The replacement does not have that option.

@Gordin
Copy link
Author

Gordin commented Apr 29, 2013

@amonakov
If I replace it I have a working external monitor but optirun crashes when there is nothing connected to the VGA output. I guess this is exactly the behavior you were fixing with the change in 3.2. (When I place the empty Device section in the config folder it has the same effect as replacing the xong.conf.nvidia as the xorg.conf.nvidia device section is overwritten and both configs autoconfigure almost everything)

@ArchangeGabriel
Still no Signal. I also tried "CRT" as I'm using the VGA output

I played around a bit with the xorg.conf.nvidia though and I found a configuration that seems to work.
when I comment out the useEDID and UseDisplayDevice lines AND add the ConnectedMonitor line like this:

#Option "UseEDID" "false" #Option "UseDisplayDevice" "none" Option "ConnectedMonitor" "CRT"

optirun works with and without a connected display this way. Also the VGA Output seems to be always on this way. If have a monitor connected while running e.g. optirun glxspheres (without enabling multiple monitors) the monitor will turn on and show a black screen. This is also the case when I start optirun without a monitor connected and connect it while optirun is running. Actually using the moniter with the screenclone script also works as expected. The UseEDID line can also stay uncommented but the output on the external monitor will have the wrong resolution.

The biggest Problem with this setup is that the ConnectedMonitor line depends on the Outputs the NVIDIA card of the system actually has. I found this Issue that dealt with that same Problem by adding the UseDisplayDevice line to the config (#21) but this is exactly what prevents the NVIDIA card from sending a signal to the VGA Output. An autodetection for this would be nice but I guess I'll change the guide so that people can set this themselves for now.

@amonakov
Copy link
Contributor

Well, of course if you do have an external monitor connected to nVidia, you need to remove "UseDisplayDevice" "none" in order to make use of it. The problem is, when you don't, the only option that works for all users is to put that option in. So in the end you need two different configs for two different use cases. The request to implement config switching from optirun is issue #373.

@ArchangeGabriel
Copy link
Member

Then closing, because the feature request is a duplicate, and everything else has been adressed when possible.

@Confusion
Copy link

I just found this issue when my setup using 'screenclone' failed after upgrading bumblebee. The multimonitor setup on the wiki doesn't mention anything about disabling "UseDisplayDevice" "none" (which fixed the problem for me), but I'm not entirely clear on whether you only need that when using the screenclone method or whether everyone should disable that. If the latter, please give the word and I'll edit that wiki page.

@amonakov
Copy link
Contributor

amonakov commented May 5, 2013

Everybody wishing to use an external display that is actually connected.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants