This is a feature request.
The goal we try to reach is to have a --vdpau option for optirun.
The most interesting way looks to be hybrid-windump for now.
Old discussion is available here : https://github.com/MrMEEE/bumblebee-Old-and-abandoned/issues/531.
I will make a summary of it here as soon as possible.
About a week ago, I accidentally succeeded in accessing a pure Nvidia desktop using hybrid-windump.
I cannot recall exactly how it happened, but I had the dual-head xorg.conf setup with gebart's windump and was screwing around with mplayer2.
The next thing I know, my mouse is trapped in the dumped Nvidia screen and can't escape. I didn't think this was much of a problem, but I had solved the problem of not being able to interact with the Nvidia desktop once the window was dumped.
Do note that metacity must be started from the terminal or tty in order for the mouse to work on the Intel screen.
Further research is needed.
Okay, I've figured out how to control the Nvidia screen on demand.
Assuming that the Nvidia screen is set to match the Intel screen's native resolution in the xorg.conf, moving my mouse to the left side of the screen will bring it into the dumped Nvidia screen.
Going forward, I have a few proposals.
Find a way to achieve dumping without hosing the Intel 3D.
A better dumping mechanism to replace XShm.
Assuming the first point, an additional Bumblebee mode specifically for VDPAU or people who want an always-on GPU. This mode would probably need a dedicated xorg.conf that can be used as needed and require an X restart like classic graphic switching.
Right now, the windump approach still requires the Nvidia vanilla binary. We should be able to achieve dumping using distro-packaged versions if all the necessary hooks are made in a safe way. The vanilla binary only works because it makes destructive changes to gl linkages for the Intel card that are hard to restore without a reinstall.
Also, something of note.
The windump approach does expose the NV17 Video Texture output over Xv in SMPlayer.
Some big developments.
Thanks to Fabio Giovagnini's post on the mailing list, I've been able to trim the requisite xorg.conf to something far simpler and manageable: http://paste.ubuntu.com/695277/
This configuration allows windump to work with distro-packaged versions of the nvidia binary OR nouveau (no OpenGL with nvc3 but X11 works).
Edit: NV17 is exposed to the Intel display in dual-head xorg configurations such as the one linked. Windump is not necessary. Also, bumblebee will load the above xorg.conf if you use it as xorg.conf.nvidia.
i was just about to try it, but don't want to fry it...
is it safe to use?
like this: /etc/bumblebee/xorg.conf.nvidia
I should clarify that using the xorg.conf as xorg.conf.nvidia does not enable VDPAU. I was just saying that Bumblebee loads it with no problems.
so its safe to use for enable external monitor hdmi.
i will try...
i tried few different ways, didnt work. the hdmi tv monitor was recognised in nvidia-settings panel, as disabled. there was option in config to choose as single x-screen (of which there is only one), or Twinview. afraid to click "Save settings to xconfig" option as i'm not sure which xconfig it will change. it would be handy to have 2 monitors going, or even just a big one.
mplayer2 output: http://pastebin.com/6D6m99D0
What is that? It's VDPAU through Bumblebee+Windump!
Now, there are still a few problems that need to be addressed. The main one is the presence of two cursors that don't always align and no keyboard support. The second is that Bumblebee doesn't have a comprehensive list of modeplines (1080p especially) like Ironhide's xorg.conf.nvidia does so the dumped window may be limited to 1024x768 or 1360x768. Even if the dumped window matches your LVDS resolution, there are still issues of window overlap, window priority, and Compiz being a bit fickle. gnome-panel and gnome-shell seem to also interfere with fullscreening the dumped window. There are also minor cosmetic issues like theming, but whatever.
Want to try? Don't worry! This shouldn't hose your system like previous approaches.
Build hybrid-windump but don't use the included xorg.conf!
https://github.com/harp1n/hybrid-windump (includes borders for entire dumped window)
https://github.com/gebart/hybrid-windump (included better cursor support, but dumped window will cover portion of Intel desktop and can't be moved)
Start the Bumblebee X server
Run Compiz (or Metacity) on Nvidia display so the dumped window can be moved around. If you skip this, VDPAU playback will be green.
DISPLAY=:8 compiz --replace &
DISPLAY=:8 mplayer -vo vdpau videofilename
./windump :8 :0.0 &
You now have VDPAU output!
NB: This also works with Ironhide. Also, unlike the previous post, dumping is needed to expose NV17.
Video of progress that highlights the dumping of single windows using the latest version of hybrid-windump
I think we can start thinking about using hybrid-windump as an alternative backend for Bumblebee. It seems more "hardwarelly" than the network approach. Thanks for the progress on this one @LLStarks. We'll try to catch up
To that end, I've forked harp1n's windump to make rendering through Bumblebee possible again. The latest commits broke that functionality.
I'm finding Bumbledump to be far more fickle than Windump by itself and a lot of work needs to be done to get the xorg.conf(.nvidia) to correctly handle mouse/keyboard events and have a wide array of resolutions. The modelines Ironhide uses shoud suffice for the latter. There's also a question of whether the two-screen behavior is appropriate. I see little need to have the Nvidia screen to the left or right of the Intel screen.
At any rate, here's a wiki page featuring demos of Windump by itself and through Bumblebee.
About modelines, whe should not use those from IronHide, however we can get them entirely from intel driver, read this: Bumblebee-Project/Bumblebee#67 (comment)
I'm trying to found an external monitor to play with, I may have found one.
I have a few little heads-up in this one:
Tried the @LLStarks hybrid-windump, and I have a really great news: It improves performance, I mean A LOT! Despite the really unusable for any purpose of current windump, this will serve as a proof of concept.
Running on Nouveau all the test are great.
$ optirun glxspheres
Polygons in scene: 62464
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: Gallium 0.4 on NVA8
30.888309 frames/sec - 27.538780 Mpixels/sec
not good huh? Well that's because VirtualGL "renders" twice: one on the server side (what we want to offload graphics) and one 2D rendering on the client side, which unfortunately loads the main CPU core and is awfull in that. But now look what happens when using windump:
Polygons in scene: 62464
Visual ID of window: 0x11c
Context is Direct
OpenGL Renderer: Gallium 0.4 on NVA8
147.114528 frames/sec - 164.179813 Mpixels/sec
Here the window is not dumped yet. When I dump it:
137.518458 frames/sec - 153.470599 Mpixels/sec
So, minimum overhead on the main core and the rendering is pretty good. And this is under Gallium3D!!!!
The hybrid-windump as it is, is unusable, but will get better i think :)
In @LLStarks we trust
I will take a look at all these interestings things here, but I'm very busy currently.
@LLStarks, just to notice something: to be able to use the keyboard on the dumped window you need to comment out the option AutoAddDevices or set it to true. That will add all the control devices including keyboard and mouse. that option is set to false so the X server will start faster.
# Option "AutoAddDevices" "false"
Yup. Keyboard now works.
I haven't been able to get the mouse working for the past few days though.
Has anyone been able to get glxgears, glxspheres, or mplayer gl output to work?
@LLStarks: I did, the problems I found:
On the bright side: it improves performance by 4 to 5 times
LLStarks found an interesting and somewhat complete solution in xpra. The performance is the same as windump, but has the only problem that the transport is via jpeg. The frames are rendered faster than are displayed and that causes some graphics corruption/flickering. For those who want to test this:
Start Bumblebee X server
Start Xpra server
xpra start :8
Attach :8 to :0
xpra attach :8
Start applications on Nvidia screen
Repeat step 4 as necessary
The trunk svn also has png and rgb24 modes, but they don't seem to help so far.
Also, I think it might be more appropriate to do xpra upgrade :8 instead of start.
Despite the unusable jpeg problem I found a way to set the Virtual screen the Nouveau driver uses as a monitor to render:
adding a minimal Screen section and matching that to the resolution of the LVDS improves performance a little (at least in the numbers):
Virtual 1366 768
A thought occurred.
It's already very easy to screw up gl_conf assignments by installing/removing/reinstalling mesa glx, nvidia glx, or Bumblebee. If we replace VirtualGL as a backend we'd need to ensure that GL works on both X servers otherwise we'd encounter "Error: couldn't get an RGB, Double-buffered visual" messages for the Nvidia one.
I think we should test this stuff in a less problematic environment (i.e. not nvidia driver). Nouveau performs well at least to make things work.
Btw, to ensure hardware acceleration, use "xpra start :8 --use-display".
I'm a bit curious as to how VirtualGL handles cursors and resizing.
The two primary pitfalls of using Xpra and Windump are that neither of the above are handled properly.
How do you create a nested X server that can handle input (mouse AND touchpad) yet not create a second cursor?
Also, I really Windump's -i flag. Not knowing the window hex ID until you run the dumped app isn't much of a problem, though it would be more ideal to refer to said app by the window name or PID of it's root window. What I would like to implement, if I ever have time, is a method for Windump to account for changes in window size or hex ID as programs like Mplayer2 go in and out fullscreen.
As for Xpra, it simply doesn't have enough memory bandwidth even with the new mmap transport. Using it as I did was a novel idea, but it is too VNC-like and suffers from its principles.
Btw, proof of concept status update. Most of this is this just me playing around with no care for implementation.
One important thing to note is that I didn't need to break Intel 3D to achieve any of them. All I did was use the Windump/Xpra approach and the requisite library paths.
VA-API with Nouveau: Works with VLC.
VP8 VDPAU with Nouveau: Emeric's patches need some more work, but yes, it works with Mplayer/Mplayer2.
VDPAU in Flash with Nvidia: Works. Requires OverrideGPUValidation=true and EnableLinuxHWVideoDecode=1 in /etc/adobe/mms.cfg as well as the browser run with LD_LIBRARY_PATH=/usr/lib/nvidia-current/:/usr/lib/nvidia-current/vdpau/. Both paths are needed in order to not break WebGL.
VDPAU in Mplayer/Mplayer2 with Nvidia: Works. Requires LD_LIBRARY_PATH=/usr/lib/nvidia-current/vdpau/.
VA-API in VLC with Nvidia: Works. Requires LD_LIBRARY_PATH=/usr/lib/nvidia-current/vdpau/.
I was wondering if there was any advances in the support of Vpdau through Windump and its integration with Bumblebee. I browsed the web looking at different walkthrough you detailed, but none of them seems up-to-date, and so far I have failed to decode video with Vpdau via Windump. Could you recap what there is to know on the matter as of now ?
is there any progress on that?