Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

segfault #2

Closed
karolherbst opened this issue Sep 2, 2012 · 6 comments
Closed

segfault #2

karolherbst opened this issue Sep 2, 2012 · 6 comments

Comments

@karolherbst
Copy link
Contributor

./primusrun glxinfo
loading /usr/lib64/opengl/nvidia/lib/libGL.so.1: 0x7f853b8fe000
loading /usr/lib/libGL.so.1: 0xfd5ab0
name of display: :0.0
glXCreateContext
Segmentation fault

sys libs:
mesa-master
xf86-video-intel-2.20.5 (on sna)
libdrm-2.4.39
libX11-1.5.0

uname -a:
Linux karols-gentoo 3.5.3-gentoo #1 SMP PREEMPT Mon Aug 27 00:17:22 CEST 2012 x86_64 Intel(R) Core(TM) i7-3610QM CPU @ 2.30GHz GenuineIntel GNU/Linux

Nvidia GPU: GT630M

do you need more information?
It would be nive to know which system you are using, so it is easier to find for errors, because on your system it seems to work.

EDIT:
this line is failing:

GLXContext actx = primus.afns.glXCreateNewContext(primus.adpy, *acfgs, GLX_RGBA_TYPE, shareList, direct);
@amonakov
Copy link
Owner

amonakov commented Sep 2, 2012

I'm using:
mesa master: commit a3685544e1e88828c4931059686cf3acc199079c
xf86-video-intel-2.20.3 (sna)
libdrm-2.4.38
libX11-1.5.0
nvidia binary drivers version 304.37

What is the value of acfgs prior to executing that statement, is it NULL?

What is your testing procedure? Note that you are responsible for starting secondary X, primus will not do that for you.

@karolherbst
Copy link
Contributor Author

ahh okay, I didn't know, that I have to start a second X server. Okay, glxgears and also glxspheres are running.

primusrun glxspheres
loading /usr/lib64/opengl/nvidia/lib/libGL.so.1: 0x7fa3a491d000
loading /usr/lib/libGL.so.1: 0x21a0c00
Polygons in scene: 62464
Visual ID of window: 0x20
glXCreateContext
ATTENTION: default value of option vblank_mode overridden by environment.
ATTENTION: default value of option vblank_mode overridden by environment.
glXIsDirect
Context is Direct
glXMakeCurrent
OpenGL Renderer: GeForce GT 630M/PCIe/SSE2
primus: sorry, not implemented: glXUseXFont
1006.130730 frames/sec - 1122.841894 Mpixels/sec
995.853913 frames/sec - 1111.372967 Mpixels/sec
995.801018 frames/sec - 1111.313936 Mpixels/sec
1002.985652 frames/sec - 1119.331988 Mpixels/sec
990.192813 frames/sec - 1105.055179 Mpixels/sec
995.331831 frames/sec - 1110.790324 Mpixels/sec

seems to work, but primus should handle the start of a second X server by itself like bumblebee. Or maybe use primus just as a middleend instead of vglrun?
But I will try to do some extended tests. Should I post bug reports of application, which won't start? Or do you want to implement some functionality first?

@Lekensteyn
Copy link

@karolherbst Keep in mind that Primus is still in its very early stages of development. There is no nice integration of it at the moment.

When testing primus, I ran optirun bash in a different terminal tab. Then I went back to a free terminal tab (not in the optirun bash one!) and ran primus.

@karolherbst
Copy link
Contributor Author

I just modified the bumblebee

KeepUnusedXServer=true
TurnCardOffAtExit=false

so the nvidia card is still loaded after an optirun call

I tried a bit to start some games and other things. Wine is not working of course.

@amonakov
Copy link
Owner

amonakov commented Sep 3, 2012

@karolherbst karolherbst, thanks for your interest. There still are some known unimplemented bits, but you can post bugreports for simple applications (mesa-demos would be very appropriate). I'm leaving for a week-long vacation.

Did you enable PRIMUS_DROPFRAMES to get 1000 fps above?

@amonakov amonakov closed this as completed Sep 3, 2012
@karolherbst
Copy link
Contributor Author

yes. I guessed, that it will leave all rendering done. via optirun I get "only" around 200 fps.
But same behaviour with glxgears.
intel => 6000 fps
opritun => 1200 fps
primusrun => 50000 fps

I am interested on this test scenario, because optirun seems to loose much fps when it comes to high fps rates. In glxspheres this effect occurs less:

intel => 220 fps
optirun => 200 fps
primusrun => 1000 fps

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants