New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unset pixel buffer when x0vncserver client disconnects. #515
Conversation
The changes in |
I should have described it properly. |
Alright. Could you update the commit message? Or put that part in a separate commit? |
Originally calling VNCServertST::setPixelBuffer(PixelBuffer* pb_) with pb_=0 would do nothing. With this change pb will be set to 0 and deferred update timer will be stopped.
In XDesktop::start() we allocate pixel buffer and set it as the backend to the given VNCServer. In XDesktop::stop() we deallocate the buffer, so we must unset it from the VNCServer as well. Otherwise the VNCServer could try to access it and crash, for example in deferred update.
b1d7c2c
to
bd9f89b
Compare
Ok, split into two commits now. |
@@ -313,6 +313,8 @@ void VNCServerST::setPixelBuffer(PixelBuffer* pb_, const ScreenSet& layout) | |||
screenLayout = layout; | |||
|
|||
if (!pb) { | |||
stopFrameClock(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps it is better to do this when the desktop stops? I.e. in VNCServerST::removeSocket()
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have placed it there because it is symetric with the startFrameClock()
- it is called when pixel buffer is set, so stopFrameClock()
is called when pixel buffer is unset.
Currently the call chain is VNCServerST::removeSocket()
-> rfb::SDesktop::stop()
(in this case x0vncserver's XDesktop::stop
) -> VNCServerST::setPixelBuffer(0)
. But other implementation of rfb::SDesktop::stop
may not want to unset the pixel buffer there, so the timer would stop and not start again for next connection? (Not sure.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right, other server may not call VNCServerST::setPixelBuffer(0)
. However that won't wedge things as the clock will start again once new changes happen. The only downside is that the clock will continue to run for a bit after the desktop is "stopped", which might be a bit confusing. My idea was to make it more symmetrical with SDesktop::start()/stop()
.
|
||
// Check that the screen layout is still valid | ||
if (!layout.validate(pb_->width(), pb_->height())) { | ||
if (pb_ && !layout.validate(pb_->width(), pb_->height())) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably reset the screen layout if we are removing the pixel buffer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I'll add that.
I noticed that WinVNC calls |
Ping @michalsrb |
Updated and merged to master now. |
In XDesktop::start() we allocate pixel buffer and set it as the backend to the given VNCServer.
In XDesktop::stop() we deallocate the buffer, so we must unset it from the VNCServer as well.
Otherwise the VNCServer could try to access it and crash, for example in deferred update.
This prevents crashes after client disconnects from x0vncserver. Backtrace of such crash looks like this: