Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Screen tearing and input delay #10

Open
Advait1306 opened this issue Jan 19, 2024 · 11 comments
Open

Screen tearing and input delay #10

Advait1306 opened this issue Jan 19, 2024 · 11 comments

Comments

@Advait1306
Copy link

Advait1306 commented Jan 19, 2024

I made changes to the example given in the library although I'm facing a lot of tearing and input lag issues. I have attached a video as well that you can check it out.

In the draw function I changed the way bytes were written to the video buffer, although that doesn't seem to improve anything.

use anyhow::{Context, Result};
use minifb::{Key, Window, WindowOptions};
use tokio::{self, net::TcpStream};
use tracing::Level;
use vnc::{PixelFormat, Rect, VncConnector, VncEvent, X11Event};

struct CanvasUtils {
    window: Window,
    video: Vec<u32>,
    width: u32,
    height: u32,
}

impl CanvasUtils {
    fn new() -> Result<Self> {
        Ok(Self {
            window: Window::new(
                "mstsc-rs Remote Desktop in Rust",
                800_usize,
                600_usize,
                WindowOptions::default(),
            )
            .with_context(|| "Unable to create window".to_string())?,
            video: vec![],
            width: 800,
            height: 600,
        })
    }

    fn init(&mut self, width: u32, height: u32) -> Result<()> {
        let mut window = Window::new(
            "mstsc-rs Remote Desktop in Rust",
            width as usize,
            height as usize,
            WindowOptions::default(),
        )
        .with_context(|| "Unable to create window")?;
        window.limit_update_rate(Some(std::time::Duration::from_micros(16600)));
        self.window = window;
        self.width = width;
        self.height = height;
        self.video.resize(height as usize * width as usize, 0);
        Ok(())
    }

    fn draw(&mut self, rect: Rect, data: Vec<u8>) -> Result<()> {
        println!("width: {} height: {}", rect.width, rect.height);

        let bytes_per_pixel = 4;
        let grouped_pix: Vec<_> = data.chunks_exact(bytes_per_pixel).collect();
        let converted_data = grouped_pix
            .iter()
            .map(|x| u32::from_le_bytes(x[0..bytes_per_pixel].try_into().unwrap()) & 0x00_ff_ff_ff)
            .collect::<Vec<_>>();

        for y in 0..rect.height as usize {
            let start = (rect.y as usize + y) * self.width as usize + rect.x as usize;
            let converted_slice =
                &converted_data[y * rect.width as usize..(y + 1) * rect.width as usize];

            self.video[start..start + rect.width as usize].copy_from_slice(&converted_slice);
        }

        Ok(())
    }

    fn flush(&mut self) -> Result<()> {
        self.window
            .update_with_buffer(&self.video, self.width as usize, self.height as usize)
            .with_context(|| "Unable to update screen buffer")?;
        Ok(())
    }

    fn copy(&mut self, dst: Rect, src: Rect) -> Result<()> {
        println!("Copy");
        let mut tmp = vec![0; src.width as usize * src.height as usize];
        let mut tmp_idx = 0;
        for y in 0..src.height as usize {
            let mut s_idx = (src.y as usize + y) * self.width as usize + src.x as usize;
            for _ in 0..src.width {
                tmp[tmp_idx] = self.video[s_idx];
                tmp_idx += 1;
                s_idx += 1;
            }
        }
        tmp_idx = 0;
        for y in 0..src.height as usize {
            let mut d_idx = (dst.y as usize + y) * self.width as usize + dst.x as usize;
            for _ in 0..src.width {
                self.video[d_idx] = tmp[tmp_idx];
                tmp_idx += 1;
                d_idx += 1;
            }
        }
        Ok(())
    }

    fn close(&self) {}

    fn test(&mut self) {
        self.init(1920, 1080);
    }

    fn hande_vnc_event(&mut self, event: VncEvent) -> Result<()> {
        match event {
            VncEvent::SetResolution(screen) => {
                tracing::info!("Resize {:?}", screen);
                self.init(screen.width as u32, screen.height as u32)?
            }
            VncEvent::RawImage(rect, data) => {
                self.draw(rect, data)?;
            }
            VncEvent::Bell => {
                tracing::warn!("Bell event got, but ignore it");
            }
            VncEvent::SetPixelFormat(_) => unreachable!(),
            VncEvent::Copy(dst, src) => {
                self.copy(dst, src)?;
            }
            VncEvent::JpegImage(_rect, _data) => {
                tracing::warn!("Jpeg event got, but ignore it");
            }
            VncEvent::SetCursor(rect, data) => {
                if rect.width != 0 {
                    self.draw(rect, data)?;
                }
            }
            VncEvent::Text(string) => {
                tracing::info!("Got clipboard message {}", string);
            }
            _ => tracing::error!("{:?}", event),
        }
        Ok(())
    }
}

#[tokio::main]
async fn main() -> Result<()> {
    // Create tracing subscriber
    #[cfg(debug_assertions)]
    let subscriber = tracing_subscriber::fmt()
        .pretty()
        .with_max_level(Level::TRACE)
        .finish();

    #[cfg(not(debug_assertions))]
    let subscriber = tracing_subscriber::fmt()
        .pretty()
        .with_max_level(Level::INFO)
        .finish();

    tracing::subscriber::set_global_default(subscriber).expect("setting default subscriber failed");

    let tcp = TcpStream::connect("35.244.38.3:5901").await?;
    let vnc = VncConnector::new(tcp)
        .set_auth_method(async move { Ok("123456".to_string()) })
        .add_encoding(vnc::VncEncoding::Tight)
        .add_encoding(vnc::VncEncoding::Zrle)
        .add_encoding(vnc::VncEncoding::CopyRect)
        .add_encoding(vnc::VncEncoding::Raw)
        .allow_shared(true)
        .set_pixel_format(PixelFormat::bgra())
        .build()?
        .try_start()
        .await?
        .finish()?;

    let mut canvas = CanvasUtils::new()?;
    // canvas.test();
    let mut now = std::time::Instant::now();
    let mut pressed_keys = Vec::<u32>::new();

    loop {
        let mut events = Vec::<X11Event>::new();

        match vnc.poll_event().await {
            Ok(Some(e)) => {
                let _ = canvas.hande_vnc_event(e);
            }
            Ok(None) => (),
            Err(e) => {
                tracing::error!("{}", e.to_string());
                break;
            }
        }

        if now.elapsed().as_millis() > 16 {
            canvas
                .window
                .get_keys_pressed(minifb::KeyRepeat::No)
                .iter()
                .for_each(|key| {
                    let converted_key = convert_key_to_u32(*key);

                    if !pressed_keys.contains(&converted_key) {
                        println!("Key pressed: {:?}", key);
                        pressed_keys.push(converted_key);
                        events.push(X11Event::KeyEvent((converted_key, true).into()))
                    }
                });

            canvas.window.get_keys_released().iter().for_each(|key| {
                let converted_key = convert_key_to_u32(*key);

                if pressed_keys.contains(&converted_key) {
                    println!("Key released: {:?}", key);
                    pressed_keys.retain(|&x| x != converted_key);
                    events.push(X11Event::KeyEvent((convert_key_to_u32(*key), false).into()))
                }
            });

            let mut x = 0;
            let mut y = 0;

            canvas
                .window
                .get_mouse_pos(minifb::MouseMode::Clamp)
                .map(|mouse| {
                    // println!("Mouse position: x {} y {}", mouse.0 as u16, mouse.1 as u16);
                    x = mouse.0 as u16;
                    y = mouse.1 as u16;
                });

            // canvas.window.get_scroll_wheel().map(|scroll| {
            //     println!("scrolling - x {} y {}", scroll.0, scroll.1);
            // });

            let left_down = canvas.window.get_mouse_down(minifb::MouseButton::Left);
            // println!("is left down? {}", left_down);

            let right_down = canvas.window.get_mouse_down(minifb::MouseButton::Right);
            // println!("is right down? {}", right_down);

            let middle_down = canvas.window.get_mouse_down(minifb::MouseButton::Middle);
            // println!("is middle down? {}", middle_down);

            let mut mask: u8 = 0;

            if left_down {
                mask |= 1;
            }

            if middle_down {
                mask |= 1 << 1
            }

            if right_down {
                mask |= 1 << 2
            }

            events.push(X11Event::PointerEvent((x, y, mask).into()));

            // Add code for receiver of input events
            for event in events {
                let _ = vnc.input(event).await;
            }

            let _ = canvas.flush();
            let _ = vnc.input(X11Event::Refresh).await;
            now = std::time::Instant::now();
        }
    }
    canvas.close();
    let _ = vnc.close().await;
    Ok(())
}
Screen.Recording.2024-01-19.at.2.16.57.PM.mov
@HsuJv
Copy link
Owner

HsuJv commented Jan 19, 2024

Hi @Advait1306

I'll be available about one day later
Will have a look at this issue then

BRs

@HsuJv
Copy link
Owner

HsuJv commented Jan 20, 2024

Hi @Advait1306 ,

Serval steps to improve the streaming quality:

  1. Run with the release build
  2. Remove the println!("width: {} height: {}", rect.width, rect.height); in CanvasUtils::draw method
  3. change the render interval from 16ms to 1ms
  4. If possible, use some structs like MouseUtil and KeyUtil to record the status of mouse & keys, and only send event when it was changed.

Hope that will be of help to you

BRs

@HsuJv
Copy link
Owner

HsuJv commented Jan 20, 2024

A simple example about the MouseUtil:

Note that it's also recommended to use one thread to handle input events and one thread to handle output events
But the code could be more complex so I didn't implement it.

use anyhow::{Context, Result};
use minifb::{Key, Window, WindowOptions};
use tokio::{self, net::TcpStream};
use tracing::Level;
use vnc::{PixelFormat, Rect, VncConnector, VncEvent, X11Event};

struct CanvasUtils {
    window: Window,
    video: Vec<u32>,
    width: u32,
    height: u32,
}

impl CanvasUtils {
    fn new() -> Result<Self> {
        Ok(Self {
            window: Window::new(
                "mstsc-rs Remote Desktop in Rust",
                800_usize,
                600_usize,
                WindowOptions::default(),
            )
            .with_context(|| "Unable to create window".to_string())?,
            video: vec![],
            width: 800,
            height: 600,
        })
    }

    fn init(&mut self, width: u32, height: u32) -> Result<()> {
        let mut window = Window::new(
            "mstsc-rs Remote Desktop in Rust",
            width as usize,
            height as usize,
            WindowOptions::default(),
        )
        .with_context(|| "Unable to create window")?;
        window.limit_update_rate(Some(std::time::Duration::from_micros(16600)));
        self.window = window;
        self.width = width;
        self.height = height;
        self.video.resize(height as usize * width as usize, 0);
        Ok(())
    }

    fn draw(&mut self, rect: Rect, data: Vec<u8>) -> Result<()> {
        // println!("width: {} height: {}", rect.width, rect.height);

        let bytes_per_pixel = 4;
        let grouped_pix: Vec<_> = data.chunks_exact(bytes_per_pixel).collect();
        let converted_data = grouped_pix
            .iter()
            .map(|x| u32::from_le_bytes(x[0..bytes_per_pixel].try_into().unwrap()) & 0x00_ff_ff_ff)
            .collect::<Vec<_>>();

        for y in 0..rect.height as usize {
            let start = (rect.y as usize + y) * self.width as usize + rect.x as usize;
            let converted_slice =
                &converted_data[y * rect.width as usize..(y + 1) * rect.width as usize];

            self.video[start..start + rect.width as usize].copy_from_slice(&converted_slice);
        }

        Ok(())
    }

    fn flush(&mut self) -> Result<()> {
        self.window
            .update_with_buffer(&self.video, self.width as usize, self.height as usize)
            .with_context(|| "Unable to update screen buffer")?;
        Ok(())
    }

    fn copy(&mut self, dst: Rect, src: Rect) -> Result<()> {
        println!("Copy");
        let mut tmp = vec![0; src.width as usize * src.height as usize];
        let mut tmp_idx = 0;
        for y in 0..src.height as usize {
            let mut s_idx = (src.y as usize + y) * self.width as usize + src.x as usize;
            for _ in 0..src.width {
                tmp[tmp_idx] = self.video[s_idx];
                tmp_idx += 1;
                s_idx += 1;
            }
        }
        tmp_idx = 0;
        for y in 0..src.height as usize {
            let mut d_idx = (dst.y as usize + y) * self.width as usize + dst.x as usize;
            for _ in 0..src.width {
                self.video[d_idx] = tmp[tmp_idx];
                tmp_idx += 1;
                d_idx += 1;
            }
        }
        Ok(())
    }

    fn close(&self) {}

    fn test(&mut self) {
        self.init(1920, 1080);
    }

    fn hande_vnc_event(&mut self, event: VncEvent) -> Result<()> {
        match event {
            VncEvent::SetResolution(screen) => {
                tracing::info!("Resize {:?}", screen);
                self.init(screen.width as u32, screen.height as u32)?
            }
            VncEvent::RawImage(rect, data) => {
                self.draw(rect, data)?;
            }
            VncEvent::Bell => {
                tracing::warn!("Bell event got, but ignore it");
            }
            VncEvent::SetPixelFormat(_) => unreachable!(),
            VncEvent::Copy(dst, src) => {
                self.copy(dst, src)?;
            }
            VncEvent::JpegImage(_rect, _data) => {
                tracing::warn!("Jpeg event got, but ignore it");
            }
            VncEvent::SetCursor(rect, data) => {
                if rect.width != 0 {
                    self.draw(rect, data)?;
                }
            }
            VncEvent::Text(string) => {
                tracing::info!("Got clipboard message {}", string);
            }
            _ => tracing::error!("{:?}", event),
        }
        Ok(())
    }
}

struct MouseUtil {
    pub mask: u8,
    pub x: u16,
    pub y: u16,
}

impl MouseUtil {
    fn new() -> Self {
        Self {
            mask: 0,
            x: 0,
            y: 0,
        }
    }

    fn changed(&mut self, window: &Window) -> bool {
        let mut x = 0;
        let mut y = 0;

        window.get_mouse_pos(minifb::MouseMode::Clamp).map(|mouse| {
            // println!("Mouse position: x {} y {}", mouse.0 as u16, mouse.1 as u16);
            x = mouse.0 as u16;
            y = mouse.1 as u16;
        });

        // canvas.window.get_scroll_wheel().map(|scroll| {
        //     println!("scrolling - x {} y {}", scroll.0, scroll.1);
        // });

        let left_down = window.get_mouse_down(minifb::MouseButton::Left);
        // println!("is left down? {}", left_down);

        let right_down = window.get_mouse_down(minifb::MouseButton::Right);
        // println!("is right down? {}", right_down);

        let middle_down = window.get_mouse_down(minifb::MouseButton::Middle);
        // println!("is middle down? {}", middle_down);

        let mut mask: u8 = 0;

        if left_down {
            mask |= 1;
        }

        if middle_down {
            mask |= 1 << 1
        }

        if right_down {
            mask |= 1 << 2
        }

        if (self.x, self.y, self.mask) != (x, y, mask) {
            (self.x, self.y, self.mask) = (x, y, mask);
            true
        } else {
            false
        }
    }
}

#[tokio::main]
async fn main() -> Result<()> {
    // Create tracing subscriber
    #[cfg(debug_assertions)]
    let subscriber = tracing_subscriber::fmt()
        .pretty()
        .with_max_level(Level::ERROR)
        .finish();

    #[cfg(not(debug_assertions))]
    let subscriber = tracing_subscriber::fmt()
        .pretty()
        .with_max_level(Level::INFO)
        .finish();

    tracing::subscriber::set_global_default(subscriber).expect("setting default subscriber failed");

    let tcp = TcpStream::connect("192.168.1.185:5900").await?;
    let vnc = VncConnector::new(tcp)
        .set_auth_method(async move { Ok("123456".to_string()) })
        .add_encoding(vnc::VncEncoding::Tight)
        .add_encoding(vnc::VncEncoding::Zrle)
        .add_encoding(vnc::VncEncoding::CopyRect)
        .add_encoding(vnc::VncEncoding::Raw)
        .allow_shared(true)
        .set_pixel_format(PixelFormat::bgra())
        .build()?
        .try_start()
        .await?
        .finish()?;

    let mut canvas = CanvasUtils::new()?;
    let mut mouse = MouseUtil::new();
    // canvas.test();
    let mut now = std::time::Instant::now();
    let mut pressed_keys = Vec::<u32>::new();

    loop {
        let mut events = Vec::<X11Event>::new();

        match vnc.poll_event().await {
            Ok(Some(e)) => {
                let _ = canvas.hande_vnc_event(e);
            }
            Ok(None) => (),
            Err(e) => {
                tracing::error!("{}", e.to_string());
                break;
            }
        }

        // canvas
        //     .window
        //     .get_keys_pressed(minifb::KeyRepeat::No)
        //     .iter()
        //     .for_each(|key| {
        //         let converted_key = convert_key_to_u32(*key);

        //         if !pressed_keys.contains(&converted_key) {
        //             println!("Key pressed: {:?}", key);
        //             pressed_keys.push(converted_key);
        //             events.push(X11Event::KeyEvent((converted_key, true).into()))
        //         }
        //     });

        // canvas.window.get_keys_released().iter().for_each(|key| {
        //     let converted_key = convert_key_to_u32(*key);

        //     if pressed_keys.contains(&converted_key) {
        //         println!("Key released: {:?}", key);
        //         pressed_keys.retain(|&x| x != converted_key);
        //         events.push(X11Event::KeyEvent((convert_key_to_u32(*key), false).into()))
        //     }
        // });

        if mouse.changed(&canvas.window) {
            let _ = vnc
                .input(X11Event::PointerEvent(
                    (mouse.x, mouse.y, mouse.mask).into(),
                ))
                .await;
        }

        if now.elapsed().as_millis() > 16 {
            // Add code for receiver of input events

            let _ = canvas.flush();
            let _ = vnc.input(X11Event::Refresh).await;
            now = std::time::Instant::now();
        }
    }
    canvas.close();
    let _ = vnc.close().await;
    Ok(())
}

@Advait1306
Copy link
Author

Hey thank you so much for these improvements. I'll look into implementing these over the weekend and getting back to you with the results.

@Advait1306
Copy link
Author

Hey sorry for the late updates, I finally transferred the keyboard and button press events to another thread & that improved the performance a lot.

Is there a way for me to set the quality of output?

@HsuJv
Copy link
Owner

HsuJv commented Mar 7, 2024

I guess you want to change the resolution of the vnc client?
If so, the answer shall be no way. The resolution is unilaterally announced by the vnc server (via the ServerInit message). You can use some command vncserver :1 -geometry 1920*1080 to get have a 1080p output or vncserver :1 -geometry 4096*2160 to have a 4k output.

BRs.

@Advait1306
Copy link
Author

So when I use noVnc for my client I get an option to set qualityLevel.

This is the description of that property:

Is an int in range [0-9] controlling the desired JPEG quality. Value 0 implies low quality and 9 implies high quality. Default value is 6.

What could be the equivalent in this library?

@HsuJv
Copy link
Owner

HsuJv commented Mar 7, 2024

Nope, actually from my pov it seems this option only affects the "JPEG" encoding which we currently don't support.
Note that we are now only supporting Raw, Trle, Zrle, Tight encodings.
Maybe I can later add the support of JPEG encoding.

BRs

@Advait1306
Copy link
Author

Okay got it, I was looking for that option. I'll try implementing it myself and raise a PR. If I have any doubts I'll create an issue and we can continue the conversation there.

Any links to documentation that can help me?

@HsuJv
Copy link
Owner

HsuJv commented Mar 7, 2024

Well, it might be not so difficult if you want to have a try.

  1. Add related encodings Jpeg, JpegQuality0, JpegQuality1.....JpegQuality10 to enum VncEncoding (in main/src/config.rs), referring https://github.com/novnc/noVNC/blob/master/core/encodings.js
  2. Add a jpeg.rs under the codec fold, referring raw.rs
  3. Implement the Decoder structure and functions, referring https://github.com/novnc/noVNC/blob/master/core/decoders/jpeg.js
  4. Add the JpegDecoder to asycn_vnc_read_loop method in file src/client/connection.rs
  5. Add APIs such as SetJpegQuality6 to call add_encoding(vnc::VncEncoding::JpegQuality6) just like what we do when we add other encodings.

If any questions, please feel free to ask.

BRs.

@HsuJv
Copy link
Owner

HsuJv commented Mar 7, 2024

And there's one thing that I'm not sure about: The uncompressed image data in JPEG encoding may be rendered in a different way compared with the other encodings.

All other encodings have the uncompressed data represent a sequence of 32-bit pixels but the JPEG encoding perhaps represents a jpeg picture, which may need further configuration on the client side.

You may test whether VncEvent::JpegImage or VncEvent::RawData should be used when generate output events.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants