Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Constant CPU usage per instance when idle #1782

Closed
Zapeth opened this issue Jul 6, 2019 · 5 comments
Closed

Constant CPU usage per instance when idle #1782

Zapeth opened this issue Jul 6, 2019 · 5 comments

Comments

@Zapeth
Copy link

Zapeth commented Jul 6, 2019

I'm seeing a constant 2.7-3% CPU usage on my system in top when launching a kitty instance, other terminals that I have tested do not show this behavior (urxvt, alakritty).

Output kitty --debug-config:

kitty 0.14.2 created by Kovid Goyal
Linux nichijou 4.19.57-1-lts #1 SMP Wed Jul 3 16:05:59 CEST 2019 x86_64
Arch Linux \r (\l)
Running under: X11

Config options different from defaults:

Hardware:

CPU:       Topology: Dual Core model: Intel Core i5-6300U bits: 64 type: MT MCP arch: Skylake rev: 3 L2 cache: 3072 KiB 
           flags: avx avx2 lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx bogomips: 19968 
           Speed: 906 MHz min/max: 400/3000 MHz Core speeds (MHz): 1: 986 2: 983 3: 981 4: 994 
Graphics:  Device-1: Intel Skylake GT2 [HD Graphics 520] vendor: Lenovo driver: i915 v: kernel bus ID: 00:02.0 
           Display: tty server: X.org 1.20.5 driver: i915 resolution: <xdpyinfo missing> 
           OpenGL: renderer: Mesa DRI Intel HD Graphics 520 (Skylake GT2) v: 4.5 Mesa 19.1.1 direct render: Yes 

I also have Gentoo running on the same machine with Wayland, and a quick check seemed to show the same problem, so I guess its either something in kitty's code or something device/driver-specific that causes this behavior.

I tried changing a couple of settings as well:
sync_to_monitor no -> had no visible effect on the usage

repaint_delay 100
input_delay 30

-> lead to a reduced CPU usage of about 0.3 - 0.4%, but still a constant factor (whereas other terminals would only occasionally show some non-zero usage)

Last but not least I saw some references to a README in a couple of issues for more detailed steps on how to better debug this, but I'm unable to find it so a link would be appreciated.

@kovidgoyal
Copy link
Owner

Presumably whatever you are running in the terminal is constantly generating output, check with --dump-commands you can also build from source with make debug-event-loop which will print out a bunch of things per event loop tick.

@Zapeth
Copy link
Author

Zapeth commented Jul 6, 2019

I have nothing running in the terminal, the cursor blink is the only visual difference over time (and even that is disabled after the automatic timeout).

The debug-event-loop output seems to indicate that the progam is constantly checking for new events with pretty much no timeout, not sure if thats intentional or not (but would explain the constant usage) -> https://pastebin.com/WajnnsCL

@Luflosi
Copy link
Contributor

Luflosi commented Jul 6, 2019

I also see a non-zero CPU usage on my machine, usually less than 1%, sometimes higher. Running make debug-event-loop and then executing kitty with kitty/launcher/kitty does not print any debug information except the occasional render frame re-request on my machine. Once App Nap, a macOS energy saving feature that kicks in after an app was inactive for a little bit, suspends kitty, the CPU usage drops to 0%.

@kovidgoyal
Copy link
Owner

@Luflosi on macOS because of the way CVDisplayLink works there will always be some CPU activity, since that fires periodically at the monitor refresh rate. However, in master, kitty turns it off automatically after 30 seconds of inactivity. And of course, app nap will also disable it automatically.

@Zapeth
Copy link
Author

Zapeth commented Jul 7, 2019

I can confirm that this fixes the issue, thanks.

Though this lead me to notice that even in case of events happening (cursor blink, typing, mouse movement, etc) kitty's CPU usage is abnormally high, even with increased delays, at least when compared to urxvt (1-3% vs 0.3-0.7%).

Maybe I'll further investigate at a later point in time, now that I'm more familiar with the event loop and building the source.

kovidgoyal added a commit that referenced this issue Jul 15, 2019
This matches behavior on macOS. Had initially set the code to process
on every loop tick in an attmept to workaround the issue of the event
loop freezing on X11 until an X event is delivered. However, in light
of #1782 that workaround was incorrect anyway. Better to have similar
behavior across platforms. This also has the advantage of reducing CPU
consumption.

Also add a simple program to test event loop wakeups.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants