Description
When I first launch a kitty terminal window and then run a python script to get the screen dimensions, the returned height and width are twice the true values. If I then increase the font size and decrease back to its original size, the reported dimensions are correct.
See below for the get_screen_size.py script I am using. Just as an example, the initial output on launching kitty into an empty workspace is:
s.rows=58
s.cols=212
s.width=3816
s.height=2088
After I increase the kitty font size (bound to ctrl+shift+equal) and decrease it back to original size (ctrl+shift+minus) I get:
s.rows=58
s.cols=212
s.width=1908
s.height=1044
And thereafter all the results are correct.
I am running kitty on arch using wayland and sway wm and it seems possible that the problem lies outside kitty but seemed most likely it was a kitty issue so wondering if anyone can recreate the same issue.
#get_screen_size.py
import sys
import array
import fcntl
import termios
from dataclasses import dataclass, fields
@dataclass
class ScreenSize():
# @dataclass essentially creates an __init__ with these attributes
rows: int
cols: int
width: int
height: int
def get_screen_size():
buf = array.array('H', [0, 0, 0, 0])
fcntl.ioctl(sys.stdout, termios.TIOCGWINSZ, buf)
rows, cols, width, height = tuple(buf)
return ScreenSize(rows, cols, width, height)
if __name__ == "__main__":
s = get_screen_size()
print(f"{s.rows=}\n{s.cols=}\n{s.width=}\n{s.height=}")