Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Font sizes don't respect the monitor DPI on Linux #2596

Closed
fschutt opened this issue Mar 31, 2018 · 7 comments
Closed

Font sizes don't respect the monitor DPI on Linux #2596

fschutt opened this issue Mar 31, 2018 · 7 comments
Labels

Comments

@fschutt
Copy link
Contributor

@fschutt fschutt commented Mar 31, 2018

Whenever I create a FontInstanceKey with, let's say, 10px, the actual font that ends up on the screen is 6px or 7px large.

For reference, I set the font in the title bar of the window to the same font at 10px:

Webly Sleeky UI:

azul_webrender_bug

Droid Sans:

pic_azul_wrong_droidsans

Both times, I set the font size to exactly 10 px * AU_PER_PX. The problem is that I don't know what kind of scale factor I have to apply to get the actual, correct 10px font size on the screen. I originally noticed this as a bug in limn here (last bullet point), so it's not only my code.

What kind of scaling does the add_font_instance() apply and how do I get the font to be actually 10 pixels high? To be clear, the HighDPI factor is set to 1 in both examples - the high_dpi factor reports "1.0", and my screen is set to 96 DPI.

@fschutt fschutt changed the title FontInstanceKey doesn't create the correct font size FontInstanceKey doesn't use the correct font size Mar 31, 2018
@fschutt
Copy link
Contributor Author

@fschutt fschutt commented Mar 31, 2018

I see where the issue is:

FT_Set_Char_Size(
face.face,
(req_size * x_scale * 64.0 + 0.5) as FT_F26Dot6,
(req_size * y_scale * 64.0 + 0.5) as FT_F26Dot6,
0,
0,
)

The DPI value (last two parameters) gets set incorrectly. The freetype reference says:

Value of 0 for the horizontal resolution means ‘same as vertical resolution’, value of 0 for the vertical resolution means ‘same as horizontal resolution’. If both values are zero, 72 dpi is used for both dimensions.

So, FreeType assumes that the DPI is always 72. If I set the values to "96", I get the correct size, without having to calculate some scale factor. For now, I think I can hack around this on Unix, by multiplying the font size with (system DPI / 72). However, this should really be fixed.

@fschutt fschutt changed the title FontInstanceKey doesn't use the correct font size Font sizes don't respect the monitor DPI on Linux Mar 31, 2018
@fschutt
Copy link
Contributor Author

@fschutt fschutt commented Mar 31, 2018

The question is:

  • Does this appear on other operating systems, too? (can't test this currently)
  • How do we get the acutal DPI of the monitor? Pure OpenGL doesn't provide this.

Also, if I set the font size to 96 / 72, I don't get the same result as if I put 96 into the FT_Set_Char_Size function (for whatever reason). Previous sentence is wrong, you actually do ge the same result. I would propose to add this to the Device struct. Then, implementations can go ahead and calculate the DPI for their backend (window, offscreen) how they wish.

@fschutt
Copy link
Contributor Author

@fschutt fschutt commented May 24, 2018

Ah, I think I realize what the error was:

The font in the title bar is 10pt, but the webrender font is 10px. I was assuming that 10px = 10 pixels on the screen. This doesn't seem to be the case - if you go to any site (even in Chrome) and set the font size of something to 20px, then take a screenshot and measure the height, it won't be actually 20 pixels high.

So in short, if you want webrender to create a font with exactly X pixels high, it should be [font size] multiplied by 96.0 / 72.0 (pt-to-px conversion). Maybe this could be expressed with a newtype struct.

@kvark
Copy link
Member

@kvark kvark commented May 29, 2018

How do we get the acutal DPI of the monitor? Pure OpenGL doesn't provide this.

winit provides it: https://docs.rs/winit/0.15.0/winit/struct.Window.html#method.hidpi_factor

So in short, if you want webrender to create a font with exactly X pixels high, it should be [font size] multiplied by 96.0 / 72.0 (pt-to-px conversion). Maybe this could be expressed with a newtype struct.

Don't you think it should be fixed on our side? We could compute the dpi for the font rasterization based on the base of 72 and our device/pixel ratio (that we get from winit).

@gw3583
Copy link
Collaborator

@gw3583 gw3583 commented May 29, 2018

I'd be hesitant to change how WR calculated this internally unless someone on Gecko agrees it's a good change.

@gw3583
Copy link
Collaborator

@gw3583 gw3583 commented Aug 2, 2018

@fschutt Is this still relevant?

@fschutt
Copy link
Contributor Author

@fschutt fschutt commented Aug 2, 2018

For me, the 96.0 / 72.0 thing is "good enough" and I also incorporate winits "physical size / virtual size" of the window. Maybe it should be mentioned more explicitly that webrender expects points, not pixels. But this was a bug on my part, I think and there shouldn't be any change from the webrender side.

I was just concerned because webrender didn't render things as big as I thought, so I thought something was broken inside the renderer / rasterizer.

@fschutt fschutt closed this Aug 2, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
3 participants
You can’t perform that action at this time.