-
Notifications
You must be signed in to change notification settings - Fork 657
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[css-fonts] Proposal to extend CSS font-optical-sizing #4430
Comments
@Lorp thanks for raising this, and for the detailed and clear write-up. @litherum it would be interesting to hear the WebKit perspective on this. Was this simply an oversight, so it should be treated as a spec-compliance browser bug, or was this a deliberate decision and if so, what was the rationale? |
Laurence and Adam, thanks for the proposal and what sounds like a generally reasonable approach to me. However, I have some questions on the Controversy section.
Could you clarify for which versions and environments you arrived at this conclusion? When I implemented font-optical-sizing, I found that latest Safari tip of tree uses CSS px (1, 2), and so does Firefox last time I checked (so I disagree with "Most browser implementers interpret Agree that still, there is potentially and interoperability between say a printing use of a font vs. its use as a web font and there is no affordance for mapping to the intended |
The problem is that all browser implementations ignored the OpenType Spec from 2016 and did something different, all in the same way, but we also now have a lot of fonts that were made with opsz axes according to the OpenType Spec. A CSS px is device pixels per inch / 96, and a CSS pt is device ppi / 72. This is a big deal and imho the default value of this property should be 0.75. |
And even if we don't end up adding this property, the spec should clarify that |
I notice that the one rendering-based WPT test for optical sizing checks that the result of
matches. In other words, the test checks that |
@svgeesus agreed @davelab6 agreed, with the caveat that your “pixels per inch” should be in quotes, since these pixels are of course based on the visual angle subtended by an idealized median desktop pixel from the year 2000, and thus the physical measure of 1pt in CSS varies significantly between devices/user agents :) But that’s a whole nuther discussion. @drott thanks for that important correction. As Dave and Chris confirm, the fixed opsz:px ratio value of 1.0 is definitely too high, and even if it were set at 0.75 in all browsers I would resubmit the proposal for the benefits of varying it. |
Minor comment: Why use a float value for PS: A |
The observation that we made when we implemented optical sizing is that 1 typographic point = 1 CSS pixel. This is clearly true: Safari, Firefox, Chrome, Pages, Microsoft Word, raw Core Text, and TextEdit all agree. Here is an image of Ahem rendered in all of those using a size of 48 typographic points: You can see that the rendered size is the same in all of these different apps. In addition, Pages even puts units on the font size: "pt". The documentation for raw Core Text (the second to rightmost app in the above image) also indicates that its size is in points:
So, when we apply an optical-sizing value of X typographic points, that is equal to CSS pixels, and we apply this correctly. I'm not making a point (no pun intended) about what should be true, or what the designers of desktop typographic systems intended to be true. Instead, I'm making a point about what is, de facto, true in reality. |
@litherum please could you show us this on windows? |
This is absolutely not true. See #614 |
We are absolutely applying it correctly. See #4430 (comment) |
@litherum when you say “typographic point” you seem to be referring to a very Apple-centric measurement, whose dimensions (measured with a ruler off a screen) started off in the 1980s as exactly 1/72 inches, conveniently aligning with early Macs having exactly 72 pixels to the inch. However since then, the size of the Apple 100% zoom screen point has varied significantly depending on the device. At the same time, Apple UIs and documentation continue to refer to this measure as “points” without very often reminding users or developers that each device applies its own scale factor such that these are no longer the 1/72 inch points defined in dictionaries. As Dave implies, Windows specified its own definition for “standard screen resolution” of 96ppi, and, respecting the idea that a real-world point is 1/72 inches, observed a 4:3 ratio for a font size: in a traditional Windows app, 9 points is 12 actual pixels. Higher device resolutions meant that these pixels became virtual, and that Microsoft, like Apple, could gradually shrink the size of that virtual pixel and, along with it, the physical size of the Windows point. In CSS, the idea of the “pt” unit is de facto standardized on the Windows relationship of points to pixels, and CSS px are based on the idea of the visual angle subtended by a single pixel on a ~2000-era computer. Thus modern browsers including Safari treat 3pt = 4px. It was therefore natural that font developers assumed browsers would adopt whatever the user agent defined a CSS pt to be as the scale for the opsz axis. BTW it is regrettable there is no online reference for what 1px (CSS) measures on modern devices. Here are two classic pt measurements and three data points I just measured with my ruler. In all cases, 1pt (CSS) is 4/3 the size of 1px.
|
Yes, according to CSS Values and Units, 3 CSS points = 4 CSS pixels. We agree here. I'm claiming something different, about how CSS measurements relate to non-CSS measurements.
CSS px is intentionally divorced from physical measurements. Again, see #614 |
I'm not as nuanced in the CSS unit system, but it doesn't look to me like 1 typographic point == 1 CSS px, at least not on my Windows 10 machine. @litherum, am I misinterpreting your comment above?
The concern I have relates to how fonts are built. Microsoft Sitka has the following styles:
When a web developer sets font-size to 12pt (or 16px), it should be using the Text style of Sitka as that's optimized for body sizes (i.e. opsz=12). As I understand it, though, Safari, Firefox, and Chrome will pass 16 for the opsz in both these cases, resulting in Subheading being displayed, degrading the legibility of the font somewhat and deviating from the intention of the font designers. (Matthew Carter and John Hudson spent hours staring at different sizes and styles to determine these numbers, which is partly why they're strange numbers like 9.7). If my interpretation of how browsers are working is correct, I worry that font designers will be struck with a difficult choice: build your font for the web, or for print - because you'll need different values of opsz for each to get exactly the results you'd like (type designers being a picky lot). They may choose to ship two versions (much to the confusion of their customers), or set values based on web or print depending on what their particular customers tend to use (thus you'll have customer confusion when one type studio caters to print media and another to web). I hope, however, I'm just thoroughly confused and everything is fine (i.e. 12pt or 16px == opsz 12). |
Yes, thank you for this suggestion, it was quite illuminating. Here, no apps state any units, but you can see that the size in the native apps is different than CSS pixels in the browsers. The blue app feeds “48” directly into
From this result, it appears that the size of a typographical point is different between Windows and macOS / iOS. This is a very interesting result, and I didn't realize it or try on Windows when implementing this. Thanks @davelab6 for the suggestion! |
This is very interesting. When a web developer sets font-size to 12pt (or 16px) on San Francisco on macOS & iOS, it should be using the optical sizing value of opsz=16. So, this seems to agree that the size of a typographical point is different depending on which platform you're using. We're making progress! |
Indeed. My “point” is that the virtual “points” used by Apple and Microsoft have real-world measurements that not only vary just as much as the CSS px, but are also defined differently from each other (with a ratio of 4:3) — and web browsers adopted the Microsoft definition. |
Font makers are going to be pretty consistent in interpreting a point to = 1/72 inch — that's how a point is defined in the mental space in which we operate, and has been for a long time —, and that's the unit in which we specify values on the opsz axis. If there's a notion of a 'typographical point' in use in CSS or other environments that is different from 1/72 point, than a) that seems a bad idea, and b) y'all are going to need to make scaling calculations to get the correct optical size design from the opsz axis. If it helps, we could add an explicit statement to the opsz axis spec that 'point' in that context = 1/72 inch. |
Am I reading this thread correctly, in that basically the only fonts that assume |
Thanks Laurence and Adam for bringing this up. That it comes up again and again I think is the result of not taking issues of web typography on in real time, waiting for things to go wrong and then trying to fix them.
@tiroj That’s agreed, FB and others join you in making opsz decisions based on typographic points. And we make a series of decisions inside the em, about how points are going to be distributed among glyph measures. This relationship between what is inside of the em, and what was going to happen outside, within 1/10,000”, used to be known and proudly used to make a vast range of things people wanted to read, or see. That craft has not evolved very well as we can see. Right now, if type needs a small size and a W3C presence, a rut pretty clearly exists where it’s best if everything opaque inside the em is around just one measure that rounds to a minimum of little more than two px, (see default san serif fonts of the world), and that rut is swallowing the design of fonts, logos, and icons. So I’d like to be onboard en route to discarding the tortured histories of non-standard rounding of 72, non-standard presentaion of what was purported to be 72, personal opinions of other people’s opsz implementations, distance of the user for whatever reason, false reporting of ppi by device manufacturers, and adoption of any of the above by W3C, or in practice there. What to do to provoke a path to addressing the users’ stated device ppi, via ppi/72 = pts. per pixel or pixels per point? That is the question i want answered that I think “px” alone, or associated in some magically way with an actual size like opsz, does not. |
I think I see what's happening.
Yes, we agree. Let's take a trip back in time, before the Web existed, when early Apple computers were being designed. Here, the OS was designed for 72dpi devices, such that one typographical point = 1 pixel on the screen. I don't think this is true for Windows (though someone can correct me if I'm wrong). This design has continued forward to today, and even into iOS, even being generalized from the concept of a pixel into the modern concept of "Cocoa point." Different physical devices are shipped with different physical DPIs because of physical constraints, but the design of the OS has followed this design from the beginning. Then, the Web was invented, and CSS along with it. Using Now, we fast forward to today, where we are discussing optical sizing. This is a feature that is defined to be represented in typographical points - specifically not physical pixels or CSS points. macOS and iOS are still designed under the design that one Cocoa point = 1 typographic point. So, if someone was trying to achieve a measurement of 1 typographical point = 1/72 inches (not CSS inches!) on macOS or iOS, the correct way to achieve that would be use a value of one Cocoa point, and the way of representing one Cocoa point in every browser on macOS & iOS is to use 1 CSS pixel. I can't speak about any other specific OSes, but we can consider a hypothetical OS which was designed where 1 pixel = 1/96 inch = 3/4 typographical points. If someone was trying to achieve a measurement of 1 typographical point = 1/72 inches (not CSS inches!) on this hypothetical OS, the correct way to achieve that would be use a value of 4/3 pixels, and the way of representing 3/4 pixels in the browser might be to use 3/4 CSS pixels. |
Um, actually, some browsers violated the original spec and authors relied on their behavior, so CSS was changed to accommodate them and the rest of the browsers had to follow suit. Originally, |
If everyone agrees that Open Type for Only as a fallback, they may assume one of the classic values, i. e. 25400 µm = 1 inch = 72 or 96 device pixel or an integer multiple thereof like 216 (Retina Unlike |
Hi Miles,
Isn't this where it falls apart? You've defined a Cocoa point as =1/72 inch, but a CSS pixel is defined as 1/96 inch. So treating on typographic point as = one Cocoa point but using one CSS pixel to represent one Cocoa point is going to mess up the sizing of anything specified in typographic points. I'm almost afraid to ask what a 'CSS inch' is. Are you referring to the fact that at lower resolutions there is rounding in display of absolute measurements? Otherwise a CSS inch is the same as a standard inch, no? [The whole question of how best to implement resolution- and other device-dependent adjustments in OT variations design space is something most people are praying will go away. It may yet need to be better addressed.] |
Not at all. CSS pixels are not defined to have any physical length. A CSS inch is defined to be equal to 96 CSS pixels. It’s up to each UA to pick a size for 1 CSS pixel. All major browsers on the Mac agree to set 1 CSS pixel equal to 1 Cocoa point. The design of the OS models this as equal to 1/72 physical inch (though if you get your ruler out, you’ll find that the physical pixels don’t exactly match this). Changing Mac browsers to treat 1 CSS pixel as 3/4 Cocoa point would Introduce the behavior the OP is asking for. However, that would 1) change the rendering of every website on the web, confusing users 2) remove interop that is already present, and 3) cause integral-px borders (which are common) to get fuzzy. Changing every website because of optical sizing, which is only used on few websites, doesn’t seem worth it. If the optical sizing value was defined to be set in CSS points, there would be a different story. However, it is defined to be set in typographic points, and macOS and iOS are correctly honoring that definition. |
Can you point me to the specification for this, because everything I've found so far suggests the opposite, that a CSS pixel is 1/96 of a standard inch (which is precisely how I recall it being defined when the move was made to make px a non device pixel measurement). I've not found anything that suggests that a CSS inch is derived from 96 CSS pixels, rather than the other way around.
Let me see if I get this straight: You're standardising 1 CSS px = 1 Cocoa pt = 1/72 standard inch. Yes? But 1 CSS px = 1/96 of a CSS inch. Yes? So, for your OS, 1 CSS inch = 1⅓ standard inch. Yes? |
https://drafts.csswg.org/css-values-3/#absolute-lengths
Yes! This is a better result than having most borders end up being fuzzy. |
What should anyone do to move this forwards? |
I submit that, with the improvements from @fantasai and @jfkthame, we can assemble a proposal that:
Recall that @fantasai’s suggestion to add Recall @jfkthame’s suggestion to specify a length rather than a number. Lengths will typically be in By combining the two suggestions we have a way of selecting a particular opsz axis setting: Here is a list of examples: font-optical-sizing: 14pt; /* use whatever opsz the browser wants to use for 14pt text */
font-optical-sizing: 14px; /* use whatever opsz the browser wants to use for 14px text */
font-optical-sizing: 14pt pt; /* use 14 opsz */
font-optical-sizing: 14px px; /* use 14 opsz */
font-optical-sizing: 14; /* use 14 opsz */
font-optical-sizing: pt; /* use pt as the unit for matching font-size to opsz */
font-optical-sizing: px; /* use px as the unit for matching font-size to opsz */
font-optical-sizing: 0.5em; /* use 0.5× whatever opsz the browser would have chosen for this font-size */
font-optical-sizing: 2em; /* use 2× whatever opsz the browser would have chosen for this font-size */
font-optical-sizing: 0.5em pt; /* use 0.5× current font-size in pt as the opsz value */
font-optical-sizing: 0.5em px; /* use 0.5× current font-size in px as the opsz value */
font-optical-sizing: 0.5rem pt; /* use 0.5× root font-size in pt as the opsz value */
font-optical-sizing: 0.5rem px; /* use 0.5× root font-size in px as the opsz value */ It may be useful, as @davelab6 suggests, to allow the components of the value to be specified separately: font-optical-sizing-length: <length> | <number>;
font-optical-sizing-unit: auto | px | pt; Note that without the What we have not done above is to specify what the browser default behaviour font-optical-sizing: 1.0em px; /* for screen media (led by Safari, followed by others) */
font-optical-sizing: 1.0em pt; /* for print media */ |
That suggestion was made before the opsz axis definition in the OT spec was revised and clarified, when it seemed that different environments might interpret the unit scale differently when using |
I believe @fantasai’s suggestion still makes sense. The OT spec definition of opsz makes no mention of CSS, but CSS and other implementations must base their opsz unit scale on something, and that’s what we’re trying to clarify, and offer control of, here for CSS. That something — in current shipping browsers — is two 1:1 mappings:
Thus text in a fixed font-size set in a font with an opsz axis uses different glyph shapes on screen and in print. Having the option to fix the browser to one of these scales, so as to guarantee identical glyph shapes for screen and print, seems to be a reasonable desire. Not default, but something that a CSS author should have available to them. BTW nothing in the proposal prevents a future or special-purpose user agent (such as a browser optimized for accessibility) having a completely different default from Safari. In fact the proposal provides a flexible model for defining defaults. |
Right. The reason the opsz definition doesn’t mention CSS is that CSS doesn’t have a unit that reliably corresponds to the physical 1/72 inch typographic point that is referenced there and elsewhere in the OpenType specification. So the opsz definition is worded in such a way as to make clear that if implementers want to make accurate selection of opsz instance as designed then they need to calculate the value—possibly taking into account known or presumed distance from reader—and not just presume a 1:1 mapping from CSS units. And I stressed that ‘if’ because I wrote the opsz definition text in the knowledge that implementers were going to have different priorities, and that while some would opt for accuracy, some would seek expedience, and some settle for ‘close enough’. While I can see the usefulness you describe, my concern with CSS syntax that bolts opsz axis units to CSS units is that it perpetuates the idea that such 1:1 mappings are a reasonable way—or even a good way—to make opsz instance selection. |
User agents, in the context of their OS and device, already make significant decisions based on presumed distance. A fundamental result of this is the physical size of the It may well be that, longer term, CSS needs an improved mechanism for determining apparent text size, in which case a new unit, say I suggest we move forward with this proposal, and in parallel consider whether something like |
I think so. Having worked to get that clarification made, I'm aware of absolutely no effect of that work. It would be ideal if the default auto behavior was as the OpenType spec describes. But I'll settle for being able to configure to achieve the specced behavior, and being able to configure for other behavior.
Kindly this seems totally backwards. The browsers have unfortunately hard coded a 1:1 relationship, which is incorrect per OpenType, and this proposal is exactly about undoing that and allow different ratios. Different units helps. |
I also wonder if it's time for another axis/axes, like "ptsz" and or "pxsz", which can be more carefully implemented, and opsz can be deprecated. I also hesitate to suggest that, despite the work to reify the opsz definition as real physical pt, at some point the OpenType spec has to concede that implementations did something else and document what is actually implemented. |
I wish we were aiming for defining intended arc minutes in the eye. (Amin or arcm anyone?) But we are not looking at trying to calculate that and especially not responsively. So we are arriving at a best guess through assumed reader distance + reproduced scale. The fact that we are working to what is ultimately experienced in arc minutes in a such an oblique way seems like a big part of the reason discrepancies in terminology and definition are making progress hard. My comment is I admit a bit too casual. But I don’t like the idea of settling for pixel or reproduced size definitions that don’t include the lived reading experience except by implication.
…Sent from my iPhone
On May 15, 2021, at 11:12 AM, Dave Crossland ***@***.***> wrote:
I also wonder if it's time for another axis/axes, like "ptsz" and or "pxsz", which can be more carefully implemented, and opsz can be deprecated.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
We are aiming towards that. Device makers already fix the true length of 1px in their browsers using predicted reading distance as the key metric (with adjustments for actual pixel size to preserve sharpness), hence roughly constant arc minutes per px across devices. The proposal before us lets us tune this existing automation with custom formulae for opsz, in particular in case the 1px:1opsz default ratio benefits from some adjustment (it probably does!). I’d love this thing to move forward so that device makers and practising web typographers can set opsz’d type on real websites and measure reading experiences in ways that were not possible before, working with fonts including several high quality variable opsz fonts that have recently been published with libre licenses. Researchers can use actual reading distance and true px size as additional signals in their font-optical-sizing formulae. This proposal is not a panacea. It’s a significant step towards a panacea, and — with use & research — will help us define that panacea better. |
The problem with an arc minute unit or axis is that it implies correct knowledge of distance from eye to surface, which is not commonly available. |
We’ve been down this road before with every optical size selection mechanism introduced into OpenType: the size GPOS feature, the OS/2 size range values, and now opsz axis. We’ve been down similar roads with vertical metrics and several other aspects of fonts, adding bits and flags and new structures to work around failures to implement correctly. And then everyone complains about how messy the OT spec is.
See earlier discussion about impossibility of designing size-specific glyphs for unspecific sizes. So long as ‘what is actually implemented’ is 1:1 mapping of opsz values to non-physical units—instead of calculating between those units and the physical units of the type design space—it cannot be designed for. That is the issue for type designers and font makers. Now maybe it is simply going to be the case that CSS opsz is going to be a kind of endlessly adjustable fuzzy implementation, in which there isn’t actually any expected level of accuracy in design size selection, just something that might be somewhere in the ballpark, and type designers will stick to making size-specific designs primarily for print. And I am, perhaps surprisingly, sort of okay with that, because the unreliability of scaling anything on the web to be the same size in two different places seems a characteristic of the medium, so size-specific design was always going to be a problem. |
Came across https://docs.microsoft.com/en-us/archive/blogs/fontblog/where-does-96-dpi-come-from-in-windows while looking for something else - a nice note from 2005 on a topic related to this :) |
This is an excellent example of how nonsense ages into legacy nonsense.
On May 21, 2021, at 1:28 PM, Dave Crossland ***@***.***> wrote:
Came across https://docs.microsoft.com/en-us/archive/blogs/fontblog/where-does-96-dpi-come-from-in-windows while looking for something else - a nice note from 2005 on a topic related to this :)
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
@Lorp in your May 12 proposal am I correctly understanding that you are proposing no changes to all current browser implementations? Assuming so, I think there's 2 things to do. First, I think @litherum 's issue 1102532 on the Chromium bug tracker, proposing Chromium use CSS pt on Windows instead of px, be closed as "WAI". (While @litherum chose to make Webkit/Safari on macOS apply Second, I now advocate updating the OpenType spec to document what is actually implemented. @Lorp put it concisely:
|
No, I’m proposing no changes to default behaviour in currrent Apple browser implementations.
I’m no expert on cross-platform philosophy, but it seems a pity if content creators have to bear in mind a sporadic 4/3 factor if they care about precise glyph shapes, even when they specify font size and know the size of device they’re dealing with.
I’m not sure about it going into the spec like that, even if it is universal behaviour in browsers. It’s very much a CSS thing, and OpenType is not beholden to CSS, is it? It makes sense for such a compromised implementation to remain browsers’ default only if it’s easy to customize. And BTW there’s no real reason it needs to remain the default forever. |
You advocate making design for optical size impossible? Type designers have to be able to target specific sizes in a reliable unit, and to assign those designs to values on a reliable scale. How that scale gets interpreted and translated into other units is outside our control, but something needs to be in our control so we can do our job. We cannot design for a specific size and map it to e.g. opsz=9 if that means 9 px in screen media and 9 pt in print media. Trying to do so is no longer optical size design, it’s vaguely-bigger–or–smaller design. As I wrote earlier, I’m okay with fuzzy interpretation of opsz in browsers, because it’s characteristic of the medium that things at the same nominal size are different sizes in different places, but that fuzziness belongs at the interpretation and implementation level—where it can be more or less precise, more or less faithful to the opsz spec and the type designer’s intent as the implementor chooses—and not baked into that spec in a way that makes the whole concept of opsz design meaningless. |
The systems are implemented the way they are implemented. It seems irresponsible to pretend otherwise. |
I’m not pretending otherwise. I am saying your proposal doesn’t work as a solution. You proposed using different opsz scale units differently for web and for print. That isn’t workable because 9pt and 9px are different sizes in very many places, so the type designer cannot design something at ‘9’ on the opsz axis scale that is optically tuned for a specific size if that means 9pt in one medium and 9px—variously rendered—in a whole bunch of other media. A solution might be to register a second axis, so there are dedicated optical size axes for pt sizes in print and px sizes in web (and yes, there is an argument to be made viz existing implementations to redefine the existing opsz axis in terms of px, and hence to define a new axis in terms of pt). But the problem remains that px is not a size unit, so from my perspective this whole issue isn’t about the axis scale definition but about interpretation of the axis scale. Even if you have an axis scale that you define in terms of e.g. the CSS reference pixel—which is as close as this stuff gets to a size unit, so something a type designer could try to target—, you still have interpretation of that scale in rendering to px in various environments that will be only more or less accurate in terms of hitting what the type designer intended. As I said previously, I am okay with that uncertainty, so long as it is at the interpretation level and not something that is forced down onto the level where the type designer is trying to do optical size design. |
As a user I definitely prefer "same system" consistency to cross platform consistency. As I don't use Safari or own a Mac, being consistent with that for me really means "Inconsistent with every other app on my device". Why not be inconsistent with macOS then to follow other platforms? I also wonder how this is going to interact with things like Electron and Windows 11 using a variable UI font, are we going to end up with every electron app using the wrong instance of the UI font on Windows because it's trying to follow the type sizing notions of macOS? If Android eventually gains one the problem would just spread there as well. |
The main issue that you’re pointing at, davelab6 is that the systems were implemented in a way so humans must adjust to that implementation, which is different from the way humans can adjust to print. Humans adjust to small type by moving it closer or farther away, and to large type, often by moving their feet. Humans were expected to move their chairs farther from windows, and closer to the Mac, when those systems each assumed their own ideal distance, without specifying what that was. On the other hand, corresponding with @tiroj on this last year, during the discussion of his heroic rewrite to the spec., I found we disagreed fundamentally on opsz. John wrote to me, “…if I have to move closer to read 6 point, opsz has failed.” This is a point of disagreement I now have time to address, as I believe the user is the final arbitrator of accessibly, and it’s always been thus. We, type designers, do not control the user distance, but rather plan for what’s normal, and count on the OS to present choices the user can adjust to. That same correspondence yielded the request that I agree with the principle that we, type designers, view all the opsz we design from one distance, or people will be confused by the spec’s documentation. This too is impossible to agree with, as small opsz have small ranges of accessibility while larger sizes have large ranges of accessibility. I look at type designed for opsz from many distances. When Apple left the 1px = 1 pt and did not document the change, as an Apple developer, I complained. When MS deprecated the 72px per inch option in vista, as a Windows developer I complained. And when w3c adopted that “96”, and its view angle nonsense, I objected. I continue to object because print exists and is not going to change. Accessibly for users of uncommon vision, cannot be forced forever to zoom, and users of world scripts with a different opsz scale from Latin will not be encouraged by such a crappy spec or its twisted implementations. Three such groups, print users, accessibly users and world script should’ve been enough a long time ago to make this not about “same system” or “some system”, because it’s about one human audience. |
To clarify, I was talking specifically about myself in that context, and not about users in general. I can’t account for everyone elses’ eyesight, or the lighting conditions, or other factors that may affect a decision to adjust distance in the interest of comfort. I was talking about the same ‘normal’ as you: that if I have planned for that normal correctly, designing to an optical size at a presumed ‘normal’ distance, then I should not have to change that distance in order to read the designed optical size at that size. If I do need to change the distance, then I think probably I have designed some different optical size and given it the wrong number. |
@litherum I was happy to see this, it suggests there is hope for sizing with real points on macOS :) https://twitter.com/nikitonsky/status/1539636230371123200?t=TB6A1ELGpnaCFQBpNVbexQ&s=19 |
Hello all, I'm expressing support for this proposal on behalf of Dalton Maag, in particular for the option To give a concrete example of why we believe this matters, we recently released the typeface Marble Arch and the designers who defined and tested the behaviour of the opsz axis found that the observed difference in web browsers and in print was so important that we considered the option of releasing two versions of the font, one specifically for use on the web and another for use in desktop applications. In the end we decided it would be too confusing and inconvenient for users to go down that route, and we only released a single version of the axis that uses the definition of the OT spec and that looks best in desktop publishing apps. However, this means that currently the font looks "more delicate" than intended in most browsers, and that conditional glyphs (such as the ear of the lowercase /g) trigger at a lower font size than what designers intended. Another argument for the sake of consistency between browsers and desktop applications is Figma and other web-design related apps, which work as desktop applications and then can export automatically some HTML and CSS. The inconsistency in treatment of the opsz axis means that the exported designs would render inaccurately in browsers. |
Introduction
This proposal extends the CSS Fonts Module Level 4
font-optical-sizing
property by allowing numerical values to express the ratio of CSSpx
units to the the units used in theopsz
OpenType Font Variation axis. The ratio is intended to be multiplied byfont-size
, measured inpx
, allowing control over the automatic selection of particular optical size designs in variable fonts.The proposal resolves the conflicting implementations of the
font-optical-sizing: auto
behaviour, and provides additional benefits for font makers, CSS authors, and end-users.Examples
font-optical-sizing: 1.0;
current Apple Safari behaviour where 1px = 1 opsz unitfont-optical-sizing: 0.75;
Apple TrueType and OpenType behaviour where 1px = 0.75 opsz units (1px = 0.75pt in many user agents)font-optical-sizing: 0.5;
custom behaviour where 2px = 1 opsz unit, to “beef up” the text (e.g. in an accessibility mode for visually impaired end-users)font-optical-sizing: 2.0;
custom behaviour where 1px = 2 opsz units, to reduce the “beefiness” of the text (suitable for large devices)font-optical-sizing: auto;
use thefont-optical-sizing
ratio defined in the user agent stylesheetBackground
OpenType Font Variations in CSS
When the OpenType Font Variations extension of the OpenType spec was being developed in 2015–2016, Adam Twardoch and Behdad Esfahbod proposed the addition of the low-level
font-variation-settings
property to the CSS Fonts Module Level 4 specification, modeled afterfont-feature-settings
.For higher-level control of font variations, there was general consensus that the
font-weight
property would be tied to thewght
font axis registered in the OpenType specification,font-stretch
would be tied towdth
, whilefont-style
would be tied toital
andslnt
.OpenType opsz variation axis and CSS font-size
The consensus was that the CSS
font-size
property could be tied to the axis registered for optical size,opsz
. Theopsz
axis provides different designs for different sizes. Commonly, a lower value on theopsz
axis yields a design that has wider glyphs and spacing, thicker horizontal strokes and taller x-height. The OpenType spec suggests that “applications may choose to select an optical-size variant automatically based on the text size”, and states: “The scale for the Optical size axis is text size in points”. Apple’s TrueType Variations specification (on which OpenType Font Variations is based) also mentions point size as the scale for interpreting theopsz
axis: “'opsz', Optical Size, Specifies the optical point size.” It is notable that neither the OpenType spec nor Apple’s TrueType spec addresses the interpretation ofopsz
values in environments where the typographic point is not usefully defined.Optical sizing introduces a new factor in handling text boxes in web documents. If the font size of a text box changes, the proportions of the box not remain constant because of the non-linear scaling of the font; typically the width grows at a slower rate than the height, because of the optical compensations in typeface design mentioned above. Realizing that many web documents may rely on the assumption of linear scaling, Twardoch proposed an additional CSS property
font-optical-sizing
:auto
: “enables” optical sizing by tying the selection of a value on theopsz
axis to the font size changenone
: “disables” optical sizing by untying that selection, so font size change happens linearlyThe
font-optical-sizing
property is currently part of CSS Fonts Module Level 4 working draft.Controversy: opsz axis and CSS font-size (px vs. pt)
Unfortunately recent browser developments introduced ambiguity in terms of how
opsz
values should be interpreted:Most browser implementers interpret[In fact, Chrome and Firefox, as well as Safari, interpretopsz
as expressed in CSSpt
units (points). If optical sizing is enabled, all text has itsopsz
axis set to the value of the font size inpt
.opsz
inpx
units. Updated thanks to @drott’s comment below.]Apple in Safari has decided to interpret
opsz
as expressed in CSSpx
units (pixels). If optical sizing is enabled, all text has itsopsz
axis set to the value of the font size inpx
.Font makers and typographers are upset at Apple’s decision. They design fonts with the assumption that
opsz
is expressed in points. Sincepx
values are commonly higher thanpt
values (typically at a ratio of 4:3) interpretingopsz
inpx
means the that a higher optical size will be chosen than intended. For 9pt/12px text, theopsz
design12
will be chosen, which will yield text that is too thin, too tightly spaced, and potentially illegible. They argue that the user experience will degrade, and optical sizing will actually yield worse results than no optical sizing, effectively defeating the whole purpose and unjustly giving variable fonts bad reputation. Inconsistent behaviour with the same font will cause problems for font makers and CSS authors.Apple defends this decision, suggesting that CSS authors can simply set
font-variation-settings: 'opsz' n
.CSS authors object that using
font-variation-settings
breaks the cascade for font styling and, because of the nature of optical size, is unsuitable for application at the document root level. Therefore it will not get used.Proposed resolution: numerical values in font-optical-sizing
The CSS
font-optical-sizing
property currently controls the relationship betweenfont-size
andopsz
by means of a simple switch (auto
/none
). We propose to allow a numeric value forfont-optical-sizing
. This value expresses the ratio ofopsz
units to CSSpx
. Examples:font-optical-sizing: 1.0;
current Apple Safari behaviour where 1px = 1 opsz unitfont-optical-sizing: 0.75;
Apple TrueType and OpenType behaviour where 1px = 0.75 opsz units (1px = 0.75pt in many user agents)font-optical-sizing: 0.5;
custom behaviour where 2px = 1 opsz unit, which would “beef up” the text (suitable for very small devices)font-optical-sizing: 2.0;
custom behaviour where 1px = 2 opsz unit, which would “reduce the beefiness” of the text (suitable for large devices)font-optical-sizing: auto;
use thefont-optical-sizing
ratio defined in the user agent stylesheetResults
User agents can ship with default
font-optical-sizing
other than 1.0. (The CSS specification might recommend 0.75 as a reasonable default for most situations.)Font makers can ship a single font whose
opsz
axis works as intended in browsers as well as print.CSS authors can change the value whenever they like, independently of the choices made by browser vendors and font makers.
CSS authors can specify a different
font-optical-sizing
ratio for different media queries, including print, or for aesthetic purposes.End-users can be offered accessibility modes that choose low values for
font-optical-sizing
to ensure lower-than-defaultopsz
values and more legible text.Note
Proposers
The text was updated successfully, but these errors were encountered: