Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Should EnforceRange be allowed on consts and return types? #842

Open
kainino0x opened this issue Feb 11, 2020 · 23 comments
Open

Should EnforceRange be allowed on consts and return types? #842

kainino0x opened this issue Feb 11, 2020 · 23 comments

Comments

@kainino0x
Copy link

WebIDL specifies that a[EnforceRange] types are not valid for readonly attributes. However, it doesn't seem to specify whether they are valid for:

It seems like all 3 should be the same.

That said, I would argue that EnforceRange be valid in all of these cases, just that it would have no effect. This makes sense because: (a) it allows a typedef of an [EnforceRange] integer type to be used consistently and (b) it matches the behavior of reading from a non-readonly attribute ([EnforceRange] is meaningless/ignored).

@annevk
Copy link
Member

annevk commented Feb 11, 2020

If specification text returns an out-of-range number for a getter or method, would that trigger an assert or should it be coerced?

@Ms2ger
Copy link
Member

Ms2ger commented Feb 11, 2020

I think that should be a spec error in any case, even if EnforceRange isn't present.

@annevk
Copy link
Member

annevk commented Feb 11, 2020

That seems fine for this case (and let's assert for it in IDL). (I think sometimes coercion or some flexibility around types makes sense, but I agree that for clearly erroneous cases it would be better to assert.)

@domenic
Copy link
Member

domenic commented Feb 11, 2020

I would strongly prefer not to allow these sorts of no-op extended attributes. They muddle the purpose of the extended attribute, which is to operate on inputs by changing how JS type conversions are done.

@annevk
Copy link
Member

annevk commented Feb 11, 2020

Well, that's still the use case, but they want to reuse that "type" for other purposes.

@domenic
Copy link
Member

domenic commented Feb 11, 2020

I don't quite understand; can you give an example of a case where you cannot remove the extended attribute when using it as a return value or const?

@annevk
Copy link
Member

annevk commented Feb 11, 2020

OP gave an example of GPUBufferUsageFlags which seems somewhat reasonable to me. (Assuming that WebGL's rather exceptional usage of IDL is okay.)

@domenic
Copy link
Member

domenic commented Feb 11, 2020

In that case I would suggest using [EnforceRange] GPUBufferUsageFlags for inputs, and GPUBufferUsageFlags for outputs. This also makes the spec a lot easier to read, IMO.

@kainino0x
Copy link
Author

kainino0x commented Feb 11, 2020

In WebGPU, we want the strongest possible typing on every input integer, because any kind of coercion would be extremely confusing to developers. I don't find it easier to read if we have to pepper [EnforceRange] on EVERY integer type throughout the entire spec, which is I proposed the typedefs in WebGPU (linked above). Plus, it is error-prone, as it's never obvious when an [EnforceRange] is missing - whereas, with the typedefs, any usage of a bare long is an error.

If WebIDL won't allow [EnforceRange] on output types, then I would like to keep the typedefs and cherry-pick the places where we have to use undecorated integers.

Could we get text in the spec about consts/return types/anything else, and ideally more validation checks (that would run in bikeshed) for invalid usages of [EnforceRange]?

@kainino0x
Copy link
Author

We are trying to figure out whether we should comply with this hypothetical rule (that [EnforceRange] shouldn't be allowed on consts). Could we get some resolution on this in the WebIDL spec?

FWIW, I think that we (the WebGPU editors) still would prefer if typedefs containing [EnforceRange] were allowed on all output types (readonly attribute, const, and return types). cc @kvark

@kainino0x
Copy link
Author

Here's a PR where we were making the change: gpuweb/gpuweb#928

@domenic
Copy link
Member

domenic commented Aug 3, 2020

Consider it resolved! When we have editing bandwidth we can add it to the spec proper. (A pull request would also be lovely.)

@kainino0x
Copy link
Author

Thanks for the quick reply!

@kainino0x
Copy link
Author

Just to clarify, we should consider it resolved that [EnforceRange] is disallowed on consts and return types. Is that correct?

@litherum
Copy link

litherum commented Aug 3, 2020

Consider it resolved!

I can't determine from this thread which way it was resolved. Do you think you could clarify?

As an aside: Having to have two sibling typedefs, one with [EnforceRange] and one without it, seems bad for clarity. And the alternative of having one typedef without it, but sprinkling [EnforceRange] around the spec in various places seems even worse (because it's non-obvious when you've missed one). IMO, allowing no-op [EnforceRange]s is valuable for type consistency and spec simplicity. Being able to put [EnforceRange] in a typedef, and use the typedef everywhere and have [EnforceRange] do its enforcement where it makes sense and do nothing where it doesn't, seems like the best outcome.

@domenic
Copy link
Member

domenic commented Aug 3, 2020

Just to clarify, we should consider it resolved that [EnforceRange] is disallowed on consts and return types. Is that correct?

Correct!

@domenic
Copy link
Member

domenic commented Aug 3, 2020

As an aside: Having to have two sibling typedefs, one with [EnforceRange] and one without it, seems bad for clarity.

In general I find the extensive use of typedefs in the Web GL specs pretty bad for clarity. They're working against the Web IDL type system, which assumes everyone will use a common vocabulary for things like unsigned long, instead of typedefing primitives. So it's understandable that you'll run into trouble when working in that type of situation.

@litherum
Copy link

litherum commented Aug 3, 2020

In general I find the extensive use of typedefs in the Web GL specs pretty bad for clarity.

Is there a better solution to represent bitsets?

@domenic
Copy link
Member

domenic commented Aug 3, 2020

Using the original types, e.g. unsigned long, directly, is what I would suggest.

@litherum
Copy link

litherum commented Aug 3, 2020

That breaks down when there are multiple bitsets. A function that takes an unsigned long does not indicate which bitset that unsigned long represents. That's the purpose of the typedefs.

@domenic
Copy link
Member

domenic commented Aug 3, 2020

I think that's pretty misleading, because in JavaScript, there's no distinction between those different types of bitsets. You could easily write code that assigns a GPUSize32 bitset value to a GPUSampleMask bitset property. The IDL shouldn't hide this fact behind typedefs.

@kvark
Copy link

kvark commented Aug 3, 2020

Not a JS dev by any means, but I thought you can write a lot of weird code in JS. Can't you pass the same object where different WebIDL dictionary types are required, for example? Does that make it less valuable to define different dictionary types then?
My understanding is that WebIDL spec defines what is expected in objects passed by the user. It can't enforce what these things are, since JS is weakly and dynamically typed. From this perspective, a typedef for an integer/flag does increase clarity and quality of a spec.

@domenic
Copy link
Member

domenic commented Aug 3, 2020

Can't you pass the same object where different WebIDL dictionary types are required, for example? Does that make it less valuable to define different dictionary types then?

If you made multiple dictionaries with the exact same members, then yes, I agree that would be just as misleading and against the grain of Web IDL.

My understanding is that WebIDL spec defines what is expected in objects passed by the user. It can't enforce what these things are, since JS is weakly and dynamically typed

I wouldn't really put it that way. In particular, Web IDL defines, and allows specs to define, conversion rules for how to turn JavaScript values into Web IDL values. Having, e.g., different dictionary types (which generate different conversion rules), or different integer types (e.g. long vs. unsigned long, which generate different conversion rules), is helpful.

But if you define two exact same conversion rules under different names, that's generally confusing for readers, causing them to dig through your indirection to realize what conversion rules are actually in place. Sometimes there's a brevity benefit, especially around typedefing large unions or deeply-nested generic structures, which outweighs this confusion; that's generally why typdefs exist. But they don't exist so that you can express "perform ToNumber, round to an integer, and take the value modulo 2**32" in several different ways. We have a canonical way to do that, namely unsigned long.

To be clear, I'm not saying that we should disallow typedefing primitives, or other "simple" types, in the Web IDL spec. Where to use typdefs, and make the indirection tradeoff, is a choice that individual specification editors should make. Instead, I'm just explaining why when you attempt to use them in this way, you'll run into issues like the ones discussed in this thread.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

6 participants