Skip to content

Conversation

dcodeIO
Copy link
Member

@dcodeIO dcodeIO commented Dec 21, 2019

An initial implementation of inferring array literals in auto contexts from their first element, as proposed in #1007. Does not yet handle nulls.

@dcodeIO
Copy link
Member Author

dcodeIO commented Dec 21, 2019

Regarding nulls there are two obvious edge cases:

  1. The array starts with one or multiple nulls
  2. The array has a late null changing its element type to nullable

Plus a not-so-obvious:

  1. The array starts with an expression that has a nullable type (like a local) but evaluates to not null, which we can't reliably detect

@dcodeIO
Copy link
Member Author

dcodeIO commented Dec 21, 2019

Last commit now first resolves all elements to their common type, so something like this can be inferred as well:

var arr = [1, 2, 3.0]; // f64[]
var ref = new Ref();
var arr = [ref, null]; // (Ref | null)[]

@MaxGraey
Copy link
Member

MaxGraey commented Dec 21, 2019

nice! This even better. But how more expensive this approach?

@dcodeIO
Copy link
Member Author

dcodeIO commented Dec 21, 2019

It doesn't make any difference if the type is annotated, as it is the case for all of our existing code. If it isn't annotated, every element needs to be resolved, which is rather simple to do for arrays of literals, but can become costly if there are lots of non-literal elements. That's uncommon, though.

One benchmarks could be a large array of integer literals, as in math code, comparing annotated vs. inferred. I'd expect up to 2x slowdown from first evaluating the elements before compiling them.

@MaxGraey
Copy link
Member

MaxGraey commented Dec 21, 2019

btw I one possible optimization is check if type reached upper limit of subtypeing like we know type i32 < type i64 < type f64 (for literals). And if we reached f64 in some array position we could skip Type.commonDenominator computation and just resolve next elements as f64. In this case [1.0, 0, ..., 1] will be the simply same as infer by first element

@dcodeIO
Copy link
Member Author

dcodeIO commented Dec 21, 2019

And if we reached f64 in some places we could skip Type.commonDenominator computation

The implementation currently steps up the types, so if there's an f64 at one place it will compile the next literal with an f64 context again yielding an f64. Perhaps a fast path doing nothing if left == right is already enough.

@MaxGraey
Copy link
Member

MaxGraey commented Dec 22, 2019

Hmm, I see one disadvantage of common denominator approach. Usually when people explicitly write something like this [2.0 as f32, 1.0, 3] they could expect arr infer as f32[] instead f64[]. However in this case perhaps more better explicitly declare type for whole array ofc.

@dcodeIO
Copy link
Member Author

dcodeIO commented Dec 22, 2019

I think this should produce f32[] already, unless it's not working? Only the first element uses contextualType=auto, but the second then uses contextualType=f32 if the first resolved to f32, so 1.0 would be interpreted as f32, as would 3.

@MaxGraey
Copy link
Member

Could you add test for this case?

@MaxGraey
Copy link
Member

MaxGraey commented Dec 29, 2019

Found some rouge cases:

let arr1 = [null, "a"]; // should compile but produce error
let arr2 = [null];      // should produce error but compile
let arr3 = [1, null];   // should produce error but compile for now

https://webassembly.studio/?f=wavzyboc05

@dcodeIO dcodeIO deleted the infer-array branch January 1, 2020 14:21
@MaxGraey
Copy link
Member

MaxGraey commented Jan 4, 2020

Another potential improvement:

// nester arrays
let arr1 = [[1], [2]]; // currently produce compiler error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants