Why does 'with' throw on out-of-bounds access #75
Comments
I'm not sure I understand. They do throw a RangeError, not a TypeError - in step 6 of https://tc39.es/proposal-change-array-by-copy/#sec-array.prototype.with. Is the question why they don't cycle around, at/slice-style? |
Related data points:
[0].at(99); // undefined ImmutableJS [...Immutable.List([]).set(3, 'v')]; // [undefined, undefined, undefined, 'v']
[...Immutable.List([]).set(-1, 'v')]; // ['v'] |
My mistake, but the intention is why do they throw an error at all? Arrays in particular are super sloppy, I can't think of another method that throws. Usually, they clamp the value to be between |
Some alternative options: // A) Ignore
[0].with(5, 'v'); // [0]
// B) Clamp
[0].with(5, 'v'); // ['v']
// C) Expand
[0].with(5, 'v'); // [0, undefined, undefined, undefined, undefined, 'v']
[0].with(-5, 'v'); // ['v'] // still clamp Math.max(0, index) |
This comment has been minimized.
This comment has been minimized.
Thanks! yes. Corrected that now 😅 Another data point: (new Uint8Array(1)).set([], 99); // RangeError! |
There’s really not another method like with on arrays or typed arrays anywhere to compare with. I think it’s different enough from the other array methods that it would be fine for it to be slightly different. Is there any non=consistency reason why the current behavior isn’t preferred? |
|
that’s == - what about if it’s larger? like the length is 2 and index is 5. |
Expanding behavior would make the most sense to me. Clamping to I'm not saying we need to change the behavior, it was just something I noticed during my review. |
So there's these options:
I'm not sure what else there would be. silently clamping seems like it'd cause lots of bugs. |
Probably just be ignored, and return a new array slice. |
I appreciate you bringing it up, good to ensure these things are discussed rather than just silently slip into the spec. The idea of So it could make sense to match Array's behavior of expanding the length when assignment beyond the end, but filling with That said, it does feel like the existing behaviour of having a const a = [0, 1];
const a2 = [...a];
a2[3] = 3;
a2; // [0, 1, empty, 3]
const a3 = [...a2]
a3; // [0, 1, undefined, 3] |
I think that code would be the same as arr.toSpliced(/* at: */ index, /* delete: */ 1, /* insert: */ item); let a = ['a', 'b', 'c'];
a.toSpliced(0, 1, 'v'); // ['v', 'b', 'c']
a.toSpliced(1, 1, 'v'); // ['a', 'v', 'c']
a.toSpliced(2, 1, 'v'); // ['a', 'b', 'v']
a.toSpliced(a.length, 1, 'v'); // ['a', 'b', 'c', 'v']
a.toSpliced(99, 1, 'v'); // ['a', 'b', 'c', 'v']
a.toSpliced(-1, 1, 'v'); // ['a', 'b', 'v']
a.toSpliced(-99, 1, 'v'); // ['v', 'b', 'c'] |
I'm going to close this issue, based on my assessment of this design in #75 (comment) and #75 (comment) |
Opening issue to discuss a point @jridgewell raised here.
The text was updated successfully, but these errors were encountered: