FWIW, I don't think this is a bug per se. It's undefined what happens when you do something like that. I'll grant you the implementation should be smarter than it currently is and not allocate more space than is required for the slice. It should limit each end of the range to reasonable values before using those values to construct a new array. Obviously the reasonable limits are one to the current length of the var being sliced.
Of course it's not really expected that we have lists with that amount of elements. I'm just objecting to the mode of failure here - it's not just unresponsive, it's actually bringing down the system.
It should limit each end of the range to reasonable values before using those values to construct a new array. Obviously the reasonable limits are one to the current length of the var being sliced.
This actually happens with even larger values:
fish: Invalid index value
I have found the issue (parse_slice builds a list of all indices, and doesn't check the array_size even though it has it), and I'll incorporate a change in my fix for #826, since it is affected by that anyway (with the current behavior we'd want this to be an error, with #826 we'd want this to silently be ignored).
While trying to solve #826, one of the things I ended up testing was a huge slice.
It turns out that
will eat all your memory, even with $a undefined.
The text was updated successfully, but these errors were encountered: