New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fish eats memory with huge slices #4127

Closed
faho opened this Issue Jun 15, 2017 · 2 comments

Comments

Projects
None yet
2 participants
@faho
Member

faho commented Jun 15, 2017

While trying to solve #826, one of the things I ended up testing was a huge slice.

It turns out that

echo $a[1..11111152323232222]

will eat all your memory, even with $a undefined.

@faho faho added the bug label Jun 15, 2017

@faho faho added this to the fish 2.7.0 milestone Jun 15, 2017

@krader1961

This comment has been minimized.

Show comment
Hide comment
@krader1961

krader1961 Jun 16, 2017

Contributor

FWIW, I don't think this is a bug per se. It's undefined what happens when you do something like that. I'll grant you the implementation should be smarter than it currently is and not allocate more space than is required for the slice. It should limit each end of the range to reasonable values before using those values to construct a new array. Obviously the reasonable limits are one to the current length of the var being sliced.

Contributor

krader1961 commented Jun 16, 2017

FWIW, I don't think this is a bug per se. It's undefined what happens when you do something like that. I'll grant you the implementation should be smarter than it currently is and not allocate more space than is required for the slice. It should limit each end of the range to reasonable values before using those values to construct a new array. Obviously the reasonable limits are one to the current length of the var being sliced.

@faho

This comment has been minimized.

Show comment
Hide comment
@faho

faho Jun 16, 2017

Member

Of course it's not really expected that we have lists with that amount of elements. I'm just objecting to the mode of failure here - it's not just unresponsive, it's actually bringing down the system.

It should limit each end of the range to reasonable values before using those values to construct a new array. Obviously the reasonable limits are one to the current length of the var being sliced.

This actually happens with even larger values:

$ echo $a[1..111111523232322222222222222]
fish: Invalid index value

I have found the issue (parse_slice builds a list of all indices, and doesn't check the array_size even though it has it), and I'll incorporate a change in my fix for #826, since it is affected by that anyway (with the current behavior we'd want this to be an error, with #826 we'd want this to silently be ignored).

Member

faho commented Jun 16, 2017

Of course it's not really expected that we have lists with that amount of elements. I'm just objecting to the mode of failure here - it's not just unresponsive, it's actually bringing down the system.

It should limit each end of the range to reasonable values before using those values to construct a new array. Obviously the reasonable limits are one to the current length of the var being sliced.

This actually happens with even larger values:

$ echo $a[1..111111523232322222222222222]
fish: Invalid index value

I have found the issue (parse_slice builds a list of all indices, and doesn't check the array_size even though it has it), and I'll incorporate a change in my fix for #826, since it is affected by that anyway (with the current behavior we'd want this to be an error, with #826 we'd want this to silently be ignored).

@faho faho self-assigned this Jun 16, 2017

faho added a commit to faho/fish-shell that referenced this issue Jun 20, 2017

faho added a commit to faho/fish-shell that referenced this issue Jun 20, 2017

faho added a commit to faho/fish-shell that referenced this issue Jun 20, 2017

@faho faho referenced this issue Jun 20, 2017

Closed

No index out of bounds #4151

3 of 3 tasks complete
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment