Maybe it's not a bug actually. Just really... odd.
So imagine I am weird and I put a signed and an unsigned number in the same array. Now shape will return the maximum of the width, and maximum of signedness. This means that if I have signed 0xff=-1 and unsigned 0xff=255, and I put them in the same array, I think what will happen if you remove the extra bit is that the unsigned one will become -1 instead of 255.
So this poses a larger question: are inhomogeneous elements in an array a thing that is supported?
It seems that this is a very weird thing to want in a HDL, and in this particular case, causes signed array values to be one bit too big.