You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is because in operator=(const AbstractTensor<Derived,2> &a) of the tensor view the eval from the right hand size expression generates a boolean simd vector of 4 (under SSE) SIMDVector<boo,simd_abi::fixed<4>> but the simd vector of the left hand side (the view itself) is constructed at the time of construction of the view when the incoming rhs is not known of course and this is SIMDVector<bool,simd_abi::scalar>. As a result the loop calls the rhs 4x4=16 times (in a scalar) fashion and rhs as a result being a SIMDVector<boo,simd_abi::fixed<4>> accesses 4 items ahead and hence the memory is accessed beyond it's bounds.
To be consistent we need to implemented the same strategy as its implemented in Tensor constructors that is
FASTOR_IF_CONSTEXPR(is_boolean_expression_v<Derived>) {
// do sequential access
}
Of course till SIMDMask is implemented.
The text was updated successfully, but these errors were encountered:
The following example illustrates the issue
This is because in
operator=(const AbstractTensor<Derived,2> &a)
of the tensor view theeval
from the right hand size expression generates a boolean simd vector of 4 (under SSE)SIMDVector<boo,simd_abi::fixed<4>>
but the simd vector of the left hand side (the view itself) is constructed at the time of construction of the view when the incoming rhs is not known of course and this isSIMDVector<bool,simd_abi::scalar>
. As a result the loop calls the rhs 4x4=16 times (in a scalar) fashion and rhs as a result being aSIMDVector<boo,simd_abi::fixed<4>>
accesses 4 items ahead and hence the memory is accessed beyond it's bounds.To be consistent we need to implemented the same strategy as its implemented in
Tensor
constructors that isOf course till
SIMDMask
is implemented.The text was updated successfully, but these errors were encountered: