Skip to content
Permalink
Browse files Browse the repository at this point in the history
[lite] Add check for bias_size is zero to avoid division by zero. Thi…
…s shouldn't happen for properly converted models. Just safety check

PiperOrigin-RevId: 416383645
Change-Id: If8e508bf696ae8ecfb927e69c139a8ccf7fe60cb
  • Loading branch information
karimnosseir authored and tensorflower-gardener committed Dec 14, 2021
1 parent c8dafc9 commit 8c6f391
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions tensorflow/lite/kernels/internal/common.h
Expand Up @@ -75,6 +75,7 @@ float ActivationFunction(float x) {
inline void BiasAndClamp(float clamp_min, float clamp_max, int bias_size,
const float* bias_data, int array_size,
float* array_data) {
if (bias_size == 0) return;
// Note: see b/132215220: in May 2019 we thought it would be OK to replace
// this with the Eigen one-liner:
// return (array.colwise() + bias).cwiseMin(clamp_max).cwiseMin(clamp_max).
Expand Down

0 comments on commit 8c6f391

Please sign in to comment.