Skip to content

Commit

Permalink
Fix wrong relu implementation on non-ARM devices
Browse files Browse the repository at this point in the history
  • Loading branch information
daquexian committed Oct 10, 2019
1 parent 8a98287 commit 55d7534
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion dabnn/layers/Relu.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,10 @@ void Relu::forward_impl() const {
}
#else
float *ptr = static_cast<float *>(*data_mat);
FORZ(i, data_mat->total()) { *ptr = std::max(*ptr, 0.f); }
FORZ(i, data_mat->total()) {
*ptr = std::max(*ptr, 0.f);
ptr++;
}
#endif // __ARM_NEON
}
} // namespace bnn

0 comments on commit 55d7534

Please sign in to comment.