Skip to content

Commit

Permalink
Merge pull request opencv#24510 from asmorkalov:as/softmax_rvv
Browse files Browse the repository at this point in the history
Enable softmax layer vectorization on RISC-V RVV opencv#24510 

Related: opencv#24466

### Pull Request Readiness Checklist

See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request

- [x] I agree to contribute to the project under Apache 2 License.
- [x] To the best of my knowledge, the proposed patch is not based on a code under GPL or another license that is incompatible with OpenCV
- [x] The PR is proposed to the proper branch
- [x] There is a reference to the original bug report and related work
- [x] There is accuracy test, performance test and test data in opencv_extra repository, if applicable
      Patch to opencv_extra has the same branch name.
- [ ] The feature is well documented and sample code can be built with the project CMake
  • Loading branch information
asmorkalov authored and IskXCr committed Dec 20, 2023
1 parent 09efa3c commit c7ad145
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions modules/dnn/src/layers/cpu_kernels/softmax.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ void softmax(Mat &dst, const Mat &src, int axis, int axisBias, int axisStep){
// make the channel axis to be multiple of 8
size_t channelAxis = (axisStep + 7) & -8;

#if CV_SIMD
#if (CV_SIMD || CV_SIMD_SCALABLE)
const int nlanes = VTraits<v_float32>::vlanes();
// the number of redundant dimension
size_t redundantDim = nlanes - axisStep % nlanes;
Expand All @@ -54,7 +54,7 @@ void softmax(Mat &dst, const Mat &src, int axis, int axisBias, int axisStep){
axisBuf[cnDim] = srcPtr[srcOffset + (cnDim + axisBias) * cnStep];

float s = 0.f;
#if CV_SIMD
#if (CV_SIMD || CV_SIMD_SCALABLE)
// make the value of the redundant dimension to be -FLT_MAX
if (redundantDim != nlanes) {
for (size_t j = axisStep; j < axisStep + redundantDim; j++)
Expand Down Expand Up @@ -121,7 +121,7 @@ void softmax(Mat &dst, const Mat &src, int axis, int axisBias, int axisStep){
s = v_reduce_sum(vs);
// subtract the value of the redundant dimension
if (redundantDim != nlanes) {
float* _val = new float[nlanes];
float _val[VTraits<v_float32>::max_nlanes];
v_store(_val, val);
for (size_t j = nlanes - redundantDim; j < nlanes; j++)
s -= _val[j];
Expand Down

0 comments on commit c7ad145

Please sign in to comment.