Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DM-14725: Eliminate explicit use of ndarray::EigenView in C++ code #58

Merged
merged 2 commits into from
Jun 8, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ version.py
.cache
.pytest_cache
pytest_session.txt
.coverage
tests/.cache/*
tests/.tests
tests/reference/bvn
Expand Down
25 changes: 12 additions & 13 deletions src/CModel.cc
Original file line number Diff line number Diff line change
Expand Up @@ -613,9 +613,7 @@ struct WeightSums {
) : fluxInner(0.0), fluxVar(0.0), norm(0.0)
{
assert(modelMatrix.getSize<1>() == 1);
run(modelMatrix.transpose()[0].asEigen<Eigen::ArrayXpr>(),
data.asEigen<Eigen::ArrayXpr>(),
variance.asEigen<Eigen::ArrayXpr>());
run(modelMatrix.transpose()[0], data, variance);
}

WeightSums(
Expand All @@ -624,20 +622,21 @@ struct WeightSums {
ndarray::Array<Pixel const,1,1> const & variance
) : fluxInner(0.0), fluxVar(0.0), norm(0.0)
{
run(model.asEigen<Eigen::ArrayXpr>(),
data.asEigen<Eigen::ArrayXpr>(),
variance.asEigen<Eigen::ArrayXpr>());
run(model, data, variance);
}

void run(
ndarray::EigenView<Pixel const,1,1,Eigen::ArrayXpr> const & model,
ndarray::EigenView<Pixel const,1,1,Eigen::ArrayXpr> const & data,
ndarray::EigenView<Pixel const,1,1,Eigen::ArrayXpr> const & variance
ndarray::Array<Pixel const,1,1> const & model,
ndarray::Array<Pixel const,1,1> const & data,
ndarray::Array<Pixel const,1,1> const & variance
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 for separation of interface and implementation.

) {
double w = model.sum();
double wd = (model*data).sum();
double ww = model.square().sum();
double wwv = (model.square()*variance).sum();
auto modelEigen = model.asEigen<Eigen::ArrayXpr>();
auto dataEigen = data.asEigen<Eigen::ArrayXpr>();
auto varianceEigen = variance.asEigen<Eigen::ArrayXpr>();
double w = modelEigen.sum();
double wd = (modelEigen*dataEigen).sum();
double ww = modelEigen.square().sum();
double wwv = (modelEigen.square()*varianceEigen).sum();
norm = w/ww;
fluxInner = wd*norm;
fluxVar = wwv*norm*norm;
Expand Down
22 changes: 12 additions & 10 deletions src/DoubleShapeletPsfApprox.cc
Original file line number Diff line number Diff line change
Expand Up @@ -283,22 +283,24 @@ class ProfileObjective : public OptimizerObjective {
Scalar oAlpha = parameters[1] * _normalization;
Scalar iR2 = parameters[2] * parameters[2];
Scalar oR2 = parameters[3] * parameters[3];
residuals.asEigen<Eigen::ArrayXpr>() = _data
- (iAlpha/iR2)*(_arg/iR2).exp()
- (oAlpha/oR2)*(_arg/oR2).exp();
auto argEigen = _arg.asEigen<Eigen::ArrayXpr>();
residuals.asEigen<Eigen::ArrayXpr>() = _data.asEigen<Eigen::ArrayXpr>()
- (iAlpha/iR2)*(argEigen/iR2).exp()
- (oAlpha/oR2)*(argEigen/oR2).exp();
}

virtual bool differentiateResiduals(
ndarray::Array<Scalar const,1,1> const & parameters,
ndarray::Array<Scalar,2,-2> const & derivatives
) const {
ndarray::EigenView<Scalar,2,-2,Eigen::ArrayXpr> d(derivatives);
auto d = derivatives.asEigen<Eigen::ArrayXpr>();
auto argEigen = _arg.asEigen<Eigen::ArrayXpr>();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You end up doing these conversions a lot. Consider using private getters or simply letting the types of _data and _arg break when you switch to Eigen::Map.

Copy link
Contributor Author

@r-owen r-owen Jun 8, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The plan is as follows: add asEigenArray and asEigenMatrix to ndarray::Array; these will both return Eigen::Map. Then eliminate all use of asEigen in our code, which will eliminate all use of ndarray::EigenView. We can then remove EigenView from ndarray ((at least conditionally when ndarray is built against too new a version of Eigen, but as far as LSST is concerned it will go away).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does that reduce the code duplication of translating _data and _arg to Eigen::Map every time you want to use them?

Scalar iR2 = parameters[2] * parameters[2];
Scalar oR2 = parameters[3] * parameters[3];
d.col(0) = - (_normalization/iR2)*(_arg/iR2).exp();
d.col(1) = - (_normalization/oR2)*(_arg/oR2).exp();
d.col(2) = -2.0*d.col(0)*(_arg/iR2 + 1.0)*parameters[0]/parameters[2];
d.col(3) = -2.0*d.col(1)*(_arg/oR2 + 1.0)*parameters[1]/parameters[3];
d.col(0) = - (_normalization/iR2)*(argEigen/iR2).exp();
d.col(1) = - (_normalization/oR2)*(argEigen/oR2).exp();
d.col(2) = -2.0*d.col(0)*(argEigen/iR2 + 1.0)*parameters[0]/parameters[2];
d.col(3) = -2.0*d.col(1)*(argEigen/oR2 + 1.0)*parameters[1]/parameters[3];
return true;
}

Expand All @@ -325,8 +327,8 @@ class ProfileObjective : public OptimizerObjective {
Scalar _maxRadius;
Scalar _minRadiusDiff;
Scalar _normalization;
ndarray::EigenView<Scalar,1,1,Eigen::ArrayXpr> _data;
ndarray::EigenView<Scalar,1,1,Eigen::ArrayXpr> _arg;
ndarray::Array<Scalar,1,1> _data; // ArrayXpr
ndarray::Array<Scalar,1,1> _arg; // ArrayXpr
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand these comments.

Copy link
Contributor Author

@r-owen r-owen Jun 8, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think they were placeholders. I'll remove them on DM-14625 (since I jumped the gun and merged this branch before I realized you had any suggested changes).

};

} // anonymous
Expand Down
2 changes: 1 addition & 1 deletion src/optimizer.cc
Original file line number Diff line number Diff line change
Expand Up @@ -362,7 +362,7 @@ Optimizer::Optimizer(
}

void Optimizer::_computeDerivatives() {
ndarray::EigenView<Scalar,2,-2> resDer(_residualDerivative);
auto resDer = _residualDerivative.asEigen();
resDer.setZero();
_next.parameters.deep() = _current.parameters;
if (!_objective->differentiateResiduals(_current.parameters, _residualDerivative)) {
Expand Down