Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DM-14740: Stop using ndarray::EigenView indirectly in C++ code #360

Merged
merged 1 commit into from
Jun 18, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
18 changes: 9 additions & 9 deletions include/lsst/afw/math/LeastSquares.h
Original file line number Diff line number Diff line change
Expand Up @@ -122,8 +122,8 @@ class LeastSquares {
// the weird C++ syntax required for calling a templated member function
// called "cast" in this context; see
// http://eigen.tuxfamily.org/dox-devel/TopicTemplateKeyword.html
_getDesignMatrix() = design.asEigen().template cast<double>();
_getDataVector() = data.asEigen().template cast<double>();
_getDesignMatrix() = ndarray::asEigenMatrix(design).template cast<double>();
_getDataVector() = ndarray::asEigenMatrix(data).template cast<double>();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if design and data are temporaries? Or is cast<double> guaranteed to make a defensive copy (which would also explain how the parameters can be const)? I looked up the documentation and it didn't say anything about object independence.

Same question about other methods that use this implementation pattern.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I hope @TallJimbo will say more, but my understanding is that this makes a copy because the target is a (reference to) an Eigen::MatrixXd

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The assignment operation is where the (only) copy is made. The call to cast is safe only because temporaries returned by asEigenMatrix are not destroyed until the end of their statements (see e.g. https://stackoverflow.com/questions/3041249/when-are-temporaries-created-as-part-of-a-function-call-destroyed).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, ok, I didn't realize the assignment copied the data. Maybe I've spent too long staring at ndarray and afw::image... 🙃

_factor(false);
}

Expand All @@ -138,7 +138,7 @@ class LeastSquares {
/// Reset the design matrix given as an ndarray; dimension and data are not changed.
template <typename T1, int C1>
void setDesignMatrix(ndarray::Array<T1, 2, C1> const& design) {
_getDesignMatrix() = design.asEigen().template cast<double>();
_getDesignMatrix() = ndarray::asEigenMatrix(design).template cast<double>();
_factor(false);
}

Expand Down Expand Up @@ -173,11 +173,11 @@ class LeastSquares {
template <typename T1, typename T2, int C1, int C2>
void setNormalEquations(ndarray::Array<T1, 2, C1> const& fisher, ndarray::Array<T2, 1, C2> const& rhs) {
if ((C1 > 0) == bool(Eigen::MatrixXd::IsRowMajor)) {
_getFisherMatrix() = fisher.asEigen().template cast<double>();
_getFisherMatrix() = ndarray::asEigenMatrix(fisher).template cast<double>();
} else {
_getFisherMatrix() = fisher.asEigen().transpose().template cast<double>();
_getFisherMatrix() = ndarray::asEigenMatrix(fisher).transpose().template cast<double>();
}
_getRhsVector() = rhs.asEigen().template cast<double>();
_getRhsVector() = ndarray::asEigenMatrix(rhs).template cast<double>();
_factor(true);
}

Expand Down Expand Up @@ -223,7 +223,7 @@ class LeastSquares {
* by future calls to LeastSquares member functions, so it's best to promptly copy the
* result elsewhere.
*
* If you want an Eigen object instead, just use getSolution().asEigen().
* If you want an Eigen object instead, just use ndarray::asEigenMatrix(getSolution()).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 for updating the documentation.

*
* The solution is cached the first time this member function is called, and will be
* reused unless the matrices are reset or the threshold is changed.
Expand All @@ -237,7 +237,7 @@ class LeastSquares {
* by future calls to LeastSquares member functions, so it's best to promptly copy the
* result elsewhere.
*
* If you want an Eigen object instead, just use getCovariance().asEigen().
* If you want an Eigen object instead, just use ndarray::asEigenMatrix(getCovariance()).
*
* The covariance is cached the first time this member function is called, and will be
* reused unless the matrices are reset or the threshold is changed.
Expand All @@ -254,7 +254,7 @@ class LeastSquares {
* by future calls to LeastSquares member functions, so it's best to promptly copy the
* result elsewhere.
*
* If you want an Eigen object instead, just use getFisherMatrix().asEigen().
* If you want an Eigen object instead, just use ndarray::asEigenMatrix(getFisherMatrix()).
*/
ndarray::Array<double const, 2, 2> getFisherMatrix();

Expand Down
2 changes: 1 addition & 1 deletion src/detection/GaussianPsf.cc
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ std::shared_ptr<GaussianPsf::Image> GaussianPsf::doComputeKernelImage(lsst::geom
sum += row[xIndex] = std::exp(-0.5 * (x * x + y * y) / (_sigma * _sigma));
}
}
array.asEigen() /= sum;
ndarray::asEigenMatrix(array) /= sum;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks a bit weird. How is ndarray::asEigenMatrix an lvalue? I found the definition, but the return value doesn't look like a reference.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, @TallJimbo may be able to explain it better, but I believe the fact that it is an Eigen::Map means it can be read or written. It's just a view into array data.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct. Eigen::Map is basically a "smart reference", something C++ is notoriously bad at providing intuitive syntax for.

return r;
}

Expand Down
2 changes: 1 addition & 1 deletion src/geom/transformFactory.cc
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ ndarray::Array<typename Eigen::MatrixBase<Derived>::Scalar, 2, 2> toNdArray(
Eigen::MatrixBase<Derived> const &matrix) {
ndarray::Array<typename Eigen::MatrixBase<Derived>::Scalar, 2, 2> array =
ndarray::allocate(ndarray::makeVector(matrix.rows(), matrix.cols()));
array.asEigen() = matrix;
ndarray::asEigenMatrix(array) = matrix;
return array;
}

Expand Down
42 changes: 22 additions & 20 deletions src/math/LeastSquares.cc
Original file line number Diff line number Diff line change
Expand Up @@ -165,9 +165,9 @@ class EigensystemSolver : public LeastSquares::Impl {
"Cannot compute NORMAL_CHOLESKY diagnostic from NORMAL_EIGENSYSTEM factorization.");
}
if (_eig.info() == Eigen::Success) {
diagnostic.asEigen() = _eig.eigenvalues().reverse();
ndarray::asEigenMatrix(diagnostic) = _eig.eigenvalues().reverse();
} else {
diagnostic.asEigen() = _svd.singularValues();
ndarray::asEigenMatrix(diagnostic) = _svd.singularValues();
}
if (whichDiagnostic == LeastSquares::DIRECT_SVD) {
diagnostic.asEigen<Eigen::ArrayXpr>() = diagnostic.asEigen<Eigen::ArrayXpr>().sqrt();
Expand All @@ -176,33 +176,35 @@ class EigensystemSolver : public LeastSquares::Impl {

virtual void updateSolution() {
if (rank == 0) {
solution.asEigen().setZero();
ndarray::asEigenMatrix(solution).setZero();
return;
}
if (_eig.info() == Eigen::Success) {
_tmp.head(rank) = _eig.eigenvectors().rightCols(rank).adjoint() * rhs;
_tmp.head(rank).array() /= _eig.eigenvalues().tail(rank).array();
solution.asEigen() = _eig.eigenvectors().rightCols(rank) * _tmp.head(rank);
ndarray::asEigenMatrix(solution) = _eig.eigenvectors().rightCols(rank) * _tmp.head(rank);
} else {
_tmp.head(rank) = _svd.matrixU().leftCols(rank).adjoint() * rhs;
_tmp.head(rank).array() /= _svd.singularValues().head(rank).array();
solution.asEigen() = _svd.matrixU().leftCols(rank) * _tmp.head(rank);
ndarray::asEigenMatrix(solution) = _svd.matrixU().leftCols(rank) * _tmp.head(rank);
}
}

virtual void updateCovariance() {
if (rank == 0) {
covariance.asEigen().setZero();
ndarray::asEigenMatrix(covariance).setZero();
return;
}
if (_eig.info() == Eigen::Success) {
covariance.asEigen() = _eig.eigenvectors().rightCols(rank) *
_eig.eigenvalues().tail(rank).array().inverse().matrix().asDiagonal() *
_eig.eigenvectors().rightCols(rank).adjoint();
ndarray::asEigenMatrix(covariance) =
_eig.eigenvectors().rightCols(rank) *
_eig.eigenvalues().tail(rank).array().inverse().matrix().asDiagonal() *
_eig.eigenvectors().rightCols(rank).adjoint();
} else {
covariance.asEigen() = _svd.matrixU().leftCols(rank) *
_svd.singularValues().head(rank).array().inverse().matrix().asDiagonal() *
_svd.matrixU().leftCols(rank).adjoint();
ndarray::asEigenMatrix(covariance) =
_svd.matrixU().leftCols(rank) *
_svd.singularValues().head(rank).array().inverse().matrix().asDiagonal() *
_svd.matrixU().leftCols(rank).adjoint();
}
}

Expand All @@ -229,13 +231,13 @@ class CholeskySolver : public LeastSquares::Impl {
pex::exceptions::LogicError,
"Can only compute NORMAL_CHOLESKY diagnostic from NORMAL_CHOLESKY factorization.");
}
diagnostic.asEigen() = _ldlt.vectorD();
ndarray::asEigenMatrix(diagnostic) = _ldlt.vectorD();
}

virtual void updateSolution() { solution.asEigen() = _ldlt.solve(rhs); }
virtual void updateSolution() { ndarray::asEigenMatrix(solution) = _ldlt.solve(rhs); }

virtual void updateCovariance() {
auto cov = covariance.asEigen();
auto cov = ndarray::asEigenMatrix(covariance);
cov.setIdentity();
cov = _ldlt.solve(cov);
}
Expand Down Expand Up @@ -270,27 +272,27 @@ class SvdSolver : public LeastSquares::Impl {
pex::exceptions::LogicError,
"Can only compute NORMAL_CHOLESKY diagnostic from DIRECT_SVD factorization.");
case LeastSquares::DIRECT_SVD:
diagnostic.asEigen() = _svd.singularValues();
ndarray::asEigenMatrix(diagnostic) = _svd.singularValues();
break;
}
}

virtual void updateSolution() {
if (rank == 0) {
solution.asEigen().setZero();
ndarray::asEigenMatrix(solution).setZero();
return;
}
_tmp.head(rank) = _svd.matrixU().leftCols(rank).adjoint() * data;
_tmp.head(rank).array() /= _svd.singularValues().head(rank).array();
solution.asEigen() = _svd.matrixV().leftCols(rank) * _tmp.head(rank);
ndarray::asEigenMatrix(solution) = _svd.matrixV().leftCols(rank) * _tmp.head(rank);
}

virtual void updateCovariance() {
if (rank == 0) {
covariance.asEigen().setZero();
ndarray::asEigenMatrix(covariance).setZero();
return;
}
covariance.asEigen() =
ndarray::asEigenMatrix(covariance) =
_svd.matrixV().leftCols(rank) *
_svd.singularValues().head(rank).array().inverse().square().matrix().asDiagonal() *
_svd.matrixV().leftCols(rank).adjoint();
Expand Down
2 changes: 1 addition & 1 deletion tests/test_tableArchives.cc
Original file line number Diff line number Diff line change
Expand Up @@ -476,7 +476,7 @@ std::vector<double> makeRandomVector(int size) {

ndarray::Array<double, 2, 2> makeRandomArray(int width, int height) {
ndarray::Array<double, 2, 2> array(ndarray::allocate(height, width));
array.asEigen().setRandom();
ndarray::asEigenMatrix(array).setRandom();
return array;
}

Expand Down