Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient for dot(x,A,y) #261

Merged
merged 8 commits into from
Oct 23, 2020
Merged

Gradient for dot(x,A,y) #261

merged 8 commits into from
Oct 23, 2020

Conversation

mcabbott
Copy link
Member

@mcabbott mcabbott commented Sep 4, 2020

Like it says.

Can be made about 15% faster by using things like lmul!(conj(dz), Ay) instead of broadcasting.

Copy link
Member

@willtebbutt willtebbutt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the contribution. Broadly LGTM -- just a couple of points.

src/rulesets/LinearAlgebra/dense.jl Outdated Show resolved Hide resolved
test/rulesets/LinearAlgebra/dense.jl Outdated Show resolved Hide resolved
Copy link
Member

@willtebbutt willtebbutt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm happy with this. @mcabbott if you could bump the patch version I would be happy to merge and to create a release. Thanks for the contribution!

@mcabbott mcabbott closed this Oct 23, 2020
@mcabbott mcabbott reopened this Oct 23, 2020
@willtebbutt willtebbutt merged commit cf36ac6 into JuliaDiff:master Oct 23, 2020
@mcabbott mcabbott deleted the dot branch October 23, 2020 09:02
@oxinabox
Copy link
Member

Oops totally lost track of this PR.
Thanks both.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants