Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add beta keyword to conv #672

Merged
merged 4 commits into from
Jan 27, 2021
Merged

add beta keyword to conv #672

merged 4 commits into from
Jan 27, 2021

Conversation

jw3126
Copy link
Contributor

@jw3126 jw3126 commented Jan 24, 2021

No description provided.

@codecov
Copy link

codecov bot commented Jan 24, 2021

Codecov Report

Merging #672 (77ce900) into master (1045963) will decrease coverage by 0.04%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #672      +/-   ##
==========================================
- Coverage   77.83%   77.79%   -0.05%     
==========================================
  Files         117      117              
  Lines        7035     7035              
==========================================
- Hits         5476     5473       -3     
- Misses       1559     1562       +3     
Impacted Files Coverage Δ
lib/cudnn/nnlib.jl 69.42% <100.00%> (ø)
lib/nvml/error.jl 71.42% <0.00%> (-21.43%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1045963...fc0d848. Read the comment docs.

@DhairyaLGandhi
Copy link
Member

Thank you for the contribution, thats super helpful!

We do support it in NNlib so that's consistent. We might want to expose this in Flux better though so the implementation matches both cpu and GPU and has consistent APIs to boot.

One comment I would have is that we should retain the default alpha since it makes the code more readable and we might want to test for a couple more cases of using beta with other keyword toggles we can pull

Otherwise lgtm!

@jw3126
Copy link
Contributor Author

jw3126 commented Jan 26, 2021

One comment I would have is that we should retain the default alpha since it makes the code more readable and we might want to test for a couple more cases of using beta with other keyword toggles we can pull

done

Copy link
Member

@DhairyaLGandhi DhairyaLGandhi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

lib/cudnn/nnlib.jl Outdated Show resolved Hide resolved
lib/cudnn/nnlib.jl Outdated Show resolved Hide resolved
jw3126 and others added 2 commits January 26, 2021 09:32
Co-authored-by: Dhairya Gandhi <dhairya@juliacomputing.com>
Co-authored-by: Dhairya Gandhi <dhairya@juliacomputing.com>
@maleadt maleadt merged commit f7ea311 into JuliaGPU:master Jan 27, 2021
@maleadt maleadt mentioned this pull request Jan 27, 2021
25 tasks
maleadt added a commit to denizyuret/CUDA.jl that referenced this pull request Feb 1, 2021
maleadt added a commit that referenced this pull request Feb 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants