Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix deprecation warnings with new Nx #454

Merged
merged 3 commits into from
Jan 19, 2023
Merged

Fix deprecation warnings with new Nx #454

merged 3 commits into from
Jan 19, 2023

Conversation

seanmor5
Copy link
Contributor

Cleaning up final blockers before next release

@josevalim
Copy link
Contributor

josevalim commented Jan 18, 2023

The warnings for Nx.tensor should be fixed by passing those as arguments instead of opts. That's easy for everything except for layers. I assume we don't want to make the opts[:rate] an argument, right? So the current fix is fine. I will push a commit for the initializers ones. :)

@@ -166,7 +165,7 @@ defmodule Axon.Schedules do

defnp apply_constant(_step, opts \\ []) do
opts = keyword!(opts, init_value: 0.01)
Nx.tensor(opts[:init_value])
opts[:init_value]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we should implement constant as literally:

def constant(init_value, _opts \\ []) do
  Nx.tensor(init_value)
end

?

Copy link
Contributor

@josevalim josevalim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Besides the two minor comments, it looks good to me. I pushed a commit to address the initializers bit. :)

@seanmor5 seanmor5 merged commit 3a81f8f into main Jan 19, 2023
@seanmor5 seanmor5 deleted the sm-fix-deprecations branch January 19, 2023 11:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants