-
-
Notifications
You must be signed in to change notification settings - Fork 40
Refactor #3
Refactor #3
Conversation
loss(𝐱, 𝐲) = sum(abs2, 𝐲 .- m(𝐱)) / size(𝐱)[end] | ||
data = [(𝐱[:, :, 1:5], 𝐲[:, 1:5])] | ||
Flux.train!(loss, params(m), data, Flux.ADAM()) | ||
@test true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is redundant
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Due to the use of Flux.train
, it is a nondeterministic process. @test true
is added to make sure the process is passed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test case will still passed if the train!
failed
loss(𝐱, 𝐲) = sum(abs2, 𝐲 .- m(𝐱)) / size(𝐱)[end] | ||
data = [(𝐱[:, :, 1:5], 𝐲[:, 1:5])] | ||
Flux.train!(loss, params(m), data, Flux.ADAM()) | ||
@test true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is redundant
loss(x, y) = Flux.mse(real.(m(x)), y) | ||
data = [(T.(𝐱[:, :, 1:5]), rand(T, 64, 1024, 5))] | ||
Flux.train!(loss, params(m), data, Flux.ADAM()) | ||
@test true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is redundant
Thanks in advance |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No description provided.