Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Network modelling prototype 1 #9

Open
wants to merge 12 commits into
base: master
Choose a base branch
from

Conversation

rzietal
Copy link

@rzietal rzietal commented Aug 28, 2020

No description provided.

@DhairyaLGandhi
Copy link
Owner

Most of it looks decent, there are a few places where the formatting needs to be fixed however.

Copy link
Owner

@DhairyaLGandhi DhairyaLGandhi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for looking into this!

src/model.jl Outdated
@@ -82,7 +89,8 @@ function (u::Unet)(x::AbstractArray)
up_x2 = u.up_blocks[2](up_x1, x2)
up_x3 = u.up_blocks[3](up_x2, x1)
up_x5 = u.up_blocks[4](up_x3, op)
tanh.(u.up_blocks[end](up_x5))
u.up_blocks[end](up_x5)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why remove the activation here? If would be better to hold it as a parameter to Unet that can be set by the user and let the default be tanh. For anyone who doesn't need it, they can set it to identity

Copy link
Author

@rzietal rzietal Sep 4, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As far as I know there is no tanh activation function in the last UNET layer only softmax (see sect 3 of the paper). Then, Flux recommends using logitcrossentropy for numerical stability instead of cross entropy and softmax separated hence I removed softmax altogether as it is included in the loss function itself.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In that case, I think it would be nice if we could have the default be identity and have a parameter in Unet that can control the activation

using Base.Iterators
using FileIO
using Images
using JLD
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs to be added to Project.toml under the test dependencies

ip = rand(Float32, 256, 256, 1,1)
gs = gradient(Flux.params(u)) do
sum(u(ip))
nEpochs = 1000
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would be very slow to run on CI

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's CI?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Continuous integration - it runs the test suite with every commit as you can see we do ours on GitHub actions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants