New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No train code #1
Comments
Hi unfortunately I cannot release the training code at this time. I've updated the readme with some key tips. I encourage people to try apply these and our network architecture to existing open source training codes. For example, |
Thanks for everything you have provided. Very impressive. What do you mean by clipping the alpha? What do you do? |
torch.clamp(alpha,0,1)
…On Wed, Aug 19, 2020, 5:51 PM ucb-pb ***@***.***> wrote:
What do you mean by clipping the alpha? What do you do?
—
You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub
<#1 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAED5MXUNRODH7DYELGV72LSBP7H7ANCNFSM4LPDGYSA>
.
|
Ok, I'm unclear about where that would happen. Like is it applied to the output alpha before the loss is calculated? |
Hi yes it's applied to the output of the network for the alpha channel
before the loss is calculated.
Occasionally this can cause issues at the beginning of training. And the
output defaults to either all zeroes or all ones. If this happens, disable
the clamping for the first few hundred iterations and then switch it back
on.
…On Wed, Aug 19, 2020, 7:34 PM ucb-pb ***@***.***> wrote:
Ok, I'm unclear about where that would happen. Like is it applied to the
output alpha before the loss is calculated?
—
You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub
<#1 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAED5MXJCMD57MBRAKBSJBLSBQLLDANCNFSM4LPDGYSA>
.
|
thank you. I see the clamping and group normalization are already incorporated into your network code, which is great. is the weight standardization happening too? I can't tell where that would/should happen at first glance if it is. |
actually, i see clamping in a few places. which one(s) do you disable? |
Hi~ Thank you for the wonderful work.
I find your work on the alphamatting.com, and I am very glad that you open source your code so quickly.
However, I do not find the training code in the repository.
Would you like also open source the training code?
The text was updated successfully, but these errors were encountered: