Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the reltol parameter #4

Closed
MayuOshima opened this issue Mar 5, 2021 · 1 comment
Closed

About the reltol parameter #4

MayuOshima opened this issue Mar 5, 2021 · 1 comment

Comments

@MayuOshima
Copy link

Hello, thanks for your excellent work!
But I have one question.
In model/pred_illuDecomp_layer.py, there is a function pinv(A, reltol=1e-6). I found that if I reserve this line:
s = tf.boolean_mask(s, s>atol)
the shape of s may become (2,) and then the shape of s_inv becomes (2, 2), which will raise error when do
tf.matmul(s_inv, tf.transpose(u))

So I want to know why need to clear entries lower than reltols_max ?
Does it matter if I don't clear entries lower than reltol
s_max ?

Thank you!

@MayuOshima
Copy link
Author

Oh, it maybe because the mask I put into the network is 0,1 form. After I convert it to 255 form, the problem does not happen anymore.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant