Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

some code question #8

Open
z-z-zhao opened this issue Nov 29, 2023 · 0 comments
Open

some code question #8

z-z-zhao opened this issue Nov 29, 2023 · 0 comments

Comments

@z-z-zhao
Copy link

z-z-zhao commented Nov 29, 2023

你好,看来你的代码,有几个疑问的地方

1、关于 MLP_DIF 代码中的self.filters_fine,这个MLP网络的输出值,fine_y 对应的是论文中的Δ Occ还是Fine Occ?

2、关于损失的计算,论文中的3D reconstruction losses:$L _ { r e c } = | | \widetilde { O } _ { s } ( p ) - O _ { g t } ( p ) | | ^ { 2 }$
其中$\widetilde { O } _ { s } ( p )$是Fine Occ(Coarse occ+Δocc)。但是在代码中是使用

            error_if += self.error_term(pred_if, occ_labels)
            error_if += self.error_term(miu_if, occ_labels)

https://github.com/psyai-net/D-IF_release/blob/f5b82ff5e18ca42115741c3f808520bf664329d0/lib/net/HGPIFuNet.py#L403C1-L405C60

看起来是将第二个MLP(Occupancy Rectifier)的输出pred_if和第一个MLP(Distribution Predictor)的输出之一$\mu$都做了一次L2损失,并相加,这看起来和论文中似乎不一致。

3、最后,train_step中,对损失乘以0了,这是为什么?

        error_G = error_G * 0

error_G = error_G * 0

@yxt7979

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant