Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于代码的一些疑惑与结果复现 #3

Closed
AlexYangLi opened this issue Jul 1, 2018 · 3 comments
Closed

关于代码的一些疑惑与结果复现 #3

AlexYangLi opened this issue Jul 1, 2018 · 3 comments

Comments

@AlexYangLi
Copy link

AlexYangLi commented Jul 1, 2018

  • 代码疑惑1
    model.py中,125行的unmasked_attr_loss的shape是[batch_size,],126行的attr_mask的shape是[batch_size, 1],两者经过tf.multiply后的attr_loss的shape就会是[batch_size, batch_size]。是不是应该先把attr_mask reshape成[batch_size,]?

  • 代码疑惑2
    model.py中,71行是否应该用tf.reduce_mean而非tf.reduce_sum

  • 结果复现
    这是我复现的结果:Acc: 93.4; MP: 56.9; MR: 57.7; F1: 55.6。F1值没有达到论文的64.9,想知道这可能是什么原因造成的?谢谢!

@zig-kwin-hu
Copy link
Collaborator

我们为了提高代码可读性,重构了我们的代码,并由此带来了这些问题。感谢您的意见,我们最近会提交正确的代码。

@zig-kwin-hu
Copy link
Collaborator

您好,我们的上传了修正后的代码,由于一定的随机性,可能F1值会在62到64.9之间。

@tucunchao
Copy link
Member

我们已经修复了之前代码中的bug,目前的版本能够达到论文中汇报的结果

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants