Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test rsgcn #146

Merged
merged 6 commits into from
Apr 23, 2018
Merged

Test rsgcn #146

merged 6 commits into from
Apr 23, 2018

Conversation

corochann
Copy link
Member

  • Adding dropout_ratio attribute to control dropout ratio in RSGCN.
  • Added a test for RSGCN
    • forward
    • backward
    • graph isomorphism invariant

@codecov-io
Copy link

codecov-io commented Apr 21, 2018

Codecov Report

Merging #146 into master will increase coverage by 1.38%.
The diff coverage is 83.05%.

@@            Coverage Diff             @@
##           master     #146      +/-   ##
==========================================
+ Coverage   75.44%   76.82%   +1.38%     
==========================================
  Files          77       78       +1     
  Lines        3213     3271      +58     
==========================================
+ Hits         2424     2513      +89     
+ Misses        789      758      -31

).astype(numpy.int32)
# adj_data = numpy.random.randint(
# 0, high=2, size=(batch_size, atom_size, atom_size)
# ).astype(numpy.float32)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove unused code.

0, high=1, size=(batch_size, atom_size, atom_size)
).astype(numpy.float32)
adj_data = adj_data + adj_data.swapaxes(-1, -2)
# adj_data = (adj_data > 1.5).astype(numpy.float32)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove unused code.

@delta2323
Copy link
Member

delta2323 commented Apr 21, 2018

Regarding stochastic behavior of dropout in RSGCN, we can fix Dropout's mask by holding an instance of F.Dropout in RSGCN and reuse it in every call. The test of Dropout in Chainer (link) would be helpful. As we do not want such deterministic behavior in ordinal use cases, one solution is to add an option to RSGCN to switch stochastic/deterministic behavior of dropout.

@delta2323 delta2323 self-assigned this Apr 21, 2018
@corochann
Copy link
Member Author

Is this change enough?

@delta2323
Copy link
Member

delta2323 commented Apr 23, 2018

Regarding my previous comment, it is OK to me to focus on deterministic behavior of dropout in another PR.

@corochann
Copy link
Member Author

Could you merge if there is no request changes?

@delta2323
Copy link
Member

Example tests passed.

@delta2323
Copy link
Member

I created #153 for the dropout issue.

@corochann corochann deleted the test_rsgcn branch April 24, 2018 07:12
@mottodora mottodora added this to the 0.3.0 milestone Apr 24, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants