New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Porting DTNNEmbedding Layer #3415
Conversation
Tasks:
|
@GreatRSingh The PR on overall looks good, just address the minor reviews that I have given. |
@shaipranesh2 Ready For Review. |
LGTM! |
init: str, optional | ||
Weight initialization for filters. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to specify what options are possible. Also this parameter needs a more descriptive name than init
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think 'init' name is apt there. Can you suggest what i should change it to?
I have written the options available for init. I have skipped some of the options as there seems to be too many of them, do you want that i write all of them.
def forward(self, inputs): | ||
""" | ||
parent layers: atom_number | ||
""" | ||
atom_number = inputs |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needs type hints. What is inputs?
Docstring needs to be updated it seems.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have made some changes here. I think something better can be done here do you have any suggestion.
|
||
def forward(self, inputs: torch.Tensor): | ||
""" | ||
parent layers: atom_number |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not clear what this is supposed to mean?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It means that it is getting a tensor called 'atom_number' as input from the parent layers.
Here parent layers is the location where this layer is used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
atom_number is used to choose the required embeddings.
@tonydavis629 @shaipranesh2 Ready for review. |
"""parent layers: atom_number | ||
|
||
Parameters | ||
---------- | ||
inputs: torch.Tensor | ||
Tensor containing the Atom number of Atoms whose embeddings are requested. | ||
|
||
""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
more explanation on what the function actually does, remove 'parent layers' if it doesn't have an explanation. Then also requires a return value with its type, shape and explanation.
You have a yapf error in layers.py |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM from my end. Will merge in once @tonydavis629 and @shaipranesh2 approve
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approved
Description
This PR ports the DTNNEmbedding Layer.
Type of change
Please check the option that is related to your PR.
Checklist
yapf -i <modified file>
and check no errors (yapf version must be 0.32.0)mypy -p deepchem
and check no errorsflake8 <modified file> --count
and check no errorspython -m doctest <modified file>
and check no errors