-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Node embedding projections for ForceRegressionTask
#204
Node embedding projections for ForceRegressionTask
#204
Conversation
This explicitly passes node level embeddings to the output head and performs a sum reduction by default.
Signed-off-by: Lee, Kin Long Kelvin <kin.long.kelvin.lee@intel.com>
The force task tests pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Just out of curiosity, but what would the use case be for using a readout other than 'sum' for a graph level energy?
You can also do a mean reduction, or if you want to be super fancy, an attention weighted one (like in graph attention). Sum is more physically meaningful than a mean reduction IMO |
This PR refactors
ForceRegressionTask
to regress energies by reducing node contributions, as opposed to system-level embedding regression.readout
functions that depend on whether a graph structure is passed, and what framework is used. The reduction can be configured with the task hyperparameterembedding_reduction_type
process_embeddings
will pack the node energy contributions into the output dictionary.