You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently the GCN and GAT layers allow specifying the activations at each layer of the network. GraphSAGE and HinSAGE do not, but rather hard code the activations to "relu" for all layers except the final one which is "linear".
Let's change the GraphSAGE class to accept an "activations" argument to specify the activations at all levels. The default behaviour should match the current implementation.
User Story
As a: user of stellargraph library
I want: have similar configuration options for each graph ML class
so that: I can switch between models easily
Done Checklist (Development)
Assumptions of the user story met
Produced code for required functionality
Branch and Pull Request build on CI
Branch and Pull Request pass unit tests on CI
Version number reflects new status
Peer Code Review Performed
Code well commented
Documentation in repo
CHANGELOG.md updated
Team demo
The text was updated successfully, but these errors were encountered:
Description
Currently the
GCN
andGAT
layers allow specifying the activations at each layer of the network. GraphSAGE and HinSAGE do not, but rather hard code the activations to "relu" for all layers except the final one which is "linear".Let's change the GraphSAGE class to accept an "activations" argument to specify the activations at all levels. The default behaviour should match the current implementation.
User Story
As a: user of stellargraph library
I want: have similar configuration options for each graph ML class
so that: I can switch between models easily
Done Checklist (Development)
The text was updated successfully, but these errors were encountered: