Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: 0 GNN layers on last IPU + Fingerprinting #488

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

callumm-graphcore
Copy link
Collaborator

Changelogs

  • Allow having 0 GNN layers (i.e. only the task heads) on the last IPU in a pipeline split

Checklist:

  • Was this PR discussed in an issue? It is recommended to first discuss a new feature into a GitHub issue before opening a PR.
  • Add tests to cover the fixed bug(s) or the new introduced feature(s) (if appropriate).
  • Update the API documentation is a new function is added, or an existing one is deleted.
  • Write concise and explanatory changelogs above.
  • If possible, assign one of the following labels to the PR: feature, fix or test (or ask a maintainer to do it for you).

discussion related to that PR

@callumm-graphcore callumm-graphcore changed the title Account for possibility of having 0 GNN layers on last IPU WIP: Account for possibility of having 0 GNN layers on last IPU Dec 4, 2023
Copy link
Collaborator

@DomInvivo DomInvivo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approved changes, can merge when all test pass

Copy link
Contributor

@s-maddrellmander s-maddrellmander left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One tiny question, apologies if I'm missing anything here.
But otherwise the logic looks good to me. Interested to see the results of this.

What exactly is the motivation with having no GNN layers on the final IPU, making more space for heads / fine-tuning ?


return final_output, extras_to_return

# Keep original code if no extra_return_names
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe I'm missing something obvious, in which case apologies.
Aren't you missing the original forward pass in the case the extra_return_names is missing?

Suggested change
# Keep original code if no extra_return_names
# Keep original code if no extra_return_names
else:
g = self.gnn.forward(g)

My suspicion is that this will fix the failing tests

@callumm-graphcore
Copy link
Collaborator Author

Sorry Sam, this was more about me sharing some extra code with Kerstin than making a change to the library, I will look at your comments tomorrow

@s-maddrellmander
Copy link
Contributor

s-maddrellmander commented Dec 7, 2023 via email

Copy link

codecov bot commented Dec 20, 2023

Codecov Report

Merging #488 (5ee8bc1) into main (a8e715c) will decrease coverage by 0.96%.
Report is 68 commits behind head on main.
The diff coverage is 3.27%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #488      +/-   ##
==========================================
- Coverage   71.42%   70.46%   -0.96%     
==========================================
  Files          93       95       +2     
  Lines        8527     8845     +318     
==========================================
+ Hits         6090     6233     +143     
- Misses       2437     2612     +175     
Flag Coverage Δ
unittests 70.46% <3.27%> (-0.96%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Components Coverage Δ
ipu 49.14% <ø> (ø)

@callumm-graphcore callumm-graphcore changed the title WIP: Account for possibility of having 0 GNN layers on last IPU WIP: 0 GNN layers on last IPU + Fingerprinting Dec 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants