Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add pytorch/torchvision porting and comparison script #225

Merged
merged 5 commits into from
Apr 24, 2023

Conversation

CarloLucibello
Copy link
Member

No description provided.

@theabhirath
Copy link
Member

This script and an older version that @darsnack had posted as a gist both work, but I find that loadmodel! is taking an insane amount of memory for even small models like VGG-11 or ResNet-18. However I have only tried this on my PC and cannot be sure...can someone else confirm if this is an issue?

@darsnack
Copy link
Member

Here is also a script for porting weights.

@theabhirath
Copy link
Member

theabhirath commented Apr 23, 2023

Here is also a script for porting weights.

Yep, that's the one I mentioned. Here too loadmodel! seems to eat up a ton of memory for me on my PC.

@ToucheSir
Copy link
Member

Can you take a quick allocation profile with and show the flamegraph here using PProf or the VSCode profile view?

@theabhirath
Copy link
Member

Sure. Got this with PProf. It's a screenshot of the flame graph, but if you want the raw results file I can try and upload that as well to somewhere else (I don't think GitHub supports sharing that).

CleanShot 2023-04-24 at 08 10 25

@CarloLucibello CarloLucibello changed the title add pytorch comparison script add pytorch/torchvision porting and comparison script Apr 24, 2023
@CarloLucibello CarloLucibello merged commit ebc9f04 into master Apr 24, 2023
@theabhirath
Copy link
Member

Ah, @ToucheSir, I neglected to mention something which on a re-run I find might be the cause of this. I was using @darsnack's script to port weights but I added an fmapstructure(identity, weights) in the end before saving because IIRC we had discussed this would avoid saving information like the RNGs and allow us to save and load weights independent of Julia version (as would be ideal). When I remove that line, the memory requirement goes down to something manageable again. Is loadmodel! somehow very bad for loading models stored in this format? Can this be fixed?

@theabhirath theabhirath mentioned this pull request Apr 25, 2023
3 tasks
@CarloLucibello CarloLucibello deleted the cl/include branch July 17, 2023 05:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants