Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error on tensor proto size #154

Closed
rkflygaard opened this issue Aug 27, 2021 · 5 comments
Closed

Error on tensor proto size #154

rkflygaard opened this issue Aug 27, 2021 · 5 comments
Labels
error report Something isn't working

Comments

@rkflygaard
Copy link

Hi,
I'm getting an error when trying to run alphafold on a particular sequence, that tensor proto cannot exceed 2GB (see attached file with the error log). I have successfully run alphafold on many other sequences without any problem, but for this particular sequence I am getting this tensor proto error no matter what I try (e.g. running with full database vs reduced one, different gpu nodes to not hit out of memory).

Any help on what I can try would be much appreciated!
Thanks a lot!

AF2_slurm_56393_snoke5_alphafold_56393_4294967294.txt

@abridgland abridgland added the error report Something isn't working label Sep 1, 2021
@yzyw0702
Copy link

yzyw0702 commented Sep 2, 2021

Same issue took place in my case, when the input sequence is very long (>1500).

@rkflygaard
Copy link
Author

Same issue took place in my case, when the input sequence is very long (>1500).

The strange thing for me though is that I have successfully run sequences of >2000 residues without getting any errors or problems, but this single input sequence always gives an error on tensor proto size.

@Augustin-Zidek
Copy link
Collaborator

Hi, this is most likely because the MSA is too big. There is a suggested workaround (reducing the size of the MSA) in: #71

@yzyw0702
Copy link

Thanks Zidek, and new things to update: I found that AF2 uses TF2.compat.v1 to call Tensor, and we may switch it to TF2 native Tensor or other packages as alternative way to solve the issue.

@Augustin-Zidek
Copy link
Collaborator

Addressed in 0be2b30.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
error report Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants