You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm now trying to predict the binding between my peptide seq and mhc
It works well when the length of peptide is 9
But when peptide length was 10
It throws error like this
0%| | 0/1 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/awork10-3/DeepLigand/elmo_embed.py", line 72, in
context_ids = batcher.batch_sentences(tokenized_context, max_length=args.max_len)
TypeError: batch_sentences() got an unexpected keyword argument 'max_length'
data embedding
running python /awork10-3/DeepLigand/embed_plusrelation_elmo_massspec.py --mhcfile /awork10-3/DeepLigand/output_x/test.mhc --pepfile /awork10-3/DeepLigand/output_x/test.pep.padded --labelfile /awork10-3/DeepLigand/output_x/test.label --relationfile /awork10-3/DeepLigand/output_x/test.relation --masslabelfile /awork10-3/DeepLigand/output_x/test.masslabel --elmodir /awork10-3/DeepLigand/output_x/test.pep.token --elmotag elmo_embeddingds_alltrain.epitope.elmo --mapper /awork10-3/DeepLigand/data/onehot_first20BLOSUM50 --outfileprefix /awork10-3/DeepLigand/output_x/test.h5.batch --expected_pep_len 9
Traceback (most recent call last):
File "/awork10-3/DeepLigand/embed_plusrelation_elmo_massspec.py", line 107, in
embed_all(f1, f2, f3, f4, f5, args.elmodir, args.elmotag, mapper, args.outfileprefix)
File "/awork10-3/DeepLigand/embed_plusrelation_elmo_massspec.py", line 52, in embed_all
assert(exists(join(elmo_dir, 'batch'+str(elmo_cnt)+'.'+elmotag+'.hdf5')))
AssertionError
When I see the training data and example, I think there is no reason that peptide with 10mer
is not working at all
Can somebody suggest me what kind of mistake I made ?
Thanking in advance
The text was updated successfully, but these errors were encountered:
Hi my name is Jong hui Hong
I'm now trying to predict the binding between my peptide seq and mhc
It works well when the length of peptide is 9
But when peptide length was 10
It throws error like this
0%| | 0/1 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/awork10-3/DeepLigand/elmo_embed.py", line 72, in
context_ids = batcher.batch_sentences(tokenized_context, max_length=args.max_len)
TypeError: batch_sentences() got an unexpected keyword argument 'max_length'
data embedding
running python /awork10-3/DeepLigand/embed_plusrelation_elmo_massspec.py --mhcfile /awork10-3/DeepLigand/output_x/test.mhc --pepfile /awork10-3/DeepLigand/output_x/test.pep.padded --labelfile /awork10-3/DeepLigand/output_x/test.label --relationfile /awork10-3/DeepLigand/output_x/test.relation --masslabelfile /awork10-3/DeepLigand/output_x/test.masslabel --elmodir /awork10-3/DeepLigand/output_x/test.pep.token --elmotag elmo_embeddingds_alltrain.epitope.elmo --mapper /awork10-3/DeepLigand/data/onehot_first20BLOSUM50 --outfileprefix /awork10-3/DeepLigand/output_x/test.h5.batch --expected_pep_len 9
Traceback (most recent call last):
File "/awork10-3/DeepLigand/embed_plusrelation_elmo_massspec.py", line 107, in
embed_all(f1, f2, f3, f4, f5, args.elmodir, args.elmotag, mapper, args.outfileprefix)
File "/awork10-3/DeepLigand/embed_plusrelation_elmo_massspec.py", line 52, in embed_all
assert(exists(join(elmo_dir, 'batch'+str(elmo_cnt)+'.'+elmotag+'.hdf5')))
AssertionError
When I see the training data and example, I think there is no reason that peptide with 10mer
is not working at all
Can somebody suggest me what kind of mistake I made ?
Thanking in advance
The text was updated successfully, but these errors were encountered: