-
Notifications
You must be signed in to change notification settings - Fork 11
Description
When I try to run:
python3 docker/run_docker.py --left_pdb_filepath project/test_data/4heq_l_u.pdb --right_pdb_filepath project/test_data/4heq_r_u.pdb --input_dataset_dir project/datasets/CASP_CAPRI --ckpt_name project/checkpoints/LitGINI-GeoTran-DilResNet.ckpt --hhsuite_db ~/Data/Databases/uniclust30/uniclust30_2018_08/uniclust30_2018_08
I get these logs:
I0621 12:54:27.512626 139977373710144 run_docker.py:59] Mounting /home/ryc/pro/DeepInteract/project/test_data -> /mnt/input_pdbs
I0621 12:54:27.512762 139977373710144 run_docker.py:59] Mounting /home/ryc/pro/DeepInteract/project/test_data -> /mnt/input_pdbs
I0621 12:54:27.512836 139977373710144 run_docker.py:59] Mounting /home/ryc/pro/DeepInteract/project/datasets/CASP_CAPRI -> /mnt/Input
I0621 12:54:27.512908 139977373710144 run_docker.py:59] Mounting /home/ryc/pro/DeepInteract/project/checkpoints -> /mnt/checkpoints
I0621 12:54:27.512977 139977373710144 run_docker.py:59] Mounting /home/ryc/Data/Databases/uniclust30/uniclust30_2018_08 -> /mnt/hhsuite_db
I0621 12:54:30.589913 139977373710144 run_docker.py:135] DGL backend not selected or invalid. Assuming PyTorch for now.
I0621 12:54:30.590292 139977373710144 run_docker.py:135] Using backend: pytorch
I0621 12:54:30.594311 139977373710144 run_docker.py:135] I0621 12:54:30.593440 140113106646848 deepinteract_utils.py:1098] Seeding everything with random seed 42
I0621 12:54:30.594596 139977373710144 run_docker.py:135] Global seed set to 42
I0621 12:54:30.643066 139977373710144 run_docker.py:135] cp: cannot stat '/mnt/input_pdbs/4heq_l_u.pdb': No such file or directory
I0621 12:54:30.654789 139977373710144 run_docker.py:135] cp: cannot stat '/mnt/input_pdbs/4heq_r_u.pdb': No such file or directory
I0621 12:54:30.655230 139977373710144 run_docker.py:135] I0621 12:54:30.654651 140113106646848 deepinteract_utils.py:608] Making interim data set from raw data
I0621 12:54:30.675874 139977373710144 run_docker.py:135] I0621 12:54:30.675035 140113106646848 parse.py:43] 62 requested keys, 60 produced keys, 2 work keys
I0621 12:54:30.676792 139977373710144 run_docker.py:135] I0621 12:54:30.675550 140113106646848 parallel.py:46] Processing 2 inputs.
I0621 12:54:30.676914 139977373710144 run_docker.py:135] I0621 12:54:30.676569 140113106646848 parallel.py:62] Sequential Mode.
I0621 12:54:30.677030 139977373710144 run_docker.py:135] I0621 12:54:30.676633 140113106646848 parse.py:63] Reading /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:54:30.711622 139977373710144 run_docker.py:135] I0621 12:54:30.710961 140113106646848 parse.py:65] Writing /mnt/Input/raw/he/4heq_r_u.pdb to /mnt/Input/interim/parsed/he/4heq_r_u.pdb.pkl
I0621 12:54:30.713438 139977373710144 run_docker.py:135] I0621 12:54:30.712913 140113106646848 parse.py:67] Done writing /mnt/Input/raw/he/4heq_r_u.pdb to /mnt/Input/interim/parsed/he/4heq_r_u.pdb.pkl
I0621 12:54:30.713546 139977373710144 run_docker.py:135] I0621 12:54:30.713084 140113106646848 parse.py:63] Reading /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:54:30.744368 139977373710144 run_docker.py:135] I0621 12:54:30.743873 140113106646848 parse.py:65] Writing /mnt/Input/raw/he/4heq_l_u.pdb to /mnt/Input/interim/parsed/he/4heq_l_u.pdb.pkl
I0621 12:54:30.745597 139977373710144 run_docker.py:135] I0621 12:54:30.745240 140113106646848 parse.py:67] Done writing /mnt/Input/raw/he/4heq_l_u.pdb to /mnt/Input/interim/parsed/he/4heq_l_u.pdb.pkl
I0621 12:54:30.745825 139977373710144 run_docker.py:135] I0621 12:54:30.745505 140113106646848 complex.py:38] Getting filenames...
I0621 12:54:30.749119 139977373710144 run_docker.py:135] I0621 12:54:30.748700 140113106646848 complex.py:40] Getting complexes...
I0621 12:54:30.770302 139977373710144 run_docker.py:135] I0621 12:54:30.769680 140113106646848 pair.py:79] 31 requested keys, 30 produced keys, 1 work keys
I0621 12:54:30.770423 139977373710144 run_docker.py:135] I0621 12:54:30.769779 140113106646848 parallel.py:46] Processing 1 inputs.
I0621 12:54:30.770527 139977373710144 run_docker.py:135] I0621 12:54:30.769842 140113106646848 parallel.py:62] Sequential Mode.
I0621 12:54:30.770629 139977373710144 run_docker.py:135] I0621 12:54:30.769901 140113106646848 pair.py:97] Working on 4heq
I0621 12:54:30.773638 139977373710144 run_docker.py:135] I0621 12:54:30.773111 140113106646848 pair.py:102] For complex 4heq found 1 pairs out of 2 chains
I0621 12:54:31.086785 139977373710144 run_docker.py:135] I0621 12:54:31.085926 140113106646848 deepinteract_utils.py:689] Generating PSAIA features from PDB files in /mnt/Input/interim/parsed
I0621 12:54:31.090075 139977373710144 run_docker.py:135] I0621 12:54:31.089508 140113106646848 conservation.py:361] 0 PDB files to process with PSAIA
I0621 12:54:31.090215 139977373710144 run_docker.py:135] I0621 12:54:31.089650 140113106646848 parallel.py:46] Processing 1 inputs.
I0621 12:54:31.090428 139977373710144 run_docker.py:135] I0621 12:54:31.089698 140113106646848 parallel.py:62] Sequential Mode.
I0621 12:54:31.090618 139977373710144 run_docker.py:135] I0621 12:54:31.089743 140113106646848 conservation.py:43] PSAIA'ing /mnt/Input/interim/external_feats/PSAIA/INPUT/pdb_list.fls
I0621 12:54:31.114144 139977373710144 run_docker.py:135] I0621 12:54:31.113151 140113106646848 conservation.py:200] For generating protrusion indices, spent 00.02 PSAIA'ing, 00.00 writing, and 00.02 overall.
I0621 12:54:31.114319 139977373710144 run_docker.py:135] I0621 12:54:31.113927 140113106646848 deepinteract_utils.py:706] Generating profile HMM features from PDB files in /mnt/Input/interim/parsed
I0621 12:54:31.125687 139977373710144 run_docker.py:135] I0621 12:54:31.125225 140113106646848 conservation.py:458] 62 requested keys, 60 produced keys, 2 work filenames
I0621 12:54:31.125820 139977373710144 run_docker.py:135] I0621 12:54:31.125341 140113106646848 conservation.py:464] 2 work filenames
I0621 12:54:31.126219 139977373710144 run_docker.py:135] I0621 12:54:31.125793 140113106646848 parallel.py:46] Processing 2 inputs.
I0621 12:54:31.126399 139977373710144 run_docker.py:135] I0621 12:54:31.125915 140113106646848 parallel.py:62] Sequential Mode.
I0621 12:54:31.160443 139977373710144 run_docker.py:135] I0621 12:54:31.159958 140113106646848 conservation.py:152] HHblits'ing /mnt/Input/interim/external_feats/he/work/4heq_l_u.pdb-1-A.fa
I0621 12:55:03.191800 139977373710144 run_docker.py:135] I0621 12:55:03.190688 140113106646848 conservation.py:238] For 1 profile HMMs generated from 4heq_l_u.pdb, spent 32.06 blitsing, 00.00 writing, and 32.06 overall.
I0621 12:55:03.224250 139977373710144 run_docker.py:135] I0621 12:55:03.223448 140113106646848 conservation.py:152] HHblits'ing /mnt/Input/interim/external_feats/he/work/4heq_r_u.pdb-1-B.fa
I0621 12:55:37.966540 139977373710144 run_docker.py:135] I0621 12:55:37.965222 140113106646848 conservation.py:238] For 1 profile HMMs generated from 4heq_r_u.pdb, spent 34.77 blitsing, 00.00 writing, and 34.77 overall.
I0621 12:55:37.966913 139977373710144 run_docker.py:135] I0621 12:55:37.965721 140113106646848 deepinteract_utils.py:722] Starting postprocessing for all unprocessed pairs in /mnt/Input/interim/pairs
I0621 12:55:37.967144 139977373710144 run_docker.py:135] I0621 12:55:37.965833 140113106646848 deepinteract_utils.py:729] Looking for all pairs in /mnt/Input/interim/pairs
I0621 12:55:37.972153 139977373710144 run_docker.py:135] I0621 12:55:37.971457 140113106646848 deepinteract_utils.py:743] Found 1 work pair(s) in /mnt/Input/interim/pairs
I0621 12:55:37.972460 139977373710144 run_docker.py:135] I0621 12:55:37.971827 140113106646848 parallel.py:46] Processing 1 inputs.
I0621 12:55:37.972671 139977373710144 run_docker.py:135] I0621 12:55:37.971918 140113106646848 parallel.py:62] Sequential Mode.
I0621 12:55:41.316108 139977373710144 run_docker.py:135] /opt/conda/lib/python3.8/site-packages/Bio/PDB/vectors.py:357: RuntimeWarning: invalid value encountered in double_scalars
I0621 12:55:41.316489 139977373710144 run_docker.py:135] c = (self * other) / (n1 * n2)
I0621 12:55:41.316720 139977373710144 run_docker.py:135] /opt/conda/lib/python3.8/site-packages/Bio/PDB/vectors.py:357: RuntimeWarning: invalid value encountered in double_scalars
I0621 12:55:41.316918 139977373710144 run_docker.py:135] c = (self * other) / (n1 * n2)
I0621 12:55:41.317103 139977373710144 run_docker.py:135] I0621 12:55:41.314721 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 9 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.336150 139977373710144 run_docker.py:135] I0621 12:55:41.335281 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 13 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.356255 139977373710144 run_docker.py:135] I0621 12:55:41.355384 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 17 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.427843 139977373710144 run_docker.py:135] I0621 12:55:41.426913 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 30 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.506287 139977373710144 run_docker.py:135] I0621 12:55:41.505459 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 45 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.526054 139977373710144 run_docker.py:135] I0621 12:55:41.525439 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 49 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.560204 139977373710144 run_docker.py:135] I0621 12:55:41.559483 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 56 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.585425 139977373710144 run_docker.py:135] I0621 12:55:41.584762 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 61 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.686717 139977373710144 run_docker.py:135] I0621 12:55:41.686072 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 82 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.720789 139977373710144 run_docker.py:135] I0621 12:55:41.720090 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 89 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.735666 139977373710144 run_docker.py:135] I0621 12:55:41.735008 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 92 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.745175 139977373710144 run_docker.py:135] I0621 12:55:41.744497 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 94 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.788900 139977373710144 run_docker.py:135] I0621 12:55:41.788190 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 103 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.852835 139977373710144 run_docker.py:135] I0621 12:55:41.852125 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 116 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:41.909060 139977373710144 run_docker.py:135] I0621 12:55:41.908362 140113106646848 dips_plus_utils.py:536] Normal vector missing for df0 residue 128 in chain A in file /mnt/Input/raw/he/4heq_l_u.pdb
I0621 12:55:42.041210 139977373710144 run_docker.py:135] I0621 12:55:42.040462 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 9 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.059900 139977373710144 run_docker.py:135] I0621 12:55:42.059195 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 13 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.079119 139977373710144 run_docker.py:135] I0621 12:55:42.078433 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 17 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.143947 139977373710144 run_docker.py:135] I0621 12:55:42.143125 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 30 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.215802 139977373710144 run_docker.py:135] I0621 12:55:42.215100 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 45 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.234920 139977373710144 run_docker.py:135] I0621 12:55:42.234243 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 49 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.267936 139977373710144 run_docker.py:135] I0621 12:55:42.267218 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 56 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.292104 139977373710144 run_docker.py:135] I0621 12:55:42.291366 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 61 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.392189 139977373710144 run_docker.py:135] I0621 12:55:42.391432 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 82 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.426449 139977373710144 run_docker.py:135] I0621 12:55:42.425676 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 89 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.440944 139977373710144 run_docker.py:135] I0621 12:55:42.440204 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 92 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.450548 139977373710144 run_docker.py:135] I0621 12:55:42.449811 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 94 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.494798 139977373710144 run_docker.py:135] I0621 12:55:42.493852 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 103 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.557198 139977373710144 run_docker.py:135] I0621 12:55:42.556405 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 116 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.614418 139977373710144 run_docker.py:135] I0621 12:55:42.613722 140113106646848 dips_plus_utils.py:630] Normal vector missing for df1 residue 128 in chain B in file /mnt/Input/raw/he/4heq_r_u.pdb
I0621 12:55:42.718224 139977373710144 run_docker.py:135] I0621 12:55:42.717459 140113106646848 deepinteract_utils.py:773] Imputing missing feature values for given inputs
I0621 12:55:42.719177 139977373710144 run_docker.py:135] I0621 12:55:42.718771 140113106646848 parallel.py:46] Processing 31 inputs.
I0621 12:55:42.719329 139977373710144 run_docker.py:135] I0621 12:55:42.718858 140113106646848 parallel.py:62] Sequential Mode.
I0621 12:55:48.405303 139977373710144 run_docker.py:135] I0621 12:55:48.404334 140113106646848 lit_model_predict_docker.py:99] Loading complex for prediction, l_chain: /mnt/input_pdbs/4heq_l_u.pdb, r_chain: /mnt/input_pdbs/4heq_r_u.pdb
I0621 12:55:49.322944 139977373710144 run_docker.py:135] /opt/conda/lib/python3.8/site-packages/torchmetrics/utilities/prints.py:36: UserWarning: Metric AUROC
will save all targets and predictions in buffer. For large datasets this may lead to large memory footprint.
I0621 12:55:49.323281 139977373710144 run_docker.py:135] warnings.warn(*args, **kwargs)
I0621 12:55:49.323480 139977373710144 run_docker.py:135] /opt/conda/lib/python3.8/site-packages/torchmetrics/utilities/prints.py:36: UserWarning: Metric AveragePrecision
will save all targets and predictions in buffer. For large datasets this may lead to large memory footprint.
I0621 12:55:49.323687 139977373710144 run_docker.py:135] warnings.warn(*args, **kwargs)
I0621 12:55:49.323897 139977373710144 run_docker.py:135] /opt/conda/lib/python3.8/site-packages/pytorch_lightning/trainer/connectors/accelerator_connector.py:792: UserWarning: You are running on single node with no parallelization, so distributed has no effect.
I0621 12:55:49.324064 139977373710144 run_docker.py:135] rank_zero_warn("You are running on single node with no parallelization, so distributed has no effect.")
I0621 12:55:49.324226 139977373710144 run_docker.py:135] GPU available: False, used: False
I0621 12:55:49.324383 139977373710144 run_docker.py:135] TPU available: False, using: 0 TPU cores
I0621 12:55:49.324540 139977373710144 run_docker.py:135] IPU available: False, using: 0 IPUs
I0621 12:55:49.377713 139977373710144 run_docker.py:135] Setting the default backend to "pytorch". You can change it in the ~/.dgl/config.json file or export the DGLBACKEND environment variable. Valid options are: pytorch, mxnet, tensorflow (all lowercase)
I0621 12:55:49.378072 139977373710144 run_docker.py:135] /opt/conda/lib/python3.8/site-packages/pytorch_lightning/trainer/data_loading.py:105: UserWarning: The dataloader, predict dataloader 0, does not have many workers which may be a bottleneck. Consider increasing the value of the num_workers
argument(try 64 which is the number of cpus on this machine) in the
DataLoader` init to improve performance.
I0621 12:55:49.378295 139977373710144 run_docker.py:135] rank_zero_warn(
I0621 12:55:50.089681 139977373710144 run_docker.py:135] I0621 12:55:50.088823 140113106646848 lit_model_predict_docker.py:298] Saved predicted contact probability map for 4heq as /mnt/input_pdbs/4heq_contact_prob_map.npy
I0621 12:55:50.092932 139977373710144 run_docker.py:135] I0621 12:55:50.092391 140113106646848 lit_model_predict_docker.py:307] Saved learned node representations for the first chain graph of 4heq as /mnt/input_pdbs/4heq_graph1_node_feats.npy
I0621 12:55:50.093075 139977373710144 run_docker.py:135] I0621 12:55:50.092499 140113106646848 lit_model_predict_docker.py:308] Saved learned edge representations for the first chain graph of 4heq as /mnt/input_pdbs/4heq_graph1_edge_feats.npy
I0621 12:55:50.093154 139977373710144 run_docker.py:135] I0621 12:55:50.092547 140113106646848 lit_model_predict_docker.py:309] Saved learned node representations for the second chain graph of 4heq as /mnt/input_pdbs/4heq_graph2_node_feats.npy
I0621 12:55:50.093222 139977373710144 run_docker.py:135] I0621 12:55:50.092598 140113106646848 lit_model_predict_docker.py:310] Saved learned edge representations for the second chain graph of 4heq as /mnt/input_pdbs/4heq_graph2_edge_feats.npy
Predicting: 100%|██████████| 1/1 [00:00<00:00, 1.41it/s]Predicting: 0it [00:00, ?it/s]
This results in the final generated dill file not working properly