-
Notifications
You must be signed in to change notification settings - Fork 267
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
try except of max_atoms/bond error #61
Comments
i added two print statements to print fbonds.shape[0] and fatoms.shape[0] when I checked MAX_ATOM and MAX_BOND variables they were 200 and 400, respectively. |
Yeah. You can increase MAX_ATOM and MAX_BOND. The current setting is only
for small molecules. Thanks!
…On Thu, Dec 10, 2020 at 11:13 AM hima111997 ***@***.***> wrote:
i added two print statements to print fbonds.shape[0] and fatoms.shape[0]
and they gave me these numbers:
fbonds.shape[0] --> 403
fatoms.shape[0] --> 183
when I checked MAX_ATOM and MAX_BOND variables they were 200 and 400,
respectively.
So should I increase MAX_BOND value to 500 for example?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#61 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABGZ72XINVQA7SE7TDRETSTSUDXQZANCNFSM4UVGNL3Q>
.
--
Tianfan Fu
PhD candidate
futianfan.github.io
Sunlab, http://sunlab.org/
College of Computing
Georgia Institute of Technology
266 Ferst Dr NW, Atlanta, GA 30332
|
I got the same issue and I double the size of MAX_ATOM and MAX_BOND. It solved this issue but I think it takes more GPU memories on both training and test phases, right? |
You are right. In the current version, both variables are set as global
variables. If you set a large number, the memory would increase. We will
fix it soon. Thanks!
…On Sun, Dec 13, 2020 at 9:35 PM Po-Yu Kao ***@***.***> wrote:
I got the same issue and I double the size of MAX_ATOM and MAX_BOND. It
solved this issue but I think it takes more GPU memories on both training
and test phases, right?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#61 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABGZ72UNEXBJMVYRDGQ6GY3SUV2W5ANCNFSM4UVGNL3Q>
.
--
Tianfan Fu
PhD candidate
futianfan.github.io
Sunlab, http://sunlab.org/
College of Computing
Georgia Institute of Technology
266 Ferst Dr NW, Atlanta, GA 30332
|
For KIBA database, I increased MAX_ATOM=270 and MAX_BOND=600 that solved the issue. :) |
Thanks for letting us know!
…On Sun, Dec 13, 2020 at 10:03 PM Po-Yu Kao ***@***.***> wrote:
For KIBA database, I increased MAX_ATOM=270 and MAX_BOND=600 that solved
the issue. :)
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#61 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABGZ72UEOPEEAULATRA7FKTSUV573ANCNFSM4UVGNL3Q>
.
--
Tianfan Fu
PhD candidate
futianfan.github.io
Sunlab, http://sunlab.org/
College of Computing
Georgia Institute of Technology
266 Ferst Dr NW, Atlanta, GA 30332
|
Sorry, I can't find "MAX_ATOM" and "MAX_BOND" in utils.py I don't understand why it was okay in "MPNN_CNN_Kiba.ipynb"? It also used Kiba dataset |
Please see DeepPurpose/DeepPurpose/utils.py Line 27 in 5300269
Can you increase them, and try again? We change the setting for support of multi-GPU. The tutorial may be out-of-date. |
It doesn't work. Do I need to reinstall deeppurpose? Actually, I installed it again, but it still doesn't work, even I increased the numbers. Have my own dataset to be under the folder of DeepPurpose? I created a new folder for my own experiments. Is it okay? |
I think this is resolved in #74 |
Greetings sir,
I was doing VS using virtual_screening function when it gave me this error. the same drugs were used but with a different protein without giving me this error
`
Traceback (most recent call last):
File "/lfs01/workdirs/cairo029u1/deeppurpose/DeepPurpose/DeepPurpose/utils.py", line 264, in smiles2mpnnfeature
assert atoms_completion_num >= 0 and bonds_completion_num >= 0
AssertionError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "play_VS.py", line 20, in
play(dest_repur_db, dest_vs_db, dest_save +'/')
File "play_VS.py", line 9, in play
save_dir= dest_save)
File "/lfs01/workdirs/cairo029u1/deeppurpose/DeepPurpose/DeepPurpose/oneliner.py", line 261, in virtual_screening
y_pred = models.virtual_screening(X_repurpose, target, model, drug_names, target_name, convert_y = convert_y, result_folder = result_folder_path, verbose = False)
File "/lfs01/workdirs/cairo029u1/deeppurpose/DeepPurpose/DeepPurpose/DTI.py", line 163, in virtual_screening
model.drug_encoding, model.target_encoding, 'virtual screening')
File "/lfs01/workdirs/cairo029u1/deeppurpose/DeepPurpose/DeepPurpose/utils.py", line 578, in data_process_repurpose_virtual_screening
split_method='repurposing_VS')
File "/lfs01/workdirs/cairo029u1/deeppurpose/DeepPurpose/DeepPurpose/utils.py", line 499, in data_process
df_data = encode_drug(df_data, drug_encoding)
File "/lfs01/workdirs/cairo029u1/deeppurpose/DeepPurpose/DeepPurpose/utils.py", line 364, in encode_drug
unique = pd.Series(df_data[column_name].unique()).apply(smiles2mpnnfeature)
File "/share/apps/conda_envs/DeepPurpose/lib/python3.7/site-packages/pandas/core/series.py", line 3848, in apply
mapped = lib.map_infer(values, f, convert=convert_dtype)
File "pandas/_libs/lib.pyx", line 2329, in pandas._libs.lib.map_infer
File "/lfs01/workdirs/cairo029u1/deeppurpose/DeepPurpose/DeepPurpose/utils.py", line 266, in smiles2mpnnfeature
raise Exception("increase MAX_ATOM and MAX_BOND in utils")
Exception: increase MAX_ATOM and MAX_BOND in utils
`
The text was updated successfully, but these errors were encountered: