Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

from model import Attention, GatedAttention #1

Closed
douxWh opened this issue Apr 11, 2023 · 1 comment
Closed

from model import Attention, GatedAttention #1

douxWh opened this issue Apr 11, 2023 · 1 comment

Comments

@douxWh
Copy link

douxWh commented Apr 11, 2023

Dear @hrlalab ,
Could you please provide the script file,model.py,if convenient.
Thank you.

@ddrrnn123
Copy link
Collaborator

ddrrnn123 commented Jun 2, 2023

We have uploaded model.py for the baseline Attention, GatedAttention networks.
For training:
python MIL_main_DeepSurv_batch_Stage1_stack.py
For testing:
python MIL_global_Stage1_Testing_all.py

You man change the roots in those python files to load your data and save the outcome in your local foloders.

We will provide more detailed introduction to use or develop the model soon. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants