-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
data problem on Polymorphic Transformers #39
Comments
and also i found when i run the 2 th step of fundus.
and i found these term show in before.
emmm so what happens? |
ohh i saw the previous issue.
now i revise it like this. is it right?? and happens some other errors..
how to fix it? |
also have a question on fine tune on "k".
and the in_ator_trans is the sub-transformers 1, right? |
Thanks for reporting the bug. I've just corrected "refuge" to "fundus". Also I've simplified the polyformer config. |
“Also I've simplified the polyformer config.” so what file i should replace?
in addition may i ask why in training, q k are shared ?
and why only finetune k of transformer 1?
it reduces computation cost?or improve performance?
Get Outlook for Android<https://aka.ms/AAb9ysg>
…________________________________
From: askerlee ***@***.***>
Sent: Thursday, June 9, 2022 5:24:39 AM
To: askerlee/segtran ***@***.***>
Cc: Qianying Liu (PGR) ***@***.***>; Author ***@***.***>
Subject: Re: [askerlee/segtran] data problem on Polymorphic Transformers (Issue #39)
Thanks for reporting the bug. I've just corrected "refuge" to "fundus". Also I've simplified the polyformer config.
Yes you are right. Fine-tuning k is only to finetune the k of sub-transformer 1.
—
Reply to this email directly, view it on GitHub<#39 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AW4EV65PM4EQUSJD3YLLKSDVOFWYPANCNFSM5YEGKXVQ>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
in addition, the poly dataset i downloaded has different folders... can you please upload your processed data?
Get Outlook for Android<https://aka.ms/AAb9ysg>
…________________________________
From: Qianying Liu (PGR) ***@***.***>
Sent: Thursday, June 9, 2022 9:32:23 AM
To: askerlee/segtran ***@***.***>; askerlee/segtran ***@***.***>
Cc: Author ***@***.***>
Subject: Re: [askerlee/segtran] data problem on Polymorphic Transformers (Issue #39)
“Also I've simplified the polyformer config.” so what file i should replace?
in addition may i ask why in training, q k are shared ?
and why only finetune k of transformer 1?
it reduces computation cost?or improve performance?
Get Outlook for Android<https://aka.ms/AAb9ysg>
________________________________
From: askerlee ***@***.***>
Sent: Thursday, June 9, 2022 5:24:39 AM
To: askerlee/segtran ***@***.***>
Cc: Qianying Liu (PGR) ***@***.***>; Author ***@***.***>
Subject: Re: [askerlee/segtran] data problem on Polymorphic Transformers (Issue #39)
Thanks for reporting the bug. I've just corrected "refuge" to "fundus". Also I've simplified the polyformer config.
Yes you are right. Fine-tuning k is only to finetune the k of sub-transformer 1.
—
Reply to this email directly, view it on GitHub<#39 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AW4EV65PM4EQUSJD3YLLKSDVOFWYPANCNFSM5YEGKXVQ>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
You can just do a "git pull origin master" to update the code.
Yes correct. It's explained in the IJCAI paper, page 3:
Because empirically when I just fine-tune k of transformer 1, it already performs well. I didn't try to fine-tune both, and I'm not sure how to intuitively understand the benefits of fine-tuning both layers for domain adaptation. |
You mean polyp? |
Thanks for your help. It works. And may I ask how was “Avg” computed in the tables of your paper? I don't quite understand what it means.
|
in addition, can you update the commands of polyp dataset ? python3 train2d.py --task polyp --ds CVC-300 --split train --samplenum 5 --maxiter 1600 --saveiter 40 --net unet-scratch --cp ../model/unet-scratch-polyp-CVC-ClinicDB-train,Kvasir-train-06101057/iter_500.pth --polyformer target --targetopt k --bnopt affine --adv feat --sourceds CVC-ClinicDB-train,Kvasir-train --domweight 0.002 --bs 3 --sourcebs 2 --targetbs 2 especially for the "sourceds ", i am not sure. python3 test2d.py --gpu 1 --ds CVC-300--split test --samplenum 5 --bs 6 --task polyp –cpdir .. --net unet-scratch --polyformer target --nosave --iters 40-1600,40 especially for the "split". |
Also I have other 2 questiones.
|
Hi~please help me figure out some questions.
should be " --task fundus", otherwise will report errors.
many thanks in advance
The text was updated successfully, but these errors were encountered: