Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

训练问题 #12

Open
banxianerr opened this issue May 16, 2022 · 5 comments
Open

训练问题 #12

banxianerr opened this issue May 16, 2022 · 5 comments

Comments

@banxianerr
Copy link

老师你好,我在训练时使用的是这个命令:python demo.py --dataset='Indian' --epoches=300 --patches=7 --band_patches=3 --mode='CAF' --weight_decay=5e-3,但是跑出来的精度只有75%,方便帮我看一下可能是哪里出现的问题吗?
image

@danfenghong
Copy link
Owner

danfenghong commented May 17, 2022 via email

@danfenghong
Copy link
Owner

danfenghong commented May 17, 2022 via email

@banxianerr
Copy link
Author

你好,你是直接下载代码后运行的吗?没有进行更改吗? banxianerr @.> 于2022年5月16日周一 20:24写道:

老师你好,我在训练时使用的是这个命令:python demo.py --dataset='Indian' --epoches=300 --patches=7 --band_patches=3 --mode='CAF' --weight_decay=5e-3,但是跑出来的精度只有75%,方便帮我看一下可能是哪里出现的问题吗? [image: image] https://user-images.githubusercontent.com/62459853/168591652-e606d2c6-17cf-41ec-beff-715eb35d7cc4.png — Reply to this email directly, view it on GitHub <#12>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFL2GZT52DQE3GWOU374MADVKI47PANCNFSM5WBJ3ROQ . You are receiving this because you are subscribed to this thread.Message ID: @.
>

老师,这是我的整个操作流程,是直接下载然后run的
(gml-nts) GaoML@work:/gml-project/test$ git clone https://github.com/danfenghong/IEEE_TGRS_SpectralFormer.git
正克隆到 'IEEE_TGRS_SpectralFormer'...
remote: Enumerating objects: 111, done.
remote: Counting objects: 100% (6/6), done.
remote: Compressing objects: 100% (6/6), done.
remote: Total 111 (delta 2), reused 2 (delta 0), pack-reused 105
接收对象中: 100% (111/111), 13.87 MiB | 54.00 KiB/s, 完成.
处理 delta 中: 100% (45/45), 完成.
(gml-nts) GaoML@work:
/gml-project/test$ ls
IEEE_TGRS_SpectralFormer
(gml-nts) GaoML@work:/gml-project/test$ cd IEEE_TGRS_SpectralFormer/
(gml-nts) GaoML@work:
/gml-project/test/IEEE_TGRS_SpectralFormer$ ls
data demo.py log README.md SpectralFormer.PNG vit_pytorch.py
(gml-nts) GaoML@work:~/gml-project/test/IEEE_TGRS_SpectralFormer$ python demo.py --dataset='Indian' --epoches=300 --patches=7 --band_patches=3 --mode='CAF' --weight_decay=5e-3
height=145,width=145,band=200


patch is : 7
mirror_image shape : [151,151,200]


x_train shape = (695, 7, 7, 200), type = float64
x_test shape = (9671, 7, 7, 200), type = float64
x_true shape = (21025, 7, 7, 200), type = float64


x_train_band shape = (695, 147, 200), type = float64
x_test_band shape = (9671, 147, 200), type = float64
x_true_band shape = (21025, 147, 200), type = float64


y_train: shape = (695,) ,type = int64
y_test: shape = (9671,) ,type = int64
y_true: shape = (21025,) ,type = int64


start training
/home/GaoML/anaconda3/envs/gml-nts/lib/python3.6/site-packages/torch/optim/lr_scheduler.py:136: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
"https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning)
Epoch: 001 train_loss: 2.8026 train_acc: 6.7626
Epoch: 002 train_loss: 2.7212 train_acc: 8.2014
Epoch: 003 train_loss: 2.6631 train_acc: 14.6763
Epoch: 004 train_loss: 2.4813 train_acc: 19.4245
Epoch: 005 train_loss: 2.2231 train_acc: 23.3094
Epoch: 006 train_loss: 2.0518 train_acc: 29.9281
Epoch: 007 train_loss: 1.9562 train_acc: 32.0863
Epoch: 008 train_loss: 1.8885 train_acc: 32.8058
Epoch: 009 train_loss: 1.8270 train_acc: 35.8273
Epoch: 010 train_loss: 1.7620 train_acc: 38.2734
Epoch: 011 train_loss: 1.6822 train_acc: 39.4245
Epoch: 012 train_loss: 1.5872 train_acc: 44.4604
Epoch: 013 train_loss: 1.5285 train_acc: 43.7410
Epoch: 014 train_loss: 1.4653 train_acc: 47.1942
Epoch: 015 train_loss: 1.4087 train_acc: 48.2014
Epoch: 016 train_loss: 1.3532 train_acc: 50.0719
Epoch: 017 train_loss: 1.2949 train_acc: 52.2302
Epoch: 018 train_loss: 1.2636 train_acc: 54.3885
Epoch: 019 train_loss: 1.2534 train_acc: 53.2374
Epoch: 020 train_loss: 1.1967 train_acc: 54.6763
Epoch: 021 train_loss: 1.2165 train_acc: 52.6619
Epoch: 022 train_loss: 1.1935 train_acc: 55.5396
Epoch: 023 train_loss: 1.1406 train_acc: 57.5540
Epoch: 024 train_loss: 1.1320 train_acc: 56.5468
Epoch: 025 train_loss: 1.1173 train_acc: 57.4101
Epoch: 026 train_loss: 1.1210 train_acc: 56.5468
Epoch: 027 train_loss: 1.1173 train_acc: 56.8345
Epoch: 028 train_loss: 1.0976 train_acc: 57.8417
Epoch: 029 train_loss: 1.0393 train_acc: 59.4245
Epoch: 030 train_loss: 1.0321 train_acc: 59.8561
Epoch: 031 train_loss: 1.0060 train_acc: 62.1583
Epoch: 032 train_loss: 1.0072 train_acc: 60.8633
Epoch: 033 train_loss: 1.0057 train_acc: 61.2950
Epoch: 034 train_loss: 0.9799 train_acc: 63.0216
Epoch: 035 train_loss: 0.9426 train_acc: 67.0504
Epoch: 036 train_loss: 0.9399 train_acc: 64.1727
Epoch: 037 train_loss: 0.9494 train_acc: 63.8849
Epoch: 038 train_loss: 0.9657 train_acc: 62.4460
Epoch: 039 train_loss: 0.9423 train_acc: 65.4676
Epoch: 040 train_loss: 0.9430 train_acc: 64.7482
Epoch: 041 train_loss: 0.9671 train_acc: 61.1511
Epoch: 042 train_loss: 0.9029 train_acc: 63.7410
Epoch: 043 train_loss: 0.8931 train_acc: 65.1799
Epoch: 044 train_loss: 0.9375 train_acc: 62.5899
Epoch: 045 train_loss: 0.9465 train_acc: 62.7338
Epoch: 046 train_loss: 0.9501 train_acc: 61.8705
Epoch: 047 train_loss: 0.9081 train_acc: 65.0360
Epoch: 048 train_loss: 0.8632 train_acc: 66.1870
Epoch: 049 train_loss: 0.9113 train_acc: 63.4532
Epoch: 050 train_loss: 0.8629 train_acc: 66.9065
Epoch: 051 train_loss: 0.8375 train_acc: 67.1942
Epoch: 052 train_loss: 0.8520 train_acc: 69.6403
Epoch: 053 train_loss: 0.8167 train_acc: 68.2014
Epoch: 054 train_loss: 0.8239 train_acc: 68.0576
Epoch: 055 train_loss: 0.8760 train_acc: 66.3309
Epoch: 056 train_loss: 0.8432 train_acc: 67.7698
Epoch: 057 train_loss: 0.8148 train_acc: 69.0648
Epoch: 058 train_loss: 0.7857 train_acc: 71.5108
Epoch: 059 train_loss: 0.8108 train_acc: 68.3453
Epoch: 060 train_loss: 0.7942 train_acc: 69.6403
Epoch: 061 train_loss: 0.7868 train_acc: 70.2158
Epoch: 062 train_loss: 0.7558 train_acc: 72.9496
Epoch: 063 train_loss: 0.7878 train_acc: 71.3669
Epoch: 064 train_loss: 0.7764 train_acc: 71.3669
Epoch: 065 train_loss: 0.7436 train_acc: 71.9424
Epoch: 066 train_loss: 0.7538 train_acc: 69.6403
Epoch: 067 train_loss: 0.7628 train_acc: 70.6475
Epoch: 068 train_loss: 0.7292 train_acc: 70.5036
Epoch: 069 train_loss: 0.7443 train_acc: 72.6619
Epoch: 070 train_loss: 0.7410 train_acc: 73.6691
Epoch: 071 train_loss: 0.7131 train_acc: 74.2446
Epoch: 072 train_loss: 0.6870 train_acc: 73.6691
Epoch: 073 train_loss: 0.7287 train_acc: 71.6547
Epoch: 074 train_loss: 0.8080 train_acc: 66.7626
Epoch: 075 train_loss: 0.7769 train_acc: 70.0719
Epoch: 076 train_loss: 0.6967 train_acc: 73.5252
Epoch: 077 train_loss: 0.6612 train_acc: 75.9712
Epoch: 078 train_loss: 0.6892 train_acc: 75.5396
Epoch: 079 train_loss: 0.6764 train_acc: 74.6763
Epoch: 080 train_loss: 0.6704 train_acc: 75.9712
Epoch: 081 train_loss: 0.6446 train_acc: 76.2590
Epoch: 082 train_loss: 0.6633 train_acc: 74.8201
Epoch: 083 train_loss: 0.6934 train_acc: 72.8058
Epoch: 084 train_loss: 0.6892 train_acc: 71.9424
Epoch: 085 train_loss: 0.6701 train_acc: 73.3813
Epoch: 086 train_loss: 0.6569 train_acc: 74.9640
Epoch: 087 train_loss: 0.6166 train_acc: 76.8345
Epoch: 088 train_loss: 0.6207 train_acc: 76.5468
Epoch: 089 train_loss: 0.6254 train_acc: 78.1295
Epoch: 090 train_loss: 0.6029 train_acc: 77.4101
Epoch: 091 train_loss: 0.6522 train_acc: 75.1079
Epoch: 092 train_loss: 0.6247 train_acc: 75.6834
Epoch: 093 train_loss: 0.5812 train_acc: 79.2806
Epoch: 094 train_loss: 0.5894 train_acc: 77.1223
Epoch: 095 train_loss: 0.6022 train_acc: 77.9856
Epoch: 096 train_loss: 0.5708 train_acc: 80.0000
Epoch: 097 train_loss: 0.5900 train_acc: 78.7050
Epoch: 098 train_loss: 0.5914 train_acc: 77.1223
Epoch: 099 train_loss: 0.5767 train_acc: 78.7050
Epoch: 100 train_loss: 0.5974 train_acc: 77.5540
Epoch: 101 train_loss: 0.5525 train_acc: 80.1439
Epoch: 102 train_loss: 0.5511 train_acc: 79.8561
Epoch: 103 train_loss: 0.5534 train_acc: 81.1511
Epoch: 104 train_loss: 0.5484 train_acc: 79.8561
Epoch: 105 train_loss: 0.5108 train_acc: 82.8777
Epoch: 106 train_loss: 0.5144 train_acc: 81.0072
Epoch: 107 train_loss: 0.5461 train_acc: 80.5755
Epoch: 108 train_loss: 0.5080 train_acc: 83.7410
Epoch: 109 train_loss: 0.5013 train_acc: 81.1511
Epoch: 110 train_loss: 0.5130 train_acc: 80.8633
Epoch: 111 train_loss: 0.5016 train_acc: 80.8633
Epoch: 112 train_loss: 0.4976 train_acc: 84.0288
Epoch: 113 train_loss: 0.4889 train_acc: 83.4532
Epoch: 114 train_loss: 0.5491 train_acc: 78.4173
Epoch: 115 train_loss: 0.5535 train_acc: 78.9928
Epoch: 116 train_loss: 0.4967 train_acc: 80.2878
Epoch: 117 train_loss: 0.4734 train_acc: 84.4604
Epoch: 118 train_loss: 0.4810 train_acc: 83.7410
Epoch: 119 train_loss: 0.4544 train_acc: 85.7554
Epoch: 120 train_loss: 0.4862 train_acc: 84.1727
Epoch: 121 train_loss: 0.4977 train_acc: 80.8633
Epoch: 122 train_loss: 0.4506 train_acc: 83.7410
Epoch: 123 train_loss: 0.4349 train_acc: 84.4604
Epoch: 124 train_loss: 0.4409 train_acc: 85.6115
Epoch: 125 train_loss: 0.4298 train_acc: 87.0504
Epoch: 126 train_loss: 0.4384 train_acc: 84.4604
Epoch: 127 train_loss: 0.4050 train_acc: 86.9065
Epoch: 128 train_loss: 0.4319 train_acc: 85.4676
Epoch: 129 train_loss: 0.4201 train_acc: 86.6187
Epoch: 130 train_loss: 0.3926 train_acc: 85.4676
Epoch: 131 train_loss: 0.4054 train_acc: 84.7482
Epoch: 132 train_loss: 0.4109 train_acc: 84.0288
Epoch: 133 train_loss: 0.3931 train_acc: 87.1942
Epoch: 134 train_loss: 0.4001 train_acc: 87.0504
Epoch: 135 train_loss: 0.4284 train_acc: 85.8993
Epoch: 136 train_loss: 0.4334 train_acc: 85.3237
Epoch: 137 train_loss: 0.3835 train_acc: 86.1870
Epoch: 138 train_loss: 0.4041 train_acc: 85.6115
Epoch: 139 train_loss: 0.3936 train_acc: 87.1942
Epoch: 140 train_loss: 0.4148 train_acc: 86.7626
Epoch: 141 train_loss: 0.3784 train_acc: 86.9065
Epoch: 142 train_loss: 0.3477 train_acc: 88.2014
Epoch: 143 train_loss: 0.3728 train_acc: 87.9137
Epoch: 144 train_loss: 0.3487 train_acc: 90.0719
Epoch: 145 train_loss: 0.3473 train_acc: 88.7770
Epoch: 146 train_loss: 0.3521 train_acc: 88.6331
Epoch: 147 train_loss: 0.3403 train_acc: 89.2086
Epoch: 148 train_loss: 0.4209 train_acc: 84.8921
Epoch: 149 train_loss: 0.3766 train_acc: 88.0576
Epoch: 150 train_loss: 0.4179 train_acc: 85.7554
Epoch: 151 train_loss: 0.3921 train_acc: 85.7554
Epoch: 152 train_loss: 0.3338 train_acc: 90.3597
Epoch: 153 train_loss: 0.3174 train_acc: 90.6475
Epoch: 154 train_loss: 0.3225 train_acc: 91.5108
Epoch: 155 train_loss: 0.3466 train_acc: 88.3453
Epoch: 156 train_loss: 0.3500 train_acc: 89.2086
Epoch: 157 train_loss: 0.3596 train_acc: 88.0576
Epoch: 158 train_loss: 0.3281 train_acc: 89.9281
Epoch: 159 train_loss: 0.3289 train_acc: 90.0719
Epoch: 160 train_loss: 0.3189 train_acc: 89.4964
Epoch: 161 train_loss: 0.3102 train_acc: 90.9352
Epoch: 162 train_loss: 0.2979 train_acc: 91.5108
Epoch: 163 train_loss: 0.2798 train_acc: 91.5108
Epoch: 164 train_loss: 0.2891 train_acc: 91.0791
Epoch: 165 train_loss: 0.3358 train_acc: 88.6331
Epoch: 166 train_loss: 0.3216 train_acc: 90.2158
Epoch: 167 train_loss: 0.3456 train_acc: 89.7842
Epoch: 168 train_loss: 0.3664 train_acc: 87.0504
Epoch: 169 train_loss: 0.3508 train_acc: 87.6259
Epoch: 170 train_loss: 0.3234 train_acc: 89.4964
Epoch: 171 train_loss: 0.2797 train_acc: 91.5108
Epoch: 172 train_loss: 0.2651 train_acc: 93.0935
Epoch: 173 train_loss: 0.2886 train_acc: 90.5036
Epoch: 174 train_loss: 0.2518 train_acc: 92.0863
Epoch: 175 train_loss: 0.2634 train_acc: 91.2230
Epoch: 176 train_loss: 0.2718 train_acc: 91.3669
Epoch: 177 train_loss: 0.2945 train_acc: 89.9281
Epoch: 178 train_loss: 0.2711 train_acc: 92.2302
Epoch: 179 train_loss: 0.2721 train_acc: 91.7986
Epoch: 180 train_loss: 0.2767 train_acc: 91.7986
Epoch: 181 train_loss: 0.2502 train_acc: 92.8058
Epoch: 182 train_loss: 0.2626 train_acc: 91.2230
Epoch: 183 train_loss: 0.2774 train_acc: 90.6475
Epoch: 184 train_loss: 0.2583 train_acc: 92.5180
Epoch: 185 train_loss: 0.2687 train_acc: 92.2302
Epoch: 186 train_loss: 0.2641 train_acc: 91.2230
Epoch: 187 train_loss: 0.2673 train_acc: 92.0863
Epoch: 188 train_loss: 0.3005 train_acc: 88.7770
Epoch: 189 train_loss: 0.2642 train_acc: 91.6547
Epoch: 190 train_loss: 0.2427 train_acc: 93.5252
Epoch: 191 train_loss: 0.1970 train_acc: 95.6834
Epoch: 192 train_loss: 0.2024 train_acc: 94.5324
Epoch: 193 train_loss: 0.2051 train_acc: 94.2446
Epoch: 194 train_loss: 0.2177 train_acc: 93.9568
Epoch: 195 train_loss: 0.2230 train_acc: 93.3813
Epoch: 196 train_loss: 0.2175 train_acc: 93.2374
Epoch: 197 train_loss: 0.2205 train_acc: 93.2374
Epoch: 198 train_loss: 0.2714 train_acc: 91.7986
Epoch: 199 train_loss: 0.2891 train_acc: 91.9424
Epoch: 200 train_loss: 0.2700 train_acc: 90.9352
Epoch: 201 train_loss: 0.2510 train_acc: 92.2302
Epoch: 202 train_loss: 0.2557 train_acc: 90.9352
Epoch: 203 train_loss: 0.2439 train_acc: 92.3741
Epoch: 204 train_loss: 0.2199 train_acc: 92.3741
Epoch: 205 train_loss: 0.2061 train_acc: 93.8130
Epoch: 206 train_loss: 0.2101 train_acc: 94.6763
Epoch: 207 train_loss: 0.2261 train_acc: 92.9496
Epoch: 208 train_loss: 0.2675 train_acc: 92.3741
Epoch: 209 train_loss: 0.2212 train_acc: 93.8130
Epoch: 210 train_loss: 0.2251 train_acc: 93.8130
Epoch: 211 train_loss: 0.2255 train_acc: 93.3813
Epoch: 212 train_loss: 0.1906 train_acc: 94.8201
Epoch: 213 train_loss: 0.1993 train_acc: 94.9640
Epoch: 214 train_loss: 0.2196 train_acc: 93.5252
Epoch: 215 train_loss: 0.1859 train_acc: 94.2446
Epoch: 216 train_loss: 0.1960 train_acc: 94.3885
Epoch: 217 train_loss: 0.1975 train_acc: 94.2446
Epoch: 218 train_loss: 0.2274 train_acc: 93.9568
Epoch: 219 train_loss: 0.2261 train_acc: 92.3741
Epoch: 220 train_loss: 0.2016 train_acc: 94.3885
Epoch: 221 train_loss: 0.1827 train_acc: 95.3957
Epoch: 222 train_loss: 0.2437 train_acc: 91.9424
Epoch: 223 train_loss: 0.2060 train_acc: 93.9568
Epoch: 224 train_loss: 0.1887 train_acc: 94.6763
Epoch: 225 train_loss: 0.1982 train_acc: 94.5324
Epoch: 226 train_loss: 0.1992 train_acc: 94.2446
Epoch: 227 train_loss: 0.1813 train_acc: 94.8201
Epoch: 228 train_loss: 0.1900 train_acc: 94.6763
Epoch: 229 train_loss: 0.1830 train_acc: 94.5324
Epoch: 230 train_loss: 0.1804 train_acc: 95.3957
Epoch: 231 train_loss: 0.1598 train_acc: 96.8345
Epoch: 232 train_loss: 0.1606 train_acc: 96.1151
Epoch: 233 train_loss: 0.1919 train_acc: 94.1007
Epoch: 234 train_loss: 0.1649 train_acc: 95.5396
Epoch: 235 train_loss: 0.1981 train_acc: 93.9568
Epoch: 236 train_loss: 0.1800 train_acc: 95.2518
Epoch: 237 train_loss: 0.1698 train_acc: 95.6834
Epoch: 238 train_loss: 0.1808 train_acc: 94.9640
Epoch: 239 train_loss: 0.1985 train_acc: 94.1007
Epoch: 240 train_loss: 0.2011 train_acc: 94.2446
Epoch: 241 train_loss: 0.1652 train_acc: 96.1151
Epoch: 242 train_loss: 0.1733 train_acc: 95.6834
Epoch: 243 train_loss: 0.1889 train_acc: 94.9640
Epoch: 244 train_loss: 0.1638 train_acc: 95.6834
Epoch: 245 train_loss: 0.1528 train_acc: 97.1223
Epoch: 246 train_loss: 0.1507 train_acc: 97.1223
Epoch: 247 train_loss: 0.1499 train_acc: 96.5468
Epoch: 248 train_loss: 0.1634 train_acc: 95.5396
Epoch: 249 train_loss: 0.1469 train_acc: 95.9712
Epoch: 250 train_loss: 0.1839 train_acc: 93.8130
Epoch: 251 train_loss: 0.2340 train_acc: 92.2302
Epoch: 252 train_loss: 0.2215 train_acc: 93.5252
Epoch: 253 train_loss: 0.1855 train_acc: 95.1079
Epoch: 254 train_loss: 0.1560 train_acc: 96.2590
Epoch: 255 train_loss: 0.1420 train_acc: 96.2590
Epoch: 256 train_loss: 0.1640 train_acc: 95.6834
Epoch: 257 train_loss: 0.1531 train_acc: 96.4029
Epoch: 258 train_loss: 0.1295 train_acc: 96.4029
Epoch: 259 train_loss: 0.1438 train_acc: 96.4029
Epoch: 260 train_loss: 0.1526 train_acc: 95.8273
Epoch: 261 train_loss: 0.1775 train_acc: 95.5396
Epoch: 262 train_loss: 0.1445 train_acc: 97.1223
Epoch: 263 train_loss: 0.1511 train_acc: 95.6834
Epoch: 264 train_loss: 0.1831 train_acc: 94.6763
Epoch: 265 train_loss: 0.1407 train_acc: 96.6906
Epoch: 266 train_loss: 0.1613 train_acc: 96.5468
Epoch: 267 train_loss: 0.1330 train_acc: 97.2662
Epoch: 268 train_loss: 0.1262 train_acc: 97.2662
Epoch: 269 train_loss: 0.1472 train_acc: 95.9712
Epoch: 270 train_loss: 0.1363 train_acc: 96.9784
Epoch: 271 train_loss: 0.1484 train_acc: 96.4029
Epoch: 272 train_loss: 0.1383 train_acc: 96.2590
Epoch: 273 train_loss: 0.1619 train_acc: 95.6834
Epoch: 274 train_loss: 0.1311 train_acc: 96.6906
Epoch: 275 train_loss: 0.1389 train_acc: 96.8345
Epoch: 276 train_loss: 0.1248 train_acc: 97.6978
Epoch: 277 train_loss: 0.1246 train_acc: 96.6906
Epoch: 278 train_loss: 0.1371 train_acc: 97.2662
Epoch: 279 train_loss: 0.1516 train_acc: 95.8273
Epoch: 280 train_loss: 0.1434 train_acc: 96.5468
Epoch: 281 train_loss: 0.1295 train_acc: 96.6906
Epoch: 282 train_loss: 0.1356 train_acc: 96.2590
Epoch: 283 train_loss: 0.1280 train_acc: 96.6906
Epoch: 284 train_loss: 0.1354 train_acc: 97.1223
Epoch: 285 train_loss: 0.1521 train_acc: 95.8273
Epoch: 286 train_loss: 0.1545 train_acc: 96.1151
Epoch: 287 train_loss: 0.1636 train_acc: 96.6906
Epoch: 288 train_loss: 0.1540 train_acc: 95.8273
Epoch: 289 train_loss: 0.1237 train_acc: 96.9784
Epoch: 290 train_loss: 0.1246 train_acc: 97.1223
Epoch: 291 train_loss: 0.1395 train_acc: 95.8273
Epoch: 292 train_loss: 0.1313 train_acc: 97.2662
Epoch: 293 train_loss: 0.1327 train_acc: 96.9784
Epoch: 294 train_loss: 0.1272 train_acc: 97.6978
Epoch: 295 train_loss: 0.1237 train_acc: 96.8345
Epoch: 296 train_loss: 0.1127 train_acc: 97.6978
Epoch: 297 train_loss: 0.1164 train_acc: 97.4101
Epoch: 298 train_loss: 0.1421 train_acc: 96.2590
Epoch: 299 train_loss: 0.1458 train_acc: 95.5396
Epoch: 300 train_loss: 0.1333 train_acc: 96.6906
Running Time: 227.97


Final result:
OA: 0.7518 | AA: 0.8448 | Kappa: 0.7196
[0.67919075 0.84311224 0.92391304 0.87248322 0.8651363 0.92027335
0.76579521 0.56823821 0.68439716 0.99382716 0.914791 0.76363636
0.97777778 0.74358974 1. 1. ]


Parameter:
dataset: Indian
flag_test: train
mode: CAF
gpu_id: 0
seed: 0
batch_size: 64
test_freq: 5
patches: 7
band_patches: 3
epoches: 300
learning_rate: 0.0005
gamma: 0.9
weight_decay: 0.005

@danfenghong
Copy link
Owner

danfenghong commented May 17, 2022 via email

@banxianerr
Copy link
Author

如果没有改变任何代码,有可能是因为python版本之类的问题(我们使用的版本也在github说明了),因为我们这个地方是把初始化的seed固定了,之前也有其他人问过我们这个问题,但他们的结果是比我们report还要高的。 banxianerr @.> 于2022年5月17日周二 20:21写道:

你好,你是直接下载代码后运行的吗?没有进行更改吗? banxianerr @. > 于2022年5月16日周一 20:24写道: … <#m_-1848683633840172713_> 老师你好,我在训练时使用的是这个命令:python demo.py --dataset='Indian' --epoches=300 --patches=7 --band_patches=3 --mode='CAF' --weight_decay=5e-3,但是跑出来的精度只有75%,方便帮我看一下可能是哪里出现的问题吗? [image: image] https://user-images.githubusercontent.com/62459853/168591652-e606d2c6-17cf-41ec-beff-715eb35d7cc4.png https://user-images.githubusercontent.com/62459853/168591652-e606d2c6-17cf-41ec-beff-715eb35d7cc4.png — Reply to this email directly, view it on GitHub <#12 <#12>>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFL2GZT52DQE3GWOU374MADVKI47PANCNFSM5WBJ3ROQ https://github.com/notifications/unsubscribe-auth/AFL2GZT52DQE3GWOU374MADVKI47PANCNFSM5WBJ3ROQ . You are receiving this because you are subscribed to this thread.Message ID: @.> 老师,这是我的整个操作流程,是直接下载然后run的 (gml-nts) @.
:/gml-project/test$ git clone https://github.com/danfenghong/IEEE_TGRS_SpectralFormer.git 正克隆到 'IEEE_TGRS_SpectralFormer'... remote: Enumerating objects: 111, done. remote: Counting objects: 100% (6/6), done. remote: Compressing objects: 100% (6/6), done. remote: Total 111 (delta 2), reused 2 (delta 0), pack-reused 105 接收对象中: 100% (111/111), 13.87 MiB | 54.00 KiB/s, 完成. 处理 delta 中: 100% (45/45), 完成. (gml-nts) @.:/gml-project/test$ ls IEEE_TGRS_SpectralFormer (gml-nts) @.:/gml-project/test$ cd IEEE_TGRS_SpectralFormer/ (gml-nts) @.:/gml-project/test/IEEE_TGRS_SpectralFormer$ ls data demo.py log README.md SpectralFormer.PNG vit_pytorch.py (gml-nts) @.:~/gml-project/test/IEEE_TGRS_SpectralFormer$ python demo.py --dataset='Indian' --epoches=300 --patches=7 --band_patches=3 --mode='CAF' --weight_decay=5e-3 height=145,width=145,band=200 ------------------------------ patch is : 7 mirror_image shape : [151,151,200] ------------------------------ x_train shape = (695, 7, 7, 200), type = float64 x_test shape = (9671, 7, 7, 200), type = float64 x_true shape = (21025, 7, 7, 200), type = float64 ------------------------------ x_train_band shape = (695, 147, 200), type = float64 x_test_band shape = (9671, 147, 200), type = float64 x_true_band shape = (21025, 147, 200), type = float64 ------------------------------ y_train: shape = (695,) ,type = int64 y_test: shape = (9671,) ,type = int64 y_true: shape = (21025,) ,type = int64 ------------------------------ start training /home/GaoML/anaconda3/envs/gml-nts/lib/python3.6/site-packages/torch/optim/lr_scheduler.py:136: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate "https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning) Epoch: 001 train_loss: 2.8026 train_acc: 6.7626 Epoch: 002 train_loss: 2.7212 train_acc: 8.2014 Epoch: 003 train_loss: 2.6631 train_acc: 14.6763 Epoch: 004 train_loss: 2.4813 train_acc: 19.4245 Epoch: 005 train_loss: 2.2231 train_acc: 23.3094 Epoch: 006 train_loss: 2.0518 train_acc: 29.9281 Epoch: 007 train_loss: 1.9562 train_acc: 32.0863 Epoch: 008 train_loss: 1.8885 train_acc: 32.8058 Epoch: 009 train_loss: 1.8270 train_acc: 35.8273 Epoch: 010 train_loss: 1.7620 train_acc: 38.2734 Epoch: 011 train_loss: 1.6822 train_acc: 39.4245 Epoch: 012 train_loss: 1.5872 train_acc: 44.4604 Epoch: 013 train_loss: 1.5285 train_acc: 43.7410 Epoch: 014 train_loss: 1.4653 train_acc: 47.1942 Epoch: 015 train_loss: 1.4087 train_acc: 48.2014 Epoch: 016 train_loss: 1.3532 train_acc: 50.0719 Epoch: 017 train_loss: 1.2949 train_acc: 52.2302 Epoch: 018 train_loss: 1.2636 train_acc: 54.3885 Epoch: 019 train_loss: 1.2534 train_acc: 53.2374 Epoch: 020 train_loss: 1.1967 train_acc: 54.6763 Epoch: 021 train_loss: 1.2165 train_acc: 52.6619 Epoch: 022 train_loss: 1.1935 train_acc: 55.5396 Epoch: 023 train_loss: 1.1406 train_acc: 57.5540 Epoch: 024 train_loss: 1.1320 train_acc: 56.5468 Epoch: 025 train_loss: 1.1173 train_acc: 57.4101 Epoch: 026 train_loss: 1.1210 train_acc: 56.5468 Epoch: 027 train_loss: 1.1173 train_acc: 56.8345 Epoch: 028 train_loss: 1.0976 train_acc: 57.8417 Epoch: 029 train_loss: 1.0393 train_acc: 59.4245 Epoch: 030 train_loss: 1.0321 train_acc: 59.8561 Epoch: 031 train_loss: 1.0060 train_acc: 62.1583 Epoch: 032 train_loss: 1.0072 train_acc: 60.8633 Epoch: 033 train_loss: 1.0057 train_acc: 61.2950 Epoch: 034 train_loss: 0.9799 train_acc: 63.0216 Epoch: 035 train_loss: 0.9426 train_acc: 67.0504 Epoch: 036 train_loss: 0.9399 train_acc: 64.1727 Epoch: 037 train_loss: 0.9494 train_acc: 63.8849 Epoch: 038 train_loss: 0.9657 train_acc: 62.4460 Epoch: 039 train_loss: 0.9423 train_acc: 65.4676 Epoch: 040 train_loss: 0.9430 train_acc: 64.7482 Epoch: 041 train_loss: 0.9671 train_acc: 61.1511 Epoch: 042 train_loss: 0.9029 train_acc: 63.7410 Epoch: 043 train_loss: 0.8931 train_acc: 65.1799 Epoch: 044 train_loss: 0.9375 train_acc: 62.5899 Epoch: 045 train_loss: 0.9465 train_acc: 62.7338 Epoch: 046 train_loss: 0.9501 train_acc: 61.8705 Epoch: 047 train_loss: 0.9081 train_acc: 65.0360 Epoch: 048 train_loss: 0.8632 train_acc: 66.1870 Epoch: 049 train_loss: 0.9113 train_acc: 63.4532 Epoch: 050 train_loss: 0.8629 train_acc: 66.9065 Epoch: 051 train_loss: 0.8375 train_acc: 67.1942 Epoch: 052 train_loss: 0.8520 train_acc: 69.6403 Epoch: 053 train_loss: 0.8167 train_acc: 68.2014 Epoch: 054 train_loss: 0.8239 train_acc: 68.0576 Epoch: 055 train_loss: 0.8760 train_acc: 66.3309 Epoch: 056 train_loss: 0.8432 train_acc: 67.7698 Epoch: 057 train_loss: 0.8148 train_acc: 69.0648 Epoch: 058 train_loss: 0.7857 train_acc: 71.5108 Epoch: 059 train_loss: 0.8108 train_acc: 68.3453 Epoch: 060 train_loss: 0.7942 train_acc: 69.6403 Epoch: 061 train_loss: 0.7868 train_acc: 70.2158 Epoch: 062 train_loss: 0.7558 train_acc: 72.9496 Epoch: 063 train_loss: 0.7878 train_acc: 71.3669 Epoch: 064 train_loss: 0.7764 train_acc: 71.3669 Epoch: 065 train_loss: 0.7436 train_acc: 71.9424 Epoch: 066 train_loss: 0.7538 train_acc: 69.6403 Epoch: 067 train_loss: 0.7628 train_acc: 70.6475 Epoch: 068 train_loss: 0.7292 train_acc: 70.5036 Epoch: 069 train_loss: 0.7443 train_acc: 72.6619 Epoch: 070 train_loss: 0.7410 train_acc: 73.6691 Epoch: 071 train_loss: 0.7131 train_acc: 74.2446 Epoch: 072 train_loss: 0.6870 train_acc: 73.6691 Epoch: 073 train_loss: 0.7287 train_acc: 71.6547 Epoch: 074 train_loss: 0.8080 train_acc: 66.7626 Epoch: 075 train_loss: 0.7769 train_acc: 70.0719 Epoch: 076 train_loss: 0.6967 train_acc: 73.5252 Epoch: 077 train_loss: 0.6612 train_acc: 75.9712 Epoch: 078 train_loss: 0.6892 train_acc: 75.5396 Epoch: 079 train_loss: 0.6764 train_acc: 74.6763 Epoch: 080 train_loss: 0.6704 train_acc: 75.9712 Epoch: 081 train_loss: 0.6446 train_acc: 76.2590 Epoch: 082 train_loss: 0.6633 train_acc: 74.8201 Epoch: 083 train_loss: 0.6934 train_acc: 72.8058 Epoch: 084 train_loss: 0.6892 train_acc: 71.9424 Epoch: 085 train_loss: 0.6701 train_acc: 73.3813 Epoch: 086 train_loss: 0.6569 train_acc: 74.9640 Epoch: 087 train_loss: 0.6166 train_acc: 76.8345 Epoch: 088 train_loss: 0.6207 train_acc: 76.5468 Epoch: 089 train_loss: 0.6254 train_acc: 78.1295 Epoch: 090 train_loss: 0.6029 train_acc: 77.4101 Epoch: 091 train_loss: 0.6522 train_acc: 75.1079 Epoch: 092 train_loss: 0.6247 train_acc: 75.6834 Epoch: 093 train_loss: 0.5812 train_acc: 79.2806 Epoch: 094 train_loss: 0.5894 train_acc: 77.1223 Epoch: 095 train_loss: 0.6022 train_acc: 77.9856 Epoch: 096 train_loss: 0.5708 train_acc: 80.0000 Epoch: 097 train_loss: 0.5900 train_acc: 78.7050 Epoch: 098 train_loss: 0.5914 train_acc: 77.1223 Epoch: 099 train_loss: 0.5767 train_acc: 78.7050 Epoch: 100 train_loss: 0.5974 train_acc: 77.5540 Epoch: 101 train_loss: 0.5525 train_acc: 80.1439 Epoch: 102 train_loss: 0.5511 train_acc: 79.8561 Epoch: 103 train_loss: 0.5534 train_acc: 81.1511 Epoch: 104 train_loss: 0.5484 train_acc: 79.8561 Epoch: 105 train_loss: 0.5108 train_acc: 82.8777 Epoch: 106 train_loss: 0.5144 train_acc: 81.0072 Epoch: 107 train_loss: 0.5461 train_acc: 80.5755 Epoch: 108 train_loss: 0.5080 train_acc: 83.7410 Epoch: 109 train_loss: 0.5013 train_acc: 81.1511 Epoch: 110 train_loss: 0.5130 train_acc: 80.8633 Epoch: 111 train_loss: 0.5016 train_acc: 80.8633 Epoch: 112 train_loss: 0.4976 train_acc: 84.0288 Epoch: 113 train_loss: 0.4889 train_acc: 83.4532 Epoch: 114 train_loss: 0.5491 train_acc: 78.4173 Epoch: 115 train_loss: 0.5535 train_acc: 78.9928 Epoch: 116 train_loss: 0.4967 train_acc: 80.2878 Epoch: 117 train_loss: 0.4734 train_acc: 84.4604 Epoch: 118 train_loss: 0.4810 train_acc: 83.7410 Epoch: 119 train_loss: 0.4544 train_acc: 85.7554 Epoch: 120 train_loss: 0.4862 train_acc: 84.1727 Epoch: 121 train_loss: 0.4977 train_acc: 80.8633 Epoch: 122 train_loss: 0.4506 train_acc: 83.7410 Epoch: 123 train_loss: 0.4349 train_acc: 84.4604 Epoch: 124 train_loss: 0.4409 train_acc: 85.6115 Epoch: 125 train_loss: 0.4298 train_acc: 87.0504 Epoch: 126 train_loss: 0.4384 train_acc: 84.4604 Epoch: 127 train_loss: 0.4050 train_acc: 86.9065 Epoch: 128 train_loss: 0.4319 train_acc: 85.4676 Epoch: 129 train_loss: 0.4201 train_acc: 86.6187 Epoch: 130 train_loss: 0.3926 train_acc: 85.4676 Epoch: 131 train_loss: 0.4054 train_acc: 84.7482 Epoch: 132 train_loss: 0.4109 train_acc: 84.0288 Epoch: 133 train_loss: 0.3931 train_acc: 87.1942 Epoch: 134 train_loss: 0.4001 train_acc: 87.0504 Epoch: 135 train_loss: 0.4284 train_acc: 85.8993 Epoch: 136 train_loss: 0.4334 train_acc: 85.3237 Epoch: 137 train_loss: 0.3835 train_acc: 86.1870 Epoch: 138 train_loss: 0.4041 train_acc: 85.6115 Epoch: 139 train_loss: 0.3936 train_acc: 87.1942 Epoch: 140 train_loss: 0.4148 train_acc: 86.7626 Epoch: 141 train_loss: 0.3784 train_acc: 86.9065 Epoch: 142 train_loss: 0.3477 train_acc: 88.2014 Epoch: 143 train_loss: 0.3728 train_acc: 87.9137 Epoch: 144 train_loss: 0.3487 train_acc: 90.0719 Epoch: 145 train_loss: 0.3473 train_acc: 88.7770 Epoch: 146 train_loss: 0.3521 train_acc: 88.6331 Epoch: 147 train_loss: 0.3403 train_acc: 89.2086 Epoch: 148 train_loss: 0.4209 train_acc: 84.8921 Epoch: 149 train_loss: 0.3766 train_acc: 88.0576 Epoch: 150 train_loss: 0.4179 train_acc: 85.7554 Epoch: 151 train_loss: 0.3921 train_acc: 85.7554 Epoch: 152 train_loss: 0.3338 train_acc: 90.3597 Epoch: 153 train_loss: 0.3174 train_acc: 90.6475 Epoch: 154 train_loss: 0.3225 train_acc: 91.5108 Epoch: 155 train_loss: 0.3466 train_acc: 88.3453 Epoch: 156 train_loss: 0.3500 train_acc: 89.2086 Epoch: 157 train_loss: 0.3596 train_acc: 88.0576 Epoch: 158 train_loss: 0.3281 train_acc: 89.9281 Epoch: 159 train_loss: 0.3289 train_acc: 90.0719 Epoch: 160 train_loss: 0.3189 train_acc: 89.4964 Epoch: 161 train_loss: 0.3102 train_acc: 90.9352 Epoch: 162 train_loss: 0.2979 train_acc: 91.5108 Epoch: 163 train_loss: 0.2798 train_acc: 91.5108 Epoch: 164 train_loss: 0.2891 train_acc: 91.0791 Epoch: 165 train_loss: 0.3358 train_acc: 88.6331 Epoch: 166 train_loss: 0.3216 train_acc: 90.2158 Epoch: 167 train_loss: 0.3456 train_acc: 89.7842 Epoch: 168 train_loss: 0.3664 train_acc: 87.0504 Epoch: 169 train_loss: 0.3508 train_acc: 87.6259 Epoch: 170 train_loss: 0.3234 train_acc: 89.4964 Epoch: 171 train_loss: 0.2797 train_acc: 91.5108 Epoch: 172 train_loss: 0.2651 train_acc: 93.0935 Epoch: 173 train_loss: 0.2886 train_acc: 90.5036 Epoch: 174 train_loss: 0.2518 train_acc: 92.0863 Epoch: 175 train_loss: 0.2634 train_acc: 91.2230 Epoch: 176 train_loss: 0.2718 train_acc: 91.3669 Epoch: 177 train_loss: 0.2945 train_acc: 89.9281 Epoch: 178 train_loss: 0.2711 train_acc: 92.2302 Epoch: 179 train_loss: 0.2721 train_acc: 91.7986 Epoch: 180 train_loss: 0.2767 train_acc: 91.7986 Epoch: 181 train_loss: 0.2502 train_acc: 92.8058 Epoch: 182 train_loss: 0.2626 train_acc: 91.2230 Epoch: 183 train_loss: 0.2774 train_acc: 90.6475 Epoch: 184 train_loss: 0.2583 train_acc: 92.5180 Epoch: 185 train_loss: 0.2687 train_acc: 92.2302 Epoch: 186 train_loss: 0.2641 train_acc: 91.2230 Epoch: 187 train_loss: 0.2673 train_acc: 92.0863 Epoch: 188 train_loss: 0.3005 train_acc: 88.7770 Epoch: 189 train_loss: 0.2642 train_acc: 91.6547 Epoch: 190 train_loss: 0.2427 train_acc: 93.5252 Epoch: 191 train_loss: 0.1970 train_acc: 95.6834 Epoch: 192 train_loss: 0.2024 train_acc: 94.5324 Epoch: 193 train_loss: 0.2051 train_acc: 94.2446 Epoch: 194 train_loss: 0.2177 train_acc: 93.9568 Epoch: 195 train_loss: 0.2230 train_acc: 93.3813 Epoch: 196 train_loss: 0.2175 train_acc: 93.2374 Epoch: 197 train_loss: 0.2205 train_acc: 93.2374 Epoch: 198 train_loss: 0.2714 train_acc: 91.7986 Epoch: 199 train_loss: 0.2891 train_acc: 91.9424 Epoch: 200 train_loss: 0.2700 train_acc: 90.9352 Epoch: 201 train_loss: 0.2510 train_acc: 92.2302 Epoch: 202 train_loss: 0.2557 train_acc: 90.9352 Epoch: 203 train_loss: 0.2439 train_acc: 92.3741 Epoch: 204 train_loss: 0.2199 train_acc: 92.3741 Epoch: 205 train_loss: 0.2061 train_acc: 93.8130 Epoch: 206 train_loss: 0.2101 train_acc: 94.6763 Epoch: 207 train_loss: 0.2261 train_acc: 92.9496 Epoch: 208 train_loss: 0.2675 train_acc: 92.3741 Epoch: 209 train_loss: 0.2212 train_acc: 93.8130 Epoch: 210 train_loss: 0.2251 train_acc: 93.8130 Epoch: 211 train_loss: 0.2255 train_acc: 93.3813 Epoch: 212 train_loss: 0.1906 train_acc: 94.8201 Epoch: 213 train_loss: 0.1993 train_acc: 94.9640 Epoch: 214 train_loss: 0.2196 train_acc: 93.5252 Epoch: 215 train_loss: 0.1859 train_acc: 94.2446 Epoch: 216 train_loss: 0.1960 train_acc: 94.3885 Epoch: 217 train_loss: 0.1975 train_acc: 94.2446 Epoch: 218 train_loss: 0.2274 train_acc: 93.9568 Epoch: 219 train_loss: 0.2261 train_acc: 92.3741 Epoch: 220 train_loss: 0.2016 train_acc: 94.3885 Epoch: 221 train_loss: 0.1827 train_acc: 95.3957 Epoch: 222 train_loss: 0.2437 train_acc: 91.9424 Epoch: 223 train_loss: 0.2060 train_acc: 93.9568 Epoch: 224 train_loss: 0.1887 train_acc: 94.6763 Epoch: 225 train_loss: 0.1982 train_acc: 94.5324 Epoch: 226 train_loss: 0.1992 train_acc: 94.2446 Epoch: 227 train_loss: 0.1813 train_acc: 94.8201 Epoch: 228 train_loss: 0.1900 train_acc: 94.6763 Epoch: 229 train_loss: 0.1830 train_acc: 94.5324 Epoch: 230 train_loss: 0.1804 train_acc: 95.3957 Epoch: 231 train_loss: 0.1598 train_acc: 96.8345 Epoch: 232 train_loss: 0.1606 train_acc: 96.1151 Epoch: 233 train_loss: 0.1919 train_acc: 94.1007 Epoch: 234 train_loss: 0.1649 train_acc: 95.5396 Epoch: 235 train_loss: 0.1981 train_acc: 93.9568 Epoch: 236 train_loss: 0.1800 train_acc: 95.2518 Epoch: 237 train_loss: 0.1698 train_acc: 95.6834 Epoch: 238 train_loss: 0.1808 train_acc: 94.9640 Epoch: 239 train_loss: 0.1985 train_acc: 94.1007 Epoch: 240 train_loss: 0.2011 train_acc: 94.2446 Epoch: 241 train_loss: 0.1652 train_acc: 96.1151 Epoch: 242 train_loss: 0.1733 train_acc: 95.6834 Epoch: 243 train_loss: 0.1889 train_acc: 94.9640 Epoch: 244 train_loss: 0.1638 train_acc: 95.6834 Epoch: 245 train_loss: 0.1528 train_acc: 97.1223 Epoch: 246 train_loss: 0.1507 train_acc: 97.1223 Epoch: 247 train_loss: 0.1499 train_acc: 96.5468 Epoch: 248 train_loss: 0.1634 train_acc: 95.5396 Epoch: 249 train_loss: 0.1469 train_acc: 95.9712 Epoch: 250 train_loss: 0.1839 train_acc: 93.8130 Epoch: 251 train_loss: 0.2340 train_acc: 92.2302 Epoch: 252 train_loss: 0.2215 train_acc: 93.5252 Epoch: 253 train_loss: 0.1855 train_acc: 95.1079 Epoch: 254 train_loss: 0.1560 train_acc: 96.2590 Epoch: 255 train_loss: 0.1420 train_acc: 96.2590 Epoch: 256 train_loss: 0.1640 train_acc: 95.6834 Epoch: 257 train_loss: 0.1531 train_acc: 96.4029 Epoch: 258 train_loss: 0.1295 train_acc: 96.4029 Epoch: 259 train_loss: 0.1438 train_acc: 96.4029 Epoch: 260 train_loss: 0.1526 train_acc: 95.8273 Epoch: 261 train_loss: 0.1775 train_acc: 95.5396 Epoch: 262 train_loss: 0.1445 train_acc: 97.1223 Epoch: 263 train_loss: 0.1511 train_acc: 95.6834 Epoch: 264 train_loss: 0.1831 train_acc: 94.6763 Epoch: 265 train_loss: 0.1407 train_acc: 96.6906 Epoch: 266 train_loss: 0.1613 train_acc: 96.5468 Epoch: 267 train_loss: 0.1330 train_acc: 97.2662 Epoch: 268 train_loss: 0.1262 train_acc: 97.2662 Epoch: 269 train_loss: 0.1472 train_acc: 95.9712 Epoch: 270 train_loss: 0.1363 train_acc: 96.9784 Epoch: 271 train_loss: 0.1484 train_acc: 96.4029 Epoch: 272 train_loss: 0.1383 train_acc: 96.2590 Epoch: 273 train_loss: 0.1619 train_acc: 95.6834 Epoch: 274 train_loss: 0.1311 train_acc: 96.6906 Epoch: 275 train_loss: 0.1389 train_acc: 96.8345 Epoch: 276 train_loss: 0.1248 train_acc: 97.6978 Epoch: 277 train_loss: 0.1246 train_acc: 96.6906 Epoch: 278 train_loss: 0.1371 train_acc: 97.2662 Epoch: 279 train_loss: 0.1516 train_acc: 95.8273 Epoch: 280 train_loss: 0.1434 train_acc: 96.5468 Epoch: 281 train_loss: 0.1295 train_acc: 96.6906 Epoch: 282 train_loss: 0.1356 train_acc: 96.2590 Epoch: 283 train_loss: 0.1280 train_acc: 96.6906 Epoch: 284 train_loss: 0.1354 train_acc: 97.1223 Epoch: 285 train_loss: 0.1521 train_acc: 95.8273 Epoch: 286 train_loss: 0.1545 train_acc: 96.1151 Epoch: 287 train_loss: 0.1636 train_acc: 96.6906 Epoch: 288 train_loss: 0.1540 train_acc: 95.8273 Epoch: 289 train_loss: 0.1237 train_acc: 96.9784 Epoch: 290 train_loss: 0.1246 train_acc: 97.1223 Epoch: 291 train_loss: 0.1395 train_acc: 95.8273 Epoch: 292 train_loss: 0.1313 train_acc: 97.2662 Epoch: 293 train_loss: 0.1327 train_acc: 96.9784 Epoch: 294 train_loss: 0.1272 train_acc: 97.6978 Epoch: 295 train_loss: 0.1237 train_acc: 96.8345 Epoch: 296 train_loss: 0.1127 train_acc: 97.6978 Epoch: 297 train_loss: 0.1164 train_acc: 97.4101 Epoch: 298 train_loss: 0.1421 train_acc: 96.2590 Epoch: 299 train_loss: 0.1458 train_acc: 95.5396 Epoch: 300 train_loss: 0.1333 train_acc: 96.6906 Running Time: 227.97 ------------------------------ Final result: OA: 0.7518 | AA: 0.8448 | Kappa: 0.7196 [0.67919075 0.84311224 0.92391304 0.87248322 0.8651363 0.92027335 0.76579521 0.56823821 0.68439716 0.99382716 0.914791 0.76363636 0.97777778 0.74358974 1. 1. ] ------------------------------ Parameter: dataset: Indian flag_test: train mode: CAF gpu_id: 0 seed: 0 batch_size: 64 test_freq: 5 patches: 7 band_patches: 3 epoches: 300 learning_rate: 0.0005 gamma: 0.9 weight_decay: 0.005 — Reply to this email directly, view it on GitHub <#12 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFL2GZTS42V2KAPO7ZXHDZDVKOFNNANCNFSM5WBJ3ROQ . You are receiving this because you commented.Message ID: @.***>

好的,老师,谢谢您的回复

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants