-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How many epochs did you train on webnlg dataset? #2
Comments
That's strange. I try many different settings, but they all work. I update a new version similar to the original one. Please have a try and use the default setting. epoch 2 loss: 13.665100 F1: 0.075149 P: 0.089938 R: 0.064537 epoch 3 loss: 12.094102 F1: 0.147541 P: 0.176944 R: 0.126518 epoch 4 loss: 13.580454 F1: 0.198138 P: 0.237500 R: 0.169968 epoch 5 loss: 11.634348 F1: 0.210015 P: 0.252925 R: 0.179553 epoch 6 loss: 10.756370 F1: 0.208622 P: 0.262343 R: 0.173163 epoch 7 loss: 8.445783 F1: 0.230285 P: 0.273768 R: 0.198722 epoch 8 loss: 9.191703 F1: 0.240741 P: 0.303797 R: 0.199361 epoch 9 loss: 6.884187 F1: 0.243792 P: 0.296432 R: 0.207029 epoch 10 loss: 7.504792 F1: 0.244103 P: 0.294756 R: 0.208307 epoch 11 loss: 7.149976 F1: 0.242936 P: 0.285345 R: 0.211502 epoch 12 loss: 6.352170 F1: 0.266135 P: 0.307628 R: 0.234505 epoch 13 loss: 6.124747 F1: 0.267201 P: 0.310491 R: 0.234505 epoch 14 loss: 6.946354 F1: 0.278336 P: 0.317253 R: 0.247923 epoch 15 loss: 5.522422 F1: 0.281497 P: 0.322341 R: 0.249840 epoch 16 loss: 5.934337 F1: 0.267284 P: 0.302176 R: 0.239617 epoch 17 loss: 6.453718 F1: 0.270039 P: 0.305153 R: 0.242173 epoch 18 loss: 4.792238 F1: 0.268985 P: 0.302474 R: 0.242173 epoch 19 loss: 4.421184 F1: 0.276836 P: 0.309392 R: 0.250479 epoch 20 loss: 4.318580 F1: 0.280458 P: 0.306991 R: 0.258147 epoch 21 loss: 4.167426 F1: 0.279696 P: 0.304282 R: 0.258786 |
Thank you for your work! I have a try on a new version of the webnlg dataset(after entity masking). I have made some mistake in my dataset. The result is good. If you can add the method of data preprocessing, it will be an excellent generalization framework on NRE task. epoch 2 loss: 13.444241 F1: 0.188377 P: 0.278801 R: 0.142243 epoch 3 loss: 10.825327 F1: 0.220714 P: 0.320303 R: 0.168366 epoch 4 loss: 10.490962 F1: 0.241535 P: 0.339389 R: 0.187480 epoch 5 loss: 10.200238 F1: 0.251348 P: 0.347985 R: 0.196719 epoch 6 loss: 9.493075 F1: 0.258635 P: 0.356983 R: 0.202772 epoch 7 loss: 9.527404 F1: 0.260773 P: 0.355825 R: 0.205798 epoch 8 loss: 9.709828 F1: 0.270671 P: 0.356307 R: 0.218222 epoch 9 loss: 10.282122 F1: 0.291675 P: 0.375879 R: 0.238292 epoch 10 loss: 8.802513 F1: 0.288422 P: 0.367151 R: 0.237496 epoch 11 loss: 8.855803 F1: 0.299773 P: 0.369231 R: 0.252310 epoch 12 loss: 9.465460 F1: 0.295006 P: 0.355411 R: 0.252150 epoch 13 loss: 7.226478 F1: 0.302694 P: 0.361326 R: 0.260433 epoch 14 loss: 7.257793 F1: 0.310519 P: 0.366833 R: 0.269194 epoch 15 loss: 6.049053 F1: 0.310207 P: 0.360794 R: 0.272061 epoch 16 loss: 7.307980 F1: 0.316016 P: 0.361032 R: 0.280981 epoch 17 loss: 6.690179 F1: 0.318442 P: 0.362637 R: 0.283848 epoch 18 loss: 6.352150 F1: 0.319246 P: 0.356890 R: 0.288786 epoch 19 loss: 5.875832 F1: 0.321760 P: 0.357839 R: 0.292291 epoch 20 loss: 6.401268 F1: 0.327978 P: 0.362969 R: 0.299140 epoch 21 loss: 4.804408 F1: 0.330060 P: 0.363167 R: 0.302485 epoch 22 loss: 5.498651 F1: 0.331173 P: 0.361988 R: 0.305193 epoch 23 loss: 6.467267 F1: 0.335626 P: 0.364587 R: 0.310927 epoch 24 loss: 4.964077 F1: 0.330294 P: 0.355140 R: 0.308697 epoch 25 loss: 5.744630 F1: 0.338613 P: 0.361548 R: 0.318414 |
My preprocessing data_prepare.py is almost the same as the official tensorflow version. And I have not got the method which turns the WebNLG from .xml to .json. |
You can refer to this website webnlg2017. |
Actually, I tried to parse xml by myself. Unfortunately, my preprocessing generated different json from the official one. Finally, I use the tf version directly. To fairly compare your model with copyre paper, I think:
|
The log info of training model on webnlg. It's tough for loss to decline and the training starts from a high loss value than the log file you have shown.
'''
config filename: ./config.json
load_embedding!
loading ./data/webnlg/entity_end_position/train.json
data size 5019
loading ./data/webnlg/entity_end_position/dev.json
data size 703
epoch 1 loss: 32.915226 F1: 0.002298 P: 0.002868 R: 0.001917
epoch 2 loss: 31.900913 F1: 0.009905 P: 0.012264 R: 0.008307
epoch 3 loss: 27.510458 F1: 0.015719 P: 0.018970 R: 0.013419
epoch 4 loss: 28.100975 F1: 0.025424 P: 0.032008 R: 0.021086
epoch 5 loss: 26.887936 F1: 0.050152 P: 0.061856 R: 0.042173
epoch 6 loss: 24.247564 F1: 0.065084 P: 0.081184 R: 0.054313
epoch 7 loss: 24.565767 F1: 0.071375 P: 0.086600 R: 0.060703
epoch 8 loss: 25.273928 F1: 0.078818 P: 0.096834 R: 0.066454
epoch 9 loss: 20.739996 F1: 0.080451 P: 0.097717 R: 0.068371
epoch 10 loss: 23.133522 F1: 0.081061 P: 0.099535 R: 0.068371
epoch 11 loss: 23.140676 F1: 0.083801 P: 0.101083 R: 0.071565
epoch 12 loss: 23.790190 F1: 0.078331 P: 0.091688 R: 0.068371
epoch 13 loss: 25.107319 F1: 0.099564 P: 0.115417 R: 0.087540
epoch 14 loss: 23.819378 F1: 0.090511 P: 0.105532 R: 0.079233
epoch 15 loss: 22.022694 F1: 0.086194 P: 0.100597 R: 0.075399
epoch 16 loss: 23.545181 F1: 0.090909 P: 0.105485 R: 0.079872
epoch 17 loss: 22.910625 F1: 0.098246 P: 0.108949 R: 0.089457
epoch 18 loss: 19.319550 F1: 0.115789 P: 0.128405 R: 0.105431
epoch 19 loss: 21.152763 F1: 0.099858 P: 0.111994 R: 0.090096
epoch 20 loss: 21.946230 F1: 0.095470 P: 0.104981 R: 0.087540
epoch 21 loss: 19.299545 F1: 0.102300 P: 0.110534 R: 0.095208
epoch 22 loss: 23.689260 F1: 0.097695 P: 0.105812 R: 0.090735
epoch 23 loss: 23.338881 F1: 0.091374 P: 0.097953 R: 0.085623
epoch 24 loss: 21.140081 F1: 0.099003 P: 0.107143 R: 0.092013
epoch 25 loss: 20.936872 F1: 0.100138 P: 0.108941 R: 0.092652
epoch 26 loss: 18.399645 F1: 0.103602 P: 0.111852 R: 0.096486
epoch 27 loss: 21.461807 F1: 0.101880 P: 0.109559 R: 0.095208
epoch 28 loss: 21.085964 F1: 0.098673 P: 0.105531 R: 0.092652
epoch 29 loss: 21.767021 F1: 0.100812 P: 0.107117 R: 0.095208
epoch 30 loss: 21.892426 F1: 0.102096 P: 0.108399 R: 0.096486
epoch 31 loss: 21.119205 F1: 0.089643 P: 0.095652 R: 0.084345
epoch 32 loss: 20.188158 F1: 0.105983 P: 0.113971 R: 0.099042
epoch 33 loss: 22.146685 F1: 0.105727 P: 0.112554 R: 0.099681
epoch 34 loss: 22.138407 F1: 0.093960 P: 0.098940 R: 0.089457
epoch 35 loss: 21.843102 F1: 0.096566 P: 0.103198 R: 0.090735
epoch 36 loss: 19.749903 F1: 0.094840 P: 0.104375 R: 0.086901
epoch 37 loss: 20.443235 F1: 0.099729 P: 0.106291 R: 0.093930
epoch 38 loss: 19.464769 F1: 0.099865 P: 0.105790 R: 0.094569
epoch 39 loss: 22.916927 F1: 0.104223 P: 0.111597 R: 0.097764
epoch 40 loss: 22.332972 F1: 0.096533 P: 0.103123 R: 0.090735
epoch 41 loss: 21.962793 F1: 0.094883 P: 0.101010 R: 0.089457
epoch 42 loss: 22.766172 F1: 0.095174 P: 0.100858 R: 0.090096
epoch 43 loss: 22.344751 F1: 0.097002 P: 0.102564 R: 0.092013
epoch 44 loss: 23.941555 F1: 0.099529 P: 0.105039 R: 0.094569
epoch 45 loss: 22.505873 F1: 0.100633 P: 0.105153 R: 0.096486
epoch 46 loss: 20.527216 F1: 0.105898 P: 0.111346 R: 0.100958
epoch 47 loss: 19.862801 F1: 0.098459 P: 0.103448 R: 0.093930
epoch 48 loss: 20.391645 F1: 0.101604 P: 0.106517 R: 0.097125
epoch 49 loss: 20.863894 F1: 0.104035 P: 0.108787 R: 0.099681
epoch 50 loss: 21.055967 F1: 0.103010 P: 0.108070 R: 0.098403
epoch 51 loss: 20.388582 F1: 0.099599 P: 0.104415 R: 0.095208
epoch 52 loss: 22.958044 F1: 0.101469 P: 0.106219 R: 0.097125
epoch 53 loss: 20.523462 F1: 0.102513 P: 0.107746 R: 0.097764
epoch 54 loss: 20.575268 F1: 0.098732 P: 0.103280 R: 0.094569
epoch 55 loss: 22.237806 F1: 0.101536 P: 0.106368 R: 0.097125
epoch 56 loss: 20.335493 F1: 0.106525 P: 0.111188 R: 0.102236
epoch 57 loss: 20.177532 F1: 0.095365 P: 0.099721 R: 0.091374
epoch 58 loss: 18.787577 F1: 0.094126 P: 0.098532 R: 0.090096
epoch 59 loss: 21.285160 F1: 0.099300 P: 0.103760 R: 0.095208
epoch 60 loss: 21.078987 F1: 0.101672 P: 0.106667 R: 0.097125
epoch 61 loss: 21.739445 F1: 0.098262 P: 0.103013 R: 0.093930
epoch 62 loss: 17.307463 F1: 0.101604 P: 0.106517 R: 0.097125
epoch 63 loss: 20.719521 F1: 0.097724 P: 0.102600 R: 0.093291
epoch 64 loss: 23.640509 F1: 0.106000 P: 0.110801 R: 0.101597
epoch 65 loss: 21.545544 F1: 0.099967 P: 0.104457 R: 0.095847
epoch 66 loss: 19.790474 F1: 0.098997 P: 0.103860 R: 0.094569
epoch 67 loss: 19.295521 F1: 0.104388 P: 0.108801 R: 0.100319
epoch 68 loss: 19.242487 F1: 0.102838 P: 0.107692 R: 0.098403
epoch 69 loss: 21.927759 F1: 0.102632 P: 0.107242 R: 0.098403
epoch 70 loss: 21.096117 F1: 0.097804 P: 0.102012 R: 0.093930
epoch 71 loss: 19.201813 F1: 0.101358 P: 0.105227 R: 0.097764
epoch 72 loss: 20.965561 F1: 0.099933 P: 0.104384 R: 0.095847
epoch 73 loss: 22.830009 F1: 0.098274 P: 0.102281 R: 0.094569
epoch 74 loss: 21.505526 F1: 0.103540 P: 0.108467 R: 0.099042
epoch 75 loss: 19.773630 F1: 0.100399 P: 0.104643 R: 0.096486
epoch 76 loss: 20.365486 F1: 0.099536 P: 0.103520 R: 0.095847
epoch 77 loss: 21.803555 F1: 0.104914 P: 0.109191 R: 0.100958
epoch 78 loss: 19.608545 F1: 0.098525 P: 0.103594 R: 0.093930
epoch 79 loss: 24.478024 F1: 0.079640 P: 0.086924 R: 0.073482
epoch 80 loss: 24.325733 F1: 0.079002 P: 0.086298 R: 0.072843
epoch 81 loss: 21.051134 F1: 0.100105 P: 0.110681 R: 0.091374
epoch 82 loss: 20.675182 F1: 0.103472 P: 0.110706 R: 0.097125
epoch 83 loss: 20.217590 F1: 0.096589 P: 0.102436 R: 0.091374
epoch 84 loss: 23.057375 F1: 0.099596 P: 0.105188 R: 0.094569
epoch 85 loss: 22.329268 F1: 0.096625 P: 0.100206 R: 0.093291
epoch 86 loss: 25.108236 F1: 0.093677 P: 0.098315 R: 0.089457
epoch 87 loss: 20.028948 F1: 0.092852 P: 0.097271 R: 0.088818
epoch 88 loss: 23.089718 F1: 0.089692 P: 0.094167 R: 0.085623
epoch 89 loss: 18.960468 F1: 0.089692 P: 0.094167 R: 0.085623
epoch 90 loss: 22.726799 F1: 0.091122 P: 0.095775 R: 0.086901
epoch 91 loss: 22.099943 F1: 0.099967 P: 0.104457 R: 0.095847
epoch 92 loss: 23.679205 F1: 0.090576 P: 0.094576 R: 0.086901
epoch 93 loss: 22.687222 F1: 0.094220 P: 0.098739 R: 0.090096
epoch 94 loss: 23.637705 F1: 0.097967 P: 0.102368 R: 0.093930
epoch 95 loss: 19.864727 F1: 0.095270 P: 0.099513 R: 0.091374
epoch 96 loss: 25.225853 F1: 0.095143 P: 0.099237 R: 0.091374
epoch 97 loss: 20.020922 F1: 0.103758 P: 0.108183 R: 0.099681
epoch 98 loss: 23.542727 F1: 0.098700 P: 0.103208 R: 0.094569
epoch 99 loss: 22.641554 F1: 0.098065 P: 0.102582 R: 0.093930
epoch 100 loss: 21.837103 F1: 0.101367 P: 0.105997 R: 0.097125
epoch 101 loss: 18.833443 F1: 0.104035 P: 0.108787 R: 0.099681
epoch 102 loss: 21.309443 F1: 0.105263 P: 0.109951 R: 0.100958
epoch 103 loss: 19.390417 F1: 0.101661 P: 0.105882 R: 0.097764
epoch 104 loss: 19.361664 F1: 0.105894 P: 0.110570 R: 0.101597
epoch 105 loss: 20.413847 F1: 0.100633 P: 0.105153 R: 0.096486
epoch 106 loss: 20.095713 F1: 0.104423 P: 0.108877 R: 0.100319
epoch 107 loss: 21.468578 F1: 0.105718 P: 0.110187 R: 0.101597
epoch 108 loss: 20.518288 F1: 0.106870 P: 0.111188 R: 0.102875
epoch 109 loss: 20.153645 F1: 0.109079 P: 0.113731 R: 0.104792
epoch 110 loss: 21.636946 F1: 0.107119 P: 0.111728 R: 0.102875
epoch 111 loss: 18.999245 F1: 0.109854 P: 0.114663 R: 0.105431
epoch 112 loss: 22.469585 F1: 0.112255 P: 0.116874 R: 0.107987
epoch 113 loss: 22.612556 F1: 0.108306 P: 0.112803 R: 0.104153
epoch 114 loss: 21.570963 F1: 0.107534 P: 0.111878 R: 0.103514
epoch 115 loss: 20.076935 F1: 0.102853 P: 0.106970 R: 0.099042
epoch 116 loss: 19.920132 F1: 0.108970 P: 0.113495 R: 0.104792
epoch 117 loss: 21.485884 F1: 0.106136 P: 0.110345 R: 0.102236
epoch 118 loss: 22.931591 F1: 0.112292 P: 0.116955 R: 0.107987
epoch 119 loss: 24.095680 F1: 0.110446 P: 0.115198 R: 0.106070
epoch 120 loss: 20.244862 F1: 0.105859 P: 0.110493 R: 0.101597
epoch 121 loss: 20.845545 F1: 0.104775 P: 0.108890 R: 0.100958
epoch 122 loss: 19.785984 F1: 0.106799 P: 0.111034 R: 0.102875
epoch 123 loss: 17.929617 F1: 0.112330 P: 0.117036 R: 0.107987
epoch 124 loss: 20.878696 F1: 0.104949 P: 0.109267 R: 0.100958
epoch 125 loss: 20.286144 F1: 0.107190 P: 0.111883 R: 0.102875
epoch 126 loss: 21.894135 F1: 0.112330 P: 0.117036 R: 0.107987
epoch 127 loss: 22.584496 F1: 0.106916 P: 0.112045 R: 0.102236
epoch 128 loss: 21.797398 F1: 0.105123 P: 0.109646 R: 0.100958
epoch 129 loss: 23.331161 F1: 0.104354 P: 0.108726 R: 0.100319
epoch 130 loss: 23.423994 F1: 0.112397 P: 0.116438 R: 0.108626
epoch 131 loss: 21.362347 F1: 0.112844 P: 0.117403 R: 0.108626
epoch 132 loss: 22.722223 F1: 0.111406 P: 0.115782 R: 0.107348
epoch 133 loss: 19.547405 F1: 0.110116 P: 0.114483 R: 0.106070
epoch 134 loss: 21.386782 F1: 0.106101 P: 0.110269 R: 0.102236
epoch 135 loss: 19.975340 F1: 0.109333 P: 0.114286 R: 0.104792
epoch 136 loss: 18.630228 F1: 0.108825 P: 0.113182 R: 0.104792
'''
The text was updated successfully, but these errors were encountered: