Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How many epochs did you train on webnlg dataset? #2

Closed
Nicoleqwerty opened this issue Apr 20, 2019 · 5 comments
Closed

How many epochs did you train on webnlg dataset? #2

Nicoleqwerty opened this issue Apr 20, 2019 · 5 comments

Comments

@Nicoleqwerty
Copy link

The log info of training model on webnlg. It's tough for loss to decline and the training starts from a high loss value than the log file you have shown.
'''
config filename: ./config.json
load_embedding!
loading ./data/webnlg/entity_end_position/train.json
data size 5019
loading ./data/webnlg/entity_end_position/dev.json
data size 703
epoch 1 loss: 32.915226 F1: 0.002298 P: 0.002868 R: 0.001917
epoch 2 loss: 31.900913 F1: 0.009905 P: 0.012264 R: 0.008307
epoch 3 loss: 27.510458 F1: 0.015719 P: 0.018970 R: 0.013419
epoch 4 loss: 28.100975 F1: 0.025424 P: 0.032008 R: 0.021086
epoch 5 loss: 26.887936 F1: 0.050152 P: 0.061856 R: 0.042173
epoch 6 loss: 24.247564 F1: 0.065084 P: 0.081184 R: 0.054313
epoch 7 loss: 24.565767 F1: 0.071375 P: 0.086600 R: 0.060703
epoch 8 loss: 25.273928 F1: 0.078818 P: 0.096834 R: 0.066454
epoch 9 loss: 20.739996 F1: 0.080451 P: 0.097717 R: 0.068371
epoch 10 loss: 23.133522 F1: 0.081061 P: 0.099535 R: 0.068371
epoch 11 loss: 23.140676 F1: 0.083801 P: 0.101083 R: 0.071565
epoch 12 loss: 23.790190 F1: 0.078331 P: 0.091688 R: 0.068371
epoch 13 loss: 25.107319 F1: 0.099564 P: 0.115417 R: 0.087540
epoch 14 loss: 23.819378 F1: 0.090511 P: 0.105532 R: 0.079233
epoch 15 loss: 22.022694 F1: 0.086194 P: 0.100597 R: 0.075399
epoch 16 loss: 23.545181 F1: 0.090909 P: 0.105485 R: 0.079872
epoch 17 loss: 22.910625 F1: 0.098246 P: 0.108949 R: 0.089457
epoch 18 loss: 19.319550 F1: 0.115789 P: 0.128405 R: 0.105431
epoch 19 loss: 21.152763 F1: 0.099858 P: 0.111994 R: 0.090096
epoch 20 loss: 21.946230 F1: 0.095470 P: 0.104981 R: 0.087540
epoch 21 loss: 19.299545 F1: 0.102300 P: 0.110534 R: 0.095208
epoch 22 loss: 23.689260 F1: 0.097695 P: 0.105812 R: 0.090735
epoch 23 loss: 23.338881 F1: 0.091374 P: 0.097953 R: 0.085623
epoch 24 loss: 21.140081 F1: 0.099003 P: 0.107143 R: 0.092013
epoch 25 loss: 20.936872 F1: 0.100138 P: 0.108941 R: 0.092652
epoch 26 loss: 18.399645 F1: 0.103602 P: 0.111852 R: 0.096486
epoch 27 loss: 21.461807 F1: 0.101880 P: 0.109559 R: 0.095208
epoch 28 loss: 21.085964 F1: 0.098673 P: 0.105531 R: 0.092652
epoch 29 loss: 21.767021 F1: 0.100812 P: 0.107117 R: 0.095208
epoch 30 loss: 21.892426 F1: 0.102096 P: 0.108399 R: 0.096486
epoch 31 loss: 21.119205 F1: 0.089643 P: 0.095652 R: 0.084345
epoch 32 loss: 20.188158 F1: 0.105983 P: 0.113971 R: 0.099042
epoch 33 loss: 22.146685 F1: 0.105727 P: 0.112554 R: 0.099681
epoch 34 loss: 22.138407 F1: 0.093960 P: 0.098940 R: 0.089457
epoch 35 loss: 21.843102 F1: 0.096566 P: 0.103198 R: 0.090735
epoch 36 loss: 19.749903 F1: 0.094840 P: 0.104375 R: 0.086901
epoch 37 loss: 20.443235 F1: 0.099729 P: 0.106291 R: 0.093930
epoch 38 loss: 19.464769 F1: 0.099865 P: 0.105790 R: 0.094569
epoch 39 loss: 22.916927 F1: 0.104223 P: 0.111597 R: 0.097764
epoch 40 loss: 22.332972 F1: 0.096533 P: 0.103123 R: 0.090735
epoch 41 loss: 21.962793 F1: 0.094883 P: 0.101010 R: 0.089457
epoch 42 loss: 22.766172 F1: 0.095174 P: 0.100858 R: 0.090096
epoch 43 loss: 22.344751 F1: 0.097002 P: 0.102564 R: 0.092013
epoch 44 loss: 23.941555 F1: 0.099529 P: 0.105039 R: 0.094569
epoch 45 loss: 22.505873 F1: 0.100633 P: 0.105153 R: 0.096486
epoch 46 loss: 20.527216 F1: 0.105898 P: 0.111346 R: 0.100958
epoch 47 loss: 19.862801 F1: 0.098459 P: 0.103448 R: 0.093930
epoch 48 loss: 20.391645 F1: 0.101604 P: 0.106517 R: 0.097125
epoch 49 loss: 20.863894 F1: 0.104035 P: 0.108787 R: 0.099681
epoch 50 loss: 21.055967 F1: 0.103010 P: 0.108070 R: 0.098403
epoch 51 loss: 20.388582 F1: 0.099599 P: 0.104415 R: 0.095208
epoch 52 loss: 22.958044 F1: 0.101469 P: 0.106219 R: 0.097125
epoch 53 loss: 20.523462 F1: 0.102513 P: 0.107746 R: 0.097764
epoch 54 loss: 20.575268 F1: 0.098732 P: 0.103280 R: 0.094569
epoch 55 loss: 22.237806 F1: 0.101536 P: 0.106368 R: 0.097125
epoch 56 loss: 20.335493 F1: 0.106525 P: 0.111188 R: 0.102236
epoch 57 loss: 20.177532 F1: 0.095365 P: 0.099721 R: 0.091374
epoch 58 loss: 18.787577 F1: 0.094126 P: 0.098532 R: 0.090096
epoch 59 loss: 21.285160 F1: 0.099300 P: 0.103760 R: 0.095208
epoch 60 loss: 21.078987 F1: 0.101672 P: 0.106667 R: 0.097125
epoch 61 loss: 21.739445 F1: 0.098262 P: 0.103013 R: 0.093930
epoch 62 loss: 17.307463 F1: 0.101604 P: 0.106517 R: 0.097125
epoch 63 loss: 20.719521 F1: 0.097724 P: 0.102600 R: 0.093291
epoch 64 loss: 23.640509 F1: 0.106000 P: 0.110801 R: 0.101597
epoch 65 loss: 21.545544 F1: 0.099967 P: 0.104457 R: 0.095847
epoch 66 loss: 19.790474 F1: 0.098997 P: 0.103860 R: 0.094569
epoch 67 loss: 19.295521 F1: 0.104388 P: 0.108801 R: 0.100319
epoch 68 loss: 19.242487 F1: 0.102838 P: 0.107692 R: 0.098403
epoch 69 loss: 21.927759 F1: 0.102632 P: 0.107242 R: 0.098403
epoch 70 loss: 21.096117 F1: 0.097804 P: 0.102012 R: 0.093930
epoch 71 loss: 19.201813 F1: 0.101358 P: 0.105227 R: 0.097764
epoch 72 loss: 20.965561 F1: 0.099933 P: 0.104384 R: 0.095847
epoch 73 loss: 22.830009 F1: 0.098274 P: 0.102281 R: 0.094569
epoch 74 loss: 21.505526 F1: 0.103540 P: 0.108467 R: 0.099042
epoch 75 loss: 19.773630 F1: 0.100399 P: 0.104643 R: 0.096486
epoch 76 loss: 20.365486 F1: 0.099536 P: 0.103520 R: 0.095847
epoch 77 loss: 21.803555 F1: 0.104914 P: 0.109191 R: 0.100958
epoch 78 loss: 19.608545 F1: 0.098525 P: 0.103594 R: 0.093930
epoch 79 loss: 24.478024 F1: 0.079640 P: 0.086924 R: 0.073482
epoch 80 loss: 24.325733 F1: 0.079002 P: 0.086298 R: 0.072843
epoch 81 loss: 21.051134 F1: 0.100105 P: 0.110681 R: 0.091374
epoch 82 loss: 20.675182 F1: 0.103472 P: 0.110706 R: 0.097125
epoch 83 loss: 20.217590 F1: 0.096589 P: 0.102436 R: 0.091374
epoch 84 loss: 23.057375 F1: 0.099596 P: 0.105188 R: 0.094569
epoch 85 loss: 22.329268 F1: 0.096625 P: 0.100206 R: 0.093291
epoch 86 loss: 25.108236 F1: 0.093677 P: 0.098315 R: 0.089457
epoch 87 loss: 20.028948 F1: 0.092852 P: 0.097271 R: 0.088818
epoch 88 loss: 23.089718 F1: 0.089692 P: 0.094167 R: 0.085623
epoch 89 loss: 18.960468 F1: 0.089692 P: 0.094167 R: 0.085623
epoch 90 loss: 22.726799 F1: 0.091122 P: 0.095775 R: 0.086901
epoch 91 loss: 22.099943 F1: 0.099967 P: 0.104457 R: 0.095847
epoch 92 loss: 23.679205 F1: 0.090576 P: 0.094576 R: 0.086901
epoch 93 loss: 22.687222 F1: 0.094220 P: 0.098739 R: 0.090096
epoch 94 loss: 23.637705 F1: 0.097967 P: 0.102368 R: 0.093930
epoch 95 loss: 19.864727 F1: 0.095270 P: 0.099513 R: 0.091374
epoch 96 loss: 25.225853 F1: 0.095143 P: 0.099237 R: 0.091374
epoch 97 loss: 20.020922 F1: 0.103758 P: 0.108183 R: 0.099681
epoch 98 loss: 23.542727 F1: 0.098700 P: 0.103208 R: 0.094569
epoch 99 loss: 22.641554 F1: 0.098065 P: 0.102582 R: 0.093930
epoch 100 loss: 21.837103 F1: 0.101367 P: 0.105997 R: 0.097125
epoch 101 loss: 18.833443 F1: 0.104035 P: 0.108787 R: 0.099681
epoch 102 loss: 21.309443 F1: 0.105263 P: 0.109951 R: 0.100958
epoch 103 loss: 19.390417 F1: 0.101661 P: 0.105882 R: 0.097764
epoch 104 loss: 19.361664 F1: 0.105894 P: 0.110570 R: 0.101597
epoch 105 loss: 20.413847 F1: 0.100633 P: 0.105153 R: 0.096486
epoch 106 loss: 20.095713 F1: 0.104423 P: 0.108877 R: 0.100319
epoch 107 loss: 21.468578 F1: 0.105718 P: 0.110187 R: 0.101597
epoch 108 loss: 20.518288 F1: 0.106870 P: 0.111188 R: 0.102875
epoch 109 loss: 20.153645 F1: 0.109079 P: 0.113731 R: 0.104792
epoch 110 loss: 21.636946 F1: 0.107119 P: 0.111728 R: 0.102875
epoch 111 loss: 18.999245 F1: 0.109854 P: 0.114663 R: 0.105431
epoch 112 loss: 22.469585 F1: 0.112255 P: 0.116874 R: 0.107987
epoch 113 loss: 22.612556 F1: 0.108306 P: 0.112803 R: 0.104153
epoch 114 loss: 21.570963 F1: 0.107534 P: 0.111878 R: 0.103514
epoch 115 loss: 20.076935 F1: 0.102853 P: 0.106970 R: 0.099042
epoch 116 loss: 19.920132 F1: 0.108970 P: 0.113495 R: 0.104792
epoch 117 loss: 21.485884 F1: 0.106136 P: 0.110345 R: 0.102236
epoch 118 loss: 22.931591 F1: 0.112292 P: 0.116955 R: 0.107987
epoch 119 loss: 24.095680 F1: 0.110446 P: 0.115198 R: 0.106070
epoch 120 loss: 20.244862 F1: 0.105859 P: 0.110493 R: 0.101597
epoch 121 loss: 20.845545 F1: 0.104775 P: 0.108890 R: 0.100958
epoch 122 loss: 19.785984 F1: 0.106799 P: 0.111034 R: 0.102875
epoch 123 loss: 17.929617 F1: 0.112330 P: 0.117036 R: 0.107987
epoch 124 loss: 20.878696 F1: 0.104949 P: 0.109267 R: 0.100958
epoch 125 loss: 20.286144 F1: 0.107190 P: 0.111883 R: 0.102875
epoch 126 loss: 21.894135 F1: 0.112330 P: 0.117036 R: 0.107987
epoch 127 loss: 22.584496 F1: 0.106916 P: 0.112045 R: 0.102236
epoch 128 loss: 21.797398 F1: 0.105123 P: 0.109646 R: 0.100958
epoch 129 loss: 23.331161 F1: 0.104354 P: 0.108726 R: 0.100319
epoch 130 loss: 23.423994 F1: 0.112397 P: 0.116438 R: 0.108626
epoch 131 loss: 21.362347 F1: 0.112844 P: 0.117403 R: 0.108626
epoch 132 loss: 22.722223 F1: 0.111406 P: 0.115782 R: 0.107348
epoch 133 loss: 19.547405 F1: 0.110116 P: 0.114483 R: 0.106070
epoch 134 loss: 21.386782 F1: 0.106101 P: 0.110269 R: 0.102236
epoch 135 loss: 19.975340 F1: 0.109333 P: 0.114286 R: 0.104792
epoch 136 loss: 18.630228 F1: 0.108825 P: 0.113182 R: 0.104792
'''

@WindChimeRan
Copy link
Owner

That's strange. I try many different settings, but they all work. I update a new version similar to the original one. Please have a try and use the default setting.
'''
epoch 1 loss: 20.777309 F1: 0.050310 P: 0.058574 R: 0.044089
relation F1: 0.294568 P: 0.342954 R: 0.258147
entity F1: 0.201969 P: 0.235144 R: 0.176997


epoch 2 loss: 13.665100 F1: 0.075149 P: 0.089938 R: 0.064537
relation F1: 0.357143 P: 0.427427 R: 0.306709
entity F1: 0.287202 P: 0.343722 R: 0.246645


epoch 3 loss: 12.094102 F1: 0.147541 P: 0.176944 R: 0.126518
relation F1: 0.513413 P: 0.615728 R: 0.440256
entity F1: 0.356930 P: 0.428061 R: 0.306070


epoch 4 loss: 13.580454 F1: 0.198138 P: 0.237500 R: 0.169968
relation F1: 0.539292 P: 0.646429 R: 0.462620
entity F1: 0.410428 P: 0.491964 R: 0.352077


epoch 5 loss: 11.634348 F1: 0.210015 P: 0.252925 R: 0.179553
relation F1: 0.580717 P: 0.699370 R: 0.496486
entity F1: 0.449925 P: 0.541854 R: 0.384665


epoch 6 loss: 10.756370 F1: 0.208622 P: 0.262343 R: 0.173163
relation F1: 0.622787 P: 0.783156 R: 0.516933
entity F1: 0.421093 P: 0.529526 R: 0.349521


epoch 7 loss: 8.445783 F1: 0.230285 P: 0.273768 R: 0.198722
relation F1: 0.630137 P: 0.749120 R: 0.543770
entity F1: 0.465013 P: 0.552817 R: 0.401278


epoch 8 loss: 9.191703 F1: 0.240741 P: 0.303797 R: 0.199361
relation F1: 0.655864 P: 0.827653 R: 0.543131
entity F1: 0.459877 P: 0.580331 R: 0.380831


epoch 9 loss: 6.884187 F1: 0.243792 P: 0.296432 R: 0.207029
relation F1: 0.662152 P: 0.805124 R: 0.562300
entity F1: 0.463506 P: 0.563586 R: 0.393610


epoch 10 loss: 7.504792 F1: 0.244103 P: 0.294756 R: 0.208307
relation F1: 0.676900 P: 0.817360 R: 0.577636
entity F1: 0.497941 P: 0.601266 R: 0.424920


epoch 11 loss: 7.149976 F1: 0.242936 P: 0.285345 R: 0.211502
relation F1: 0.694312 P: 0.815517 R: 0.604473
entity F1: 0.484404 P: 0.568966 R: 0.421725


epoch 12 loss: 6.352170 F1: 0.266135 P: 0.307628 R: 0.234505
relation F1: 0.717912 P: 0.829841 R: 0.632588
entity F1: 0.539521 P: 0.623638 R: 0.475399


epoch 13 loss: 6.124747 F1: 0.267201 P: 0.310491 R: 0.234505
relation F1: 0.717874 P: 0.834179 R: 0.630032
entity F1: 0.537313 P: 0.624365 R: 0.471565


epoch 14 loss: 6.946354 F1: 0.278336 P: 0.317253 R: 0.247923
relation F1: 0.738164 P: 0.841374 R: 0.657508
entity F1: 0.544476 P: 0.620605 R: 0.484984


epoch 15 loss: 5.522422 F1: 0.281497 P: 0.322341 R: 0.249840
relation F1: 0.721382 P: 0.826051 R: 0.640256
entity F1: 0.539237 P: 0.617477 R: 0.478594


epoch 16 loss: 5.934337 F1: 0.267284 P: 0.302176 R: 0.239617
relation F1: 0.730577 P: 0.825947 R: 0.654952
entity F1: 0.530292 P: 0.599517 R: 0.475399


epoch 17 loss: 6.453718 F1: 0.270039 P: 0.305153 R: 0.242173
relation F1: 0.741717 P: 0.838164 R: 0.665176
entity F1: 0.541503 P: 0.611916 R: 0.485623


epoch 18 loss: 4.792238 F1: 0.268985 P: 0.302474 R: 0.242173
relation F1: 0.731725 P: 0.822825 R: 0.658786
entity F1: 0.549326 P: 0.617717 R: 0.494569


epoch 19 loss: 4.421184 F1: 0.276836 P: 0.309392 R: 0.250479
relation F1: 0.739407 P: 0.826361 R: 0.669010
entity F1: 0.570621 P: 0.637727 R: 0.516294


epoch 20 loss: 4.318580 F1: 0.280458 P: 0.306991 R: 0.258147
relation F1: 0.748351 P: 0.819149 R: 0.688818
entity F1: 0.583825 P: 0.639058 R: 0.537380


epoch 21 loss: 4.167426 F1: 0.279696 P: 0.304282 R: 0.258786
relation F1: 0.770028 P: 0.837716 R: 0.712460
entity F1: 0.553177 P: 0.601803 R: 0.511821
'''

@Nicoleqwerty
Copy link
Author

Thank you for your work! I have a try on a new version of the webnlg dataset(after entity masking). I have made some mistake in my dataset. The result is good. If you can add the method of data preprocessing, it will be an excellent generalization framework on NRE task.
'''
epoch 1 loss: 16.559380 F1: 0.142799 P: 0.199543 R: 0.111182
relation F1: 0.415098 P: 0.580046 R: 0.323192
entity F1: 0.420008 P: 0.586907 R: 0.327015


epoch 2 loss: 13.444241 F1: 0.188377 P: 0.278801 R: 0.142243
relation F1: 0.560278 P: 0.829223 R: 0.423065
entity F1: 0.439405 P: 0.650328 R: 0.331794


epoch 3 loss: 10.825327 F1: 0.220714 P: 0.320303 R: 0.168366
relation F1: 0.608060 P: 0.882424 R: 0.463842
entity F1: 0.482982 P: 0.700909 R: 0.368429


epoch 4 loss: 10.490962 F1: 0.241535 P: 0.339389 R: 0.187480
relation F1: 0.642315 P: 0.902537 R: 0.498566
entity F1: 0.502976 P: 0.706747 R: 0.390411


epoch 5 loss: 10.200238 F1: 0.251348 P: 0.347985 R: 0.196719
relation F1: 0.663682 P: 0.918850 R: 0.519433
entity F1: 0.533632 P: 0.738800 R: 0.417649


epoch 6 loss: 9.493075 F1: 0.258635 P: 0.356983 R: 0.202772
relation F1: 0.681227 P: 0.940269 R: 0.534087
entity F1: 0.536164 P: 0.740045 R: 0.420357


epoch 7 loss: 9.527404 F1: 0.260773 P: 0.355825 R: 0.205798
relation F1: 0.691694 P: 0.943817 R: 0.545874
entity F1: 0.530427 P: 0.723768 R: 0.418605


epoch 8 loss: 9.709828 F1: 0.270671 P: 0.356307 R: 0.218222
relation F1: 0.715598 P: 0.942003 R: 0.576935
entity F1: 0.567223 P: 0.746684 R: 0.457311


epoch 9 loss: 10.282122 F1: 0.291675 P: 0.375879 R: 0.238292
relation F1: 0.742835 P: 0.957286 R: 0.606881
entity F1: 0.583934 P: 0.752513 R: 0.477063


epoch 10 loss: 8.802513 F1: 0.288422 P: 0.367151 R: 0.237496
relation F1: 0.756746 P: 0.963310 R: 0.623128
entity F1: 0.575878 P: 0.733071 R: 0.474196


epoch 11 loss: 8.855803 F1: 0.299773 P: 0.369231 R: 0.252310
relation F1: 0.766465 P: 0.944056 R: 0.645110
entity F1: 0.614307 P: 0.756643 R: 0.517044


epoch 12 loss: 9.465460 F1: 0.295006 P: 0.355411 R: 0.252150
relation F1: 0.789042 P: 0.950606 R: 0.674419
entity F1: 0.605293 P: 0.729232 R: 0.517362


epoch 13 loss: 7.226478 F1: 0.302694 P: 0.361326 R: 0.260433
relation F1: 0.799963 P: 0.954917 R: 0.688277
entity F1: 0.632232 P: 0.754696 R: 0.543963


epoch 14 loss: 7.257793 F1: 0.310519 P: 0.366833 R: 0.269194
relation F1: 0.805328 P: 0.951378 R: 0.698152
entity F1: 0.648783 P: 0.766442 R: 0.562440


epoch 15 loss: 6.049053 F1: 0.310207 P: 0.360794 R: 0.272061
relation F1: 0.814384 P: 0.947191 R: 0.714240
entity F1: 0.655467 P: 0.762357 R: 0.574865


epoch 16 loss: 7.307980 F1: 0.316016 P: 0.361032 R: 0.280981
relation F1: 0.831243 P: 0.949652 R: 0.739089
entity F1: 0.661233 P: 0.755424 R: 0.587926


epoch 17 loss: 6.690179 F1: 0.318442 P: 0.362637 R: 0.283848
relation F1: 0.833810 P: 0.949532 R: 0.743230
entity F1: 0.663510 P: 0.755596 R: 0.591430


epoch 18 loss: 6.352150 F1: 0.319246 P: 0.356890 R: 0.288786
relation F1: 0.846628 P: 0.946457 R: 0.765849
entity F1: 0.670893 P: 0.750000 R: 0.606881


epoch 19 loss: 5.875832 F1: 0.321760 P: 0.357839 R: 0.292291
relation F1: 0.849378 P: 0.944618 R: 0.771583
entity F1: 0.670174 P: 0.745320 R: 0.608793


epoch 20 loss: 6.401268 F1: 0.327978 P: 0.362969 R: 0.299140
relation F1: 0.853825 P: 0.944917 R: 0.778751
entity F1: 0.684946 P: 0.758021 R: 0.624721


epoch 21 loss: 4.804408 F1: 0.330060 P: 0.363167 R: 0.302485
relation F1: 0.849396 P: 0.934596 R: 0.778433
entity F1: 0.693317 P: 0.762861 R: 0.635393


epoch 22 loss: 5.498651 F1: 0.331173 P: 0.361988 R: 0.305193
relation F1: 0.855760 P: 0.935386 R: 0.788627
entity F1: 0.692594 P: 0.757038 R: 0.638261


epoch 23 loss: 6.467267 F1: 0.335626 P: 0.364587 R: 0.310927
relation F1: 0.857806 P: 0.931827 R: 0.794680
entity F1: 0.702373 P: 0.762981 R: 0.650685


epoch 24 loss: 4.964077 F1: 0.330294 P: 0.355140 R: 0.308697
relation F1: 0.862378 P: 0.927249 R: 0.805989
entity F1: 0.698935 P: 0.751512 R: 0.653234


epoch 25 loss: 5.744630 F1: 0.338613 P: 0.361548 R: 0.318414
relation F1: 0.866435 P: 0.925122 R: 0.814750
entity F1: 0.716185 P: 0.764695 R: 0.673463
'''

@WindChimeRan
Copy link
Owner

My preprocessing data_prepare.py is almost the same as the official tensorflow version. And I have not got the method which turns the WebNLG from .xml to .json.

@Nicoleqwerty
Copy link
Author

You can refer to this website webnlg2017.

@WindChimeRan
Copy link
Owner

Actually, I tried to parse xml by myself. Unfortunately, my preprocessing generated different json from the official one. Finally, I use the tf version directly.

To fairly compare your model with copyre paper, I think:

  1. ask the author for the preprocessing.

  2. parse xml by yourself and evaluate models in the new WebNLG for NRE

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants