Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't reproduce F1 scores #2

Closed
lxy444 opened this issue Mar 23, 2019 · 3 comments
Closed

Can't reproduce F1 scores #2

lxy444 opened this issue Mar 23, 2019 · 3 comments

Comments

@lxy444
Copy link

lxy444 commented Mar 23, 2019

After training the model using the default parameters, the result is
` precision recall f1-score support

    ORG     0.7019    0.7685    0.7337       337
    LOC     0.8190    0.8854    0.8509       419
   MISC     0.8188    0.8278    0.8233       273
    PER     0.7619    0.7453    0.7535       322

avg / total 0.7761 0.8113 0.7929 1351
`

which is different from the README results.

@kamalkraj
Copy link
Owner

default parameters ?
Can you share your cmd you used for training ?

@lxy444
Copy link
Author

lxy444 commented Mar 24, 2019

default parameters ?
Can you share your cmd you used for training ?

Yes, I used the default parameters.
python run_ner.py --data_dir=data/ --bert_model=bert-base-cased --task_name=ner --output_dir=out --max_seq_length=128 --do_train --num_train_epochs 5 --do_eval --warmup_proportion=0.4

Also, theres is a file named model_config.json in the out folder, which contains
{"bert_model": "/mnt/data/share_data/nlp/bert/model/cased_L-12_H-768_A-12", "do_lower": false, "max_seq_length": 128, "num_labels": 13, "label_map": {"1": "O", "2": "B-MISC", "3": "I-MISC", "4": "B-PER", "5": "I-PER", "6": "B-ORG", "7": "I-ORG", "8": "B-LOC", "9": "I-LOC", "10": "X", "11": "[CLS]", "12": "[SEP]"}}

@kamalkraj
Copy link
Owner

I trained on different machines results are only varying by ~0.3
For now you can use pretrained model , Link in Readme

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants