Skip to content

We extracted 5,000 question-answer pairs from Turkish Wikipedia and fine-tuned Turkish BERT, ALBERT, ELECTRA for the question-answering task.

License

Notifications You must be signed in to change notification settings

fzehracetin/turkish-question-answering

Repository files navigation

Turkish Question Answering

This is our Bachelor's graduation thesis from Yildiz Technical University on question answering systems with deep learning. Betül Ön and I collected 610 paragraphs from Turkish Wikipedia and extracted 5,000 question-answer pairs from them. With this dataset we fine-tuned Turkish BERT, ALBERT, and ELECTRA for the question-answering task. We achieved 68% exact-match and 81% F1 score, 49% exact-match and 68% F1 score, and 66% exact-match and 82% F1 score with BERT, ALBERT, and ELECTRA respectively. We are thankful to our supervisor Prof. Dr. Banu Diri for her excellent mentorship. Thank you to Bayerische Staatsbibliothek and Loodos for this great pre-trained turkish models. And thank you to Huggingface for the all transformers and tokenizers.

Dataset Format ✏

BERT, ALBERT, ELECTRA Architectures ⚙

Test Results 📊

WEB GUI 💻

About

We extracted 5,000 question-answer pairs from Turkish Wikipedia and fine-tuned Turkish BERT, ALBERT, ELECTRA for the question-answering task.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published