Skip to content

NC0DER/Maistros

Repository files navigation

Python-Versions Open in HuggingFace Software-License

Maistros: A Greek Large Language Model Adapted Through Knowledge Distillation From Large Reasoning Models

This repository stores the code to facilitate reproducibility for the arxiv paper.
The model (full and 4-bit quantized versions) and dataset are hosted on HuggingFace.

Installation

pip install requirements.txt

First Time Setup.

You need to set the project directory path in src/config.py.
The processed datasets and generated answers are included for reproducibility.
To reproduce the experiments, run the reproduce_experiments.py
Setting API keys is only required to run the overall_approach.py (optional).

Citation

@misc{
  giarelis2026maistrosgreeklargelanguage,
  title = {Maistros: A Greek Large Language Model Adapted Through Knowledge Distillation From Large Reasoning Models}, 
  author = {Nikolaos Giarelis and Charalampos Mastrokostas and Nikos Karacapilidis},
  year = {2026},
  eprint = {2605.01870},
  archivePrefix = {arXiv},
  primaryClass = {cs.CL},
  url = {https://arxiv.org/abs/2605.01870}, 
}

Contributors

About

A Greek Large Language Model Adapted Through Knowledge Distillation From Large Reasoning Models

Resources

License

Stars

Watchers

Forks

Contributors