This repository includes several training checkpoints of the base and small BERT models for Galician published by Garcia, M. (2021) Exploring the Representation of Word Meanings in Context: A Case Study on Homonymy and Synonymy (ACL 2021), and available at the HuggingFace's hub: BERT-base, and BERT-small.
The following links contain the checkpoints released by de-Dios-Flores, I & M. Garcia (2022) A computational psycholinguistic evaluation of the syntactic abilities of Galician BERT models at the interface of dependency resolution and training time (SEPLN 2022):
- BERT-base: zenodo link.
- BERT-small: zenodo link.
If you use these checkpoints please cite the following paper:
@article{dediosflores-garcia-2022-computational,,
title = "A computational psycholinguistic evaluation of the syntactic abilities of Galician BERT models at the interface of dependency resolution and training time",
author = "Iria {de-Dios-Flores} and Marcos Garcia",
journal = "Procesamiento del Lenguaje Natural",
year = "2022",
publisher = "Sociedad Española para el Procesamiento del Lenguaje Natural",
volume = "69"
}