Skip to content

marcospln/galician_bert_checkpoints

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 

Repository files navigation

Galician BERT Checkpoints

This repository includes several training checkpoints of the base and small BERT models for Galician published by Garcia, M. (2021) Exploring the Representation of Word Meanings in Context: A Case Study on Homonymy and Synonymy (ACL 2021), and available at the HuggingFace's hub: BERT-base, and BERT-small.

0-425k steps

The following links contain the checkpoints released by de-Dios-Flores, I & M. Garcia (2022) A computational psycholinguistic evaluation of the syntactic abilities of Galician BERT models at the interface of dependency resolution and training time (SEPLN 2022):

  1. BERT-base: zenodo link.
  2. BERT-small: zenodo link.

Citation

If you use these checkpoints please cite the following paper:

@article{dediosflores-garcia-2022-computational,,
    title = "A computational psycholinguistic evaluation of the syntactic abilities of Galician BERT models at the interface of dependency resolution and training time",
    author = "Iria {de-Dios-Flores} and Marcos Garcia",
    journal = "Procesamiento del Lenguaje Natural",
    year = "2022",
    publisher = "Sociedad Española para el Procesamiento del Lenguaje Natural",
    volume = "69"
}

About

Training checkpoints for BERT models for Galician

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published