You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repository contains the code, data, and models of the paper titled "CrossSum: Beyond English-Centric Cross-Lingual Summarization for 1,500+ Language Pairs" published in Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (ACL’23), July 9-14, 2023.
In this work we applied multilingual zero-shot transfer concept for the task of toxic comments detection. This concept allows a model trained only on a single-language dataset to work in arbitrary language, even low-resource.
This repository contains the code for the experiments related to higher-level semantic tasks and related to the meta-learning from: "From Zero to Hero: On the Limitations of Zero Shot Cross-Lingual Transfer"
Improving Quality of Multilingual Question Answering and Cross-Lingual Transfer using Multitask Learning, Knowledge Distillation, and Data Augmentation
This is a project proposal to implement Yan et al.'s (2020) mBERT-Unaligned for cross-lingual RDs with Japanese, German and Italian untranslatable terms
This repository contains the implementation of cross-lingual transfer learning experiments for Named Entity Recognition (NER) between Hindi and Nepali, utilizing pre-trained multilingual BERT models to explore the effectiveness of linguistic knowledge transfer across these languages.