Skip to content

symanto-research/few-shot-learning-label-tuning

Repository files navigation

Few-Shot Learning with Siamese Networks and Label Tuning

A few-shot learning method based on siamese networks.

Code & models for the paper to appear at ACL 2022.

The Symanto Few-Shot Benchmark

symanto-fsb implements the benchmark discussed in the paper.

It can be easily extended to evaluate new models. See the extension section below.

Installation

pip install -e .

CLI

Char-SVM

symanto-fsb evaluate-char-svm output/char_svm --n-trials=1

This will run the specified number of trials on each dataset and write results to the output directory. Afterwards you can create a result table:

symanto-fsb report output /tmp/report.tsv

Sentence-Transformer

Zero-Shot

symanto-fsb \
   evaluate-sentence-transformer \
    output/pml-mpnet \
    --gpu 0 \
    --n-examples=0 \
    --n-trials=1 

Few-Shot

symanto-fsb \
   evaluate-sentence-transformer \
    output/pml-mpnet \
    --gpu 0 \
    --n-examples=8 \
    --n-trials=1

Extension

In general a new model is added by adding:

  1. A new implementation of the Predictor interface
  2. A new command to cli.py

Known Issues

Datasets hosted on Google Drive do not work right now: datasets issue/3809

Testing & Maintenance

pip install -r dev-requirements.txt
dev-tools/format.sh
dev-tools/lint.sh
dev-tools/test.sh

Models

For the sake of comparability we trained 4. The code above can be used with the Siamese network models.

Cross Attention

Siamese Networks

Disclaimer

This is not an official Symanto product!

How to Cite

@inproceedings{labeltuning2022,
	title        = {{Few-Shot} {Learning} with {Siamese} {Networks} and {Label} {Tuning}},
	author       = {M{\"u}ller, Thomas and Pérez-Torró, Guillermo and Franco-Salvador, Marc},
	year         = {2022},
	booktitle    = {ACL (to appear)},
	url          = {https://arxiv.org/abs/2203.14655},
}

About

A few-shot learning method based on siamese networks.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published