Skip to content

Pattern-Exploiting Training in Julia and Knet. A replication of "It’s Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners"

Notifications You must be signed in to change notification settings

Shamdan17/pet.jl

Repository files navigation

pet.jl

Pattern-Exploiting Training in Julia. A replication of "It’s Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners"

Setting up

./download-data.sh
./download-albert.sh

Replicating experiments

Scripts for the PET and iPET versions of each dataset can be found under src/scripts.

To run an experiment, for example boolq with PET, simply do:

cd src && julia scripts/boolq_pet.jl

Running Baselines

All baselines

cd src && julia baselines.jl

Specific baseline

Here are the possible flags:

cd src && julia baselines.jl --dataset BoolQ/CB/COPA/RTE/WiC/WSC/all --method Random/MostCommon/all

For more details, you can always do:

julia src/baselines.jl --help

ALBERT

Requirements

  • PyCall

Python Dependencies

  • transformers

Example

cd src && julia albert_example.jl

Input: The capital of France is [MASK].

Output: the capital of france is paris .

To use your own input, uncomment out the lines in src/albert_example.jl

Running tests

All ALBERT related tests

cd src/albert && julia albert_tests.jl

About

Pattern-Exploiting Training in Julia and Knet. A replication of "It’s Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published