- Past commonsense knowledge should accelerate or at least not affect the learning of new commonsenses or more sophisticated skills.
- Past commonsense knowledge should be retained and accumulated as model learns new commonsense or more sophisticated skills.
-
A unified language model(e.g., GPT-2) as both learner and pesudo-sample generator.
-
learning task:
- generative commonsense question answering.
- discriminative multiple-choice commonsense reasoning.
-
sample generation task:
generate the complete commonsense questions.
- meta-replay by uncertain examples stored in physical episodic memory module.
- supervised-replay by confident examples generated by the model itself.
- pytorch
- transformers
- jsonlines
- pytorch-lightning
cd src/train
./train.sh