This repository contains the code to train and evaluate models from the paper:
Reciptor: An Effective Pretrained Model for Recipe Representation Learning.
pip install -r requirements.txt
To prepare training data from scratch, run:
python prepare_dataset.py
Please make sure you have downloaded
data/encs_train_1024.t7
: Skip-instructions train partitiondata/encs_val_1024.t7
: Skip-instructions val partitiondata/encs_test_1024.t7
: Skip-instructions test partition
from the original recipe1M and put them under their corresponding folders and downloaded
data/foodcom_dataset.json
from foodcom_datatset.json
Alternatively, you can download our preprocessed sample data from reciptor_data
and the full data from zenodo and unzip them under data
directory.
To run the recipe representation model:
bash run_foodcom_reciptor.sh
The model will be saved in --snapshots
path.
- Notice: to run the baseline models (jm, sjm) described in our paper, please change
--model_type
tojm|sjm
accordingly.
To store the pretrained recipe embeddings:
bash run_store_embed.sh
To evaluate the pretrained recipe embeddings on category classification task:
bash run_evaluation.sh
The backbone of this framework is based on torralba-lab/im2recipe-Pytorch
The implementation of Set Transformer is based on TropComplique/set-transformer.