Contains code for the EMNLP paper `Learning Linguistic Attributes for Zero-Shot Verb Classification'
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.
text @ e070c83


This repository contains data and code for the EMNLP 2017 paper Zero-Shot Activity Recognition with Verb Attribute Induction. (For more information, see the paper). If you use our Verbs with Attributes corpus or if the paper significantly inspires you, we request that you cite our work:


author = {Rowan Zellers and Yejin Choi},
title = {Zero-Shot Activity Recognition with Verb Attribute Induction},
url = {},
booktitle = "Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
year = "2017",

Obtaining the Verb Annotations

Our annotations are available in this folder. See the readme there for more information.

To download the imSitu images, you'll need to follow the instructions from the install script here.


I originally wrote this code with PyTorch 1.12, but I've updated it to hopefully work for PyTorch 3.0. See requirements.txt for dependencies. For ease of use, I recommend installing everything in a virtualenv. Ping me if there's a dependency missing (I tried to prune dependencies not needed for this project in particular).

Installing the data that we used

  • You'll need to download the imsitu dataset from here and update IMSITU_PATH in accordingly.
  • If you want to pretrain on the dictionary challenge dataset, then download it from here. I renamed the file training_data.pkl (in the archive training_data.tgz) to dictionary_challenge.pkl and moved it to my data folder.

Reproducing our results

Here's the rough outline to reproduce my results:

# Pretrain dictionary challenge
python models/

# pretrain IMSITU
python models/

# Train the ensembling text model
python models/

# Train imsitu
python models/

For evaluation, use the scripts and


This documentation is a work in progress, so flag an issue or contact me if you have any questions.