ActorObserverNet code in PyTorch from "Actor and Observer: Joint Modeling of First and Third-Person Videos", CVPR 2018
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.

ActorObserverNet code in PyTorch

From: Actor and Observer: Joint Modeling of First and Third-Person Videos, CVPR 2018

Contributor: Gunnar Atli Sigurdsson

  • This code implements a triplet network in PyTorch

The code implements found in:

author = {Gunnar A. Sigurdsson and Abhinav Gupta and Cordelia Schmid and Ali Farhadi and Karteek Alahari},
title = {Actor and Observer: Joint Modeling of First and Third-Person Videos},
booktitle={The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
code = {},


The Charades-Ego and Charades datasets are available at

Charades-Ego Teaser Video

Technical Overview:

All outputs are stored in the cache-dir. This includes epoch*.txt which is the classification output. All output files can be scored with the official MATLAB evaluation script provided with the Charades / CharadesEgo datasets.


  • Python 2.7
  • PyTorch

Steps to train your own model on CharadesEgo:

  1. Download the CharadesEgo Annotations (
  2. Download the CharadesEgo RGB frames (
  3. Duplicate and edit one of the experiment files under exp/ with appropriate parameters. For additional parameters, see
  4. Run an experiment by calling python exp/ where is your experiment file
  5. The checkpoints/logfiles/outputs are stored in your specified cache directory.
  6. Build of the code, cite our papers, and say hi to us at CVPR.

Good luck!

Pretrained networks:

Charades submission files are available for multiple baselines at