Skip to content
Implements the loss used in A. Furnari, S. Battiato, G. M. Farinella (2018). Leveraging Uncertainty to Rethink Loss Functions and Evaluation Measures for Egocentric Action Anticipation . In International Workshop on Egocentric Perception, Interaction and Computing (EPIC) in conjunction with ECCV .
Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
MarginalCrossEntropyLoss Updated docstrings Mar 14, 2019
TruncatedTopkEntropyLoss Support for PyTorch 0.4.1 (#1) Mar 13, 2019
README.md Update README.md May 22, 2019
video.png Add files via upload Jan 18, 2019

README.md

Video Demo

Video Demo

PyTorch Action Anticipation Losses

This repository implements the Verb-Noun Marginal Cross Entropy Loss (VNMCE) proposed in the paper (download here):

A. Furnari, S. Battiato, G. M. Farinella (2018). Leveraging Uncertainty to Rethink Loss Functions and Evaluation Measures for Egocentric Action Anticipation . In International Workshop on Egocentric Perception, Interaction and Computing (EPIC) in conjunction with ECCV.

We also report an implementation of the Truncated Top-5 Entropy Loss proposed in the paper:

Lapin, Maksim, Matthias Hein, and Bernt Schiele. "Loss functions for top-k error: Analysis and insights." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2016.

Please see the code for usage examples.

Results on the EPIC-KITCHENS test sets

At the time of the publication, the evaluation server for the egocentric action recognition/anticipation challenges was not available. Therefore the results of the proposed method are not reported in the paper. To allow for fair comparisons with other methods, the results on the test sets are reported in the following.

Action Anticipation

These results are related to the TSN model trained for the task of egocentric action anticipation with the VNMCE loss (please refer to the paper for more details):

Top-1 Accuracy (%)Top-5 Accuracy (%)Precision (%)Recall (%)
Set VNAVNAVNAVNA
Seen27.9216.0910.7673.5939.3225.2823.4317.536.0514.7911.655.11
Unseen21.279.905.5763.3325.5015.7110.026.881.997.686.612.39

Action Recognition

These results are related to the TSN model trained for the task of egocentric action recognition with the VNMCE loss (please refer to the paper for more details):

Top-1 Accuracy (%)Top-5 Accuracy (%)Precision (%)Recall (%)
Set VNAVNAVNAVNA
Seen54.2238.8529.0085.2261.8049.6253.8738.1818.2235.8832.2716.56
Unseen40.9023.4616.3972.1143.0531.3426.6216.837.1015.5617.7010.17

Publication

Please reference this publication if you find this code useful:

@inproceedings{furnari2018Leveraging,
  author = { A. Furnari and S. Battiato and G. M. Farinella },
  title = {  Leveraging Uncertainty to Rethink Loss Functions and Evaluation Measures for Egocentric Action Anticipation  },
  booktitle = {  International Workshop on Egocentric Perception, Interaction and Computing (EPIC) in conjunction with ECCV  },
  year = { 2018 },
}

Related Works

You can find related works at the following page: http://iplab.dmi.unict.it/fpv/.

You can’t perform that action at this time.