Skip to content

shadowwkl/LSTM-for-Multiple-Instance-Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

In Defense of LSTMs for Addressing Multiple Instance Learning Problems

Kaili Wang, Jose Oramas, Tinne Tuytelaars

Oral paper @ ACCV 2020.

You can find the paper here: In Defense of LSTMs for Addressing Multiple Instance Learning Problems

Abstract

LSTMs have a proven track record in analyzing sequential data. But what about unordered instance bags, as found under a Multiple Instance Learning (MIL) setting? While not often used for this, we show LSTMs excell under this setting too. In addition, we show thatLSTMs are capable of indirectly capturing instance-level information us-ing only bag-level annotations. Thus, they can be used to learn instance-level models in a weakly supervised manner. Our empirical evaluation on both simplified (MNIST) and realistic (Lookbook and Histopathology) datasets shows that LSTMs are competitive with or even surpass state-of-the-art methods specially designed for handling specific MIL problems. Moreover, we show that their performance on instance-level prediction is close to that of fully-supervised methods.

Implementation

To be organized and uploaded.

The code is based on Pytorch 0.4.1, Python 2. The implementation considers Attention-based Deep Multiple Instance Learning.

Usage

We provide the training and testing code for the difficult experiment Outlier Detection. We use 10,000 sets to train the model and 2,000 sets to test. The setcardinality is 6 with 1 standard deviation.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages