This repository provides the official implementation of the paper:
Divide and Conquer: Object Co-occurrence Helps Mitigate Simplicity Bias in OOD Detection (CVPR 2026)
OCO is an object-centric OOD detection method that leverages Object CO-occurrence patterns and a divide-and-conquer scoring strategy to mitigate simplicity bias and improve both challenging and full-spectrum OOD detection. Below is the pipeline:
Our implementation is based on the OpenOOD framework. For more detailed environment configuration and instructions, please refer to https://github.com/Jingkang50/OpenOOD.
# Clone the repository
git clone https://github.com/Michael-McQueen/OCO.git
cd OCO
# Create and activate a new Conda environment
conda create -n oco python=3.10
conda activate oco
# Install required dependencies
pip install git+https://github.com/Jingkang50/OpenOOD
pip install libmr
pip install timmDownload the required benchmark datasets using the provided script:
bash scripts/download/download.shNote: For the ImageNet-1K training images, due to copyright restrictions, please ensure you have downloaded them from the official ImageNet website and placed them in the appropriate
./datadirectory structure.
Running the OCO pipeline involves a two-step process: pretraining the Slot Attention module, followed by the integrated 3-stage OCO training and evaluation pipeline.
Before training the main OCONet, you should obtain a pre-trained Slot Attention checkpoint.
- Clone and follow the instructions in the DINOSAUR repository to pre-train the model on the default COCO 2017 dataset.
- Configuration Settings: resize images to 224x224 and set
slot_numto 6. Leave other hyperparameters as their defaults. - Save the resulting checkpoint.
Once your pretrained checkpoint is in place, you can execute the full OCO methodology on ImageNet-1K using the provided bash script.
bash scripts/ood/oco/oco.shThe oco.sh script automates the full 3-stage methodology:
-
Stage A: Using ID data to train OCONet. -
Stage B: Building$F_{train}$ . -
Stage C: OOD/FS-OOD evaluation.
This codebase is built upon OpenOOD. We thank the authors for their excellent work.
If you find this work useful, please cite our paper.
