ObjectRelator: Enabling Cross-View Object Relation Understanding Across Ego-Centric and Exo-Centric Perspectives (ICCV 2025 Highlight)
Paper 🌟 Project Page 🚀
- [08/2025] Data, models, codes, and training/testing scripts are released. 🔧
- [07/2025] Project website is released. 📖
- [06/2025] Our paper is accepted by ICCV 25 (Highlight Paper). 🎉
- [06/2025] We were awarded 2nd place in the Correspondences track of the 2025 EgoVis Ego-Exo4D Challenge. Technical report 🏅
-
🔥Ego-Exo Object Correspondence Task: We conduct an early exploration of this challenging task, analyzing its unique difficulties, constructing several baselines, and proposing a new method.
-
🔥ObjectRelator Framework: We introduce ObjectRelator, a cross-view object segmentation method combining MCFuse and XObjAlign. MCFuse first introduces the text modality into this task and improves localization using multimodal cues for the same object(s), while XObjAlign boosts performance under appearance variations with an object-level consistency constraint.
-
🔥New Testbed & SOTA Results: Alongside Ego-Exo4D, we present HANDAL-X as an additional benchmark. Our proposed ObjectRelator achieves state-of-the-art (SOTA) results on both datasets.
More video demos can be found: http://yuqianfu.com/ObjectRelator/.
See Installation instructions.
See Prepare Datasets for ObjectRelator.
See Quick Start With ObjectRelator.
Results on val set, main results from our ICCV25 paper.
Results on test set, same as EgoExo4D Correspondence Challenge and our technical report.
If you think this work is useful for your research, please use the following BibTeX entry.
@article{fu2024objectrelator,
title={Objectrelator: Enabling cross-view object relation understanding in ego-centric and exo-centric videos},
author={Fu, Yuqian and Wang, Runze and Fu, Yanwei and Paudel, Danda Pani and Huang, Xuanjing and Van Gool, Luc},
journal={ICCV},
year={2025}
}
@article{fu2025cross,
title={Cross-View Multi-Modal Segmentation@ Ego-Exo4D Challenges 2025},
author={Fu, Yuqian and Wang, Runze and Fu, Yanwei and Paudel, Danda Pani and Van Gool, Luc},
journal={arXiv preprint arXiv:2506.05856},
year={2025}
}
Thanks for awesome works: PSALM , LLaVA and Ego-Exo4D. Code is based on these works.

