New egocentric synthetic dataset for egocentric 3D human pose estimation
-
Updated
Oct 8, 2021 - Python
New egocentric synthetic dataset for egocentric 3D human pose estimation
Egocentric Upper Limb Segmentation in unconstrained real-life environments using Deep Neural Network.
Deep Learning models to fuse imu-based motion capture and first-person video data to improve the prediction of future knee and ankle joint kinematics, in complex real-world environments.
This is an Ego-Jenga game created in Unity3D engine, which requires Oculus VR headset support to play the game. Egocentric vision to experience human vision in the virtual world for better interaction using the VR system.
Collect related papers and datasets for research
Official PyTorch implementation of our work "Egocentric zone-aware action recognition across environments"
Python implementation of the LTMU-H and TbyD-H trackers proposed in https://arxiv.org/abs/2209.13502
[ICRA 2024] Official Implementation of EgoPAT3Dv2: Predicting 3D Action Target from 2D Egocentric Vision for Human-Robot Interaction
Official repository of the "Attention-Propagation Network for Egocentric Heatmap to 3D Pose Lifting" (CVPR 2024 Highlight)
[ECCV 2024] EgoPoseFormer: A Simple Baseline for Stereo Egocentric 3D Human Pose Estimation
A collection of the forefront of Egocentric Human Activity Recognition (HAR) and Action Anticipation through Deep Learning
Official repository of the "Ego3DPose: Capturing 3D Cues from Binocular Egocentric Views" (SIGGRAPH Asia 2023)
Official repository of ECCV 2024 paper - "HAT: History-Augmented Anchor Transformer for Online Temporal Action Localization"
A dataset of egocentric vision, eye-tracking and full body kinematics from human locomotion in out-of-the-lab environments. Also, different use cases of the dataset along with example code.
[ECCV 2024] Official code release for "Multimodal Cross-Domain Few-Shot Learning for Egocentric Action Recognition"
Official code repository to download the TREK-150 benchmark dataset and run experiments on it.
Official implementation of "A Backpack Full of Skills: Egocentric Video Understanding with Diverse Task Perspectives", accepted at CVPR 2024.
The official PyTorch implementation of the IEEE/CVF Computer Vision and Pattern Recognition (CVPR) '24 paper PREGO: online mistake detection in PRocedural EGOcentric videos.
✌️ Detection and tracking hand from FPV: benchmarks and challenges on rehabilitation exercises dataset
The champion solution for Ego4D Natural Language Queries Challenge in CVPR 2023
Add a description, image, and links to the egocentric-vision topic page so that developers can more easily learn about it.
To associate your repository with the egocentric-vision topic, visit your repo's landing page and select "manage topics."