There are 1 repository under egocentric-vision topic.
Official code and data for EgoBody dataset (2022 ECCV)
PyTorch code for EgoHMR (ICCV 2023): Probabilistic Human Mesh Recovery in 3D Scenes from Egocentric Views
Official implementation of Balanced Spherical Grid for Egocentric View Synthesis (CVPR 2023)
[ICCV, 2023] Multiple humans in 3D captured by dynamic and static cameras in 4K.
Codebase for "Multimodal Distillation for Egocentric Action Recognition" (ICCV 2023)
EventEgo3D: 3D Human Motion Capture from Egocentric Event Streams [CVPR'24]
Code implementation for paper titled "HOI-Ref: Hand-Object Interaction Referral in Egocentric Vision"
The champion solution for Ego4D Natural Language Queries Challenge in CVPR 2023
:v: Detection and tracking hand from FPV: benchmarks and challenges on rehabilitation exercises dataset
The official PyTorch implementation of the IEEE/CVF Computer Vision and Pattern Recognition (CVPR) '24 paper PREGO: online mistake detection in PRocedural EGOcentric videos.
(ECCV 2024) Official repository of paper "EgoExo-Fitness: Towards Egocentric and Exocentric Full-Body Action Understanding"
Official code repository to download the TREK-150 benchmark dataset and run experiments on it.
Official implementation of "A Backpack Full of Skills: Egocentric Video Understanding with Diverse Task Perspectives", accepted at CVPR 2024.
A dataset of egocentric vision, eye-tracking and full body kinematics from human locomotion in out-of-the-lab environments. Also, different use cases of the dataset along with example code.
A collection of the forefront of Egocentric Human Activity Recognition (HAR) and Action Anticipation through Deep Learning
[ECCV 2024] Official code release for "Multimodal Cross-Domain Few-Shot Learning for Egocentric Action Recognition"
Official repository of the "Ego3DPose: Capturing 3D Cues from Binocular Egocentric Views" (SIGGRAPH Asia 2023)
Official repository of ECCV 2024 paper - "HAT: History-Augmented Anchor Transformer for Online Temporal Action Localization"
Official repository of the "Attention-Propagation Network for Egocentric Heatmap to 3D Pose Lifting" (CVPR 2024 Highlight)
Python implementation of the LTMU-H and TbyD-H trackers proposed in https://arxiv.org/abs/2209.13502
Deep Learning models to fuse imu-based motion capture and first-person video data to improve the prediction of future knee and ankle joint kinematics, in complex real-world environments.
This is an Ego-Jenga game created in Unity3D engine, which requires Oculus VR headset support to play the game. Egocentric vision to experience human vision in the virtual world for better interaction using the VR system.
Collect related papers and datasets for research
Egocentric Upper Limb Segmentation in unconstrained real-life environments using Deep Neural Network.
New egocentric synthetic dataset for egocentric 3D human pose estimation