INTRODUCING

HOT3D Dataset

A new benchmark dataset for vision-based
understanding of 3D hand-object interactions

WHAT IS IT?

A new benchmark dataset to better understand how humans use their hands

We use our hands to communicate with others, interact with objects, and handle tools. Yet reliable understanding of how people use their hands to manipulate objects remains a key challenge for computer vision research.

The HOT3D dataset and benchmark will unlock new opportunities within this research area, such as transferring manual skills from experts to less experienced users or robots, helping an AI assistant to understand user's actions, or enabling new input capabilities for AR/VR users, such as turning any physical surface to a virtual keyboard or any pencil to a multi-functional magic wand.

DOWNLOAD THE DATASET BELOW
3 different clips from the HOT3D dataset, with RGB and greyscale images.
WHAT IS IN THE DATASET?

Over one million multi-view frames of hand-object interactions

Dataset Content

  • Synchronized multi-view egocentric videos from Project Aria glasses and Quest 3 VR headset
  • High-quality 3D pose annotations of hands and objects
  • 3D object models with PBR materials
  • 2D bounding boxes
  • Gaze signal
  • 3D scene point cloud from SLAM

Sequence Metrics

  • Over 800 minutes of egocentric recordings
  • 33 diverse hand-held objects
View of 3D object models contained within the HOT3D Dataset

High-fidelity 3D object models

To enable research on model-based object pose estimation, we provide high-fidelity 3D models of 33 diverse objects. Each model is captured with high-resolution geometry and PBR materials, using an in-house 3D scanner.

3 images showing the RGB and greyscale images from Project Aria glasses

Multi-view image streams from the first-person perspective

The HOT3D dataset includes synchronized multi-view image streams from Project Aria glasses, and Quest 3. This enables benchmarking methods that can leverage multi-view and/or temporal information.

RBG view from HOT3D Dataset showing an Aria wearer interacting with tabletop objects, with ground-truth overlaid

Accurate ground-truth 3D poses of hands and objects

A set of small optical markers were attached to hands and objects and tracked using a professional motion-capture system. This ground truth enables training and evaluating methods for joint hand and object tracking.

3D point cloud showing the position of Aria glasses, hand, object, and eye gaze vector

Precise eye tracking, indicating wearer gaze

Data from Project Aria glasses also include gaze signal, which may be useful for predicting the user's intent, or for developing efficient tracking methods that primarily focus on hands and objects within the user's sight.

HOT-3D DATASET TOOLS

Comprehensive tools to load and visualize data easily

We provide python tools that enable researchers to interact with egocentric hands and objects tracking in 3D on multi-view image streams.


An API and code samples provide ways to easily access and visualize the image streams and high-quality ground-truth 3D poses and shapes of hands and objects.

VIEW HOT3D TOOLS ON GITHUB
A screenshot of the visualization tools included with the HOT3D dataset.
BOP BENCHMARK FOR 6D OBJECT POSE ESTIMATION

Unlocking new challenges to accelerate research

In 2024, the HOT3D dataset is recognized as an official dataset for the BOP challenge, helping researchers to demonstrate their methods for a range of tasks, including model-based and model free object detection and pose estimation.

LEARN MORE ABOUT HOT3D DATASET BENCHMARK CHALLENGES
An RGB image from the BOP challenge

Read the accompanying HOT3D Research Paper

More information about the HOT3D Dataset can be found in our paper.

READ THE HOT3D PAPER
A screenshot from the HOT3D research paper.

Access HOT3D Dataset and accompanying Tools

If you are a researcher in AI or ML research, access the HOT3D Dataset and accompanying tools here.

By submitting your email and accessing the HOT3D Dataset, you agree to abide by the dataset license agreement and to receive emails in relation to the dataset.

Subscribe to Project Aria Updates

Stay in the loop with the latest news from Project Aria.

By providing your email, you agree to receive marketing related electronic communications from Meta, including news, events, updates, and promotional emails related to Project Aria. You may withdraw your consent and unsubscribe from these at any time, for example, by clicking the unsubscribe link included on our emails. For more information about how Meta handles your data please read our Data Policy.