Datasets
Curated datasets for training and developing robotics applications.
BOP Challenge Benchmark
Benchmark for 6D object pose estimation. Includes multiple datasets and evaluation protocols.
ARC X Robotics Benchmark
Amazon's object picking benchmark for robotic manipulation. Standardized evaluation suite.
ACT Model Zoo
Pretrained Action Chunking Transformer models for manipulation. Includes model checkpoints and evaluation code.
ROS Industrial Repository
Robot models, controllers, and configurations for dozens of industrial robot arms. Essential for simulation and deployment.
LM-O (Linemod-Occluded)
6DoF object pose dataset with 8 household objects. Includes occluded scenarios for robust pose estimation.
T-LESS
6DoF object pose estimation dataset with 30 industrial objects. Used for manipulation and pose estimation benchmarks.
LIBERO Language-Prompt Datasets
Household manipulation tasks combined with language prompts and demonstration episodes. Language-conditioned skill learning.
DexMV / Dex-Hand
Dexterous hand manipulation demonstrations. Includes complex in-hand manipulation tasks.
RoboTurk / R2D2
Crowdsourced robot teleoperation dataset. Large-scale human demonstrations collected via web interface.
BC-Z
Large-scale multi-task robotic manipulation dataset with 24,000 demonstration episodes across 100+ tasks.
Meta-World
Popular simulation benchmark for generalist RL research. 50 manipulation tasks with diverse complexity.
Franka Kitchen Dataset
Canonical benchmark for multi-step manipulation with Franka Panda. Kitchen task demonstrations.
BEHAVIOR-1K
Comprehensive simulation suite with 1,000 household tasks. Includes both simulation and real-world components.
SayCan / Commands Dataset
Google's language-conditioned manipulation dataset used before RT-1. Includes natural language commands and robot demonstrations.
Diffusion Policy Model Zoo
Dozens of pretrained diffusion-based policies for tabletop manipulation tasks. Ready-to-use skill models.
OpenVLA
Open Visual Language Action model for robot control. Pretrained VLA model that can be fine-tuned for specific tasks.
Isaac Sim Assets Library
Robot models, articulation trees, sensors, and full simulation scenes for NVIDIA Isaac Sim. Essential for simulation workflows.
KITTI
Classic robot perception dataset for outdoor autonomous driving. Includes stereo images, LiDAR, GPS/IMU, and object labels.
Replica / Habitat Scene Datasets
Photorealistic indoor environments from Facebook AI Research. Used for robot navigation, policy learning, and embodied AI.
Matterport3D
High-resolution indoor 3D scans of 90 building-scale scenes. Essential for navigation, mapping, and scene understanding.
ScanNet
Large-scale indoor scene dataset with 3D reconstructions, semantic segmentation, and camera trajectories. Essential for navigation and scene understanding.
HOPE (Household Object Pose Estimation)
6DoF object pose estimation dataset with 28 household objects. Includes RGB-D images and ground truth poses.
Google Scanned Objects
Real, high-quality 3D scans of household objects from Google Research. Photorealistic meshes with textures.
YCB Object & Model Set
Standard object models used across almost all manipulation research. Includes 77 household objects with 3D meshes, textures, and physical properties.
ManiSkill
Massive simulated manipulation tasks with trajectories, 3D scenes, and pretrained models. Supports ManiSkill 1, 2, and 3.
RLBench
Hundreds of multi-step manipulation tasks with 3D scenes and expert demonstrations for sim-to-real learning.
WidowX Demonstrations
Collection of manipulation demonstrations using WidowX robot arm.
MimicGen
Large-scale dataset generation system for manipulation, enabling scalable data collection.
Action Chunking with Transformers (ACT)
Manipulation dataset with action chunking demonstrations for transformer-based learning.
DROID
Large-scale manipulation dataset with 76,000 demonstrations across 100+ tasks.
RT-1-X
Pre-trained models and datasets for cross-embodiment robot learning, built on Open X-Embodiment.
Language-Table
Large-scale language-conditioned manipulation dataset with 600k+ language-annotated demonstrations.
Bridge Data V2
Large-scale manipulation dataset with 60,000+ demonstrations across diverse tasks and environments.
RoboSet
Real-world multi-task dataset collected in kitchen scenes with kinesthetic and teleop demonstrations (28,500 trajectories).
Open X-Embodiment
Unified open dataset of 1M+ real robot trajectories across 22 robot embodiments for cross-robot learning.
RoboNet
Large-scale multi-robot learning dataset with over 15 million frames from 7 robot platforms.
WidowX Demonstrations
Collection of manipulation demonstrations using WidowX robot arm.
MimicGen
Large-scale dataset generation system for manipulation, enabling scalable data collection.
Action Chunking with Transformers (ACT)
Manipulation dataset with action chunking demonstrations for transformer-based learning.
DROID
Large-scale manipulation dataset with 76,000 demonstrations across 100+ tasks.
RT-1-X
Pre-trained models and datasets for cross-embodiment robot learning, built on Open X-Embodiment.
Language-Table
Large-scale language-conditioned manipulation dataset with 600k+ language-annotated demonstrations.
Bridge Data V2
Large-scale manipulation dataset with 60,000+ demonstrations across diverse tasks and environments.
RoboSet
Real-world multi-task dataset collected in kitchen scenes with kinesthetic and teleop demonstrations (28,500 trajectories).
Open X-Embodiment
Unified open dataset of 1M+ real robot trajectories across 22 robot embodiments for cross-robot learning.
RoboNet
Large-scale multi-robot learning dataset with over 15 million frames from 7 robot platforms.