Resources
A curated collection of research papers, tools, datasets, and references for exploring embodied artificial intelligence and physical intelligence systems.
Research Papers
February 2025
Hi Robot: Open-Ended Instruction Following with Hierarchical Vision-Language-Action ModelsJanuary 2025
FAST: Efficient Action Tokenization for Vision-Language-Action ModelsSeptember 2024
OpenVLA: An Open-Source Vision-Language-Action ModelJuly 2024
Aligning Cyber Space with Physical World: A Comprehensive Survey on Embodied AIApril 2024
Learning dexterity from human hand motion in internet videosJuly 2024
Train Offline, Test Online: A Real Robot Learning BenchmarkDecember 2023
π0: A Vision-Language-Action Flow Model for General Robot ControlOctober 2023
Open X-Embodiment: Robotic Learning Datasets and RT-X ModelsMarch 2023
Catalyzing next-generation Artificial Intelligence through NeuroAIJanuary 2022
A Survey of Embodied AI: From Simulators to Research TasksTools, Data, and Software
Open X-Embodiment Dataset Overview
Open X-Embodiment: Robotic Learning Datasets and RT-X Models
Meta Large-scale datasets
Genesis: A Generative and Universal Physics Engine for Robotics and Beyond
MuJoCo: Advanced physics simulation
MeTRAbs Absolute 3D Human Pose Estimator
NVIDIA Isaac Lab: a unified and modular framework for robot learning
NVIDIA Isaac GR00T N1: Foundation model for generalized humanoid robot reasoning and skills.
NVIDIA Isaac GR00T N1: Open Physical AI Dataset
DexForce: Extracting Force-informed Actions for Dextrous Manipulation
OpenVLA: An Open-Source Vision-Language-Action Model