We present Habitat, a new platform for the development of embodied artificial intelligence (AI). Training robots in the real world is slow, dangerous, expensive, and not easily reproducible. We aim to support a complementary approach – training embodied AI agents (virtual robots) in photorealistic 3D simulation and transferring the learned skills to reality. The ‘software stack’ for training embodied agents involves datasets providing 3D assets, simulators that render these assets and simulate agents, and tasks that define goals and evaluation metrics, thus enabling controlled and reproducible assessment of scientific progress. We aim to standardize this entire stack by contributing specific instantiations at each level: unified support for scanned and modelled 3D scene datasets, a new simulation engine (Habitat-Sim), and a modular API (Habitat-LAB). The Habitat architecture and implementation combine modularity and high performance. For example, when rendering a realistic scanned scene from the Matterport3D dataset, habitatsim achieves several thousand frames per second (fps) running single-threaded and can reach over 10,000 fps multi-process on a single GPU. We also describe the Habitat Challenge, an autonomous navigation challenge that aims to benchmark and accelerate progress in embodied AI.