Seamlessly integrate state-of-the-art transformer models into robotics stacks
-
Updated
Jul 4, 2024 - Python
Seamlessly integrate state-of-the-art transformer models into robotics stacks
A Production Tool for Embodied AI
This repo is the official implementation of "MineDreamer: Learning to Follow Instructions via Chain-of-Imagination for Simulated-World Control "
Official Repo of LangSuitE
[GenRL] Multimodal foundation world models allow grounding language and video prompts into embodied domains, by turning them into sequences of latent world model states. Latent state sequences can be decoded using the decoder of the model, allowing visualization of the expected behavior, before training the agent to execute it.
Official implementation of the EMNLP 2023 paper "R2H: Building Multimodal Navigation Helpers that Respond to Help Requests"
Official PyTorch Implementation of Genesis: Embodiment Co-Design via Efficient Message and Reward Delivery
An open source framework for research in Embodied-AI from AI2.
Python code to implement LLM4Teach, a policy distillation approach for teaching reinforcement learning agents with Large Language Model
Official implementation of the NAACL 2024 paper "Navigation as Attackers Wish? Towards Building Robust Embodied Agents under Federated Learning"
Democratization of RT-2 "RT-2: New model translates vision and language into action"
Code for ORAR Agent for Vision and Language Navigation on Touchdown and map2seq
Official Implementation of NeurIPS'23 Paper "Cross-Episodic Curriculum for Transformer Agents"
[arXiv 2023] Embodied Task Planning with Large Language Models
Tracking an embodied AI agent to estimate movement from observations
Add a description, image, and links to the embodied-agent topic page so that developers can more easily learn about it.
To associate your repository with the embodied-agent topic, visit your repo's landing page and select "manage topics."