Official implementation of MindForge, presented at NeurIPS 2025. [Paper]
Embodied agents powered by large language models (LLMs), such as Voyager, promise open-ended competence in worlds such as Minecraft. However, when powered by open-weight LLMs they still falter on elementary tasks after domain-specific fine-tuning. We propose MindForge, a generative-agent framework for cultural lifelong learning through explicit perspective taking. We introduce three key innovations: (1) a structured theory of mind representation linking percepts, beliefs, desires, and actions; (2) natural inter-agent communication; and (3) a multi-component memory system. Following the cultural learning framework, we test MindForge in both instructive and collaborative settings within Minecraft. In an instructive setting with GPT-4, MindForge agents powered by open-weight LLMs significantly outperform their Voyager counterparts in basic tasks yielding 3× more tech-tree milestones and collecting 2.3× more unique items than the Voyager baseline. Furthermore, in fully collaborative settings, we find that the performance of two underachieving agents improves with more communication rounds, echoing the Condorcet Jury Theorem. MindForge agents demonstrate sophisticated behaviors, including expert-novice knowledge transfer, collaborative problem solving, and adaptation to out-of-distribution tasks through accumulated cultural experiences.
MindForge follows a similar setup as Voyager and requires Python ≥ 3.10 and Node.js ≥ 16.13.0. You need to follow the instructions below to install MindForge.
git clone https://github.com/tapri-lab/mindforge
cd mindforge
pip install -e .
In addition to the Python dependencies, you need to install the following Node.js packages:
cd mindforge/env/mineflayer
npm install -g npx
npm install
cd mineflayer-collectblock
npx tsc
cd ..
npm install
MindForge depends on Minecraft game. You need to install Minecraft game and set up a Minecraft instance.
Please refer to the detailed instructions in the Voyager repository.
You need to install fabric mods to support all the features in Voyager. Remember to use the correct Fabric version of all the mods.
Follow the instructions in Fabric Mods Install to install the mods.
mindforge/
├── mindforge/ # Main package source code
│ ├── agents/ # Agent logic (LLM, memory, skills)
│ ├── env/ # Minecraft environment & bridge
│ ├── configs/ # Configuration files (YAML)
│ ├── control_primitives/# Control primitives
│ └── prompts/ # LLM prompts
├── plots/ # Analysis notebooks
├── scripts/ # Entry point scripts
├── setup.py # Package installation setup
└── README.md # Project documentation
The repository provides scripts for running MindForge agents in both isolation (single agent) and collaboration (multi-agent).
Both scripts require you to specify the models used by the agents together with the API keys corresponding to the model providers.
OPENAI_KEY="ADD_OPENAI_KEY"
TOGETHER_API_KEY = "ADD_TOGETHER_KEY"
MISTRAL_API_KEY = "ADD_MISTRAL_KEY"
For the multi-agent settings, you need to start the communication server first:
node mindforge/env/mineflayer/communication.jsThe following script runs a single MindForge agent in a lifelong learning setting.
python mindforge/scripts/mindforge_single_agent.py \
--config_path mindforge/configs/individual.yaml \
--learning # Optional: Enable learning modeThe following script runs two MindForge agents in a collaborative lifelong learning setting.
python mindforge/scripts/mindforge_multi_agent.py \
--config_path mindforge/configs/mindforge.yaml \
--learning # Optional: Enable learning modeThe following script runs two MindForge agents in an instructive lifelong learning setting.
python mindforge/scripts/mindforge_multi_agent.py \
--config_path mindforge/configs/mindforge.yaml \
--learning # Optional: Enable learning mode
--instructive # Optional: Enable instructive modeThe plots/ directory contains Jupyter notebooks used for the generating figures in the paper.
@inproceedings{mindforge2025,
title={MindForge: Empowering Embodied Agents with Theory of Mind for Lifelong Cultural Learning},
author={Lică, Mircea and Colle, Baptiste and Shirekar, Ojas and Raman, Chirag},
booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
year={2025},
url={[https://arxiv.org/abs/2411.12977](https://arxiv.org/abs/2411.12977)}
}This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

