A hopper hacks 2026 project integrating LLM agents with Minecraft, using the Odyssey framework as its upstream agent core.
minecraft-agents/
├── Odyssey/ # Upstream Odyssey framework (LLM agent core)
├── docker/ # (planned) Docker setup for parallel agents
├── custom_skills/ # (planned) Custom skill extensions
├── ui/ # (planned) Monitoring dashboard
├── server-setup/ # Server infrastructure (venv, data, mods, scripts)
└── README.md
docker/ - Enables spinning up multiple isolated Minecraft servers with corresponding agent containers in parallel, each with independent resources.
custom_skills/ - Houses extensions to Odyssey's skill library, keeping new primitive and compositional skills separate from upstream code.
ui/ - Provides monitoring capabilities for agent state, task progression, and logging across multiple concurrent agents.
- Python >= 3.9
- Node.js >= 16.13.0
- Running Minecraft server instance
- Running LLaMA-3 backend
-
Clone repository and navigate into directory
-
Create Python virtual environment:
python -m venv server-setup/venv source server-setup/venv/bin/activate -
Install Python packages:
cd Odyssey/Odyssey pip install -e . pip install -r requirements.txt
-
Install Node dependencies:
npm install -g yarn cd Odyssey/Odyssey/odyssey/env/mineflayer yarn install cd mineflayer-collectblock npx tsc cd ../node_modules/mineflayer-collectblock npx tsc
-
Configure settings: Copy config template and populate:
cp Odyssey/Odyssey/conf/config.json.keep.this Odyssey/Odyssey/conf/config.json
Populate with LLM API credentials, server addresses, and embedding model path.
config.json is gitignored — never commit it. It contains secrets.
-
Download embedding model (requires git-lfs):
git lfs install git clone https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2.git server-setup/embeddings
Terminal 1 — Start the Minecraft server (via docker-compose in server-setup/):
cd server-setup
docker compose upTerminal 2 — Start LLM backend (once server is running):
~/MINECRAFTEXPERIMENTS/AgenticMinecraft/server-setup/start_llm_backend.shTerminal 3 — Start Odyssey agent (once backend is running):
~/MINECRAFTEXPERIMENTS/AgenticMinecraft/server-setup/start_odyssey.shapi_key: LLM authenticationserver_host/server_port: LLM backend addressMC_SERVER_HOST/MC_SERVER_PORT: Minecraft server connectionSENTENT_EMBEDDING_DIR: Local path to embedding modelNODE_SERVER_PORT: Node.js service port
MIT License