⚠️ Sunsetting Notice: OpenMemory is being sunset. For local self-hosted memory with a dashboard, please use the Mem0 self-hosted server instead. Get started withcd server && make bootstrap. See the self-hosted docs for configuration details.
OpenMemory is your personal memory layer for LLMs - private, portable, and open-source. Your memories live locally, giving you complete control over your data. Build AI applications with personalized memories while keeping your data secure.
- Docker
- OpenAI API Key
You can quickly run OpenMemory by running the following command:
curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | bashYou should set the OPENAI_API_KEY as a global environment variable:
export OPENAI_API_KEY=your_api_keyYou can also set the OPENAI_API_KEY as a parameter to the script:
curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | OPENAI_API_KEY=your_api_key bash- Docker and Docker Compose
- Python 3.9+ (for backend development)
- Node.js (for frontend development)
- OpenAI API Key (required for LLM interactions, run
cp api/.env.example api/.envthen change OPENAI_API_KEY to yours)
Before running the project, you need to configure environment variables for both the API and the UI.
You can do this in one of the following ways:
-
Manually:
Create a.envfile in each of the following directories:/api/.env/ui/.env
-
Using
.env.examplefiles:
Copy and rename the example files:cp api/.env.example api/.env cp ui/.env.example ui/.env
-
Using Makefile (if supported):
Run:make env
OPENAI_API_KEY=sk-xxx
USER=<user-id> # The User Id you want to associate the memories withBy default, OpenMemory uses OpenAI (gpt-4o-mini) for the LLM and embedder. You can configure a different provider using these environment variables in /api/.env:
| Variable | Description | Default |
|---|---|---|
LLM_PROVIDER |
LLM provider (openai, ollama, anthropic, groq, together, deepseek, etc.) |
openai |
LLM_MODEL |
Model name for the LLM provider | gpt-4o-mini (OpenAI) / llama3.1:latest (Ollama) |
LLM_API_KEY |
API key for the LLM provider | OPENAI_API_KEY env var |
LLM_BASE_URL |
Custom base URL for the LLM API | Provider default |
OLLAMA_BASE_URL |
Ollama-specific base URL (takes precedence over LLM_BASE_URL for Ollama) |
http://localhost:11434 |
EMBEDDER_PROVIDER |
Embedder provider (defaults to ollama when LLM is Ollama, otherwise openai) |
openai |
EMBEDDER_MODEL |
Model name for the embedder | text-embedding-3-small (OpenAI) / nomic-embed-text (Ollama) |
EMBEDDER_API_KEY |
API key for the embedder provider | OPENAI_API_KEY env var |
EMBEDDER_BASE_URL |
Custom base URL for the embedder API | Provider default |
Example: Using Ollama (fully local)
LLM_PROVIDER=ollama
LLM_MODEL=llama3.1:latest
EMBEDDER_PROVIDER=ollama
EMBEDDER_MODEL=nomic-embed-text
OLLAMA_BASE_URL=http://localhost:11434Example: Using Anthropic
LLM_PROVIDER=anthropic
LLM_MODEL=claude-sonnet-4-20250514
LLM_API_KEY=sk-ant-xxxNEXT_PUBLIC_API_URL=http://localhost:8765
NEXT_PUBLIC_USER_ID=<user-id> # Same as the user id for environment variable in apiYou can run the project using the following two commands:
make build # builds the mcp server and ui
make up # runs openmemory mcp server and uiAfter running these commands, you will have:
- OpenMemory MCP server running at: http://localhost:8765 (API documentation available at http://localhost:8765/docs)
- OpenMemory UI running at: http://localhost:3000
If the UI does not start properly on http://localhost:3000, try running it manually:
cd ui
pnpm install
pnpm devUse the following one step command to configure OpenMemory Local MCP to a client. The general command format is as follows:
npx @openmemory/install local http://localhost:8765/mcp/<client-name>/sse/<user-id> --client <client-name>Replace <client-name> with the desired client name and <user-id> with the value specified in your environment variables.
api/- Backend APIs + MCP serverui/- Frontend React application
We are a team of developers passionate about the future of AI and open-source software. With years of experience in both fields, we believe in the power of community-driven development and are excited to build tools that make AI more accessible and personalized.
We welcome all forms of contributions:
- Bug reports and feature requests
- Documentation improvements
- Code contributions
- Testing and feedback
- Community support
How to contribute:
- Fork the repository
- Create your feature branch (
git checkout -b openmemory/feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin openmemory/feature/amazing-feature) - Open a Pull Request
Join us in building the future of AI memory management! Your contributions help make OpenMemory better for everyone.
