Installation¶
Detailed installation instructions for Lerim, including prerequisites, Python setup, Docker configuration, and troubleshooting.
Prerequisites¶
Before you begin, make sure you have:
- Python 3.10 or higher
- Docker installed (get Docker) — recommended for the always-on service
- An LLM API key — you only need a key for the provider(s) in your
[roles.*]config (e.g.OPENCODE_API_KEYfor OpenCode Go defaults, or MiniMax / Z.AI / OpenRouter / OpenAI / Anthropic as configured)
Docker is optional
If you don't have Docker, you can run Lerim directly using lerim serve instead of lerim up. See Running without Docker below.
Install Lerim¶
Verify the installation:
Set up API keys¶
Lerim needs an LLM provider for extraction and querying. Set at least one:
Note
You only need API keys for the providers you configure. Match keys to [roles.*] (see shipped src/lerim/config/default.toml). See model roles.
First-time setup¶
Run the interactive setup wizard:
This will:
- Detect your installed coding agents (Claude Code, Codex, Cursor, OpenCode)
- Ask which agents you want to connect
- Write the config to
~/.lerim/config.toml - Check for Docker availability
Then register your projects:
Start Lerim¶
This starts a Docker container with the daemon + JSON API on http://localhost:8765 (web UI: Lerim Cloud).
Running without Docker¶
If you prefer not to use Docker, run Lerim directly:
Then use lerim ask, lerim sync, lerim status, etc. as usual — they connect to the running server.
Local models (Ollama)¶
To use local models instead of cloud APIs:
- Install Ollama: ollama.com
- Pull a model:
ollama pull qwen3.5:9b-q8_0 - Make sure Ollama is running:
ollama serve(or the macOS background service) - Configure Lerim roles to use Ollama:
# ~/.lerim/config.toml
[roles.lead]
provider = "ollama"
model = "qwen3.5:9b-q8_0"
[roles.extract]
provider = "ollama"
model = "qwen3.5:9b-q8_0"
Lerim automatically loads models into RAM before each sync/maintain cycle and unloads them immediately after, so the model only uses memory during active processing. No API keys required.
If running Lerim in Docker with Ollama on the host:
Troubleshooting¶
Docker not found¶
If lerim up reports Docker is not found:
# Check Docker installation
docker --version
# On macOS, make sure Docker Desktop is running
open -a Docker
API key errors¶
If sync or ask commands fail with authentication errors, confirm the env var for
your configured provider (e.g. echo $OPENCODE_API_KEY for OpenCode Go) and
re-export it, or switch [roles.*] to a provider you have keys for.
Port already in use¶
If port 8765 is occupied:
Fresh start¶
If you need to reset everything:
# Reinitialize config (preserves memories)
lerim init
# Or wipe all data and start over
lerim memory reset --scope both --yes
lerim down
lerim up
Warning
lerim memory reset permanently deletes all memories, workspace data, and session indexes. This cannot be undone.
Next steps¶
-
Quickstart
Complete the 5-minute quickstart guide
-
Configuration
Customize model providers, tracing, and more