Getting Started¶
Get Lerim running in under 2 minutes.
Prerequisites¶
- Python 3.10+
- Deno (required by the DSPy extraction pipeline)
- An LLM API key (OpenRouter, OpenAI, or Anthropic)
Install¶
For development installs:
Set up API keys¶
Lerim needs an LLM provider for extraction and chat. Set at least one:
export OPENROUTER_API_KEY="sk-or-..." # default provider
# or
export OPENAI_API_KEY="sk-..."
# or
export ZAI_API_KEY="..."
Connect your agent platforms¶
Auto-detect and connect all supported platforms:
Or connect specific platforms:
Check what's connected:
Start the learning loop¶
Run the daemon for continuous sync + maintain:
Or run one-shot commands:
Query your memories¶
lerim chat "What auth pattern do we use?"
lerim memory search "database migration"
lerim memory list
Teach your agent about Lerim¶
Install the Lerim skill so your coding agent knows how to query past context:
This works with Claude Code, Codex, Cursor, Copilot, Cline, Windsurf, OpenCode, and other agents that support skills.
At the start of a session, tell your agent:
Check lerim for any relevant memories about [topic you're working on].
Your agent will run lerim chat or lerim memory search to pull in past decisions and learnings.
Next steps¶
- CLI Reference — full command documentation
- Configuration — TOML config, model roles, tracing
- Connecting Agents — supported platforms and custom paths
- Memory Model — how memories are stored and structured
- Dashboard — local web UI for browsing sessions and memories