Google's terminal AI with a memory it can finally keep
Because every new Gemini session shouldn't feel like a first date
The Problem
Gemini CLI is fast. It's smart. It understands your code in ways that make you feel slightly inadequate.
And then you close your terminal.
The next session, Gemini greets you like you've never met. Your project structure? Unknown. Your conventions? A mystery. That three-hour debugging session where you finally tracked down the race condition? Ancient history that nobody recorded.
Google gave Gemini a 1M token context window. Impressive. But context windows don't survive session restarts. Your AI has the attention span of a goldfish with a very, very large short-term memory.
How Stompy Helps
Stompy gives Gemini CLI persistent memory that survives every session restart — via three integration paths that all hit the same memory store.
MCP: Native tool integration via Gemini's settings.json. Gemini calls lock_context and recall_context as naturally as it reads your files.
CLI: Shell commands — stompy lock, stompy recall, stompy search. Works from any terminal, any script, any CI/CD pipeline.
REST API: HTTP endpoints at /api/v1/agent/memory/*. For custom scripts, webhooks, or anything that speaks HTTP.
Three paths, one brain. Context locked via MCP is recallable via CLI or API and vice versa. Switch tools, switch sessions, switch machines — your project memory follows.
Integration Walkthrough
Path 1: Connect via MCP (native integration)
Add Stompy to Gemini CLI's MCP configuration for automatic tool access.
// ~/.gemini/settings.json{"mcpServers": {"stompy": {"command": "npx","args": ["-y", "mcp-remote", "https://mcp.stompy.ai/sse"],"env": {"AUTHORIZATION": "Bearer YOUR_STOMPY_TOKEN"}}}}
Gemini uses Stompy tools automatically
Once configured, Gemini calls lock_context and recall_context as native MCP tools.
# Session 1: Gemini learns your architecturegemini "explain our authentication flow"# Gemini saves: lock_context(topic="auth_flow", content="JWT with RS256...")# Session 47: Gemini remembersgemini "add rate limiting to the auth endpoint"# Gemini recalls auth_flow automatically — no re-explaining needed
Path 2: Use Stompy CLI directly
Shell commands that work from any terminal or script. Same memory store as MCP.
# Lock a memory from your terminalstompy lock --topic "api_conventions" \--content "REST endpoints use /api/v1/ prefix, snake_case, 201 for creates"# Recall in any sessionstompy recall --topic "api_conventions"# Semantic search across all memoriesstompy search "how do we handle authentication"
Path 3: Use the REST API
HTTP endpoints for custom integrations, CI/CD, or any tool that speaks HTTP.
# Store a memory via REST APIcurl -X POST https://api.stompy.ai/api/v1/agent/memory/store \-H "Authorization: Bearer $STOMPY_TOKEN" \-H "Content-Type: application/json" \-d '{"topic": "deploy_config", "content": "Production uses 3 replicas..."}'# Recall a memorycurl -X POST https://api.stompy.ai/api/v1/agent/memory/recall \-H "Authorization: Bearer $STOMPY_TOKEN" \-d '{"query": "deployment configuration"}'
Cross-agent workflow: Gemini + Claude
Memories saved by one AI are instantly available to another. Same project, same brain.
# Morning: Gemini designs the API schemagemini "design the user settings API"# → lock_context(topic="settings_api", content="GET/PUT /settings...")# Afternoon: Claude implements it — with full contextclaude "implement the settings API we designed this morning"# → recall_context(topic="settings_api") — gets Gemini's design
What You Get
- Three integration paths: MCP, CLI, and REST API — all hitting the same memory store
- Cross-agent memory: context saved by Gemini is recalled by Claude, Codex, or any tool
- Semantic search finds relevant context even when you don't remember the exact topic
- Session handovers survive terminal restarts, machine switches, and tool changes
- Project isolation keeps separate codebases completely independent
Ready to give Gemini CLI a memory?
Join the waitlist and be the first to know when Stompy is ready. Your Gemini CLI projects will never forget again.