mememory

module
v0.5.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 9, 2026 License: MIT

README

MEMEMORY

Persistent semantic memory for AI agents

Store, search, and deliver knowledge across sessions. MCP server with PostgreSQL + pgvector. All data stays local.

CI Release Go Report Card

Documentation · Quick Start · MCP Tools · Releases


What it does

  • Stores memories with scope, type, weight, tags, and TTL
  • Searches by semantic similarity with hierarchical scope inheritance
  • Delivers accumulated rules and knowledge to agents at session start
  • Detects contradictions when new memories conflict with existing ones
  • Evolves beliefs through supersede/weight mechanisms without losing history

Requirements

  • PostgreSQL >= 14 with the pgvector extension
  • (Optional) Docker, if you want the bundled quick-start stack

Install

# CLI (mememory)
go install github.com/scott-walker/mememory/cmd/mememory@latest

# MCP server binary — only needed for BYO Postgres mode;
# bundled Docker mode runs it inside the container
go install github.com/scott-walker/mememory/cmd/mememory-server@latest

Or grab pre-built binaries from Releases.

Quick Start (bundled Docker stack)

mememory setup

mememory setup extracts the bundled Docker Compose file, resolves an OS-standard data directory, and brings up the Docker stack (Postgres with pgvector + Ollama + Admin UI). No source checkout required.

Quick Start (BYO Postgres)

If you already have a PostgreSQL >= 14 server with pgvector, point DATABASE_URL at it. There is no fallback — the server fails fast if DATABASE_URL is unset.

export DATABASE_URL=postgres://user:pass@your-host:5432/mememory?sslmode=disable
mememory-server

The server runs CREATE EXTENSION IF NOT EXISTS vector at startup. If your DB user lacks CREATE privilege, ask your DBA to install pgvector beforehand.

Add to Claude Code config (~/.claude/.mcp.jsonmcpServers):

{
  "mememory": {
    "type": "stdio",
    "command": "docker",
    "args": ["exec", "-i", "mememory", "server"],
    "env": {}
  }
}

Admin UI at http://localhost:4200.

Architecture

Agent ──stdio──▶ server (Go, MCP)
                      │
              ┌───────┴───────┐
              ▼               ▼
         PostgreSQL       Ollama / OpenAI
        (pgvector)       (embeddings)

Bring your own Postgres, or run mememory setup for the bundled Docker quick-start.

Where your data lives

If DATA_DIR is not set, the mememory CLI auto-resolves it to an OS-standard path:

Platform Default location
Linux ~/.local/share/mememory (or $XDG_DATA_HOME/mememory)
macOS ~/Library/Application Support/mememory
Windows %LOCALAPPDATA%\mememory

Inside that directory you get postgres/ (database files) and ollama/ (embedding model). Override with DATA_DIR=/custom/path if you want.

Backup

Two equivalent options:

# Stop the stack and copy the data dir
mememory uninstall
cp -a "$DATA_DIR" "$DATA_DIR.backup-$(date +%F)"
mememory setup

# Or take a logical dump against any Postgres
pg_dump "$DATABASE_URL" > mememory-$(date +%F).sql

MCP Tools

Tool Description
remember Store a memory with scope, type, tags, optional TTL
recall Semantic search with hierarchical scope inheritance
forget Delete by ID
update Update content, re-embed
list List with filters
stats Count breakdown by scope/project/type
help Usage documentation

Key Concepts

Scopes — global (everywhere) and project (one project). Recall searches hierarchically: project scope sees global + its own project.

Types — fact, rule, decision, feedback, context, bootstrap. Only bootstrap memories load automatically at session start; all other types are loaded on demand via recall.

Scoringsimilarity × scope_weight × memory_weight × temporal_decay. Recent, specific, high-weight memories rank higher. Project weight = 1.0, global weight = 0.8.

Contradiction detection — warns when a new memory is >75% similar to existing ones. Does not block storage.

Session bootstrapbootstrap-type memories are delivered to the agent at session start via the SessionStart hook or the mememory://bootstrap MCP resource. The bootstrap payload is bounded by a 30K-token budget (~15% of a 200K-token context window) and ends with a ## Bootstrap Stats block reporting project, source, memory counts, and budget usage. Project name is auto-resolved via a .mememory file at the project root (see docs/config/mememory-file.md), with fallbacks to git basename and cwd.

Embedding Providers

Provider Model Dimension Setup
Ollama (default) nomic-embed-text 768 Included in Docker stack
OpenAI text-embedding-3-small 1536 Set EMBEDDING_PROVIDER=openai + API key

Documentation

License

MIT

Directories

Path Synopsis
cmd
mememory command
mememory-admin command
mememory-server command
internal
api
bootstrap
Package bootstrap renders persisted memories into the SessionStart payload that an agent receives at the very beginning of a session.
Package bootstrap renders persisted memories into the SessionStart payload that an agent receives at the very beginning of a session.
mcp
projectconfig
Package projectconfig parses the .mememory file that lives at a project root.
Package projectconfig parses the .mememory file that lives at a project root.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL