Documentation
¶
Overview ¶
Package simple provides a basic memory generator implementation.
WARNING: This implementation is NOT production-ready for the following reasons:
- Fixed importance scoring (0.5) without dynamic evaluation
- No forgetting/decay mechanism
- No memory consolidation or deduplication
- No security protection against memory poisoning
- Single-dimension (vector) retrieval only
For production use, consider:
- Mem0 (https://mem0.ai) - Hybrid datastore with graph/vector/KV
- Letta (https://letta.com) - Hierarchical memory with sleep-time compute
- Custom implementation based on your domain requirements
This package is suitable for:
- Development and testing environments
- Prototyping memory-aware features
- Learning about memory engineering concepts
Index ¶
Constants ¶
const DefaultEmbeddingModel = "BAAI/bge-m3"
DefaultEmbeddingModel is the default embedding model for episodic memories.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Config ¶
type Config struct {
// Enabled controls whether memory generation is active.
Enabled bool
// SummaryMaxTokens is the max tokens for LLM summary generation.
SummaryMaxTokens int
// MaxConcurrency limits concurrent memory generation tasks.
MaxConcurrency int
// Timeout is the maximum time for memory generation.
Timeout time.Duration
}
Config holds configuration for the memory generator.
type EmbeddingService ¶
EmbeddingService defines the interface for embedding generation.
type Generator ¶
type Generator struct {
// contains filtered or unexported fields
}
Generator generates episodic memories from completed interactions. See package documentation for limitations and production recommendations.
func NewGenerator ¶
func NewGenerator( store MemoryStore, llm LLMService, embedder EmbeddingService, config *Config, ) *Generator
NewGenerator creates a new memory generator.
func (*Generator) GenerateAsync ¶
func (g *Generator) GenerateAsync(ctx context.Context, req memory.MemoryRequest)
GenerateAsync starts asynchronous memory generation.
func (*Generator) GenerateSync ¶
GenerateSync generates memory synchronously (for testing).
type LLMService ¶
type LLMService interface {
Chat(ctx context.Context, messages []llm.Message) (string, *llm.LLMCallStats, error)
}
LLMService defines the interface for LLM-based summary generation.
type MemoryStore ¶
type MemoryStore interface {
CreateEpisodicMemory(ctx context.Context, create *store.EpisodicMemory) (*store.EpisodicMemory, error)
UpsertEpisodicMemoryEmbedding(ctx context.Context, embedding *store.EpisodicMemoryEmbedding) (*store.EpisodicMemoryEmbedding, error)
}
MemoryStore defines the interface for memory persistence.