Documentation
¶
Overview ¶
Package main demonstrates two ways to use the corpus embedded library.
LLM configuration is read from environment variables via the genai provider/env package.
Usage:
go run ./example/embedded/main.go [--advanced] <storage-dir>
Examples:
# Simple (auto-composition) LLM_CHAT_COMPLETION_PROVIDER=openai \ LLM_CHAT_COMPLETION_OPENAI_BASE_URL=http://localhost:11434/v1 \ LLM_EMBEDDINGS_PROVIDER=openai \ LLM_EMBEDDINGS_OPENAI_BASE_URL=http://localhost:11434/v1 \ go run ./example/embedded/main.go /tmp/corpus-demo # Advanced (explicit pipeline), loading env from a .env file go run ./example/embedded/main.go --advanced /tmp/corpus-demo
Click to show internal directories.
Click to hide internal directories.