Documentation
¶
Index ¶
- type Anthropic
- type BoltDB
- func (b BoltDB) AddChat(_ context.Context, chat models.Chat) (string, error)
- func (b BoltDB) AddMessage(_ context.Context, chatID string, message models.Message) (string, error)
- func (b BoltDB) Chats(context.Context) ([]models.Chat, error)
- func (b BoltDB) Messages(_ context.Context, chatID string) ([]models.Message, error)
- func (b BoltDB) UpdateChat(_ context.Context, chat models.Chat) error
- func (b BoltDB) UpdateMessage(_ context.Context, chatID string, message models.Message) error
- type LLMParameters
- type Ollama
- type OpenAI
- type OpenRouter
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Anthropic ¶
type Anthropic struct {
// contains filtered or unexported fields
}
Anthropic provides an interface to the Anthropic API for large language model interactions. It implements the LLM interface and handles streaming chat completions using Claude models.
func NewAnthropic ¶
func NewAnthropic(apiKey, model, systemPrompt string, maxTokens int, params LLMParameters) Anthropic
NewAnthropic creates a new Anthropic instance with the specified API key, model name, and maximum token limit. It initializes an HTTP client for API communication and returns a configured Anthropic instance ready for chat interactions.
func (Anthropic) Chat ¶
func (a Anthropic) Chat( ctx context.Context, messages []models.Message, tools []mcp.Tool, ) iter.Seq2[models.Content, error]
Chat streams responses from the Anthropic API for a given sequence of messages. It processes system messages separately and returns an iterator that yields response chunks and potential errors. The context can be used to cancel ongoing requests. Refer to models.Message for message structure details.
func (Anthropic) GenerateTitle ¶
GenerateTitle generates a title for a given message using the Anthropic API. It sends a single message to the Anthropic API and returns the first response content as the title. The context can be used to cancel ongoing requests.
type BoltDB ¶
type BoltDB struct {
// contains filtered or unexported fields
}
BoltDB implements the Store interface using a BoltDB backend for persistent storage of chats and messages. It provides atomic operations for managing chat histories and their associated messages through a key-value storage model.
func NewBoltDB ¶
NewBoltDB creates a new BoltDB instance with the specified file path. It initializes the database with required buckets and returns an error if the database cannot be opened or initialized. The database file is created with 0600 permissions if it doesn't exist.
func (BoltDB) AddChat ¶
AddChat stores a new chat record in the database and creates an associated message bucket. It generates a unique ID for the chat by combining a sequence number with the chat's original ID, and returns the new ID or an error if the operation fails.
func (BoltDB) AddMessage ¶
func (b BoltDB) AddMessage(_ context.Context, chatID string, message models.Message) (string, error)
AddMessage stores a new message in the specified chat's message bucket. It generates a unique ID for the message by combining a sequence number with the message's original ID, and returns the new ID or an error if the operation fails.
func (BoltDB) Chats ¶
Chats retrieves all stored chat records from the database in reverse chronological order. It returns a slice of Chat models or an error if the database operation fails.
func (BoltDB) Messages ¶
Messages retrieves all messages associated with the specified chat ID. It returns the messages in their stored order or an error if the database operation fails.
func (BoltDB) UpdateChat ¶
UpdateChat modifies an existing chat record in the database. If the chat doesn't exist, the operation is silently ignored. Returns an error if the marshaling or database operation fails.
func (BoltDB) UpdateMessage ¶
UpdateMessage modifies an existing message in the specified chat's message bucket. If the message doesn't exist, the operation is silently ignored. Returns an error if the marshaling or database operation fails.
type LLMParameters ¶
type LLMParameters struct { Temperature *float32 `yaml:"temperature"` TopP *float32 `yaml:"topP"` TopK *int `yaml:"topK"` FrequencyPenalty *float32 `yaml:"frequencyPenalty"` PresencePenalty *float32 `yaml:"presencePenalty"` RepetitionPenalty *float32 `yaml:"repetitionPenalty"` MinP *float32 `yaml:"minP"` TopA *float32 `yaml:"topA"` Seed *int `yaml:"seed"` MaxTokens *int `yaml:"maxTokens"` LogitBias map[string]int `yaml:"logitBias"` Logprobs *bool `yaml:"logprobs"` TopLogprobs *int `yaml:"topLogprobs"` Stop []string `yaml:"stop"` IncludeReasoning *bool `yaml:"includeReasoning"` }
LLMParameters contains the optional configuration parameters for LLM services.
Not all parameters are supported by all LLM providers. The parameters are documented in the corresponding LLM provider's documentation.
These parameters is taken from OpenRouter documentation: https://openrouter.ai/docs/api-reference/parameters For more information obout what these parameters do, please refer to it.
type Ollama ¶
type Ollama struct {
// contains filtered or unexported fields
}
Ollama provides an implementation of the LLM interface for interacting with Ollama's language models. It manages connections to an Ollama server instance and handles streaming chat completions.
func NewOllama ¶
func NewOllama(host, model, systemPrompt string, params LLMParameters, logger *slog.Logger) Ollama
NewOllama creates a new Ollama instance with the specified host URL and model name. The host parameter should be a valid URL pointing to an Ollama server. If the provided host URL is invalid, the function will panic.
func (Ollama) Chat ¶
func (o Ollama) Chat( ctx context.Context, messages []models.Message, tools []mcp.Tool, ) iter.Seq2[models.Content, error]
Chat implements the LLM interface by streaming responses from the Ollama model. It accepts a context for cancellation and a slice of messages representing the conversation history. The function returns an iterator that yields response chunks as strings and potential errors. The response is streamed incrementally, allowing for real-time processing of model outputs.
func (Ollama) GenerateTitle ¶
GenerateTitle generates a title for a given message using the Ollama API. It sends a single message to the Ollama API and returns the first response content as the title. The context can be used to cancel ongoing requests.
type OpenAI ¶
type OpenAI struct {
// contains filtered or unexported fields
}
OpenAI provides an implementation of the LLM interface for interacting with OpenAI's language models.
func NewOpenAI ¶
func NewOpenAI( apiKey, model, systemPrompt, endpoint string, params LLMParameters, logger *slog.Logger, ) OpenAI
NewOpenAI creates a new OpenAI instance with the specified API key, base URL, model name, and system prompt.
type OpenRouter ¶
type OpenRouter struct {
// contains filtered or unexported fields
}
OpenRouter provides an implementation of the LLM interface for interacting with OpenRouter's language models.
func NewOpenRouter ¶
func NewOpenRouter(apiKey, model, systemPrompt string, params LLMParameters, logger *slog.Logger) OpenRouter
NewOpenRouter creates a new OpenRouter instance with the specified API key, model name, and system prompt.
func (OpenRouter) Chat ¶
func (o OpenRouter) Chat( ctx context.Context, messages []models.Message, tools []mcp.Tool, ) iter.Seq2[models.Content, error]
Chat streams responses from the OpenRouter API for a given sequence of messages. It processes system messages separately and returns an iterator that yields response chunks and potential errors. The context can be used to cancel ongoing requests. Refer to models.Message for message structure details.
func (OpenRouter) GenerateTitle ¶
GenerateTitle generates a title for a given message using the OpenRouter API. It sends a single message to the OpenRouter API and returns the first response content as the title. The context can be used to cancel ongoing requests.