Documentation
¶
Overview ¶
Package llm provides a unified interface for Large Language Model providers. It abstracts different LLM providers (Anthropic Claude, OpenAI GPT) behind a common Thread interface for consistent interaction patterns.
Index ¶
- Constants
- func ExtractConversationEntries(provider string, rawMessages []byte, metadata map[string]any, ...) ([]conversations.StreamableMessage, error)
- func ExtractMessages(provider string, rawMessages []byte, metadata map[string]any, ...) ([]llmtypes.Message, error)
- func GetConfigFromViper() (llmtypes.Config, error)
- func GetConfigFromViperWithCmd(cmd *cobra.Command) (llmtypes.Config, error)
- func GetConfigFromViperWithProfile(profileName string) (llmtypes.Config, error)
- func GetConfigFromViperWithProfileAndCmd(profileName string, cmd *cobra.Command) (llmtypes.Config, error)
- func GetConfigFromViperWithoutProfile() (llmtypes.Config, error)
- func NewConversationStreamer(ctx context.Context) (streamer *conversations.ConversationStreamer, closer func() error, err error)
- func NewThread(config llmtypes.Config) (llmtypes.Thread, error)
- func RenderConversationMarkdown(provider string, rawMessages []byte, metadata map[string]any, ...) (string, error)
- func RenderConversationMarkdownWithOptions(provider string, rawMessages []byte, metadata map[string]any, ...) (string, error)
- func SendMessageAndGetText(ctx context.Context, state tooltypes.State, query string, ...) string
- func SendMessageAndGetTextWithUsage(ctx context.Context, state tooltypes.State, query string, ...) (string, llmtypes.Usage)
- type ConversationMarkdownOptions
Constants ¶
const ( DefaultMaxToolResultCharacters = conversations.DefaultMaxToolResultCharacters DefaultMaxToolResultBytes = conversations.DefaultMaxToolResultBytes )
Variables ¶
This section is empty.
Functions ¶
func ExtractConversationEntries ¶
func ExtractConversationEntries(provider string, rawMessages []byte, metadata map[string]any, toolResults map[string]tooltypes.StructuredToolResult) ([]conversations.StreamableMessage, error)
ExtractConversationEntries parses the raw messages from a conversation record into structured conversation entries.
func ExtractMessages ¶
func ExtractMessages(provider string, rawMessages []byte, metadata map[string]any, toolResults map[string]tooltypes.StructuredToolResult) ([]llmtypes.Message, error)
ExtractMessages parses the raw messages from a conversation record
func GetConfigFromViper ¶
GetConfigFromViper loads the LLM configuration from Viper, applies the active profile if set, and resolves any model aliases.
func GetConfigFromViperWithCmd ¶
GetConfigFromViperWithCmd loads the LLM configuration from Viper with command context. When a cobra.Command is provided, CLI flags that were explicitly changed take priority over profile settings.
func GetConfigFromViperWithProfile ¶
GetConfigFromViperWithProfile loads configuration from Viper while applying the provided profile name instead of the globally active viper profile. This is useful for request-scoped profile selection (for example, in the web UI) without mutating shared process-wide viper state.
func GetConfigFromViperWithProfileAndCmd ¶
func GetConfigFromViperWithProfileAndCmd(profileName string, cmd *cobra.Command) (llmtypes.Config, error)
GetConfigFromViperWithProfileAndCmd loads configuration from Viper while applying the provided profile name and then re-applying any explicitly changed Cobra flags on top.
func GetConfigFromViperWithoutProfile ¶
GetConfigFromViperWithoutProfile loads configuration from Viper without applying any active profile from shared process state.
func NewConversationStreamer ¶
func NewConversationStreamer(ctx context.Context) (streamer *conversations.ConversationStreamer, closer func() error, err error)
NewConversationStreamer creates a fully configured conversation streamer with all provider message parsers pre-registered
func RenderConversationMarkdown ¶
func RenderConversationMarkdown( provider string, rawMessages []byte, metadata map[string]any, toolResults map[string]tooltypes.StructuredToolResult, ) (string, error)
RenderConversationMarkdown converts a stored conversation into markdown using the same formatting logic as `kodelet conversation show --format markdown`.
func RenderConversationMarkdownWithOptions ¶
func RenderConversationMarkdownWithOptions( provider string, rawMessages []byte, metadata map[string]any, toolResults map[string]tooltypes.StructuredToolResult, opts ConversationMarkdownOptions, ) (string, error)
RenderConversationMarkdownWithOptions converts a stored conversation into markdown with optional output-shaping controls.
func SendMessageAndGetText ¶
func SendMessageAndGetText(ctx context.Context, state tooltypes.State, query string, config llmtypes.Config, silent bool, opt llmtypes.MessageOpt) string
SendMessageAndGetText is a convenience method for one-shot queries that returns the response as a string
func SendMessageAndGetTextWithUsage ¶
func SendMessageAndGetTextWithUsage(ctx context.Context, state tooltypes.State, query string, config llmtypes.Config, silent bool, opt llmtypes.MessageOpt) (string, llmtypes.Usage)
SendMessageAndGetTextWithUsage is a convenience method for one-shot queries that returns the response as a string and usage information
Types ¶
type ConversationMarkdownOptions ¶
type ConversationMarkdownOptions = conversations.MarkdownOptions
ConversationMarkdownOptions controls markdown rendering for stored conversations.
Directories
¶
| Path | Synopsis |
|---|---|
|
Package anthropic provides a client implementation for interacting with Anthropic's Claude AI models.
|
Package anthropic provides a client implementation for interacting with Anthropic's Claude AI models. |
|
Package base provides shared functionality for LLM thread implementations.
|
Package base provides shared functionality for LLM thread implementations. |
|
Package google provides a client implementation for interacting with Google's GenAI models.
|
Package google provides a client implementation for interacting with Google's GenAI models. |
|
Package openai provides OpenAI API client implementations.
|
Package openai provides OpenAI API client implementations. |
|
preset/codex
Package codex provides preset configurations for Codex CLI models.
|
Package codex provides preset configurations for Codex CLI models. |
|
preset/openai
Package openai provides preset configurations for OpenAI models
|
Package openai provides preset configurations for OpenAI models |
|
preset/xai
Package xai provides preset configurations for xAI Grok models
|
Package xai provides preset configurations for xAI Grok models |
|
responses
Package responses implements storage types for the OpenAI Responses API.
|
Package responses implements storage types for the OpenAI Responses API. |
|
Package prompts provides common prompt templates for LLM interactions including context compacting prompts and other reusable prompt text used throughout kodelet's LLM communication.
|
Package prompts provides common prompt templates for LLM interactions including context compacting prompts and other reusable prompt text used throughout kodelet's LLM communication. |