Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type LocalAIClient ¶
type LocalAIClient struct {
// contains filtered or unexported fields
}
LocalAIClient is an LLM client for LocalAI-compatible APIs. It uses the same request format as OpenAI but parses an additional "reasoning" field in the response JSON (in choices[].message) and maps it to LLMReply.ReasoningContent.
func NewLocalAILLM ¶
func NewLocalAILLM(model, apiKey, baseURL string) *LocalAIClient
NewLocalAILLM creates a new LocalAI client with the same constructor signature as NewOpenAILLM. baseURL is the API base (e.g. "http://localhost:8080/v1").
func (*LocalAIClient) Ask ¶
Ask prompts the LLM with the provided messages and returns a Fragment containing the response. Uses CreateChatCompletion so reasoning is preserved. The Fragment's Status.LastUsage is updated with the token usage.
func (*LocalAIClient) CreateChatCompletion ¶
func (llm *LocalAIClient) CreateChatCompletion(ctx context.Context, request openai.ChatCompletionRequest) (cogito.LLMReply, cogito.LLMUsage, error)
CreateChatCompletion sends the chat completion request and parses the response, including LocalAI's optional "reasoning" field, into LLMReply.ReasoningContent.
func (*LocalAIClient) SetGrammar ¶
func (llm *LocalAIClient) SetGrammar(grammar string)
SetGrammar sets a GBNF grammar string that constrains the model's output. When set, the grammar is included in the request body sent to LocalAI.
type OpenAIClient ¶
type OpenAIClient struct {
// contains filtered or unexported fields
}
func NewOpenAILLM ¶
func NewOpenAILLM(model, apiKey, baseURL string) *OpenAIClient
func (*OpenAIClient) Ask ¶
Ask prompts to the LLM with the provided messages and returns a Fragment containing the response. The Fragment.GetMessages() method automatically handles force-text-reply when tool calls are present in the conversation history. The Fragment's Status.LastUsage is updated with the token usage.
func (*OpenAIClient) CreateChatCompletion ¶
func (llm *OpenAIClient) CreateChatCompletion(ctx context.Context, request openai.ChatCompletionRequest) (cogito.LLMReply, cogito.LLMUsage, error)