llm

package
v1.6.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 19, 2026 License: MIT Imports: 11 Imported by: 0

Documentation

Overview

Package llm provides interfaces and implementations for LLM completion clients

Package llm provides Ollama LLM client implementation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func NormalizeJSONArraysToStrings added in v1.4.1

func NormalizeJSONArraysToStrings(jsonBytes []byte) ([]byte, bool, error)

NormalizeJSONArraysToStrings walks a JSON structure and converts arrays of strings to comma-joined strings. This handles cases where the LLM returns arrays where strings are expected (e.g., {"object": ["a", "b"]} becomes {"object": "a, b"}).

Note: Top-level arrays are preserved. Only arrays within object fields are normalized.

Returns:

  • normalized JSON bytes
  • bool indicating whether any normalization occurred
  • error if JSON parsing fails

Types

type LLMClient

type LLMClient interface {
	// Complete sends a prompt to the LLM and returns the raw completion text
	Complete(ctx context.Context, prompt string) (string, error)

	// CompleteWithSchema sends a prompt and unmarshals the response into the provided schema
	// The schema parameter should be a pointer to the target struct
	CompleteWithSchema(ctx context.Context, prompt string, schema any) error
}

LLMClient defines the interface for interacting with large language models

type OllamaClient added in v1.1.1

type OllamaClient struct {
	// contains filtered or unexported fields
}

OllamaClient implements LLMClient using local Ollama API

func NewOllamaClient added in v1.1.1

func NewOllamaClient(baseURL, model string) *OllamaClient

NewOllamaClient creates a new Ollama LLM client baseURL is typically "http://localhost:11434" model is the LLM model name, e.g. "mistral"

func (*OllamaClient) Complete added in v1.1.1

func (c *OllamaClient) Complete(ctx context.Context, prompt string) (string, error)

Complete sends a prompt to the LLM and returns the raw completion text

func (*OllamaClient) CompleteWithSchema added in v1.1.1

func (c *OllamaClient) CompleteWithSchema(ctx context.Context, prompt string, schema any) error

CompleteWithSchema sends a prompt and unmarshals the response into the provided schema

type OpenAILLM

type OpenAILLM struct {
	APIKey  string
	Model   string
	BaseURL string
	// contains filtered or unexported fields
}

OpenAILLM implements LLMClient for OpenAI's Chat Completions API

func NewOpenAILLM

func NewOpenAILLM(apiKey string) *OpenAILLM

NewOpenAILLM creates a new OpenAI LLM client

func (*OpenAILLM) Complete

func (o *OpenAILLM) Complete(ctx context.Context, prompt string) (string, error)

Complete sends a prompt to the OpenAI Chat Completions API and returns the response

func (*OpenAILLM) CompleteWithSchema

func (o *OpenAILLM) CompleteWithSchema(ctx context.Context, prompt string, schema any) error

CompleteWithSchema sends a prompt and unmarshals the JSON response into the provided schema

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL