chat

package
v1.10.5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 2, 2026 License: MIT Imports: 27 Imported by: 0

README

Chat

Multi-provider AI client for buildling intelligent features in GTB applications.

Supports:

  • OpenAI
  • Claude (Anthropic)
  • Gemini (Google)

For detailed documentation, usage examples, and configuration guides, see the Chat Component Documentation.

Documentation

Overview

Package chat provides a unified multi-provider AI chat client supporting Claude, OpenAI, Gemini, Claude Local (via CLI binary), and OpenAI-compatible endpoints.

The ChatClient interface exposes four core methods: Add (append a message), Chat (multi-turn conversation with tool use via a ReAct loop), Ask (structured output with JSON schema validation), and SetTools (register callable tools).

Tool calling follows a JSON Schema parameter definition, and the ReAct loop automatically dispatches tool calls and feeds results back until the model produces a final text response. Per-provider token limits and maximum agent steps are configurable via Config.

New providers can be registered at runtime via RegisterProvider. Structured output helpers such as GenerateSchema simplify schema generation for Ask calls.

Index

Examples

Constants

View Source
const (
	// ConfigKeyAIProvider is the config key for the AI provider.
	ConfigKeyAIProvider = "ai.provider"
	// ConfigKeyOpenAIKey is the config key for the OpenAI API key.
	ConfigKeyOpenAIKey = "openai.api.key"
	// ConfigKeyClaudeKey is the config key for the Claude/Anthropic API key.
	ConfigKeyClaudeKey = "anthropic.api.key"
	// ConfigKeyGeminiKey is the config key for the Gemini API key.
	ConfigKeyGeminiKey = "gemini.api.key"
)
View Source
const (
	// EnvAIProvider is the environment variable for overriding the AI provider.
	EnvAIProvider = "AI_PROVIDER"
	// EnvOpenAIKey is the environment variable for overriding the OpenAI API key.
	EnvOpenAIKey = "OPENAI_API_KEY"
	// EnvClaudeKey is the environment variable for overriding the Claude API key.
	EnvClaudeKey = "ANTHROPIC_API_KEY"
	// EnvGeminiKey is the environment variable for overriding the Gemini API key.
	EnvGeminiKey = "GEMINI_API_KEY"
)
View Source
const (
	// DefaultModelGemini is the default model for the Gemini provider.
	DefaultModelGemini = "gemini-3-flash-preview"

	// DefaultModelClaude is the default model for the Claude provider.
	DefaultModelClaude = "claude-sonnet-4-5"

	// DefaultModelOpenAI is the default model for the OpenAI provider.
	DefaultModelOpenAI = openai.ChatModelGPT5

	// DefaultMaxSteps is the default maximum number of ReAct loop iterations.
	DefaultMaxSteps = 20

	// DefaultMaxTokensOpenAI is the default maximum tokens per response for OpenAI.
	DefaultMaxTokensOpenAI = 4096

	// DefaultMaxTokensClaude is the default maximum tokens per response for Claude.
	DefaultMaxTokensClaude = 8192

	// DefaultMaxTokensGemini is the default maximum tokens per response for Gemini.
	DefaultMaxTokensGemini = 8192
)
View Source
const SnapshotVersion = 1

SnapshotVersion is the current version of the snapshot format. Increment when the format changes in a way that requires migration.

Variables

View Source
var (
	// allow mocking in tests.
	ExportExecCommand  = exec.CommandContext
	ExportExecLookPath = exec.LookPath
)
View Source
var (
	// allow mocking in tests.
	ExportGenaiNewClient = genai.NewClient
)

Functions

func GenerateSchema

func GenerateSchema[T any]() any

GenerateSchema creates a JSON schema for a given type T. OpenAI's structured outputs feature uses a subset of JSON schema. The reflector is configured with flags to ensure the generated schema complies with this specific subset.

func RegisterProvider

func RegisterProvider(name Provider, factory ProviderFactory)

RegisterProvider registers a factory function for a provider name. Call this from an init() function in your provider file or external package.

Types

type ChatClient

type ChatClient interface {
	// Add appends a user message to the conversation history without
	// triggering a completion. The message persists for subsequent
	// Chat() or Ask() calls.
	Add(ctx context.Context, prompt string) error
	// Ask sends a question and unmarshals the structured response into
	// target. If Config.ResponseSchema was set during construction, the
	// provider enforces that schema. If no schema is set, the provider
	// returns the raw text content unmarshalled into target (which must
	// be a *string or implement json.Unmarshaler).
	Ask(ctx context.Context, question string, target any) error
	// SetTools configures the tools available to the AI. This replaces
	// (not appends to) any previously set tools.
	SetTools(tools []Tool) error
	// Chat sends a message and returns the response content. If tools
	// are configured, the provider handles tool calls internally via a
	// ReAct loop bounded by Config.MaxSteps (default 20).
	Chat(ctx context.Context, prompt string) (string, error)
}

ChatClient defines the interface for interacting with a chat service.

Implementations are NOT safe for concurrent use by multiple goroutines. Each goroutine should use its own ChatClient instance.

Message history from Add() calls persists across Chat() and Ask() calls within the same client instance. To start a fresh conversation, create a new client via chat.New().

func New

func New(ctx context.Context, p *props.Props, cfg Config) (ChatClient, error)

New creates a ChatClient for the configured provider.

type Claude

type Claude struct {
	// contains filtered or unexported fields
}

Claude implements the ChatClient interface using Anthropic's official Go SDK.

func (*Claude) Add

func (c *Claude) Add(_ context.Context, prompt string) error

Add appends a new user message to the chat session.

func (*Claude) Ask

func (c *Claude) Ask(ctx context.Context, question string, target any) error

Ask sends a question to the Claude chat client and expects a structured response.

func (*Claude) Chat

func (c *Claude) Chat(ctx context.Context, prompt string) (string, error)

Chat sends a message and returns the response content.

func (*Claude) Restore added in v1.9.2

func (c *Claude) Restore(snapshot *Snapshot) error

Restore replaces the current conversation state with a previously saved snapshot.

func (*Claude) Save added in v1.9.2

func (c *Claude) Save() (*Snapshot, error)

Save captures the current Claude conversation state as a snapshot.

func (*Claude) SetTools

func (c *Claude) SetTools(tools []Tool) error

SetTools configures the tools available to the AI.

func (*Claude) StreamChat added in v1.8.0

func (c *Claude) StreamChat(ctx context.Context, prompt string, callback StreamCallback) (string, error)

StreamChat implements StreamingChatClient.

type ClaudeLocal

type ClaudeLocal struct {
	// contains filtered or unexported fields
}

ClaudeLocal implements the ChatClient interface using a locally installed claude CLI binary. This provider is useful in environments where direct API access to api.anthropic.com is blocked but the pre-authenticated claude binary is permitted.

func (*ClaudeLocal) Add

func (c *ClaudeLocal) Add(_ context.Context, prompt string) error

Add buffers a user message to be prepended to the next Chat or Ask call.

func (*ClaudeLocal) Ask

func (c *ClaudeLocal) Ask(ctx context.Context, question string, target any) error

Ask sends a question to the local claude binary and unmarshals the structured response into the target using --json-schema for schema-enforced output.

func (*ClaudeLocal) Chat

func (c *ClaudeLocal) Chat(ctx context.Context, prompt string) (string, error)

Chat sends a message to the local claude binary and returns the text response.

func (*ClaudeLocal) SetTools

func (c *ClaudeLocal) SetTools(_ []Tool) error

SetTools is not supported in Phase 1 of ProviderClaudeLocal. Tool integration via MCP server is planned for a future release.

type Config

type Config struct {
	// Provider is the AI service provider to use.
	Provider Provider
	// Model is the specific model to use (e.g., "gpt-4o", "claude-3-5-sonnet").
	Model string
	// Token is the API key or token for the service.
	Token string
	// BaseURL overrides the API endpoint. Required when using ProviderOpenAICompatible.
	// Example: "http://localhost:11434/v1" for Ollama, "https://api.groq.com/openai/v1" for Groq.
	BaseURL string
	// SystemPrompt is the initial system prompt to set the context for the AI.
	SystemPrompt string
	// ResponseSchema is the JSON schema used to force a structured output from the AI.
	ResponseSchema any
	// SchemaName is the name of the response schema (e.g., "error_analysis").
	SchemaName string
	// SchemaDescription is a description of the response schema.
	SchemaDescription string
	// MaxSteps limits the number of ReAct loop iterations in Chat().
	// Zero means use the default (DefaultMaxSteps = 20).
	MaxSteps int
	// MaxTokens sets the maximum tokens per response.
	// Zero means use the provider default (OpenAI: 4096, Claude: 8192, Gemini: 8192).
	MaxTokens int
	// ParallelTools enables concurrent execution of multiple tool calls
	// within a single ReAct step. Disabled by default.
	ParallelTools bool
	// MaxParallelTools limits the number of tools executing concurrently.
	// Zero means use the default (5). Only effective when ParallelTools is true.
	MaxParallelTools int
}

Config holds configuration for a chat client.

type ConversationStore added in v1.9.2

type ConversationStore interface {
	// Save writes a snapshot to the store.
	Save(ctx context.Context, snapshot *Snapshot) error
	// Load retrieves a snapshot by ID.
	Load(ctx context.Context, id string) (*Snapshot, error)
	// List returns summaries of all stored snapshots.
	List(ctx context.Context) ([]SnapshotSummary, error)
	// Delete removes a snapshot by ID.
	Delete(ctx context.Context, id string) error
}

ConversationStore persists and retrieves conversation snapshots.

func NewFileStore added in v1.9.2

func NewFileStore(fs afero.Fs, dir string, opts ...FileStoreOption) (ConversationStore, error)

NewFileStore creates a ConversationStore that persists snapshots as JSON files. Files are stored in dir with 0600 permissions. The directory is created with 0700 permissions if it doesn't exist.

Example
package main

import (
	"github.com/spf13/afero"

	"github.com/phpboyscout/go-tool-base/pkg/chat"
)

func main() {
	// Create a FileStore for persisting chat conversation snapshots.
	store, err := chat.NewFileStore(afero.NewMemMapFs(), "/conversations")
	if err != nil {
		return
	}

	// Save, Load, List, Delete snapshots
	_ = store
}
Example (WithEncryption)
package main

import (
	"github.com/spf13/afero"

	"github.com/phpboyscout/go-tool-base/pkg/chat"
)

func main() {
	// Encrypt stored snapshots with AES-256-GCM (key must be 32 bytes).
	key := make([]byte, 32) // In real usage, use a secure key source

	store, err := chat.NewFileStore(afero.NewMemMapFs(), "/conversations",
		chat.WithEncryption(key),
	)
	if err != nil {
		return
	}

	_ = store
}

type FileStoreOption added in v1.9.2

type FileStoreOption func(*fileStoreConfig)

FileStoreOption configures a FileStore.

func WithEncryption added in v1.9.2

func WithEncryption(key []byte) FileStoreOption

WithEncryption enables AES-256-GCM encryption for stored snapshots. The key must be exactly 32 bytes.

type Gemini

type Gemini struct {
	// contains filtered or unexported fields
}

Gemini implements the ChatClient interface using Google's Generative AI SDK.

func (*Gemini) Add

func (g *Gemini) Add(_ context.Context, prompt string) error

Add appends a user message to the conversation history.

func (*Gemini) Ask

func (g *Gemini) Ask(ctx context.Context, question string, target any) error

Ask sends a question to the Gemini chat client and expects a structured response.

func (*Gemini) Chat

func (g *Gemini) Chat(ctx context.Context, prompt string) (string, error)

Chat sends a message and returns the response content, handling tool calls internally.

func (*Gemini) Restore added in v1.9.2

func (g *Gemini) Restore(snapshot *Snapshot) error

Restore replaces the current conversation state with a previously saved snapshot.

func (*Gemini) Save added in v1.9.2

func (g *Gemini) Save() (*Snapshot, error)

Save captures the current Gemini conversation state as a snapshot.

func (*Gemini) SetTools

func (g *Gemini) SetTools(tools []Tool) error

SetTools configures the tools available to the AI.

func (*Gemini) StreamChat added in v1.8.0

func (g *Gemini) StreamChat(ctx context.Context, prompt string, callback StreamCallback) (string, error)

StreamChat implements StreamingChatClient.

type OpenAI

type OpenAI struct {
	// contains filtered or unexported fields
}

OpenAI implements the ChatClient interface for interacting with OpenAI's API and any OpenAI-compatible API endpoint.

func (*OpenAI) Add

func (a *OpenAI) Add(_ context.Context, prompt string) error

Add appends a new user message to the chat session.

func (*OpenAI) Ask

func (a *OpenAI) Ask(ctx context.Context, question string, target any) error

Ask sends a question to the OpenAI chat client and expects a structured response which is unmarshalled into the target interface.

func (*OpenAI) Chat

func (a *OpenAI) Chat(ctx context.Context, prompt string) (string, error)

Chat sends a message and returns the response content. It handles tool calls internally.

func (*OpenAI) Restore added in v1.9.2

func (a *OpenAI) Restore(snapshot *Snapshot) error

Restore replaces the current conversation state with a previously saved snapshot.

func (*OpenAI) Save added in v1.9.2

func (a *OpenAI) Save() (*Snapshot, error)

Save captures the current OpenAI conversation state as a snapshot.

func (*OpenAI) SetTools

func (a *OpenAI) SetTools(tools []Tool) error

SetTools configures the tools available to the AI.

func (*OpenAI) StreamChat added in v1.8.0

func (a *OpenAI) StreamChat(ctx context.Context, prompt string, callback StreamCallback) (string, error)

StreamChat implements StreamingChatClient.

type PersistentChatClient added in v1.9.2

type PersistentChatClient interface {
	ChatClient
	// Save captures the current conversation state as an immutable snapshot.
	// The snapshot includes provider-specific messages as opaque JSON, tool
	// metadata (without handlers), and configuration. Tokens are never saved.
	Save() (*Snapshot, error)
	// Restore replaces the current conversation state with a previously saved
	// snapshot. The snapshot's Provider must match the client's provider.
	// After restore, tools must be re-registered via SetTools with live handlers.
	Restore(snapshot *Snapshot) error
}

PersistentChatClient extends ChatClient with the ability to save and restore conversation state. Discover via type assertion (same pattern as StreamingChatClient):

if pc, ok := client.(chat.PersistentChatClient); ok {
    snapshot, err := pc.Save()
}

ClaudeLocal does not implement this interface — it delegates to an external subprocess and has no internal message state to persist.

Example
package main

import (
	"context"
)

func main() {
	// Discover persistence support via type assertion:
	//
	//   client, _ := chat.New(ctx, props, cfg)
	//   if pc, ok := client.(chat.PersistentChatClient); ok {
	//       snapshot, _ := pc.Save()
	//       // ... store snapshot ...
	//       pc.Restore(snapshot)
	//   }
	//
	// ClaudeLocal does not implement PersistentChatClient.

	_ = context.Background()
}

type Provider

type Provider string

Provider defines the AI service provider.

const (
	// ProviderOpenAI uses OpenAI's API.
	ProviderOpenAI Provider = "openai"
	// ProviderOpenAICompatible uses any OpenAI-compatible API endpoint (e.g. Ollama, Groq).
	ProviderOpenAICompatible Provider = "openai-compatible"
	// ProviderClaude uses Anthropic's Claude API.
	ProviderClaude Provider = "claude"
	// ProviderClaudeLocal uses a locally installed claude CLI binary.
	ProviderClaudeLocal Provider = "claude-local"
	// ProviderGemini uses Google's Gemini API.
	ProviderGemini Provider = "gemini"
)

type ProviderFactory

type ProviderFactory func(ctx context.Context, p *props.Props, cfg Config) (ChatClient, error)

ProviderFactory creates a ChatClient for a named provider. Register implementations via RegisterProvider in an init() function to allow external packages to add providers without modifying this file.

type Snapshot added in v1.9.2

type Snapshot struct {
	// ID uniquely identifies this snapshot.
	ID string `json:"id"`
	// Provider identifies which chat provider created this snapshot.
	Provider Provider `json:"provider"`
	// Model is the AI model used in the conversation.
	Model string `json:"model"`
	// SystemPrompt is the system instruction active at snapshot time.
	SystemPrompt string `json:"system_prompt,omitempty"`
	// Messages contains provider-specific message history as opaque JSON.
	// The format varies by provider — do not parse or modify directly.
	Messages json.RawMessage `json:"messages"`
	// Tools captures tool metadata (name, description, parameters) without
	// handlers. After restoring, call SetTools to re-register live handlers.
	Tools []ToolSnapshot `json:"tools,omitempty"`
	// Metadata holds arbitrary key-value pairs for consumer use.
	Metadata map[string]string `json:"metadata,omitempty"`
	// CreatedAt is when this snapshot was taken.
	CreatedAt time.Time `json:"created_at"`
	// Version is the snapshot format version for forward compatibility.
	Version int `json:"version"`
}

Snapshot is an immutable point-in-time capture of a conversation.

func NewSnapshot added in v1.9.2

func NewSnapshot(provider Provider, model, systemPrompt string, messages json.RawMessage, tools map[string]Tool, metadata map[string]string) *Snapshot

NewSnapshot creates a Snapshot with a new UUID and the current timestamp.

Example
package main

import (
	"encoding/json"

	"github.com/phpboyscout/go-tool-base/pkg/chat"
)

func main() {
	snap := chat.NewSnapshot(
		chat.ProviderClaude,
		"claude-3-5-sonnet",
		"You are a helpful assistant.",
		json.RawMessage(`[{"role":"user","content":"hello"}]`),
		nil,
		map[string]string{"session": "demo"},
	)

	_ = snap.ID        // UUID
	_ = snap.CreatedAt // timestamp
}

type SnapshotSummary added in v1.9.2

type SnapshotSummary struct {
	ID           string    `json:"id"`
	Provider     Provider  `json:"provider"`
	Model        string    `json:"model"`
	CreatedAt    time.Time `json:"created_at"`
	MessageCount int       `json:"message_count"`
}

SnapshotSummary is a lightweight view of a snapshot for listing without loading the full message history.

type StreamCallback added in v1.8.0

type StreamCallback func(event StreamEvent) error

StreamCallback receives streaming events. Return a non-nil error to cancel the stream.

type StreamEvent added in v1.8.0

type StreamEvent struct {
	// Type indicates the kind of event.
	Type StreamEventType

	// Delta contains the text fragment for EventTextDelta events.
	Delta string

	// ToolCall contains tool call information for EventToolCallStart/EventToolCallEnd events.
	ToolCall *StreamToolCall

	// Error contains error information for EventError events.
	Error error
}

StreamEvent represents a single event in a streaming response.

type StreamEventType added in v1.8.0

type StreamEventType int

StreamEventType identifies the kind of stream event.

const (
	// EventTextDelta is a partial text response fragment.
	EventTextDelta StreamEventType = iota
	// EventToolCallStart indicates a tool call has begun execution.
	EventToolCallStart
	// EventToolCallEnd indicates a tool call has completed execution.
	EventToolCallEnd
	// EventComplete indicates the stream has finished successfully.
	EventComplete
	// EventError indicates an error occurred during streaming.
	EventError
)

type StreamToolCall added in v1.8.0

type StreamToolCall struct {
	// ID is the provider-assigned identifier for the tool call.
	ID string
	// Name is the tool name.
	Name string
	// Arguments is the complete JSON argument payload (only populated on EventToolCallEnd).
	Arguments string
	// Result is the tool execution result (only populated on EventToolCallEnd).
	Result string
}

StreamToolCall contains information about a tool call within a stream.

type StreamingChatClient added in v1.8.0

type StreamingChatClient interface {
	ChatClient

	// StreamChat sends a message and streams the response via callback.
	// The callback is invoked for each event in the stream. If the callback
	// returns a non-nil error, the stream is cancelled and that error is returned.
	// The return value is the complete assembled response text (concatenation of
	// all EventTextDelta fragments) or an error if streaming failed.
	// Tool calls are handled internally via the same ReAct loop as Chat(). If
	// Config.ParallelTools is enabled, multiple tool calls are executed concurrently.
	StreamChat(ctx context.Context, prompt string, callback StreamCallback) (string, error)
}

StreamingChatClient extends ChatClient with streaming support. Implementations that support streaming implement this interface in addition to ChatClient. Discover support via type assertion:

if streamer, ok := client.(chat.StreamingChatClient); ok {
    result, err := streamer.StreamChat(ctx, "prompt", callback)
}

type Tool

type Tool struct {
	Name        string                                                       `json:"name"`
	Description string                                                       `json:"description"`
	Parameters  *jsonschema.Schema                                           `json:"parameters"`
	Handler     func(ctx context.Context, args json.RawMessage) (any, error) `json:"-"`
}

Tool represents a function that the AI can call.

type ToolCall added in v1.8.0

type ToolCall struct {
	Name  string
	Input json.RawMessage
}

ToolCall represents a single tool invocation request.

type ToolResult added in v1.8.0

type ToolResult struct {
	Name   string
	Result string
}

ToolResult holds the result of a single tool execution.

type ToolSnapshot added in v1.9.2

type ToolSnapshot struct {
	Name        string             `json:"name"`
	Description string             `json:"description"`
	Parameters  *jsonschema.Schema `json:"parameters,omitempty"`
}

ToolSnapshot captures tool metadata without the handler function.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL