chat

package
v1.4.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 23, 2026 License: MIT Imports: 18 Imported by: 0

README

Chat

Multi-provider AI client for buildling intelligent features in GTB applications.

Supports:

  • OpenAI
  • Claude (Anthropic)
  • Gemini (Google)

For detailed documentation, usage examples, and configuration guides, see the Chat Component Documentation.

Documentation

Overview

Package chat provides a unified multi-provider AI chat client supporting Claude, OpenAI, Gemini, Claude Local (via CLI binary), and OpenAI-compatible endpoints.

The ChatClient interface exposes four core methods: Add (append a message), Chat (multi-turn conversation with tool use via a ReAct loop), Ask (structured output with JSON schema validation), and SetTools (register callable tools).

Tool calling follows a JSON Schema parameter definition, and the ReAct loop automatically dispatches tool calls and feeds results back until the model produces a final text response. Per-provider token limits and maximum agent steps are configurable via Config.

New providers can be registered at runtime via RegisterProvider. Structured output helpers such as GenerateSchema simplify schema generation for Ask calls.

Index

Constants

View Source
const (
	// ConfigKeyAIProvider is the config key for the AI provider.
	ConfigKeyAIProvider = "ai.provider"
	// ConfigKeyOpenAIKey is the config key for the OpenAI API key.
	ConfigKeyOpenAIKey = "openai.api.key"
	// ConfigKeyClaudeKey is the config key for the Claude/Anthropic API key.
	ConfigKeyClaudeKey = "anthropic.api.key"
	// ConfigKeyGeminiKey is the config key for the Gemini API key.
	ConfigKeyGeminiKey = "gemini.api.key"
)
View Source
const (
	// EnvAIProvider is the environment variable for overriding the AI provider.
	EnvAIProvider = "AI_PROVIDER"
	// EnvOpenAIKey is the environment variable for overriding the OpenAI API key.
	EnvOpenAIKey = "OPENAI_API_KEY"
	// EnvClaudeKey is the environment variable for overriding the Claude API key.
	EnvClaudeKey = "ANTHROPIC_API_KEY"
	// EnvGeminiKey is the environment variable for overriding the Gemini API key.
	EnvGeminiKey = "GEMINI_API_KEY"
)
View Source
const (
	// DefaultModelGemini is the default model for the Gemini provider.
	DefaultModelGemini = "gemini-3-flash-preview"

	// DefaultModelClaude is the default model for the Claude provider.
	DefaultModelClaude = "claude-sonnet-4-5"

	// DefaultModelOpenAI is the default model for the OpenAI provider.
	DefaultModelOpenAI = openai.ChatModelGPT5

	// DefaultMaxSteps is the default maximum number of ReAct loop iterations.
	DefaultMaxSteps = 20

	// DefaultMaxTokensOpenAI is the default maximum tokens per response for OpenAI.
	DefaultMaxTokensOpenAI = 4096

	// DefaultMaxTokensClaude is the default maximum tokens per response for Claude.
	DefaultMaxTokensClaude = 8192

	// DefaultMaxTokensGemini is the default maximum tokens per response for Gemini.
	DefaultMaxTokensGemini = 8192
)

Variables

This section is empty.

Functions

func GenerateSchema

func GenerateSchema[T any]() any

GenerateSchema creates a JSON schema for a given type T. OpenAI's structured outputs feature uses a subset of JSON schema. The reflector is configured with flags to ensure the generated schema complies with this specific subset.

func RegisterProvider

func RegisterProvider(name Provider, factory ProviderFactory)

RegisterProvider registers a factory function for a provider name. Call this from an init() function in your provider file or external package.

Types

type ChatClient

type ChatClient interface {
	// Add appends a user message to the conversation history without
	// triggering a completion. The message persists for subsequent
	// Chat() or Ask() calls.
	Add(ctx context.Context, prompt string) error
	// Ask sends a question and unmarshals the structured response into
	// target. If Config.ResponseSchema was set during construction, the
	// provider enforces that schema. If no schema is set, the provider
	// returns the raw text content unmarshalled into target (which must
	// be a *string or implement json.Unmarshaler).
	Ask(ctx context.Context, question string, target any) error
	// SetTools configures the tools available to the AI. This replaces
	// (not appends to) any previously set tools.
	SetTools(tools []Tool) error
	// Chat sends a message and returns the response content. If tools
	// are configured, the provider handles tool calls internally via a
	// ReAct loop bounded by Config.MaxSteps (default 20).
	Chat(ctx context.Context, prompt string) (string, error)
}

ChatClient defines the interface for interacting with a chat service.

Implementations are NOT safe for concurrent use by multiple goroutines. Each goroutine should use its own ChatClient instance.

Message history from Add() calls persists across Chat() and Ask() calls within the same client instance. To start a fresh conversation, create a new client via chat.New().

func New

func New(ctx context.Context, p *props.Props, cfg Config) (ChatClient, error)

New creates a ChatClient for the configured provider.

type Claude

type Claude struct {
	// contains filtered or unexported fields
}

Claude implements the ChatClient interface using Anthropic's official Go SDK.

func (*Claude) Add

func (c *Claude) Add(_ context.Context, prompt string) error

Add appends a new user message to the chat session.

func (*Claude) Ask

func (c *Claude) Ask(ctx context.Context, question string, target any) error

Ask sends a question to the Claude chat client and expects a structured response.

func (*Claude) Chat

func (c *Claude) Chat(ctx context.Context, prompt string) (string, error)

Chat sends a message and returns the response content.

func (*Claude) SetTools

func (c *Claude) SetTools(tools []Tool) error

SetTools configures the tools available to the AI.

type ClaudeLocal

type ClaudeLocal struct {
	// contains filtered or unexported fields
}

ClaudeLocal implements the ChatClient interface using a locally installed claude CLI binary. This provider is useful in environments where direct API access to api.anthropic.com is blocked but the pre-authenticated claude binary is permitted.

func (*ClaudeLocal) Add

func (c *ClaudeLocal) Add(_ context.Context, prompt string) error

Add buffers a user message to be prepended to the next Chat or Ask call.

func (*ClaudeLocal) Ask

func (c *ClaudeLocal) Ask(ctx context.Context, question string, target any) error

Ask sends a question to the local claude binary and unmarshals the structured response into the target using --json-schema for schema-enforced output.

func (*ClaudeLocal) Chat

func (c *ClaudeLocal) Chat(ctx context.Context, prompt string) (string, error)

Chat sends a message to the local claude binary and returns the text response.

func (*ClaudeLocal) SetTools

func (c *ClaudeLocal) SetTools(_ []Tool) error

SetTools is not supported in Phase 1 of ProviderClaudeLocal. Tool integration via MCP server is planned for a future release.

type Config

type Config struct {
	// Provider is the AI service provider to use.
	Provider Provider
	// Model is the specific model to use (e.g., "gpt-4o", "claude-3-5-sonnet").
	Model string
	// Token is the API key or token for the service.
	Token string
	// BaseURL overrides the API endpoint. Required when using ProviderOpenAICompatible.
	// Example: "http://localhost:11434/v1" for Ollama, "https://api.groq.com/openai/v1" for Groq.
	BaseURL string
	// SystemPrompt is the initial system prompt to set the context for the AI.
	SystemPrompt string
	// ResponseSchema is the JSON schema used to force a structured output from the AI.
	ResponseSchema any
	// SchemaName is the name of the response schema (e.g., "error_analysis").
	SchemaName string
	// SchemaDescription is a description of the response schema.
	SchemaDescription string
	// MaxSteps limits the number of ReAct loop iterations in Chat().
	// Zero means use the default (DefaultMaxSteps = 20).
	MaxSteps int
	// MaxTokens sets the maximum tokens per response.
	// Zero means use the provider default (OpenAI: 4096, Claude: 8192, Gemini: 8192).
	MaxTokens int
}

Config holds configuration for a chat client.

type Gemini

type Gemini struct {
	// contains filtered or unexported fields
}

Gemini implements the ChatClient interface using Google's Generative AI SDK.

func (*Gemini) Add

func (g *Gemini) Add(_ context.Context, prompt string) error

Add appends a user message to the conversation history.

func (*Gemini) Ask

func (g *Gemini) Ask(ctx context.Context, question string, target any) error

Ask sends a question to the Gemini chat client and expects a structured response.

func (*Gemini) Chat

func (g *Gemini) Chat(ctx context.Context, prompt string) (string, error)

Chat sends a message and returns the response content, handling tool calls internally.

func (*Gemini) SetTools

func (g *Gemini) SetTools(tools []Tool) error

SetTools configures the tools available to the AI.

type OpenAI

type OpenAI struct {
	// contains filtered or unexported fields
}

OpenAI implements the ChatClient interface for interacting with OpenAI's API and any OpenAI-compatible API endpoint.

func (*OpenAI) Add

func (a *OpenAI) Add(_ context.Context, prompt string) error

Add appends a new user message to the chat session.

func (*OpenAI) Ask

func (a *OpenAI) Ask(ctx context.Context, question string, target any) error

Ask sends a question to the OpenAI chat client and expects a structured response which is unmarshalled into the target interface.

func (*OpenAI) Chat

func (a *OpenAI) Chat(ctx context.Context, prompt string) (string, error)

Chat sends a message and returns the response content. It handles tool calls internally.

func (*OpenAI) SetTools

func (a *OpenAI) SetTools(tools []Tool) error

SetTools configures the tools available to the AI.

type Provider

type Provider string

Provider defines the AI service provider.

const (
	// ProviderOpenAI uses OpenAI's API.
	ProviderOpenAI Provider = "openai"
	// ProviderOpenAICompatible uses any OpenAI-compatible API endpoint (e.g. Ollama, Groq).
	ProviderOpenAICompatible Provider = "openai-compatible"
	// ProviderClaude uses Anthropic's Claude API.
	ProviderClaude Provider = "claude"
	// ProviderClaudeLocal uses a locally installed claude CLI binary.
	ProviderClaudeLocal Provider = "claude-local"
	// ProviderGemini uses Google's Gemini API.
	ProviderGemini Provider = "gemini"
)

type ProviderFactory

type ProviderFactory func(ctx context.Context, p *props.Props, cfg Config) (ChatClient, error)

ProviderFactory creates a ChatClient for a named provider. Register implementations via RegisterProvider in an init() function to allow external packages to add providers without modifying this file.

type Tool

type Tool struct {
	Name        string                                                       `json:"name"`
	Description string                                                       `json:"description"`
	Parameters  *jsonschema.Schema                                           `json:"parameters"`
	Handler     func(ctx context.Context, args json.RawMessage) (any, error) `json:"-"`
}

Tool represents a function that the AI can call.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL