Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func SetLogLevel ¶
SetLogLevel sets the log level for the entire go-agent library. This affects all LLM providers (OpenAI, Claude, Gemini) and is global to the process.
Log Level Guide:
- slog.LevelError: Only errors (minimal logging, production default for quiet systems)
- slog.LevelWarn: Warnings and errors (default - shows issues without overwhelming)
- slog.LevelInfo: Informational messages (shows high-level operations like "using OpenAI client")
- slog.LevelDebug: Verbose debugging (shows every stream event, tool call, token update - very verbose)
Example - Enable debug logging for development:
import "log/slog" llm.SetLogLevel(slog.LevelDebug)
Example - Quiet production logging:
llm.SetLogLevel(slog.LevelError)
This can also be controlled via the GO_AGENT_DEBUG environment variable:
GO_AGENT_DEBUG=0 # Error level GO_AGENT_DEBUG=1 # Warn level (default) GO_AGENT_DEBUG=2 # Info level GO_AGENT_DEBUG=3 # Debug level
Note: This is a global setting. You cannot set different log levels for different providers or clients. All logging in the process shares the same level.
Types ¶
type Config ¶
type Config struct {
Model string
Provider string
APIKey string
BaseURL string // Optional base URL override for the API endpoint
Headers map[string]string // Optional custom HTTP headers
Temperature float64
MaxTokens int
SystemPrompt string
// LogLevel sets the library-wide log level (affects all providers).
// Values: -1=don't change (default), 0=Error, 1=Warn, 2=Info, 3=Debug
// Note: This is a global setting that affects all LLM providers in the process.
LogLevel int
}
Config holds the LLM client configuration
type ModelProvider ¶
type ModelProvider int
ModelProvider represents the different LLM providers
const ( ProviderOpenAI ModelProvider = iota ProviderClaude ProviderGemini ProviderOllama ProviderUnknown )
Click to show internal directories.
Click to hide internal directories.