Documentation
¶
Overview ¶
Package openai provides an OpenAI provider implementation for the LLM client interface.
Overview ¶
The openai package implements the llm.ProviderClient interface using the official openai-go SDK. It supports chat completions, streaming responses, and tool calling capabilities with OpenAI's API.
Usage ¶
Create a new OpenAI provider with configuration options:
import ( "codeberg.org/MadsRC/aigent/pkg/llm/providers/openai" "log/slog" ) // Create a new provider with API key provider := openai.New( openai.WithAPIKey("your-api-key"), openai.WithLogger(slog.Default()), ) // Create a chat request request := llm.ChatRequest{ ModelID: "gpt-4o", Messages: []llm.Message{ { Role: llm.RoleUser, Content: "Hello, how are you?", }, }, } // Get a response response, err := provider.Chat(context.Background(), request) if err != nil { // Handle error } // Or use streaming for real-time responses stream, err := provider.ChatStream(context.Background(), request) if err != nil { // Handle error } defer stream.Close() for { chunk, err := stream.Next() if err == io.EOF { break } if err != nil { // Handle error } // Process chunk... }
Configuration Options ¶
The provider supports several configuration options:
- WithAPIKey: Sets the OpenAI API key
- WithBaseURL: Sets a custom base URL for API requests
- WithLogger: Sets a custom logger for the provider
- WithTools: Sets available tools for the provider to reference
- WithOrganization: Sets the organization ID for the provider
Tool Calling Support ¶
The provider fully supports OpenAI's tool calling capabilities:
// Define tools tools := []llm.Tool{ { Function: llm.FunctionDefinition{ Name: "get_weather", Description: "Get the current weather for a location", Parameters: map[string]interface{}{ "type": "object", "properties": map[string]interface{}{ "location": map[string]interface{}{ "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, }, "required": []string{"location"}, }, }, }, } // Create provider with tools provider := openai.New( openai.WithAPIKey("your-api-key"), openai.WithTools(tools), )
Model Capabilities ¶
The provider automatically detects capabilities of different OpenAI models:
- Streaming support
- JSON mode support
- Function/tool calling support
- Vision capabilities (for models like gpt-4-vision-preview, gpt-4o)
Streaming Implementation ¶
The package includes a robust streaming implementation that handles:
- Incremental content delivery
- Tool call assembly across multiple chunks
- JSON validation for tool arguments
- Token usage tracking
Package openai provides an OpenAI provider for the LLM client using the official openai-go SDK.
Index ¶
- type Provider
- func (p *Provider) Chat(ctx context.Context, req llm.ChatRequest) (*llm.ChatResponse, error)
- func (p *Provider) ChatStream(ctx context.Context, req llm.ChatRequest) (llm.ChatResponseStream, error)
- func (p *Provider) GetModel(ctx context.Context, modelID string) (*llm.Model, error)
- func (p *Provider) ID() string
- func (p *Provider) ListModels(ctx context.Context) ([]llm.Model, error)
- func (p *Provider) Name() string
- type ProviderOption
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Provider ¶
type Provider struct {
// contains filtered or unexported fields
}
Provider implements the llm.ProviderClient interface for OpenAI.
func New ¶
func New(opts ...ProviderOption) *Provider
New creates a new OpenAI provider with the given options.
func (*Provider) Chat ¶
func (p *Provider) Chat(ctx context.Context, req llm.ChatRequest) (*llm.ChatResponse, error)
Chat sends a chat completion request to the provider.
func (*Provider) ChatStream ¶
func (p *Provider) ChatStream(ctx context.Context, req llm.ChatRequest) (llm.ChatResponseStream, error)
ChatStream sends a streaming chat completion request.
func (*Provider) ListModels ¶
ListModels returns all available models from this provider.
type ProviderOption ¶
type ProviderOption func(*Provider)
ProviderOption configures the OpenAI provider.
func WithAPIKey ¶
func WithAPIKey(apiKey string) ProviderOption
WithAPIKey sets the API key for the provider.
func WithBaseURL ¶
func WithBaseURL(baseURL string) ProviderOption
WithBaseURL sets the base URL for the provider.
func WithLogger ¶
func WithLogger(logger *slog.Logger) ProviderOption
WithLogger sets the logger for the provider.
func WithOrganization ¶
func WithOrganization(organization string) ProviderOption
WithOrganization sets the organization ID for the provider.
func WithTools ¶
func WithTools(tools []llm.Tool) ProviderOption
WithTools sets the available tools for the provider to reference.