Documentation
¶
Overview ¶
Package openai implements LLMProvider for OpenAI-compatible APIs. Importing this package (even with a blank import) registers the "openai" provider in the global registry.
The provider reads OPENAI_API_KEY from the environment. Any OpenAI-compatible endpoint can be targeted by setting OPENAI_BASE_URL (e.g. a local Ollama instance or Azure OpenAI).
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Provider ¶
type Provider struct{}
Provider implements providers.LLMProvider using the OpenAI Chat Completions API.
func (*Provider) RunTask ¶
func (p *Provider) RunTask( ctx context.Context, cfg *config.Config, task queue.Task, tools []mcp.Tool, callTool func(context.Context, string, json.RawMessage) (string, error), chunkFn func(string), ) (string, queue.TokenUsage, error)
RunTask executes a task through the OpenAI agentic tool-use loop. It keeps calling the API until the model stops requesting tool calls. Token usage is accumulated across all API calls and returned with the result. If chunkFn is non-nil, the final text turn is streamed token-by-token.