Documentation
¶
Overview ¶
Package openai provides an OpenAI API client implementing the ai.Provider interface. It supports both Chat Completions API and Responses API (for codex models).
Index ¶
- func ClassifyChatHTTPErrorFor(provider string, statusCode int, body []byte) *ai.AIError
- func MapChatFinishReason(r string) string
- func ParseChatStepResponse(body []byte, requestedModel string) (*ai.Response, *ai.AIError)
- type APIType
- type ChatStepChoice
- type ChatStepErrorEnvelope
- type ChatStepMessage
- type ChatStepRequest
- type ChatStepRespMessage
- type ChatStepResponse
- type ChatStepToolCall
- type ChatStepToolCallFunction
- type ChatStepToolDef
- type ChatStepToolDefFunction
- type ChatStepUsage
- type Client
- type ClientOption
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func ClassifyChatHTTPErrorFor ¶ added in v0.15.2
ClassifyChatHTTPErrorFor classifies an OpenAI-format error envelope. Both OpenAI and OpenRouter speak this shape — provider distinguishes which name is reported in the AIError.Message when the body is empty.
func MapChatFinishReason ¶ added in v0.15.2
MapChatFinishReason converts the OpenAI Chat Completions finish_reason vocabulary into the normalized ai.Response.FinishReason values.
Mapping:
"stop" → "stop" "tool_calls" → "tool_calls" "function_call" → "tool_calls" (legacy single-function alias) "length" → "length" "content_filter" → "error" anything else → "error"
func ParseChatStepResponse ¶ added in v0.15.2
ParseChatStepResponse decodes a successful Chat Completions response body into an ai.Response. requestedModel is used as a fallback when the response model field is empty (rare but defensive).
Types ¶
type ChatStepChoice ¶ added in v0.15.2
type ChatStepChoice struct {
Index int `json:"index"`
Message ChatStepRespMessage `json:"message"`
FinishReason string `json:"finish_reason"`
}
ChatStepChoice is one completion choice. Message.Content is RawMessage so we can accept JSON null or a string transparently.
type ChatStepErrorEnvelope ¶ added in v0.15.2
type ChatStepErrorEnvelope struct {
Error struct {
Message string `json:"message"`
Type string `json:"type"`
Code string `json:"code"`
} `json:"error"`
}
ChatStepErrorEnvelope is the OpenAI/OpenRouter error response shape.
type ChatStepMessage ¶ added in v0.15.2
type ChatStepMessage struct {
Role string `json:"role"`
Content json.RawMessage `json:"content"`
ToolCallID string `json:"tool_call_id,omitempty"`
ToolCalls []ChatStepToolCall `json:"tool_calls,omitempty"`
}
ChatStepMessage is one entry in the messages array. Content is a json.RawMessage so callers can emit a JSON string OR JSON null (for assistant messages with tool_calls and empty text).
type ChatStepRequest ¶ added in v0.15.2
type ChatStepRequest struct {
Model string `json:"model"`
Messages []ChatStepMessage `json:"messages"`
Tools []ChatStepToolDef `json:"tools,omitempty"`
MaxTokens int `json:"max_tokens,omitempty"`
MaxCompletionTokens int `json:"max_completion_tokens,omitempty"`
Temperature float64 `json:"temperature,omitempty"`
}
ChatStepRequest is the on-the-wire request body for a Chat Completions tool-use call. Exported so OpenRouter can extend it with its provider-routing field by composition (see openrouter package).
MaxTokens vs MaxCompletionTokens: OpenAI's GPT-5+ and o-series reasoning models (o1, o3) reject max_tokens with HTTP 400 "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead." BuildChatStepRequest routes the value to whichever field the model accepts (see usesMaxCompletionTokens). Only one of the two will be set on a given request — the omitempty tags keep the wire payload clean.
func BuildChatStepRequest ¶ added in v0.15.2
func BuildChatStepRequest(req *ai.Request) (*ChatStepRequest, *ai.AIError)
BuildChatStepRequest converts an ai.Request into the on-the-wire ChatStepRequest body. Returns *ai.AIError on translation failure (e.g. malformed JSON in a tool's Parameters schema).
The translation rules are documented on Client.Step.
type ChatStepRespMessage ¶ added in v0.15.2
type ChatStepRespMessage struct {
Role string `json:"role"`
Content json.RawMessage `json:"content"`
ToolCalls []ChatStepToolCall `json:"tool_calls,omitempty"`
}
ChatStepRespMessage is the assistant message inside a choice.
type ChatStepResponse ¶ added in v0.15.2
type ChatStepResponse struct {
ID string `json:"id"`
Object string `json:"object"`
Model string `json:"model"`
Choices []ChatStepChoice `json:"choices"`
Usage ChatStepUsage `json:"usage"`
}
ChatStepResponse is the parsed response body for a Chat Completions call.
type ChatStepToolCall ¶ added in v0.15.2
type ChatStepToolCall struct {
ID string `json:"id"`
Type string `json:"type"`
Function ChatStepToolCallFunction `json:"function"`
}
ChatStepToolCall is one tool invocation in an assistant message. Note that Function.Arguments is a JSON STRING (not an object) — OpenAI requires this.
type ChatStepToolCallFunction ¶ added in v0.15.2
type ChatStepToolCallFunction struct {
Name string `json:"name"`
Arguments json.RawMessage `json:"arguments"`
}
ChatStepToolCallFunction carries the tool name and the arguments STRING. json.RawMessage is used because the wire value MUST be a JSON string literal containing JSON — re-encoding as Go string would double-escape.
type ChatStepToolDef ¶ added in v0.15.2
type ChatStepToolDef struct {
Type string `json:"type"`
Function ChatStepToolDefFunction `json:"function"`
}
ChatStepToolDef is one entry in the tools array advertised to the model. Parameters is a decoded JSON Schema OBJECT (unlike Arguments which is a string).
type ChatStepToolDefFunction ¶ added in v0.15.2
type ChatStepToolDefFunction struct {
Name string `json:"name"`
Description string `json:"description,omitempty"`
Parameters json.RawMessage `json:"parameters"`
}
ChatStepToolDefFunction carries the tool's name, description, and the JSON-Schema parameters as a decoded object on the wire.
type ChatStepUsage ¶ added in v0.15.2
type ChatStepUsage struct {
PromptTokens int `json:"prompt_tokens"`
CompletionTokens int `json:"completion_tokens"`
TotalTokens int `json:"total_tokens"`
}
ChatStepUsage is the token-usage block.
type Client ¶
type Client struct {
// contains filtered or unexported fields
}
Client implements ai.Provider for OpenAI's APIs. It supports both Chat Completions and Responses APIs.
func NewClient ¶
func NewClient(apiKey string, opts ...ClientOption) *Client
NewClient creates a new OpenAI client.
func (*Client) Generate ¶
Generate implements ai.Provider. It routes to either Chat Completions or Responses API based on the model.
func (*Client) NewHandler ¶
NewHandler creates an ai.Handler wrapping this client.
func (*Client) Step ¶ added in v0.15.2
Step is the multi-turn / tool-aware completion entry point introduced by M-AI-TOOL-LOOP (v0.17.0). It translates req.Messages + req.Tools into OpenAI's Chat Completions tool-use shape and parses tool_calls out of the response into resp.ToolCalls.
Wire shape contract:
- SystemPrompt is prepended as a {"role":"system"} message ONLY when req.Messages does not already contain a system-role entry. If req.Messages has a system message, req.Messages wins and SystemPrompt is dropped.
- User messages with no ToolCallID emit {"role":"user","content":<text>}.
- User messages with a ToolCallID, and Role="tool" messages, emit {"role":"tool","tool_call_id":<id>,"content":<text>}.
- Assistant messages with no ToolCalls emit {"role":"assistant","content":<text>}.
- Assistant messages with non-empty ToolCalls emit {"role":"assistant","content":null,"tool_calls":[...]} when Content is empty, or {"role":"assistant","content":<text>,"tool_calls":[...]} when Content is non-empty. tool_calls[].function.arguments is passed through verbatim — OpenAI emits it as a JSON STRING and round-trips it the same.
Errors are returned as *ai.AIError exclusively. Non-2xx responses are classified via ai.ClassifyHTTPError; transport / context errors via ai.ClassifyError. The inner error.message field of an OpenAI error envelope, when present, is hoisted into the AIError.Message for clarity.
type ClientOption ¶
type ClientOption func(*Client)
ClientOption configures a Client.
func WithAPIType ¶
func WithAPIType(apiType APIType) ClientOption
WithAPIType forces a specific API type (chat or responses).
func WithBaseURL ¶
func WithBaseURL(url string) ClientOption
WithBaseURL sets a custom base URL (useful for testing).
func WithHTTPClient ¶
func WithHTTPClient(client *http.Client) ClientOption
WithHTTPClient sets a custom HTTP client.