openaiexecutor

package
v0.7.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 4, 2026 License: Apache-2.0 Imports: 18 Imported by: 0

Documentation

Overview

Package openaiexecutor provides a multi-turn conversation executor for OpenAI-compatible chat completion APIs, including Vertex AI's partner model endpoint.

The executor manages the full conversation lifecycle: sending prompts, processing tool calls, recording metrics, and extracting structured results. It mirrors the claudeexecutor and googleexecutor patterns, enabling the metaagent to route to any model available via the OpenAI chat completions API.

Basic Usage

client := openai.NewClient(
	option.WithBaseURL(vertexBaseURL),
	option.WithHTTPClient(authedClient),
	option.WithAPIKey("placeholder"),
)

prompt := promptbuilder.MustParse("Analyze {{.Input}}")

exec, err := openaiexecutor.New[MyRequest, MyResponse](client, prompt,
	openaiexecutor.WithModel[MyRequest, MyResponse]("deepseek-ai/deepseek-v3.2-maas"),
	openaiexecutor.WithMaxTokens[MyRequest, MyResponse](32768),
	openaiexecutor.WithTemperature[MyRequest, MyResponse](0.2),
)

Options

Thinking Models

Models that return reasoning_content in their responses (e.g. kimi-k2-thinking-maas) are supported. The executor captures reasoning content into the agent trace automatically.

Submit Result Redirect

When a submit_result tool is configured but the model responds with text instead of calling the tool, the executor sends a redirect message asking the model to call submit_result. Unlike the claudeexecutor, the openaiexecutor does not use a forced tool_choice for the redirect — some models (e.g. reasoning models) return 400 on named tool_choice constraints.

Index

Examples

Constants

View Source
const DefaultMaxTurns = 200

DefaultMaxTurns is the default maximum number of conversation turns before aborting.

Variables

This section is empty.

Functions

This section is empty.

Types

type Interface

type Interface[Request promptbuilder.Bindable, Response any] interface {
	// Execute runs the agent conversation with the given request and tools.
	Execute(ctx context.Context, request Request, tools map[string]openaistool.Metadata[Response]) (Response, error)
}

Interface is the public interface for OpenAI-compatible agent execution.

func New

func New[Request promptbuilder.Bindable, Response any](
	client openai.Client,
	prompt *promptbuilder.Prompt,
	opts ...Option[Request, Response],
) (Interface[Request, Response], error)

New creates a new OpenAI-compatible executor.

Example
prompt := promptbuilder.MustNewPrompt("Summarize: {{input}}")

client := openai.NewClient(
	option.WithAPIKey("placeholder"),
)

exec, err := openaiexecutor.New[myRequest, myResponse](client, prompt,
	openaiexecutor.WithModel[myRequest, myResponse]("google/gemini-2.5-flash"),
	openaiexecutor.WithMaxTokens[myRequest, myResponse](8192),
	openaiexecutor.WithTemperature[myRequest, myResponse](0.1),
)
if err != nil {
	log.Fatal(err)
}

fmt.Printf("executor created: %v\n", exec != nil)
Output:
executor created: true

type Option

type Option[Request promptbuilder.Bindable, Response any] func(*executor[Request, Response]) error

Option is a functional option for configuring the executor.

func WithMaxTokens

func WithMaxTokens[Request promptbuilder.Bindable, Response any](tokens int64) Option[Request, Response]

WithMaxTokens sets the maximum completion tokens.

func WithMaxTurns

func WithMaxTurns[Request promptbuilder.Bindable, Response any](turns int) Option[Request, Response]

WithMaxTurns sets the maximum number of conversation turns before aborting.

Example
prompt := promptbuilder.MustNewPrompt("Process: {{input}}")

client := openai.NewClient(
	option.WithAPIKey("placeholder"),
)

exec, err := openaiexecutor.New[myRequest, myResponse](client, prompt,
	openaiexecutor.WithMaxTurns[myRequest, myResponse](50),
)
if err != nil {
	log.Fatal(err)
}

fmt.Printf("executor created: %v\n", exec != nil)
Output:
executor created: true

func WithModel

func WithModel[Request promptbuilder.Bindable, Response any](model string) Option[Request, Response]

WithModel sets the model name.

Example
prompt := promptbuilder.MustNewPrompt("Analyze: {{input}}")

client := openai.NewClient(
	option.WithAPIKey("placeholder"),
)

exec, err := openaiexecutor.New[myRequest, myResponse](client, prompt,
	openaiexecutor.WithModel[myRequest, myResponse]("deepseek-ai/deepseek-v3.2-maas"),
)
if err != nil {
	log.Fatal(err)
}

fmt.Printf("executor created: %v\n", exec != nil)
Output:
executor created: true

func WithResourceLabels

func WithResourceLabels[Request promptbuilder.Bindable, Response any](labels map[string]string) Option[Request, Response]

WithResourceLabels sets labels for observability attribution. Automatically includes default labels from environment variables:

  • service_name: from K_SERVICE (defaults to "unknown")
  • product: from CHAINGUARD_PRODUCT (defaults to "unknown")
  • team: from CHAINGUARD_TEAM (defaults to "unknown")

func WithRetryConfig

func WithRetryConfig[Request promptbuilder.Bindable, Response any](cfg retry.RetryConfig) Option[Request, Response]

WithRetryConfig sets the retry configuration for transient API errors.

func WithSubmitResultProvider

func WithSubmitResultProvider[Request promptbuilder.Bindable, Response any](provider SubmitResultProvider[Response]) Option[Request, Response]

WithSubmitResultProvider registers the submit_result tool.

func WithSystemInstructions

func WithSystemInstructions[Request promptbuilder.Bindable, Response any](prompt *promptbuilder.Prompt) Option[Request, Response]

WithSystemInstructions sets the system prompt.

func WithTemperature

func WithTemperature[Request promptbuilder.Bindable, Response any](temp float64) Option[Request, Response]

WithTemperature sets the sampling temperature (0.0–2.0).

Example
prompt := promptbuilder.MustNewPrompt("Summarize: {{input}}")

client := openai.NewClient(
	option.WithAPIKey("placeholder"),
)

exec, err := openaiexecutor.New[myRequest, myResponse](client, prompt,
	openaiexecutor.WithTemperature[myRequest, myResponse](0.5),
)
if err != nil {
	log.Fatal(err)
}

fmt.Printf("executor created: %v\n", exec != nil)
Output:
executor created: true

type SubmitResultProvider

type SubmitResultProvider[Response any] func() (openaistool.Metadata[Response], error)

SubmitResultProvider constructs tool metadata for submit_result.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL