Documentation
¶
Index ¶
- type Agent
- type AgentOption
- type BasicAgent
- func (agent *BasicAgent) DetectToolCalls(messages []openai.ChatCompletionMessageParamUnion, ...) (string, []string, string, error)
- func (agent *BasicAgent) DetectToolCallsStream(messages []openai.ChatCompletionMessageParamUnion, ...) (string, []string, string, error)
- func (agent *BasicAgent) GenerateEmbeddingVector(content string) ([]float64, error)
- func (agent *BasicAgent) GetMessages() []openai.ChatCompletionMessageParamUnion
- func (agent *BasicAgent) GetModel() shared.ChatModel
- func (agent *BasicAgent) GetName() string
- func (agent *BasicAgent) GetResponseFormat() openai.ChatCompletionNewParamsResponseFormatUnion
- func (agent *BasicAgent) Run(Messages []openai.ChatCompletionMessageParamUnion) (string, error)
- func (agent *BasicAgent) RunStream(Messages []openai.ChatCompletionMessageParamUnion, ...) (string, error)
- func (agent *BasicAgent) RunStreamWithReasoning(Messages []openai.ChatCompletionMessageParamUnion, ...) (string, string, error)
- func (agent *BasicAgent) RunWithReasoning(Messages []openai.ChatCompletionMessageParamUnion) (string, string, error)
- func (agent *BasicAgent) SetMessages(messages []openai.ChatCompletionMessageParamUnion)
- func (agent *BasicAgent) SetModel(model shared.ChatModel)
- func (agent *BasicAgent) SetName(name string)
- func (agent *BasicAgent) SetResponseFormat(format openai.ChatCompletionNewParamsResponseFormatUnion)
- type ExitStreamCompletionError
- type ExitToolCallsLoopError
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Agent ¶
type Agent interface {
Run(Messages []openai.ChatCompletionMessageParamUnion) (string, error)
RunStream(Messages []openai.ChatCompletionMessageParamUnion, callBack func(content string) error) (string, error)
RunWithReasoning(Messages []openai.ChatCompletionMessageParamUnion) (string, string, error)
RunStreamWithReasoning(Messages []openai.ChatCompletionMessageParamUnion, contentCallback func(content string) error, reasoningCallback func(reasoning string) error) (string, string, error)
DetectToolCalls(messages []openai.ChatCompletionMessageParamUnion, toolCallBack func(functionName string, arguments string) (string, error)) (string, []string, string, error)
DetectToolCallsStream(messages []openai.ChatCompletionMessageParamUnion, toolCallback func(functionName string, arguments string) (string, error), streamCallback func(content string) error) (string, []string, string, error)
GenerateEmbeddingVector(content string) ([]float64, error)
GetMessages() []openai.ChatCompletionMessageParamUnion
SetMessages(messages []openai.ChatCompletionMessageParamUnion)
GetResponseFormat() openai.ChatCompletionNewParamsResponseFormatUnion
SetResponseFormat(format openai.ChatCompletionNewParamsResponseFormatUnion)
GetName() string
SetName(name string)
GetModel() shared.ChatModel
SetModel(model shared.ChatModel)
}
Agent is the interface for AI agents that can interact with OpenAI models and tools
func NewAgent ¶
NewAgent creates a new Agent instance with the specified configuration. It uses the functional options pattern to configure the agent's client, parameters, and other settings.
Parameters:
- ctx: Context for managing the agent's lifecycle and cancellation
- name: Human-readable name for the agent (used for identification/logging)
- options: Variable number of AgentOption functions to configure the agent
Returns:
- Agent: Configured agent instance ready for use
- error: Always nil in current implementation, reserved for future validation
Example usage:
agent, err := NewAgent(ctx, "ChatBot", WithClient(openaiClient), WithParams(completionParams), )
type AgentOption ¶
type AgentOption func(*BasicAgent)
AgentOption is a functional option for configuring BasicAgent instances
func WithClient ¶
func WithClient(client openai.Client) AgentOption
WithClient is a functional option that sets the OpenAI client for an agent. The client handles the connection to the OpenAI API or compatible endpoints.
Parameters:
- client: OpenAI client instance configured with API key, base URL, and other connection settings
Returns:
- AgentOption: A function that applies the client to an Agent during construction
Example usage:
client := openai.NewClient(option.WithAPIKey("your-api-key"))
agent := NewAgent(ctx, "MyAgent", WithClient(client))
func WithEmbeddingParams ¶
func WithEmbeddingParams(embeddingParams openai.EmbeddingNewParams) AgentOption
WithEmbeddingParams sets the embedding model parameters for the agent's vector generation
func WithParams ¶
func WithParams(params openai.ChatCompletionNewParams) AgentOption
WithParams is a functional option that sets the chat completion parameters for an agent. This includes model settings, temperature, tools, messages, and other completion options.
Parameters:
- params: OpenAI chat completion parameters including model, temperature, tools, messages, etc.
Returns:
- AgentOption: A function that applies the parameters to an Agent during construction
Example usage:
agent := NewAgent(ctx, "MyAgent", WithParams(openai.ChatCompletionNewParams{
Model: "gpt-4",
Temperature: openai.Opt(0.7),
Tools: myTools,
}))
type BasicAgent ¶ added in v0.0.2
type BasicAgent struct {
Client openai.Client
Params openai.ChatCompletionNewParams
EmbeddingParams openai.EmbeddingNewParams
Name string
Avatar string
Color string // used for UI display
// contains filtered or unexported fields
}
BasicAgent represents a basic implementation of Agent with OpenAI client configuration and UI properties
func (*BasicAgent) DetectToolCalls ¶ added in v0.0.2
func (agent *BasicAgent) DetectToolCalls(messages []openai.ChatCompletionMessageParamUnion, toolCallBack func(functionName string, arguments string) (string, error)) (string, []string, string, error)
DetectToolCalls processes a conversation with tool calls support. It handles the complete tool calling workflow: detecting tool calls, executing them via callback, and managing the conversation history until completion.
Parameters:
- messages: Initial conversation messages to start with
- callBack: Function to execute when tools are called. Takes functionName and arguments (JSON string), returns the result as a JSON string
Returns:
- finishReason: The reason the conversation ended ("stop" for normal completion, other values for errors)
- results: Slice of all tool execution results (JSON strings)
- lastAssistantMessage: The final message from the assistant when conversation ends normally
- error: Any error that occurred during processing
func (*BasicAgent) DetectToolCallsStream ¶ added in v0.0.2
func (agent *BasicAgent) DetectToolCallsStream(messages []openai.ChatCompletionMessageParamUnion, toolCallback func(functionName string, arguments string) (string, error), streamCallback func(content string) error) (string, []string, string, error)
DetectToolCallsStream processes a conversation with tool calls support using streaming. It handles the complete tool calling workflow with real-time streaming of assistant responses, detecting tool calls, executing them via callback, and managing the conversation history until completion.
Parameters:
- messages: Initial conversation messages to start with
- streamCallback: Function called for each streaming chunk (content string) -> error
- toolCallback: Function to execute when tools are called. Takes functionName and arguments (JSON string), returns the result as a JSON string
Returns:
- finishReason: The reason the conversation ended ("stop" for normal completion, other values for errors)
- results: Slice of all tool execution results (JSON strings)
- lastAssistantMessage: The final message from the assistant when conversation ends normally
- error: Any error that occurred during processing
func (*BasicAgent) GenerateEmbeddingVector ¶ added in v0.0.2
func (agent *BasicAgent) GenerateEmbeddingVector(content string) ([]float64, error)
GenerateEmbeddingVector creates a vector embedding for the given text content using the agent's embedding model
func (*BasicAgent) GetMessages ¶ added in v0.0.2
func (agent *BasicAgent) GetMessages() []openai.ChatCompletionMessageParamUnion
GetMessages returns the messages from the agent's parameters
func (*BasicAgent) GetModel ¶ added in v0.0.5
func (agent *BasicAgent) GetModel() shared.ChatModel
GetModel returns the model from the agent's parameters
func (*BasicAgent) GetName ¶ added in v0.0.4
func (agent *BasicAgent) GetName() string
GetName returns the name of the agent
func (*BasicAgent) GetResponseFormat ¶ added in v0.0.3
func (agent *BasicAgent) GetResponseFormat() openai.ChatCompletionNewParamsResponseFormatUnion
GetResponseFormat returns the response format from the agent's parameters
func (*BasicAgent) Run ¶ added in v0.0.2
func (agent *BasicAgent) Run(Messages []openai.ChatCompletionMessageParamUnion) (string, error)
Run executes a chat completion with the provided messages. It sends the messages to the model and returns the first choice's content.
Parameters:
- Messages: The conversation messages to send to the model
Returns:
- string: The content of the first choice from the model's response
- error: Any error that occurred during the completion request or if no choices are returned
This method temporarily sets the agent's Messages parameter and makes a synchronous completion request. It returns an error if the completion fails or if the response contains no choices.
func (*BasicAgent) RunStream ¶ added in v0.0.2
func (agent *BasicAgent) RunStream(Messages []openai.ChatCompletionMessageParamUnion, callBack func(content string) error) (string, error)
RunStream executes a streaming chat completion with the given messages. It streams the response content in real-time by calling the provided callback for each chunk. The complete response is also accumulated and returned at the end.
Parameters:
- Messages: The conversation messages to send to the model
- callBack: Function called for each streaming chunk. Takes content string, returns error to stop streaming
Returns:
- string: The complete accumulated response content from all chunks
- error: Any error that occurred during streaming or from the callback
The streaming stops early if:
- The callback returns a non-nil error
- A stream error occurs
- Stream closing fails
func (*BasicAgent) RunStreamWithReasoning ¶ added in v0.0.2
func (agent *BasicAgent) RunStreamWithReasoning(Messages []openai.ChatCompletionMessageParamUnion, contentCallback func(content string) error, reasoningCallback func(reasoning string) error) (string, string, error)
RunStreamWithReasoning. executes a streaming chat completion with the given messages. It streams the response content and reasoning in real-time by calling the provided callbacks for each chunk. The complete response content and reasoning are also accumulated and returned at the end.
Parameters:
- Messages: The conversation messages to send to the model
- contentCallback: Function called for each content streaming chunk. Takes content string, returns error to stop streaming
- reasoningCallback: Function called for each reasoning streaming chunk. Takes reasoning string, returns error to stop streaming
Returns:
- string: The complete accumulated response content from all chunks
- string: The complete accumulated reasoning content from all chunks
- error: Any error that occurred during streaming or from the callbacks
The streaming stops early if:
- Either callback returns a non-nil error
- A stream error occurs
- Stream closing fails
func (*BasicAgent) RunWithReasoning ¶ added in v0.0.2
func (agent *BasicAgent) RunWithReasoning(Messages []openai.ChatCompletionMessageParamUnion) (string, string, error)
RunWithReasoning executes a chat completion with the provided messages. It sends the messages to the model and returns the first choice's content and reasoning.
Parameters:
- Messages: The conversation messages to send to the model
Returns:
- string: The content of the first choice from the model's response
- string: The reasoning content from the model's response
- error: Any error that occurred during the completion request or if no choices are returned
This method temporarily sets the agent's Messages parameter and makes a synchronous completion request. It returns an error if the completion fails or if the response contains no choices.
func (*BasicAgent) SetMessages ¶ added in v0.0.2
func (agent *BasicAgent) SetMessages(messages []openai.ChatCompletionMessageParamUnion)
SetMessages sets the messages in the agent's parameters
func (*BasicAgent) SetModel ¶ added in v0.0.5
func (agent *BasicAgent) SetModel(model shared.ChatModel)
SetModel sets the model in the agent's parameters
func (*BasicAgent) SetName ¶ added in v0.0.4
func (agent *BasicAgent) SetName(name string)
SetName sets the name of the agent
func (*BasicAgent) SetResponseFormat ¶ added in v0.0.3
func (agent *BasicAgent) SetResponseFormat(format openai.ChatCompletionNewParamsResponseFormatUnion)
SetResponseFormat sets the response format in the agent's parameters
type ExitStreamCompletionError ¶
type ExitStreamCompletionError struct {
Message string
}
ExitStreamCompletionError signals early termination of streaming completions
func (*ExitStreamCompletionError) Error ¶
func (e *ExitStreamCompletionError) Error() string
Error implements the error interface for ExitStreamCompletionError
type ExitToolCallsLoopError ¶
type ExitToolCallsLoopError struct {
Message string
}
ExitToolCallsLoopError signals early termination of tool call processing loops
func (*ExitToolCallsLoopError) Error ¶
func (e *ExitToolCallsLoopError) Error() string
Error implements the error interface for ExitToolCallsLoopError