Documentation
¶
Index ¶
- Variables
- func Function[A any](f func(args A) (string, error)) internal.FunctionBody
- func NewTool[A any](name, description string, fn internal.FunctionBody) internal.Tool
- func WithDefaultModel(model string) llmOption
- func WithDefaultProvider(provider Llm) llmOption
- func WithHttpClient(client *http.Client) llmOption
- func WithProvider(name string, provider Llm) llmOption
- type AsyncResponse
- type Candidater
- type FinishReason
- type History
- type InnerResponse
- type Llm
- type Llmberjack
- type Message
- type MessageRole
- type MessageType
- type Request
- func (r Request[T]) CreateThread() Request[T]
- func (r Request[T]) Do(ctx context.Context, llm *Llmberjack) (*Response[T], error)
- func (r Request[T]) FromCandidate(c Candidater, idx int) Request[T]
- func (r Request[T]) InThread(threadId *ThreadId) Request[T]
- func (r Request[T]) OverrideResponseSchema(schema jsonschema.Schema) Request[T]
- func (r Request[T]) ProviderRequestOptions(provider Llm) internal.ProviderRequestOptions
- func (r Request[T]) SkipSaveInput() Request[T]
- func (r Request[T]) SkipSaveOutput() Request[T]
- func (r Request[T]) ToRequest() innerRequest
- func (r Request[T]) WithInstruction(parts ...string) Request[T]
- func (r Request[T]) WithInstructionFiles(files ...string) Request[T]
- func (r Request[T]) WithInstructionReader(parts ...io.Reader) Request[T]
- func (r Request[T]) WithJson(role MessageRole, data any) Request[T]
- func (r Request[T]) WithMaxCandidates(candidates int) Request[T]
- func (r Request[T]) WithMaxTokens(tokens int) Request[T]
- func (r Request[T]) WithModel(model string) Request[T]
- func (r Request[T]) WithModelFunc(fn func(provider Llm, providerName *string) string) Request[T]
- func (r Request[T]) WithProvider(name string) Request[T]
- func (r Request[T]) WithProviderOptions(opts internal.ProviderRequestOptions) Request[T]
- func (r Request[T]) WithSchemaDescription(name, description string) Request[T]
- func (r Request[T]) WithSerializable(role MessageRole, ser Serializer, input any) Request[T]
- func (r Request[T]) WithTemperature(temp float64) Request[T]
- func (r Request[T]) WithText(role MessageRole, parts ...string) Request[T]
- func (r Request[T]) WithTextReader(role MessageRole, parts ...io.Reader) Request[T]
- func (r Request[T]) WithThinking(thinking bool) Request[T]
- func (r Request[T]) WithToolExecution(tools ...internal.Tool) Request[T]
- func (r Request[T]) WithTools(tools ...internal.Tool) Request[T]
- func (r Request[T]) WithTopP(topp float64) Request[T]
- type Requester
- type Response
- type ResponseCandidate
- type ResponseGrounding
- type ResponseGroundingSource
- type ResponseToolCall
- type Serializer
- type ThreadId
Constants ¶
This section is empty.
Variables ¶
var ( // Serializers is a global object that holds singleton of library-provided // serializers Serializers = struct { Json jsonSerializer Csv csvSerializer }{ Json: jsonSerializer{}, Csv: csvSerializer{}, } )
Functions ¶
func Function ¶
func Function[A any](f func(args A) (string, error)) internal.FunctionBody
Function is a wrapper for the code executed in a tool.
It is generic in A, which is a type containing the tool arguments. It follows the same idioms as a response schema.
func NewTool ¶
NewTool creates a new tool.
It is generic in the type of the tool arguments, and takes the tool name and description.
The function body should be wrapped in `Function`.
func WithDefaultModel ¶
func WithDefaultModel(model string) llmOption
WithDefaultModel sets the model to use if not specified in a particular request. It is the caller's responsibility to ensure the requested model is available on the configured provider.
func WithDefaultProvider ¶
func WithDefaultProvider(provider Llm) llmOption
WithDefaultProvider sets what LLM provider to use for communication.
func WithHttpClient ¶
WithHttpClient sets a custom HTTP clients to be used.
If a provider does not support overriding the HTTP client, this will be ignored.
func WithProvider ¶
WithProvider registers a provider.
The first one to be registered will become the default, unless a default was already or is defined later with `SetDefaultProvider`.
Types ¶
type AsyncResponse ¶
func All ¶
func All[T any](ctx context.Context, llm *Llmberjack, reqs ...Request[T]) []AsyncResponse[T]
type Candidater ¶
type Candidater interface {
NumCandidates() int
Candidate(int) (*ResponseCandidate, error)
Thread() *ThreadId
}
Candidater represents a type that can have several candidates.
type FinishReason ¶
type FinishReason string
const ( FinishReasonStop FinishReason = "stop" FinishReasonMaxTokens FinishReason = "max_tokens" FinishReasonContentFilter FinishReason = "content_filter" )
type History ¶
type History[T any] struct { // contains filtered or unexported fields }
History manages the conversation context by storing a sequence of messages. It is generic in type `T`, where `T` represents the specific message format required by a particular LLM provider (e.g., OpenAI's Message or AIStudio's ChatMessage). This allows the adapter to maintain conversation state across multiple requests.
func (*History[T]) Clear ¶
Clear empties the entire conversation history, effectively starting a new conversation. This also clears any system instructions that were part of the history.
type InnerResponse ¶
type InnerResponse struct {
Id string
Model string
Candidates []ResponseCandidate
Created time.Time
}
InnerResponse is a response from a provider.
type Llm ¶
type Llm interface {
// Init initializes the LLM provider with the given adapter configuration.
// It is called once when the provider is added to the adapter.
Init(llm internal.Adapter) error
// ResetContext clears the conversation history for the specific LLM provider.
// This allows starting a new conversation without re-initializing the provider.
ResetThread(*ThreadId)
// CopyThread copies all history from the provided thread into a new, discrete one.
CopyThread(*ThreadId) *ThreadId
// CloseThread deletes a thread and associated resources.
CloseThread(*ThreadId)
// ChatCompletion sends a chat completion request to the LLM provider.
// It takes a context, the adapter's internal configuration, and a Requester
// to retrieve the request.
ChatCompletion(context.Context, internal.Adapter, Requester) (*InnerResponse, error)
// RequestOptionsType returns the reflect.Type of the provider-specific
// request options struct. This is used for type checking and reflection
// when processing custom request options.
RequestOptionsType() reflect.Type
}
Llm defines the interface that all LLM providers must implement. It provides a contract for initializing, managing context, and performing chat completions with different language models.
type Llmberjack ¶
type Llmberjack struct {
// contains filtered or unexported fields
}
Llmberjack is the main entrypoint for interacting with different LLM providers. It provides a unified interface to send requests and receive responses.
func New ¶
func New(opts ...llmOption) (*Llmberjack, error)
New creates a new Llmberjack with the given options. It initializes the specified LLM provider and returns a configured adapter.
Example usage:
adapter, err := llmberjack.New(
llmberjack.WithDefaultProvider(provider),
llmberjack.WithDefaultModel("gpt-4"),
llmberjack.WithApiKey("...")
)
func (Llmberjack) DefaultModel ¶
func (llm Llmberjack) DefaultModel() string
func (*Llmberjack) GetProvider ¶
func (llm *Llmberjack) GetProvider(requestProvider *string) (Llm, error)
GetProvider retrieves an LLM provider based on the given provider name. It accepts the provider requested in a specific request, which will override the default provider. If the provider argument is nil, it will return the configured default provider.
func (Llmberjack) HttpClient ¶
func (llm Llmberjack) HttpClient() *http.Client
type Message ¶
type Message struct {
// Type is the binary representation of the message
Type MessageType
// Role represent "who" (or "what") composed a message. Note that all
// provider will not support all of the roles, but must still account for
// them.
Role MessageRole
// Parts are subdivision of a specific message.
Parts []io.Reader
// Tool is an instruction from a tool function to be called. This only makes
// sense in response messages.
Tool *ResponseToolCall
}
Message is an abstraction over a "prompt".
type MessageRole ¶
type MessageRole int
const ( RoleSystem MessageRole = iota RoleUser RoleAi RoleTool )
type Request ¶
type Request[T any] struct { // contains filtered or unexported fields }
Request represent a request to be sent the a provider, in the context of the current conversation.
It contains an `innerRequest` built by the caller, but also optionally tracks which candidate it responds to, in order to link tool responses to their corresponding tool calls.
It is generic in T which it will use to unmarshal the response into a typed struct.
func NewRequest ¶
NewRequest creates a builder to craft a request to sent to an LLM provider.
It provides a series of methods to chain-call in order to add context, prompts and configuration.
It is generic in T, which will be used to generate a JSONSchema to be used as a response schema in the request. See [this](https://github.com/invopop/jsonschema) for more information about how to write the structs.
Example usage:
resp, err := llmberjack.NewRequest[Output](). WithText(llmberjack.RoleUser, "How are you today?"). Do(ctx, llm)
func NewUntypedRequest ¶
NewUntypedRequest is a helper method to create a `Request` which will be a raw string, without unmarshalling the response into a struct.
func (Request[T]) CreateThread ¶
func (Request[T]) Do ¶
Do executes a built request on the configured provider.
It will return a response generic over the configured typed on the Request, or an error.
func (Request[T]) FromCandidate ¶
func (r Request[T]) FromCandidate(c Candidater, idx int) Request[T]
FromCandidate selects a candidate/choice from a previous response as the base for this Request.
Selecting a candidate will have two effects:
- Adding the candidate to the history (if the request was in a thread)
- Using this response tool calls as a basis for tool responses, if applicable.
Example usage:
resp, err := llmberjack.NewRequest[Output](). FromCandidate(previousResp, 0). WithText(llmberjack.RoleUser, "How are you today?"). Do(ctx, llm)
func (Request[T]) OverrideResponseSchema ¶
func (r Request[T]) OverrideResponseSchema(schema jsonschema.Schema) Request[T]
func (Request[T]) ProviderRequestOptions ¶
func (r Request[T]) ProviderRequestOptions(provider Llm) internal.ProviderRequestOptions
func (Request[T]) SkipSaveInput ¶
func (Request[T]) SkipSaveOutput ¶
func (Request[T]) WithInstruction ¶
WithInstruction adds a system prompt to the request.
Note that if the adapter is configured to save history, this need only be added on the first request sent to the provider.
func (Request[T]) WithInstructionFiles ¶
func (Request[T]) WithInstructionReader ¶
WithInstructionReader adds a system prompt read from an io.Reader.
func (Request[T]) WithMaxCandidates ¶
WithMaxCandidates limits how many candidate responses the provider is able to provide.
Most providers default to 1 for this value.
func (Request[T]) WithMaxTokens ¶
WithMaxTokens limits how many token a provider can emit for its completion.
func (Request[T]) WithModel ¶
WithModel overrides the model used for this specific request.
If not provided, the default model set on the provider, then the adapter will be used.
func (Request[T]) WithModelFunc ¶
WithModelFunc executes a callback to determine the model to use.
Is it useful notably when having multiple provider, to be able to select the model depending on which provider was actually selected to execute the request. The callback is passed the actual instance of the selected provider, as well as its registered name, if applicable.
func (Request[T]) WithProvider ¶
func (Request[T]) WithProviderOptions ¶
func (r Request[T]) WithProviderOptions(opts internal.ProviderRequestOptions) Request[T]
WithProviderOptions set provider-specific options.
Some options are not going to be supported by all providers, so they will usually defined a type representing options specific to them. This function allows to define those. One set of option can be defined by provider type.
func (Request[T]) WithSchemaDescription ¶
func (Request[T]) WithSerializable ¶
func (r Request[T]) WithSerializable(role MessageRole, ser Serializer, input any) Request[T]
func (Request[T]) WithTemperature ¶
WithTemperature sets custom temperature value to be used.
Default value depends on the model.
func (Request[T]) WithText ¶
func (r Request[T]) WithText(role MessageRole, parts ...string) Request[T]
WithText adds a text message to the Request.
Each provided `string` will be added as a discrete `part` in the message. The message will be declared as text content.
func (Request[T]) WithTextReader ¶
func (r Request[T]) WithTextReader(role MessageRole, parts ...io.Reader) Request[T]
WithTextReader adds a message to the Request read from an io.Reader.
func (Request[T]) WithThinking ¶
func (Request[T]) WithToolExecution ¶
WithToolExecution executes the requested tools and add their output to the Request.
It will also take care of adding the matching tool definitions to the Request, so there is not need to also call `WithTool`.
Note that this requires that a candidate from the previous response was selected by calling `FromCandidate()` before this function, to determine which function the provider asked to be called.
func (Request[T]) WithTools ¶
WithTools adds tool definitions to the request.
Tools are represented as a type-safe function taking its configuration as input, and return a string and an error. The JSONSchema sent to the provider will be generated from the input type.
Example usage:
resp, err := llmberjack.NewRequest[Output]().
WithText(llmberjack.RoleUser, "How are you today?").
WithTool(llmberjack.NewTool[WeatherParams]("get_weather", "Get weather at location", llmberjack.Function(func(args WeatherParams) (string, error) {
return "Good weather!", nil
})).
Do(ctx, llm)
type Requester ¶
type Requester interface {
// ToRequest unwraps the actual request.
ToRequest() innerRequest
// ProviderRequestOptions extracts the provider-specific configuration
// options for a given provider. This is called from each provider to
// retrieve its specific configuration in a type-safe manner.
ProviderRequestOptions(provider Llm) internal.ProviderRequestOptions
}
Requester represents something that can be turned into a request.
Used internally to abstract over request types across packages.
type Response ¶
type Response[T any] struct { InnerResponse ThreadId *ThreadId }
Response[T] is a wrapper around a provider response.
It wraps it so it can be generic without the provider's response to also be, and provide typed methods to unmarshal the response, if necessary.
func (Response[T]) Get ¶
Get will return the deserialized output for a candidate.
It will parse the response and deserialize it to the requested type, or return an error if it cannot.
func (Response[T]) NumCandidates ¶
type ResponseCandidate ¶
type ResponseCandidate struct {
Text string
FinishReason FinishReason
ToolCalls []ResponseToolCall
Grounding *ResponseGrounding
Thoughts string
// SelectCandidate is a callback that is called when a candidate is
// "selected" (when the conversation will continue from it).
SelectCandidate func()
}
ResponseCandidate represent a candidate response from a provider.
type ResponseGrounding ¶
type ResponseGrounding struct {
Searches []string
Sources []ResponseGroundingSource
Snippets []string
}
type ResponseGroundingSource ¶
type ResponseToolCall ¶
ResponseToolCall is a request from a provider to execute a tool.
type ThreadId ¶
type ThreadId struct {
// contains filtered or unexported fields
}
ThreadId uniquely represents a conversation with an LLM.
It is used to mark and identify a specific conversation and accumulate its history. ThreadsId inherent identifiers (their memory address is the identifier), so their value cannot be copied, only pointers should be passed around.