Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type AIResponse ¶
type CallOption ¶
type CallOption func(*CallOptions)
CallOption is a function that configures a CallOptions
func WithLang ¶
func WithLang(lang string) CallOption
func WithMaxTokens ¶
func WithMaxTokens(maxTokens int) CallOption
WithMaxTokens is an option for LLM.Call.
func WithOptions ¶
func WithOptions(options CallOptions) CallOption
WithOptions is an option for LLM.Call.
func WithStopWords ¶
func WithStopWords(stopWords []string) CallOption
WithStopWords is an option for LLM.Call.
func WithTemperature ¶
func WithTemperature(temperature float64) CallOption
WithTemperature is an option for LLM.Call.
type CallOptions ¶
type CallOptions struct {
// Model is the models to use.
Model string `json:"models"`
// MaxTokens is the maximum number of tokens to generate.
MaxTokens int `json:"maxTokens"`
// Temperature is the temperature for sampling, between 0 and 1.
Temperature float64 `json:"temperature"`
// StopWords is a list of words to stop on.
StopWords []string `json:"stopWords"`
// Lang is the programming language type.
Lang string `json:"lang"`
}
CallOptions is a set of options for LLM.Call.
type LLM ¶
type LLM interface {
Call(ctx context.Context, prompt string, options ...CallOption) (string, error)
Generate(ctx context.Context, prompts []string, options ...CallOption) (*AIResponse, error)
}
LLM is a Large Language Model.
Click to show internal directories.
Click to hide internal directories.