Documentation
¶
Index ¶
- func GenerateFromSinglePrompt(ctx context.Context, llm Model, prompt string, options ...CallOption) (string, error)
- func TextParts(role schema.ChatMessageType, parts ...string) schema.MessageContent
- type CallOption
- func WithMaxTokens(maxTokens int) CallOption
- func WithModel(model string) CallOption
- func WithSeed(seed int) CallOption
- func WithStopWords(stopWords []string) CallOption
- func WithStreamingFunc(streamingFunc func(ctx context.Context, chunk []byte) error) CallOption
- func WithTemperature(temperature float64) CallOption
- func WithTopK(topK int) CallOption
- func WithTopP(topP float64) CallOption
- type CallOptions
- type Model
- type Tokenizer
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func TextParts ¶
func TextParts(role schema.ChatMessageType, parts ...string) schema.MessageContent
Types ¶
type CallOption ¶
type CallOption func(*CallOptions)
func WithMaxTokens ¶ added in v0.15.0
func WithMaxTokens(maxTokens int) CallOption
WithMaxTokens specifies the maximum number of tokens to generate.
func WithModel ¶ added in v0.15.0
func WithModel(model string) CallOption
WithModel specifies the model to use.
func WithSeed ¶ added in v0.15.0
func WithSeed(seed int) CallOption
WithSeed specifies the seed to use.
func WithStopWords ¶ added in v0.15.0
func WithStopWords(stopWords []string) CallOption
WithStopWords specifies the stop words to use.
func WithStreamingFunc ¶
func WithStreamingFunc(streamingFunc func(ctx context.Context, chunk []byte) error) CallOption
WithStreamingFunc specifies the streaming function to use.
func WithTemperature ¶ added in v0.15.0
func WithTemperature(temperature float64) CallOption
WithTemperature specifies the temperature to use.
func WithTopK ¶ added in v0.15.0
func WithTopK(topK int) CallOption
WithTopK specifies the top-k value to use.
func WithTopP ¶ added in v0.15.0
func WithTopP(topP float64) CallOption
WithTopP specifies the top-p value to use.
type CallOptions ¶
type CallOptions struct {
Model string `json:"model"`
Temperature float64 `json:"temperature"`
MaxTokens int `json:"max_tokens"`
StopWords []string `json:"stop_words"`
TopP float64 `json:"top_p"`
TopK int `json:"top_k"`
Seed int `json:"seed"`
Metadata map[string]any `json:"metadata,omitempty"`
StreamingFunc func(ctx context.Context, chunk []byte) error `json:"-"`
}
type Model ¶
type Model interface {
GenerateContent(ctx context.Context, messages []schema.MessageContent, options ...CallOption) (*schema.ContentResponse, error)
Call(ctx context.Context, prompt string, options ...CallOption) (string, error)
}
Click to show internal directories.
Click to hide internal directories.