llms

package
v0.10.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 24, 2025 License: MIT Imports: 3 Imported by: 1

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func GenerateFromSinglePrompt

func GenerateFromSinglePrompt(ctx context.Context, llm Model, prompt string, options ...CallOption) (string, error)

func TextParts

func TextParts(role schema.ChatMessageType, parts ...string) schema.MessageContent

Types

type CallOption

type CallOption func(*CallOptions)

func WithStreamingFunc

func WithStreamingFunc(streamingFunc func(ctx context.Context, chunk []byte) error) CallOption

WithStreamingFunc specifies the streaming function to use.

type CallOptions

type CallOptions struct {
	Model         string                                        `json:"model"`
	Temperature   float64                                       `json:"temperature"`
	Metadata      map[string]any                                `json:"metadata,omitempty"`
	StreamingFunc func(ctx context.Context, chunk []byte) error `json:"-"`
}

type Model

type Model interface {
	GenerateContent(ctx context.Context, messages []schema.MessageContent, options ...CallOption) (*schema.ContentResponse, error)
	Call(ctx context.Context, prompt string, options ...CallOption) (string, error)
}

type Tokenizer

type Tokenizer interface {
	CountTokens(ctx context.Context, text string) (int, error)
}

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL