Documentation
¶
Overview ¶
Package ollama provides an Ollama LLM implementation.
This implementation is strictly aligned with ADK-Go's model architecture:
- Uses Ollama's Chat API (/api/chat)
- Unified GenerateContent method with stream boolean
- Returns iter.Seq2[*Response, error]
- Uses StreamingAggregator for streaming with Partial flag
- Proper handling of tool calls
- Support for thinking models via `think` parameter
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Client ¶
type Client struct {
// contains filtered or unexported fields
}
Client is an Ollama LLM implementation. Implements model.LLM interface aligned with ADK-Go.
func (*Client) GenerateContent ¶
func (c *Client) GenerateContent(ctx context.Context, req *model.Request, stream bool) iter.Seq2[*model.Response, error]
GenerateContent produces responses for the given request. This is the ADK-Go aligned interface.
When stream=false:
- Yields exactly one Response with complete content, Partial=false
When stream=true:
- Yields multiple partial Responses (Partial=true) for real-time UI updates
- Finally yields aggregated Response (Partial=false) for session persistence
type Config ¶
type Config struct {
// BaseURL is the Ollama server URL (default: http://localhost:11434)
BaseURL string
// Model is the model name (e.g., "llama3.2", "mistral", "codellama")
Model string
// Temperature controls randomness (0-2)
Temperature *float64
// TopP for nucleus sampling
TopP *float64
// TopK for top-k sampling
TopK *int
// NumPredict limits the number of tokens to predict
NumPredict *int
// NumCtx sets the context window size
NumCtx *int
// Seed for reproducible outputs
Seed *int
// KeepAlive controls how long the model stays loaded (default: "5m")
KeepAlive string
// Timeout for HTTP requests
Timeout time.Duration
// MaxRetries for HTTP requests with retry/backoff
MaxRetries int
// EnableThinking enables thinking for supported models
EnableThinking bool
// MaxToolOutputLength limits the length of tool outputs.
MaxToolOutputLength int
}
Config configures the Ollama client.
type Option ¶
type Option func(*Config)
Option configures the Ollama client.
func WithTemperature ¶
WithTemperature sets the temperature.
Click to show internal directories.
Click to hide internal directories.