Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Client ¶
type Client struct {
// contains filtered or unexported fields
}
Client is an Ollama LLM client.
Ollama allows you to run large language models locally on your machine. This client connects to a local Ollama server (default: localhost:11434) and provides the same interface as cloud-based providers.
No API key is required. The client uses a longer default timeout (60s) since local models may take more time to generate responses.
func DefaultOllamaClient ¶ added in v0.1.5
DefaultOllamaClient 创建默认的 Ollama 客户端 使用 qwen3.5:0.8b 模型,适用于本地智能分块和图提取
Click to show internal directories.
Click to hide internal directories.