Documentation
¶
Index ¶
- func BuildSQLPrompt(tables []TableInfo, now time.Time, columnHints string, extraContext string) string
- func BuildSummaryPrompt(question, sql, resultsTable string, now time.Time, extraContext string) string
- func BuildSystemPrompt(tables []TableInfo, dataSummary string, now time.Time, extraContext string) string
- func ExtractSQL(raw string) string
- func FormatResultsTable(columns []string, rows [][]string) string
- func FormatSQL(sql string, maxWidth int) string
- type Client
- func (c *Client) BaseURL() string
- func (c *Client) ChatComplete(ctx context.Context, messages []Message) (string, error)
- func (c *Client) ChatStream(ctx context.Context, messages []Message) (<-chan StreamChunk, error)
- func (c *Client) ListModels(ctx context.Context) ([]string, error)
- func (c *Client) Model() string
- func (c *Client) Ping(ctx context.Context) error
- func (c *Client) PullModel(ctx context.Context, model string) (*PullScanner, error)
- func (c *Client) SetModel(model string)
- func (c *Client) Timeout() time.Duration
- type ColumnInfo
- type Message
- type PullChunk
- type PullScanner
- type StreamChunk
- type TableInfo
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func BuildSQLPrompt ¶
func BuildSQLPrompt( tables []TableInfo, now time.Time, columnHints string, extraContext string, ) string
BuildSQLPrompt creates a system prompt that instructs the LLM to translate a natural-language question into a single SELECT statement. The prompt includes the current date, the full schema as DDL, and few-shot examples. If extraContext is non-empty, it's appended at the end.
func BuildSummaryPrompt ¶
func BuildSummaryPrompt( question, sql, resultsTable string, now time.Time, extraContext string, ) string
BuildSummaryPrompt creates a system prompt for the second stage: turning SQL results into a concise natural-language answer. If extraContext is non-empty, it's appended at the end.
func BuildSystemPrompt ¶
func BuildSystemPrompt( tables []TableInfo, dataSummary string, now time.Time, extraContext string, ) string
BuildSystemPrompt assembles the old single-stage system prompt, used as a fallback when the two-stage pipeline fails. If extraContext is non-empty, it's appended at the end.
func ExtractSQL ¶
ExtractSQL pulls the SQL statement from the LLM's response, handling both bare SQL and fenced code blocks. Returns the trimmed SQL string.
func FormatResultsTable ¶
FormatResultsTable renders query results as a pipe-delimited text table, compact enough for an LLM context window.
func FormatSQL ¶
FormatSQL pretty-prints a SQL SELECT statement for human reading. It tokenizes the input, uppercases keywords, aligns clauses, indents continuation lines, and wraps SELECT column lists one-per-line when there are multiple columns. maxWidth controls soft line wrapping; pass 0 to disable wrapping.
Types ¶
type Client ¶
type Client struct {
// contains filtered or unexported fields
}
Client talks to an OpenAI-compatible API endpoint (Ollama, llama.cpp, LM Studio, etc.) for local LLM inference.
func NewClient ¶
NewClient creates an LLM client targeting the given OpenAI-compatible endpoint and model. The timeout controls quick operations like ping and model listing.
func (*Client) ChatComplete ¶
ChatComplete sends a non-streaming chat completion request and returns the full response content. Used for structured output like SQL generation where the caller needs the complete result before proceeding.
func (*Client) ChatStream ¶
ChatStream sends a chat completion request and returns a channel that emits streamed response chunks. The channel closes when the response completes or the context is cancelled. Callers must drain the channel.
func (*Client) ListModels ¶
ListModels fetches the available model IDs from the inference server.
func (*Client) Ping ¶
Ping checks whether the API is reachable and the configured model is available. Returns a user-friendly error if not.
func (*Client) PullModel ¶
PullModel initiates a model pull via the Ollama native API. The base URL is assumed to be the OpenAI-compatible endpoint (e.g. "http://localhost:11434/v1"); this method strips "/v1" to reach the Ollama native API at "/api/pull".
type ColumnInfo ¶
ColumnInfo describes a single column in a table.
type PullChunk ¶
type PullChunk struct {
Status string `json:"status"`
Digest string `json:"digest"`
Total int64 `json:"total"`
Completed int64 `json:"completed"`
Error string `json:"error"` // Ollama streams errors in this field
}
PullChunk is a single progress update from the Ollama pull API.
type PullScanner ¶
type PullScanner struct {
// contains filtered or unexported fields
}
PullScanner wraps the streaming response from the Ollama pull API.
func (*PullScanner) Next ¶
func (ps *PullScanner) Next() (*PullChunk, error)
Next returns the next progress chunk, or nil at EOF.
type StreamChunk ¶
StreamChunk is a single piece of a streaming chat response.
type TableInfo ¶
type TableInfo struct {
Name string
Columns []ColumnInfo
}
TableInfo describes a database table for context injection.