chains

package
v0.26.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 25, 2026 License: MIT Imports: 10 Imported by: 0

Documentation

Overview

Package chains provides composable chains for LLM workflows. Chains combine prompts, LLMs, retrievers, and other components into reusable pipelines for tasks like RAG, validation, and map-reduce.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type LLMChain added in v0.15.0

type LLMChain[T any] struct {
	// LLM is the language model to use for generation.
	LLM llms.Model
	// Prompt is the template to render before calling the LLM.
	Prompt prompts.PromptTemplate
	// Parser converts the LLM output to type T.
	Parser schema.OutputParser[T]
	// CallOptions are options passed to the LLM call.
	CallOptions []llms.CallOption
}

LLMChain combines a prompt template, an LLM, and an output parser into a single callable unit. It renders the prompt, calls the LLM, and parses the output into a typed result.

func NewLLMChain added in v0.15.0

func NewLLMChain[T any](llm llms.Model, prompt prompts.PromptTemplate, opts ...LLMChainOption[T]) (*LLMChain[T], error)

NewLLMChain creates a new LLMChain. If no parser is provided via options, a StringParser is used (only valid when T is string). Returns an error if llm or prompt is nil.

func (*LLMChain[T]) Call added in v0.15.0

func (c *LLMChain[T]) Call(ctx context.Context, vars map[string]string) (T, error)

Call renders the prompt with the provided variables, calls the LLM, and parses the response. Returns an error if no parser is configured and T is not string.

func (*LLMChain[T]) GetPrompt added in v0.15.0

func (c *LLMChain[T]) GetPrompt(vars map[string]string) string

GetPrompt renders the prompt without calling the LLM. Useful for debugging.

type LLMChainOption added in v0.15.0

type LLMChainOption[T any] func(*LLMChain[T])

LLMChainOption configures an LLMChain.

func WithLLMCallOptions added in v0.15.0

func WithLLMCallOptions[T any](opts ...llms.CallOption) LLMChainOption[T]

WithLLMCallOptions sets LLM call options (temperature, max tokens, etc.)

func WithOutputParser added in v0.15.0

func WithOutputParser[T any](parser schema.OutputParser[T]) LLMChainOption[T]

WithOutputParser sets a custom output parser for the chain.

type MapReduceChain added in v0.15.0

type MapReduceChain[In, Mid, Out any] struct {
	// MapFunc processes each input item.
	MapFunc func(ctx context.Context, input In) (Mid, error)
	// ReduceFunc combines all map results into the final output.
	ReduceFunc func(ctx context.Context, results []Mid) (Out, error)
	// MaxConcurrency limits concurrent map operations. 0 means unlimited.
	MaxConcurrency int
	// Timeout is the per-task timeout. 0 means no timeout.
	Timeout time.Duration
	// QuorumFraction is the fraction of tasks that must succeed (e.g., 0.66).
	QuorumFraction float64
}

MapReduceChain runs a map function concurrently over inputs, then reduces the collected results into a final output. Supports concurrency limits, per-task timeouts, and quorum-based early return.

func NewMapReduceChain added in v0.15.0

func NewMapReduceChain[In, Mid, Out any](
	mapFn func(ctx context.Context, input In) (Mid, error),
	reduceFn func(ctx context.Context, results []Mid) (Out, error),
	opts ...MapReduceOption[In, Mid, Out],
) *MapReduceChain[In, Mid, Out]

NewMapReduceChain creates a new MapReduceChain with the given map and reduce functions.

func (*MapReduceChain[In, Mid, Out]) Call added in v0.15.0

func (c *MapReduceChain[In, Mid, Out]) Call(ctx context.Context, inputs []In) (Out, error)

Call runs the map function over all inputs concurrently (with bounded concurrency), waits for quorum, then calls ReduceFunc with the results.

type MapReduceOption added in v0.15.0

type MapReduceOption[In, Mid, Out any] func(*MapReduceChain[In, Mid, Out])

MapReduceOption configures a MapReduceChain.

func WithMapTimeout added in v0.15.0

func WithMapTimeout[In, Mid, Out any](d time.Duration) MapReduceOption[In, Mid, Out]

WithMapTimeout sets a timeout for each individual map task.

func WithMaxConcurrency added in v0.15.0

func WithMaxConcurrency[In, Mid, Out any](n int) MapReduceOption[In, Mid, Out]

WithMaxConcurrency limits the number of map tasks running in parallel.

func WithQuorum added in v0.15.0

func WithQuorum[In, Mid, Out any](fraction float64) MapReduceOption[In, Mid, Out]

WithQuorum sets the fraction of map tasks that must succeed before the chain proceeds to the reduce step. For example, 0.66 means at least 2 out of 3 tasks must succeed.

type RetrievalQA

type RetrievalQA struct {
	// Retriever fetches relevant documents for the query.
	Retriever schema.Retriever
	// LLM generates the final answer.
	LLM llms.Model
	// PromptBuilder formats the query and documents into a prompt.
	// If nil, uses DefaultRAGPrompt.
	PromptBuilder func(query string, docs []schema.Document) (string, error)
}

RetrievalQA is a simple RAG chain that retrieves documents and generates an answer. It retrieves relevant documents using the Retriever, builds a prompt with the documents as context, and uses the LLM to generate an answer.

func NewRetrievalQA

func NewRetrievalQA(retriever schema.Retriever, llm llms.Model, opts ...RetrievalQAOption) (RetrievalQA, error)

NewRetrievalQA creates a new RetrievalQA chain. Returns an error if retriever or llm is nil.

func (RetrievalQA) Call

func (c RetrievalQA) Call(ctx context.Context, query string) (string, error)

Call retrieves relevant documents and generates an answer for the query. If no documents are retrieved, it calls the LLM directly with the query.

type RetrievalQAOption added in v0.15.0

type RetrievalQAOption func(*RetrievalQA)

RetrievalQAOption configures a RetrievalQA chain.

func WithPromptBuilder added in v0.15.0

func WithPromptBuilder(pb func(query string, docs []schema.Document) (string, error)) RetrievalQAOption

WithPromptBuilder sets a custom prompt builder function. The builder receives the query and retrieved documents and returns the formatted prompt string.

type ValidatingRetrievalQA added in v0.2.0

type ValidatingRetrievalQA struct {
	// Retriever fetches relevant documents for the query.
	Retriever schema.Retriever
	// GeneratorLLM generates the final answer.
	GeneratorLLM llms.Model
	// ValidatorLLM validates context relevance.
	ValidatorLLM llms.Model
	// contains filtered or unexported fields
}

ValidatingRetrievalQA validates the relevance of retrieved context before generation. It uses a separate validator LLM to check if the retrieved documents are relevant to the query before using them for generation.

func NewValidatingRetrievalQA added in v0.2.0

func NewValidatingRetrievalQA(retriever schema.Retriever, generator llms.Model, opts ...ValidatingRetrievalQAOption) (ValidatingRetrievalQA, error)

NewValidatingRetrievalQA creates a new ValidatingRetrievalQA chain. Returns an error if retriever, generator, or validator is nil.

func (*ValidatingRetrievalQA) Call added in v0.2.0

func (c *ValidatingRetrievalQA) Call(ctx context.Context, query string) (string, error)

Call retrieves documents, validates context relevance, and generates an answer. If the retrieved context is not relevant, it generates an answer without context.

type ValidatingRetrievalQAOption added in v0.2.0

type ValidatingRetrievalQAOption func(*ValidatingRetrievalQA)

ValidatingRetrievalQAOption configures a ValidatingRetrievalQA chain.

func WithLogger added in v0.2.0

func WithLogger(logger *slog.Logger) ValidatingRetrievalQAOption

WithLogger sets the logger for the chain.

func WithValidator added in v0.2.0

func WithValidator(llm llms.Model) ValidatingRetrievalQAOption

WithValidator sets the LLM used for context validation.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL