langfuse

package
v0.1.8-rc.16 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 19, 2026 License: Apache-2.0 Imports: 19 Imported by: 0

Documentation

Overview

Package langfuse provides observability and tracing for LLM interactions via the Langfuse service.

It solves the problem of capturing traces, spans, and generations for Genie's agent runs so that teams can debug, monitor, and analyze LLM usage in one place. When configured (LANGFUSE_* credentials), the package exports traces to Langfuse and can optionally sync prompt definitions. Without it, debugging agent behavior would rely solely on local logs and audit files.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func GetPrompt

func GetPrompt(ctx context.Context, name, defaultPrompt string) string

GetPrompt delegates to the global defaultClient's GetPrompt. Returns defaultPrompt when no client is configured. Exists to provide a convenient package-level API without requiring callers to manage a Client instance.

Types

type AgentUsageStats added in v0.1.7

type AgentUsageStats struct {
	// AgentName is the trace name (i.e. the agent identifier) in Langfuse.
	AgentName string `json:"agent_name"`
	// TotalCost is the total cost in USD for all observations belonging to
	// this agent's traces within the queried time window.
	TotalCost float64 `json:"total_cost"`
	// InputTokens is the sum of input tokens consumed across all observations.
	InputTokens float64 `json:"input_tokens"`
	// OutputTokens is the sum of output tokens produced across all observations.
	OutputTokens float64 `json:"output_tokens"`
	// TotalTokens is the sum of all tokens consumed across all observations.
	TotalTokens float64 `json:"total_tokens"`
	// Count is the total number of observations for this agent.
	Count float64 `json:"count"`
}

AgentUsageStats holds the aggregated token usage and cost statistics for a single agent within a given time window. Without this type, callers would have to parse raw Langfuse API responses and perform their own aggregation, which is error-prone and duplicated across consumers.

func GetAgentStats added in v0.1.7

func GetAgentStats(ctx context.Context, req GetAgentStatsRequest) ([]AgentUsageStats, error)

GetAgentStats delegates to the global defaultClient's GetAgentStats. Returns nil, nil when no client is configured (callers should handle accordingly). Exists to allow callers to get per-agent usage stats without managing their own Client instance.

type AnalyzeTracesRequest

type AnalyzeTracesRequest struct {
	// UserID filters traces to a specific user. Optional.
	UserID string

	// SessionID filters traces to a specific session. Optional.
	SessionID string

	// AgentName filters traces by agent (trace name). Optional.
	AgentName string

	// Duration is the lookback window from now. Required when using Analyze().
	Duration time.Duration

	// Tags filters traces to those matching these tags. Optional.
	Tags []string

	// Limit caps the number of traces fetched. Defaults to 100.
	Limit int
}

AnalyzeTracesRequest holds the parameters for analyzing traces. All filter fields are optional — omit them to widen the query.

type Client

type Client interface {
	// GetPrompt returns the prompt template by name, or the default if not found/disabled.
	GetPrompt(ctx context.Context, name, defaultPrompt string) string

	// GetAgentStats returns aggregated token usage and cost statistics per
	// agent (trace name) for the duration specified in the request. Without
	// this method, consumers would need to call the Langfuse metrics API
	// directly, duplicating auth and parsing logic.
	GetAgentStats(ctx context.Context, req GetAgentStatsRequest) ([]AgentUsageStats, error)
}

Client defines the interface for interacting with the Langfuse API, including prompt management and usage metrics retrieval.

type Config

type Config struct {
	PublicKey     string `json:"public_key" toml:"public_key,omitempty" yaml:"public_key,omitempty"`
	SecretKey     string `json:"secret_key" toml:"secret_key,omitempty" yaml:"secret_key,omitempty"`
	Host          string `json:"host" toml:"host,omitempty" yaml:"host,omitempty"`
	EnablePrompts bool   `json:"enable_prompts" toml:"enable_prompts,omitempty" yaml:"enable_prompts,omitempty"`
}

Config holds the configuration for the Langfuse integration, which provides observability and tracing for LLM interactions.

func DefaultConfig

func DefaultConfig(ctx context.Context, sp security.SecretProvider) Config

DefaultConfig builds the default Langfuse configuration by resolving credentials through the given SecretProvider. Without a SecretProvider, callers can pass security.NewEnvProvider() to preserve the legacy os.Getenv behavior.

func (Config) Init

func (c Config) Init(ctx context.Context)

func (Config) LangfuseHost

func (c Config) LangfuseHost() string

LangfuseHost returns the full URL (with scheme) for the Langfuse HTTP API.

func (Config) NewClient

func (c Config) NewClient() Client

func (Config) NewTraceAnalyzer

func (c Config) NewTraceAnalyzer(opts ...Option) *TraceAnalyzer

NewTraceAnalyzer creates a TraceAnalyzer that queries the Langfuse API. Returns nil if the config is missing credentials.

func (Config) NewTraceClient

func (c Config) NewTraceClient() TraceClient

type GetAgentStatsRequest added in v0.1.7

type GetAgentStatsRequest struct {
	// Duration is the lookback period from now (e.g. 24h, 7*24h).
	Duration time.Duration
	// AgentName optionally restricts the query to a single agent.
	// When empty, stats for all agents are returned.
	AgentName string
}

GetAgentStatsRequest encapsulates the parameters for querying agent-level usage statistics from Langfuse. The Duration field specifies the lookback window from the current time. An optional AgentName can filter results to a single agent.

type GetObservationsRequest

type GetObservationsRequest struct {
	TraceID string

	// Page is the 1-based page number. Defaults to 1 when zero.
	Page int

	// Limit caps the number of observations per page.
	// Defaults to 100 when zero. Langfuse enforces a server-side max of 100.
	Limit int
}

GetObservationsRequest holds the input parameters for GetObservations.

type GetTraceRequest

type GetTraceRequest struct {
	TraceID string
}

GetTraceRequest holds the input parameters for GetTrace.

type ObsUsage

type ObsUsage struct {
	Input  int     `json:"input"`
	Output int     `json:"output"`
	Total  int     `json:"total"`
	Unit   string  `json:"unit,omitempty"`
	Cost   float64 `json:"totalCost,omitempty"`
}

ObsUsage holds token usage for a generation observation.

type Observation

type Observation struct {
	ID                  string         `json:"id"`
	TraceID             string         `json:"traceId"`
	ParentObservationID *string        `json:"parentObservationId"`
	Type                string         `json:"type"` // "GENERATION", "SPAN", "EVENT", "TOOL", "AGENT", "CHAIN"
	Name                string         `json:"name"`
	StartTime           time.Time      `json:"startTime"`
	EndTime             *time.Time     `json:"endTime,omitempty"`
	CompletionStartTime *time.Time     `json:"completionStartTime,omitempty"`
	Model               string         `json:"model,omitempty"`
	ModelParameters     map[string]any `json:"modelParameters,omitempty"`
	Input               any            `json:"input,omitempty"`
	Output              any            `json:"output,omitempty"`
	Metadata            any            `json:"metadata,omitempty"`
	Level               string         `json:"level,omitempty"`
	StatusMessage       string         `json:"statusMessage,omitempty"`
	Version             string         `json:"version,omitempty"`
	Environment         string         `json:"environment,omitempty"`
	Usage               *ObsUsage      `json:"usage,omitempty"`
	PromptName          *string        `json:"promptName,omitempty"`
	PromptVersion       *int           `json:"promptVersion,omitempty"`

	CalculatedTotalCost  float64 `json:"calculatedTotalCost,omitempty"`
	CalculatedInputCost  float64 `json:"calculatedInputCost,omitempty"`
	CalculatedOutputCost float64 `json:"calculatedOutputCost,omitempty"`

	TimeToFirstToken *float64       `json:"timeToFirstToken,omitempty"`
	Latency          float64        `json:"latency,omitempty"`
	UsageDetails     map[string]any `json:"usageDetails,omitempty"`
	CostDetails      map[string]any `json:"costDetails,omitempty"`
}

Observation represents a single observation (span, generation, or event) within a trace. Observations form a tree via ParentObservationID. Fields map 1-to-1 with the Langfuse GET /api/public/observations response.

func (Observation) TotalCost

func (o Observation) TotalCost() float64

TotalCost returns the best available cost for this observation. Langfuse populates calculatedTotalCost at the observation level; the usage.totalCost field is often zero. This helper prefers the observation-level value and falls back to usage.

type Option

type Option func(*TraceAnalyzer)

func WithHTTPClient

func WithHTTPClient(httpClient *http.Client) Option

func WithObservationConcurrency

func WithObservationConcurrency(concurrency int) Option

type SubAgentDetail

type SubAgentDetail struct {
	Name      string `json:"name"`
	Input     string `json:"input,omitempty"`
	Output    string `json:"output,omitempty"`
	LLMCalls  int    `json:"llm_calls"`
	ToolCalls int    `json:"tool_calls"`
}

SubAgentDetail captures a sub-agent execution.

type ToolCallDetail

type ToolCallDetail struct {
	Name   string `json:"name"`
	Input  string `json:"input,omitempty"`
	Output string `json:"output,omitempty"`
	// ParentName is the name of the parent span/agent that made this call.
	ParentName string `json:"parent_name,omitempty"`
}

ToolCallDetail captures a single tool invocation.

type Trace

type Trace struct {
	ID        string    `json:"id"`
	Timestamp time.Time `json:"timestamp"`
	Name      string    `json:"name"`
	UserID    string    `json:"userId"`
	SessionID *string   `json:"sessionId"`
	Input     any       `json:"input"`
	Output    any       `json:"output"`
	Tags      []string  `json:"tags"`
	Version   string    `json:"version,omitempty"`
}

Trace represents a top-level trace entry from the Langfuse API. Each trace corresponds to one user request.

type TraceAnalysisResult

type TraceAnalysisResult struct {
	// TracesAnalyzed is the total number of traces processed.
	TracesAnalyzed int `json:"traces_analyzed"`

	// TraceDetails contains the per-trace execution breakdown.
	TraceDetails TraceDetails `json:"trace_details"`

	// Aggregated totals across all traces.
	TotalToolCalls      int `json:"total_tool_calls"`
	TotalLLMCalls       int `json:"total_llm_calls"`
	TotalSubAgents      int `json:"total_sub_agents"`
	TotalVectorStoreOps int `json:"total_vector_store_ops"`
	TotalInputTokens    int `json:"total_input_tokens"`
	TotalOutputTokens   int `json:"total_output_tokens"`
}

TraceAnalysisResult holds the aggregated analysis across all traces.

func (TraceAnalysisResult) FormatReport

func (r TraceAnalysisResult) FormatReport() string

FormatReport generates a human-readable report of the trace analysis.

type TraceAnalyzer

type TraceAnalyzer struct {
	// contains filtered or unexported fields
}

TraceAnalyzer analyzes Langfuse traces to produce execution breakdowns. It can fetch traces from the Langfuse API or analyze local exports.

func (*TraceAnalyzer) Analyze

Analyze fetches traces from the Langfuse API, then fetches observations for each trace in parallel, and produces an execution breakdown.

type TraceClient

type TraceClient interface {
	GetTrace(ctx context.Context, req GetTraceRequest) (*Trace, error)
	GetObservations(ctx context.Context, req GetObservationsRequest) ([]Observation, error)
}

TraceClient defines the interface for fetching Langfuse traces and observations.

type TraceDetail

type TraceDetail struct {
	// TraceID is the Langfuse trace ID.
	TraceID string `json:"trace_id"`

	// AgentName is the top-level agent (trace name).
	AgentName string `json:"agent_name"`

	// UserID is who made the request.
	UserID string `json:"user_id"`

	// SessionID is the conversation session.
	SessionID string `json:"session_id,omitempty"`

	// Timestamp is when the request was made.
	Timestamp time.Time `json:"timestamp"`

	// Input is the user's original request.
	Input string `json:"input"`

	// Output is the agent's final response (nil if no output).
	Output *string `json:"output"`

	// ToolCalls lists every top-level tool invocation in this trace.
	// Tool calls made within sub-agents (i.e., spans that are descendants of a sub-agent)
	// are not included here.
	ToolCalls []ToolCallDetail `json:"tool_calls"`

	// SubAgents lists every sub-agent that was created.
	SubAgents []SubAgentDetail `json:"sub_agents"`

	// LLMCalls is the total number of LLM generation calls.
	LLMCalls int `json:"llm_calls"`

	// VectorStoreOps is the number of vector store add operations.
	VectorStoreOps int `json:"vector_store_ops"`

	// InputTokens is the sum of input tokens across all generations.
	InputTokens int `json:"input_tokens"`

	// OutputTokens is the sum of output tokens across all generations.
	OutputTokens int `json:"output_tokens"`

	// TotalCost is the estimated total cost (USD).
	TotalCost float64 `json:"total_cost"`

	// Duration is the total trace duration (first observation to last).
	Duration time.Duration `json:"duration"`
}

TraceDetail is the execution breakdown of a single trace (one user request).

type TraceDetails

type TraceDetails []TraceDetail

func (TraceDetails) SortByTimestamp

func (t TraceDetails) SortByTimestamp()

SortByTimestamp implements sort.Interface for TraceDetails

Directories

Path Synopsis
Code generated by counterfeiter.
Code generated by counterfeiter.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL