omniobserve

package module
v0.5.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 19, 2026 License: MIT Imports: 3 Imported by: 0

README ΒΆ

OmniObserve

Build Status Lint Status Go Report Card Docs License

A unified Go library for LLM and ML observability. OmniObserve provides a vendor-agnostic abstraction layer that enables you to instrument your AI applications once and seamlessly switch between different observability backends without code changes.

Features

  • πŸ”— Unified Interface: Single API for tracing, evaluation, prompts, and datasets across all providers
  • πŸ”„ Provider Agnostic: Switch between Opik, Langfuse, and Phoenix without changing your code
  • πŸ” Full Tracing: Trace LLM calls with spans, token usage, and cost tracking
  • πŸ“Š Evaluation Support: Run metrics and add feedback scores to traces
  • πŸ“¦ Dataset Management: Create and manage evaluation datasets
  • πŸ“ Prompt Versioning: Store and version prompt templates (provider-dependent)
  • πŸ”€ Context Propagation: Automatic trace/span context propagation via context.Context
  • βš™οΈ Functional Options: Clean, extensible configuration using the options pattern

Installation

go get github.com/agentplexus/omniobserve

Quick Start

package main

import (
    "context"
    "log"

    "github.com/agentplexus/omniobserve/llmops"
    _ "github.com/agentplexus/go-opik/llmops"  // Register Opik provider
)

func main() {
    // Open a provider
    provider, err := llmops.Open("opik",
        llmops.WithAPIKey("your-api-key"),
        llmops.WithProjectName("my-project"),
    )
    if err != nil {
        log.Fatal(err)
    }
    defer provider.Close()

    ctx := context.Background()

    // Start a trace
    ctx, trace, err := provider.StartTrace(ctx, "chat-workflow",
        llmops.WithTraceInput(map[string]any{"query": "Hello, world!"}),
    )
    if err != nil {
        log.Fatal(err)
    }
    defer trace.End()

    // Start a span for the LLM call
    ctx, span, err := provider.StartSpan(ctx, "gpt-4-completion",
        llmops.WithSpanType(llmops.SpanTypeLLM),
        llmops.WithModel("gpt-4"),
        llmops.WithProvider("openai"),
    )
    if err != nil {
        log.Fatal(err)
    }

    // Record the LLM interaction
    span.SetInput(map[string]any{
        "messages": []map[string]string{
            {"role": "user", "content": "Hello!"},
        },
    })

    // ... call your LLM here ...

    span.SetOutput(map[string]any{
        "response": "Hello! How can I help you today?",
    })
    span.SetUsage(llmops.TokenUsage{
        PromptTokens:     10,
        CompletionTokens: 8,
        TotalTokens:      18,
    })

    span.End()
    trace.SetOutput(map[string]any{"response": "Hello! How can I help you today?"})
}

Supported Providers

Provider Package Description
Opik go-opik/llmops Comet Opik - Open-source, full-featured
Langfuse omniobserve/llmops/langfuse Cloud & self-hosted, batch ingestion
Phoenix go-phoenix/llmops Arize Phoenix - OpenTelemetry-based
Provider Capabilities
Feature Opik Langfuse Phoenix
Tracing βœ… βœ… βœ…
Evaluation βœ… βœ… βœ…
Prompts βœ… Partial ❌
Datasets βœ… βœ… Partial
Experiments βœ… βœ… Partial
Streaming βœ… βœ… Planned
Distributed Tracing βœ… ❌ βœ…
Cost Tracking βœ… βœ… ❌
OpenTelemetry ❌ ❌ βœ…

Architecture

omniobserve/
β”œβ”€β”€ omniobserve.go       # Main package with re-exports
β”œβ”€β”€ llmops/              # LLM observability interfaces
β”‚   β”œβ”€β”€ llmops.go        # Core interfaces (Provider, Tracer, Evaluator, etc.)
β”‚   β”œβ”€β”€ trace.go         # Trace and Span interfaces
β”‚   β”œβ”€β”€ types.go         # Data types (EvalInput, Dataset, Prompt, etc.)
β”‚   β”œβ”€β”€ options.go       # Functional options
β”‚   β”œβ”€β”€ provider.go      # Provider registration system
β”‚   β”œβ”€β”€ errors.go        # Error definitions
β”‚   β”œβ”€β”€ metrics/         # Evaluation metrics (hallucination, relevance, etc.)
β”‚   └── langfuse/        # Langfuse provider adapter
β”œβ”€β”€ integrations/        # Integrations with LLM libraries
β”‚   └── omnillm/         # OmniLLM observability hook (separate module)
β”œβ”€β”€ examples/            # Usage examples
β”‚   └── evaluation/      # Metrics evaluation example
β”œβ”€β”€ mlops/               # ML operations interfaces (experiments, model registry)
└── sdk/                 # Provider-specific SDKs
    └── langfuse/        # Langfuse Go SDK

# Provider adapters in standalone SDKs:
# github.com/agentplexus/go-opik/llmops      # Opik provider
# github.com/agentplexus/go-phoenix/llmops   # Phoenix provider

Core Interfaces

Provider

The main interface that all observability backends implement:

type Provider interface {
    Tracer            // Trace/span operations
    Evaluator         // Evaluation and feedback
    PromptManager     // Prompt template management
    DatasetManager    // Test dataset management
    ProjectManager    // Project/workspace management
    AnnotationManager // Span/trace annotations
    io.Closer

    Name() string
}
Trace and Span
type Trace interface {
    ID() string
    Name() string
    StartSpan(ctx context.Context, name string, opts ...SpanOption) (context.Context, Span, error)
    SetInput(input any)
    SetOutput(output any)
    SetMetadata(metadata map[string]any)
    AddTag(key, value string)
    AddFeedbackScore(ctx context.Context, name string, score float64, opts ...FeedbackOption) error
    End(opts ...EndOption)
}

type Span interface {
    ID() string
    TraceID() string
    Name() string
    Type() SpanType
    StartSpan(ctx context.Context, name string, opts ...SpanOption) (context.Context, Span, error)
    SetInput(input any)
    SetOutput(output any)
    SetModel(model string)
    SetProvider(provider string)
    SetUsage(usage TokenUsage)
    End(opts ...EndOption)
}
Span Types
const (
    SpanTypeGeneral   SpanType = "general"
    SpanTypeLLM       SpanType = "llm"
    SpanTypeTool      SpanType = "tool"
    SpanTypeRetrieval SpanType = "retrieval"
    SpanTypeAgent     SpanType = "agent"
    SpanTypeChain     SpanType = "chain"
    SpanTypeGuardrail SpanType = "guardrail"
)

Usage Examples

Using Different Providers
// Opik
import _ "github.com/agentplexus/go-opik/llmops"
provider, _ := llmops.Open("opik", llmops.WithAPIKey("..."))

// Langfuse
import _ "github.com/agentplexus/omniobserve/llmops/langfuse"
provider, _ := llmops.Open("langfuse",
    llmops.WithAPIKey("sk-lf-..."),
    llmops.WithEndpoint("https://cloud.langfuse.com"),
)

// Phoenix
import _ "github.com/agentplexus/go-phoenix/llmops"
provider, _ := llmops.Open("phoenix",
    llmops.WithEndpoint("http://localhost:6006"),
)
Nested Spans
ctx, trace, _ := provider.StartTrace(ctx, "rag-pipeline")
defer trace.End()

// Retrieval span
ctx, retrievalSpan, _ := provider.StartSpan(ctx, "vector-search",
    llmops.WithSpanType(llmops.SpanTypeRetrieval),
)
// ... perform retrieval ...
retrievalSpan.SetOutput(documents)
retrievalSpan.End()

// LLM span
ctx, llmSpan, _ := provider.StartSpan(ctx, "generate-response",
    llmops.WithSpanType(llmops.SpanTypeLLM),
    llmops.WithModel("gpt-4"),
)
// ... call LLM ...
llmSpan.SetUsage(llmops.TokenUsage{
    PromptTokens:     150,
    CompletionTokens: 50,
    TotalTokens:      200,
})
llmSpan.End()
Adding Feedback Scores
// Add a score to a span
span.AddFeedbackScore(ctx, "relevance", 0.95,
    llmops.WithFeedbackReason("Response directly addressed the query"),
    llmops.WithFeedbackCategory("quality"),
)

// Add a score to a trace
trace.AddFeedbackScore(ctx, "user_satisfaction", 0.8)
Working with Datasets
// Create a dataset
dataset, _ := provider.CreateDataset(ctx, "test-cases",
    llmops.WithDatasetDescription("Test cases for RAG evaluation"),
)

// Add items
provider.AddDatasetItems(ctx, "test-cases", []llmops.DatasetItem{
    {
        Input:    map[string]any{"query": "What is Go?"},
        Expected: map[string]any{"answer": "Go is a programming language..."},
    },
})
Working with Prompts (Opik)
// Create a versioned prompt
prompt, _ := provider.CreatePrompt(ctx, "chat-template",
    `You are a helpful assistant. User: {{.query}}`,
    llmops.WithPromptDescription("Main chat template"),
)

// Get a prompt
prompt, _ := provider.GetPrompt(ctx, "chat-template")

// Render with variables
rendered := prompt.Render(map[string]any{"query": "Hello!"})

Configuration Options

Client Options
llmops.WithAPIKey("...")           // API key for authentication
llmops.WithEndpoint("...")         // Custom endpoint URL
llmops.WithWorkspace("...")        // Workspace/organization name
llmops.WithProjectName("...")      // Default project name
llmops.WithHTTPClient(client)      // Custom HTTP client
llmops.WithTimeout(30 * time.Second)
llmops.WithDisabled(true)          // Disable tracing (no-op mode)
llmops.WithDebug(true)             // Enable debug logging
Trace Options
llmops.WithTraceProject("...")
llmops.WithTraceInput(input)
llmops.WithTraceOutput(output)
llmops.WithTraceMetadata(map[string]any{...})
llmops.WithTraceTags(map[string]string{...})
llmops.WithThreadID("...")
Span Options
llmops.WithSpanType(llmops.SpanTypeLLM)
llmops.WithSpanInput(input)
llmops.WithSpanOutput(output)
llmops.WithSpanMetadata(map[string]any{...})
llmops.WithModel("gpt-4")
llmops.WithProvider("openai")
llmops.WithTokenUsage(usage)
llmops.WithParentSpan(parentSpan)

Error Handling

The library provides typed errors for common conditions:

if errors.Is(err, llmops.ErrMissingAPIKey) {
    // Handle missing API key
}

if llmops.IsNotFound(err) {
    // Handle not found
}

if llmops.IsRateLimited(err) {
    // Handle rate limiting
}

Direct SDK Access

For provider-specific features, you can use the underlying SDKs directly:

import "github.com/agentplexus/omniobserve/sdk/langfuse"  // Langfuse SDK
import "github.com/agentplexus/go-opik"                   // Opik SDK
import "github.com/agentplexus/go-phoenix"                // Phoenix SDK

OmniLLM Integration

OmniObserve provides an integration with OmniLLM, a multi-LLM abstraction layer. This allows you to automatically instrument all LLM calls made through OmniLLM with any OmniObserve provider.

go get github.com/agentplexus/omniobserve/integrations/omnillm
package main

import (
    "github.com/agentplexus/omnillm"
    omnillmhook "github.com/agentplexus/omniobserve/integrations/omnillm"
    "github.com/agentplexus/omniobserve/llmops"
    _ "github.com/agentplexus/go-opik/llmops"
)

func main() {
    // Initialize a OmniObserve provider
    provider, _ := llmops.Open("opik",
        llmops.WithAPIKey("your-api-key"),
        llmops.WithProjectName("my-project"),
    )
    defer provider.Close()

    // Create the observability hook
    hook := omnillmhook.NewHook(provider)

    // Attach to your OmniLLM client
    client := omnillm.NewClient(
        omnillm.WithObservabilityHook(hook),
    )

    // All LLM calls through this client are now automatically traced
}

The hook automatically captures:

  • Model and provider information
  • Input messages and output responses
  • Token usage (prompt, completion, total)
  • Streaming responses
  • Errors

The hook also automatically creates traces when none exists in context, ensuring all LLM calls are properly traced.

Requirements

  • Go 1.24.5 or later

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests.

License

See LICENSE for details.

Documentation ΒΆ

Overview ΒΆ

Package omniobserve provides a unified interface for LLM, ML, and general service observability platforms.

This library abstracts common functionality across providers like:

  • Comet Opik
  • Arize Phoenix
  • Langfuse
  • New Relic
  • Datadog
  • OpenTelemetry (OTLP)

Quick Start - LLM Observability ΒΆ

Import the provider you want to use:

import (
	"github.com/agentplexus/omniobserve/llmops"
	_ "github.com/agentplexus/go-opik/llmops"             // Register Opik
	// or
	_ "github.com/agentplexus/omniobserve/llmops/langfuse" // Register Langfuse
	// or
	_ "github.com/agentplexus/go-phoenix/llmops"           // Register Phoenix
)

Then open a provider:

provider, err := llmops.Open("opik", llmops.WithAPIKey("..."))
if err != nil {
	log.Fatal(err)
}
defer provider.Close()

Start tracing:

ctx, trace, _ := provider.StartTrace(ctx, "my-workflow")
defer trace.End()

ctx, span, _ := provider.StartSpan(ctx, "llm-call",
	llmops.WithSpanType(llmops.SpanTypeLLM),
	llmops.WithModel("gpt-4"),
)
defer span.End()

Quick Start - General Service Observability ΒΆ

Import the provider you want to use:

import (
	"github.com/agentplexus/omniobserve/observops"
	_ "github.com/agentplexus/omniobserve/observops/otlp"     // OTLP exporter
	// or
	_ "github.com/agentplexus/omniobserve/observops/newrelic" // New Relic
	// or
	_ "github.com/agentplexus/omniobserve/observops/datadog"  // Datadog
)

Then open a provider:

provider, err := observops.Open("otlp",
	observops.WithEndpoint("localhost:4317"),
	observops.WithServiceName("my-service"),
)
if err != nil {
	log.Fatal(err)
}
defer provider.Shutdown(context.Background())

Use metrics, traces, and logs:

// Metrics
counter, _ := provider.Meter().Counter("requests_total")
counter.Add(ctx, 1, observops.WithAttributes(observops.Attribute("method", "GET")))

// Traces
ctx, span := provider.Tracer().Start(ctx, "ProcessRequest")
defer span.End()

// Logs
provider.Logger().Info(ctx, "Request processed", observops.LogAttr("user_id", "123"))

Architecture ΒΆ

The library is organized into three main packages:

  • llmops: LLM observability (traces, spans, evaluations, prompts)
  • mlops: ML operations (experiments, model registry, artifacts)
  • observops: General service observability (metrics, traces, logs via OpenTelemetry)

Each package defines interfaces that providers implement. Provider-specific implementations are in subpackages (e.g., llmops/opik, observops/newrelic).

LLM Observability Providers ΒΆ

  • opik: Comet Opik (open-source, self-hosted)
  • langfuse: Langfuse (open-source, cloud & self-hosted)
  • phoenix: Arize Phoenix (open-source, uses OpenTelemetry)

General Observability Providers ΒΆ

  • otlp: OpenTelemetry Protocol (vendor-agnostic)
  • newrelic: New Relic
  • datadog: Datadog

Features ΒΆ

LLM Observability:

  • Trace/span creation and context propagation
  • Input/output capture
  • Token usage and cost tracking
  • Feedback scores and evaluations
  • Dataset management
  • Prompt versioning (provider-dependent)

General Observability:

  • Metrics (counters, gauges, histograms)
  • Distributed tracing with spans
  • Structured logging with trace correlation
  • Vendor-agnostic via OpenTelemetry

SDK Access ΒΆ

For provider-specific features, you can use the underlying SDKs directly:

import "github.com/agentplexus/go-opik"    // Opik SDK
import "github.com/agentplexus/go-phoenix" // Phoenix SDK
import "github.com/agentplexus/omniobserve/sdk/langfuse"

Index ΒΆ

Constants ΒΆ

View Source
const (
	SpanTypeGeneral   = llmops.SpanTypeGeneral
	SpanTypeLLM       = llmops.SpanTypeLLM
	SpanTypeTool      = llmops.SpanTypeTool
	SpanTypeRetrieval = llmops.SpanTypeRetrieval
	SpanTypeAgent     = llmops.SpanTypeAgent
	SpanTypeChain     = llmops.SpanTypeChain
	SpanTypeGuardrail = llmops.SpanTypeGuardrail
)

Span type constants.

View Source
const Version = "0.1.0"

Version is the library version.

Variables ΒΆ

View Source
var (
	// Client options
	WithAPIKey      = llmops.WithAPIKey
	WithEndpoint    = llmops.WithEndpoint
	WithWorkspace   = llmops.WithWorkspace
	WithProjectName = llmops.WithProjectName
	WithHTTPClient  = llmops.WithHTTPClient
	WithTimeout     = llmops.WithTimeout
	WithDisabled    = llmops.WithDisabled
	WithDebug       = llmops.WithDebug

	// Trace options
	WithTraceProject  = llmops.WithTraceProject
	WithTraceInput    = llmops.WithTraceInput
	WithTraceOutput   = llmops.WithTraceOutput
	WithTraceMetadata = llmops.WithTraceMetadata
	WithTraceTags     = llmops.WithTraceTags
	WithThreadID      = llmops.WithThreadID

	// Span options
	WithSpanType     = llmops.WithSpanType
	WithSpanInput    = llmops.WithSpanInput
	WithSpanOutput   = llmops.WithSpanOutput
	WithSpanMetadata = llmops.WithSpanMetadata
	WithSpanTags     = llmops.WithSpanTags
	WithModel        = llmops.WithModel
	WithProvider     = llmops.WithProvider
	WithTokenUsage   = llmops.WithTokenUsage

	// End options
	WithEndOutput   = llmops.WithEndOutput
	WithEndMetadata = llmops.WithEndMetadata
	WithEndError    = llmops.WithEndError
)

Re-export option functions for convenience.

View Source
var (
	// Observops client options
	ObsWithServiceName    = observops.WithServiceName
	ObsWithServiceVersion = observops.WithServiceVersion
	ObsWithEndpoint       = observops.WithEndpoint
	ObsWithAPIKey         = observops.WithAPIKey
	ObsWithInsecure       = observops.WithInsecure
	ObsWithHeaders        = observops.WithHeaders
	ObsWithResource       = observops.WithResource
	ObsWithBatchTimeout   = observops.WithBatchTimeout
	ObsWithBatchSize      = observops.WithBatchSize
	ObsWithDisabled       = observops.WithDisabled
	ObsWithDebug          = observops.WithDebug

	// Metric options
	ObsWithDescription = observops.WithDescription
	ObsWithUnit        = observops.WithUnit

	// Record options
	ObsWithAttributes = observops.WithAttributes

	// Span options
	ObsWithSpanKind       = observops.WithSpanKind
	ObsWithSpanAttributes = observops.WithSpanAttributes
	ObsWithSpanLinks      = observops.WithSpanLinks

	// Attribute helper
	ObsAttribute = observops.Attribute
	ObsLogAttr   = observops.LogAttr
)

Re-export observops option functions for convenience.

Functions ΒΆ

func ObservopsProviders ΒΆ

func ObservopsProviders() []string

ObservopsProviders returns the names of registered general observability providers.

func OpenLLMOps ΒΆ

func OpenLLMOps(name string, opts ...llmops.ClientOption) (llmops.Provider, error)

OpenLLMOps opens an LLM observability provider. This is a convenience function that wraps llmops.Open.

func OpenObservops ΒΆ

func OpenObservops(name string, opts ...observops.ClientOption) (observops.Provider, error)

OpenObservops opens a general observability provider. This is a convenience function that wraps observops.Open.

func Providers ΒΆ

func Providers() []string

Providers returns the names of registered LLM providers.

Types ΒΆ

type Counter ΒΆ

type Counter = observops.Counter

Counter is an alias for observops.Counter.

type EvalInput ΒΆ

type EvalInput = llmops.EvalInput

EvalInput is an alias for llmops.EvalInput.

type EvalResult ΒΆ

type EvalResult = llmops.EvalResult

EvalResult is an alias for llmops.EvalResult.

type Experiment ΒΆ

type Experiment = mlops.Experiment

Experiment is an alias for mlops.Experiment.

type Gauge ΒΆ

type Gauge = observops.Gauge

Gauge is an alias for observops.Gauge.

type Histogram ΒΆ

type Histogram = observops.Histogram

Histogram is an alias for observops.Histogram.

type Logger ΒΆ

type Logger = observops.Logger

Logger is an alias for observops.Logger.

type MLProvider ΒΆ

type MLProvider = mlops.Provider

MLProvider is an alias for mlops.Provider.

type Meter ΒΆ

type Meter = observops.Meter

Meter is an alias for observops.Meter.

type Model ΒΆ

type Model = mlops.Model

Model is an alias for mlops.Model.

type ObservopsProvider ΒΆ

type ObservopsProvider = observops.Provider

ObservopsProvider is an alias for observops.Provider.

type ObservopsSpan ΒΆ

type ObservopsSpan = observops.Span

ObservopsSpan is an alias for observops.Span.

type Provider ΒΆ

type Provider = llmops.Provider

Provider is an alias for llmops.Provider.

type Run ΒΆ

type Run = mlops.Run

Run is an alias for mlops.Run.

type Span ΒΆ

type Span = llmops.Span

Span is an alias for llmops.Span.

type SpanType ΒΆ

type SpanType = llmops.SpanType

SpanType is an alias for llmops.SpanType.

type TokenUsage ΒΆ

type TokenUsage = llmops.TokenUsage

TokenUsage is an alias for llmops.TokenUsage.

type Trace ΒΆ

type Trace = llmops.Trace

Trace is an alias for llmops.Trace.

type Tracer ΒΆ

type Tracer = observops.Tracer

Tracer is an alias for observops.Tracer.

Directories ΒΆ

Path Synopsis
Package agentops provides observability for multi-agent systems.
Package agentops provides observability for multi-agent systems.
ent
middleware
Package middleware provides reusable instrumentation helpers for multi-agent systems.
Package middleware provides reusable instrumentation helpers for multi-agent systems.
postgres
Package postgres provides a PostgreSQL backend for agentops using Ent.
Package postgres provides a PostgreSQL backend for agentops using Ent.
examples
evaluation command
Example: evaluation
Example: evaluation
integrations
observability
Package observability provides integration helpers for using observops with multi-agent systems and general Go applications.
Package observability provides integration helpers for using observops with multi-agent systems and general Go applications.
omnillm
Package omnillm provides an ObservabilityHook implementation for OmniLLM that integrates with OmniObserve's llmops providers (Opik, Langfuse, Phoenix).
Package omnillm provides an ObservabilityHook implementation for OmniLLM that integrates with OmniObserve's llmops providers (Opik, Langfuse, Phoenix).
Package llmops provides a unified interface for LLM observability platforms.
Package llmops provides a unified interface for LLM observability platforms.
langfuse
Package langfuse provides a Langfuse adapter for the llmops abstraction.
Package langfuse provides a Langfuse adapter for the llmops abstraction.
metrics
Package metrics provides evaluation metrics for LLM observability.
Package metrics provides evaluation metrics for LLM observability.
Package mlops provides interfaces for ML operations platforms.
Package mlops provides interfaces for ML operations platforms.
Package observops provides a unified interface for general service observability platforms.
Package observops provides a unified interface for general service observability platforms.
datadog
Package datadog provides a Datadog observability provider for observops.
Package datadog provides a Datadog observability provider for observops.
newrelic
Package newrelic provides a New Relic observability provider for observops.
Package newrelic provides a New Relic observability provider for observops.
otlp
Package otlp provides an OpenTelemetry Protocol (OTLP) exporter for observops.
Package otlp provides an OpenTelemetry Protocol (OTLP) exporter for observops.
sdk
langfuse
Package langfuse provides a Go SDK for Langfuse, an open-source LLM observability platform.
Package langfuse provides a Go SDK for Langfuse, an open-source LLM observability platform.
semconv
agent
Package agent provides OpenTelemetry semantic conventions for Agentic AI.
Package agent provides OpenTelemetry semantic conventions for Agentic AI.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL