Documentation
¶
Index ¶
- func GenerateRequestID() string
- func GenerateTraceID() string
- func NewLogger(level slog.Level) *slog.Logger
- func ParseNonStreamingResponse(body []byte) (model, msgID string, u anthropic.Usage, err error)
- func ParseStreamEvent(data []byte) (eventType, model, msgID string, u anthropic.Usage, hasUsage bool)
- type Config
- type QueryOpts
- type SandboxInfo
- type Server
- type Store
- func (s *Store) Close() error
- func (s *Store) GetOrCreateTrace(traceID, sandboxID, workspaceID, source string) (*Trace, error)
- func (s *Store) GetTraceDetail(traceID string) (*Trace, []TokenUsage, error)
- func (s *Store) QueryTraces(opts QueryOpts) ([]TraceWithStats, int64, error)
- func (s *Store) QueryUsage(opts QueryOpts) ([]UsageSummary, error)
- func (s *Store) RecordUsage(u TokenUsage) error
- func (s *Store) UpdateTraceActivity(traceID string) error
- type TokenUsage
- type Trace
- type TraceWithStats
- type UsageSummary
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func GenerateRequestID ¶
func GenerateRequestID() string
GenerateRequestID creates a new request ID with the "ar-" prefix.
func GenerateTraceID ¶
func GenerateTraceID() string
GenerateTraceID creates a new trace ID with the "at-" prefix.
func ParseNonStreamingResponse ¶
ParseNonStreamingResponse parses a complete JSON response from the Anthropic messages API.
Types ¶
type Config ¶
type Config struct {
ListenAddr string // HTTP listen address, e.g. ":8081"
DatabaseURL string // proxy's own PostgreSQL connection URL
AgentserverURL string // agentserver internal API URL for token validation
AnthropicBaseURL string // upstream Anthropic API URL
AnthropicAPIKey string // real Anthropic API key
AnthropicAuthToken string // alternative: Bearer token auth
TraceHeader string // custom trace header name
}
Config holds all configuration for the LLM proxy.
func LoadConfigFromEnv ¶
func LoadConfigFromEnv() Config
LoadConfigFromEnv reads configuration from environment variables.
type SandboxInfo ¶
type SandboxInfo struct {
ID string `json:"sandbox_id"`
WorkspaceID string `json:"workspace_id"`
Status string `json:"status"`
}
SandboxInfo is returned by the agentserver token validation API.
type Server ¶
type Server struct {
// contains filtered or unexported fields
}
Server is the LLM proxy HTTP server.
func (*Server) ExtractTraceID ¶
ExtractTraceID extracts a trace ID from the request. Priority: custom header → Claude Code metadata → auto-generate. Returns (traceID, source).
func (*Server) ValidateProxyToken ¶
ValidateProxyToken calls the agentserver internal API to validate a sandbox proxy token. Returns nil (not error) if the token is invalid.
type Store ¶
type Store struct {
// contains filtered or unexported fields
}
Store wraps a *sql.DB for the LLM proxy's own database.
func (*Store) GetOrCreateTrace ¶
GetOrCreateTrace returns an existing trace or creates a new one. Uses INSERT ... ON CONFLICT to avoid TOCTOU races under concurrent requests.
func (*Store) GetTraceDetail ¶
func (s *Store) GetTraceDetail(traceID string) (*Trace, []TokenUsage, error)
GetTraceDetail returns a trace and all its usage records.
func (*Store) QueryTraces ¶
func (s *Store) QueryTraces(opts QueryOpts) ([]TraceWithStats, int64, error)
QueryTraces returns traces with aggregated statistics and total count.
func (*Store) QueryUsage ¶
func (s *Store) QueryUsage(opts QueryOpts) ([]UsageSummary, error)
QueryUsage returns aggregated usage grouped by provider and model.
func (*Store) RecordUsage ¶
func (s *Store) RecordUsage(u TokenUsage) error
RecordUsage inserts a single API request usage record.
func (*Store) UpdateTraceActivity ¶
UpdateTraceActivity updates the updated_at timestamp on a trace.
type TokenUsage ¶
type TokenUsage struct {
ID string `json:"id"`
TraceID string `json:"traceId,omitempty"`
SandboxID string `json:"sandboxId"`
WorkspaceID string `json:"workspaceId"`
Provider string `json:"provider"`
Model string `json:"model"`
MessageID string `json:"messageId,omitempty"`
InputTokens int64 `json:"inputTokens"`
OutputTokens int64 `json:"outputTokens"`
CacheCreationInputTokens int64 `json:"cacheCreationInputTokens"`
CacheReadInputTokens int64 `json:"cacheReadInputTokens"`
Streaming bool `json:"streaming"`
Duration int64 `json:"duration"`
TTFT int64 `json:"ttft"`
CreatedAt time.Time `json:"createdAt"`
}
TokenUsage records a single LLM API request's token usage.
type Trace ¶
type Trace struct {
ID string `json:"id"`
SandboxID string `json:"sandboxId"`
WorkspaceID string `json:"workspaceId"`
Source string `json:"source"`
CreatedAt time.Time `json:"createdAt"`
UpdatedAt time.Time `json:"updatedAt"`
}
Trace represents a logical session/trace spanning multiple API requests.
type TraceWithStats ¶
type TraceWithStats struct {
Trace
RequestCount int64 `json:"requestCount"`
TotalInputTokens int64 `json:"totalInputTokens"`
TotalOutputTokens int64 `json:"totalOutputTokens"`
}
TraceWithStats is a trace with aggregated request statistics.
type UsageSummary ¶
type UsageSummary struct {
Provider string `json:"provider"`
Model string `json:"model"`
InputTokens int64 `json:"inputTokens"`
OutputTokens int64 `json:"outputTokens"`
CacheCreationInputTokens int64 `json:"cacheCreationInputTokens"`
CacheReadInputTokens int64 `json:"cacheReadInputTokens"`
RequestCount int64 `json:"requestCount"`
}
UsageSummary is an aggregated usage row grouped by provider+model.