Documentation
¶
Index ¶
- func AddMCPServers(data *ConfigData, input *MCPServers) error
- func PopulateAnthropicEnvConfig(data *ConfigData) error
- func PopulateOTELLogEnvConfig(data *ConfigData) error
- func PopulateOpenAIEnvConfig(data *ConfigData) error
- func WriteConfig(data *ConfigData) (string, error)
- type AnthropicConfig
- type Backend
- type ConfigData
- type MCPBackendRef
- type MCPServer
- type MCPServers
- type OpenAIConfig
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func AddMCPServers ¶
func AddMCPServers(data *ConfigData, input *MCPServers) error
AddMCPServers adds MCP server configurations to the ConfigData. It parses the MCPServers input and populates Backends and MCPBackendRefs fields. If the input is nil or empty, this function does nothing (safe to call).
Headers are parsed intelligently:
- "Authorization: Bearer <token>" → extracted as APIKey (preserving ${VAR} envsubst syntax)
- Other headers → stored in Headers map for headerMutation
Bearer tokens and other header values preserve envsubst syntax (e.g., "${VAR}") for runtime substitution by cmd/extproc.
func PopulateAnthropicEnvConfig ¶
func PopulateAnthropicEnvConfig(data *ConfigData) error
PopulateAnthropicEnvConfig populates ConfigData with Anthropic backend configuration from standard Anthropic SDK environment variables.
This errs if ANTHROPIC_API_KEY is not set.
func PopulateOTELLogEnvConfig ¶ added in v0.6.0
func PopulateOTELLogEnvConfig(data *ConfigData) error
PopulateOTELLogEnvConfig configures OTLP access logging based on OTEL env vars. Only gRPC transport is supported, following OTEL precedence rules:
- OTEL_EXPORTER_OTLP_LOGS_ENDPOINT > OTEL_EXPORTER_OTLP_ENDPOINT
- OTEL_EXPORTER_OTLP_LOGS_HEADERS > OTEL_EXPORTER_OTLP_HEADERS
- OTEL_SERVICE_NAME overrides service.name in OTEL_RESOURCE_ATTRIBUTES
Behavior:
- OTEL_LOGS_EXPORTER=none disables access logs (no accessLog section).
- OTEL_LOGS_EXPORTER=console uses the file sink (/dev/stdout).
- OTEL_LOGS_EXPORTER=otlp uses OTLP sink when gRPC + endpoint are present.
- OTEL_LOGS_EXPORTER unset defaults to OTLP if an endpoint is configured, otherwise console.
func PopulateOpenAIEnvConfig ¶
func PopulateOpenAIEnvConfig(data *ConfigData) error
PopulateOpenAIEnvConfig populates ConfigData with OpenAI backend configuration from standard OpenAI SDK environment variables.
This errs if neither OPENAI_API_KEY nor AZURE_OPENAI_API_KEY is set. Prioritizes AZURE_OPENAI_API_KEY over OPENAI_API_KEY when both are set.
For Azure OpenAI, requires AZURE_OPENAI_ENDPOINT and OPENAI_API_VERSION.
See https://github.com/openai/openai-python/blob/main/src/openai/_client.py
func WriteConfig ¶
func WriteConfig(data *ConfigData) (string, error)
WriteConfig generates the AI Gateway configuration.
Types ¶
type AnthropicConfig ¶
type AnthropicConfig struct {
BackendName string // References a Backend.Name (typically "anthropic")
SchemaName string // Schema name: "Anthropic"
Version string // API version (Anthropic path prefix)
}
AnthropicConfig holds Anthropic-specific configuration for generating AIServiceBackend resources. This is nil when no Anthropic configuration is present.
type Backend ¶
type Backend struct {
Name string // Backend resource name (e.g., "openai", "github")
Hostname string // Hostname for Backend endpoint (FQDN only)
IP string // IP address for Backend endpoint (IPv4/IPv6 literal)
Port int // Port number
NeedsTLS bool // Whether TLS is required for this backend.
IsTelemetry bool // Whether this is a telemetry backend (e.g., OTLP collector).
}
Backend represents a network backend endpoint (OpenAI or MCP server). Backends are rendered as Kubernetes Backend resources with optional TLS policy.
type ConfigData ¶
type ConfigData struct {
Backends []Backend // All backend endpoints (e.g. OpenAI, Anthropic, MCP, and OTEL)
OpenAI *OpenAIConfig // OpenAI-specific configuration (nil when not present)
Anthropic *AnthropicConfig // Anthropic-specific configuration (nil when not present)
MCPBackendRefs []MCPBackendRef // MCP routing configuration (nil/empty for OpenAI-only or Anthropic-only mode)
Debug bool // Enable debug logging for Envoy (includes component-level logging for ext_proc, http, connection)
EnvoyVersion string // Explicitly configure the version of Envoy to use.
OTELLog *otelLogConfig // OpenTelemetry access log configuration (nil => file sink).
}
ConfigData holds all template data for generating the AI Gateway configuration. It supports OpenAI-only, Anthropic-only, MCP-only, or combined configurations.
type MCPBackendRef ¶
type MCPBackendRef struct {
BackendName string // References a Backend.Name
Path string // MCP endpoint path
IncludeTools []string // Only the specified tools will be available
APIKey string // Optional API key extracted from Authorization: Bearer header
Headers map[string]string // Optional arbitrary headers for headerMutation (excluding Authorization)
}
MCPBackendRef references a backend with MCP-specific routing configuration. Used to generate MCPRoute backendRefs with path, tool filtering, and authentication.
type MCPServer ¶
type MCPServer struct {
// Type specifies the MCP server transport protocol.
// Common values: "http", "streamable-http", "sse"
Type string `json:"type"`
// URL is the full endpoint URL for the MCP server.
// For HTTP servers: "https://api.example.com/mcp"
// For local servers: "http://localhost:3000/mcp"
URL string `json:"url"`
// Headers contains optional HTTP headers sent to the MCP server.
// Commonly used for authentication: {"Authorization": "Bearer token"}
Headers map[string]string `json:"headers,omitempty"`
// IncludeTools specifies which tools will be available from the server.
IncludeTools []string `json:"includeTools,omitempty"`
// Command is the executable to run.
Command string `json:"command,omitempty"`
// Args are the command-line arguments.
Args []string `json:"args,omitempty"`
}
MCPServer represents the configuration for a Model Context Protocol (MCP) server. This follows the canonical JSON format used by MCP clients like Claude Desktop, Cursor, VS Code, and other MCP ecosystem tools.
The includeTools field follows the convention established by Gemini CLI: https://google-gemini.github.io/gemini-cli/docs/tools/mcp-server.html#optional
Example canonical configuration:
{
"mcpServers": {
"github": {
"type": "http",
"url": "https://api.githubcopilot.com/mcp/",
"headers": {"Authorization": "Bearer ghp_xxxxxxxxxxxx"},
"includeTools": ["search_repositories", "get_file_contents", "list_issues"]
}
}
}
type MCPServers ¶
MCPServers is the structure of the MCP servers configuration file. This matches the format used by MCP client configuration files.
type OpenAIConfig ¶
type OpenAIConfig struct {
BackendName string // References a Backend.Name (typically "openai")
SchemaName string // Schema name: "OpenAI" or "AzureOpenAI"
Version string // API version (OpenAI path prefix or Azure query param version)
OrganizationID string // Optional OpenAI-Organization header value
ProjectID string // Optional OpenAI-Project header value
}
OpenAIConfig holds OpenAI-specific configuration for generating AIServiceBackend resources. This is nil when no OpenAI configuration is present (MCP-only mode).