llm

package
v0.46.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 6, 2026 License: Apache-2.0 Imports: 18 Imported by: 0

Documentation

Index

Constants

View Source
const (
	// DefaultTimeoutSeconds is the default timeout for LLM API calls.
	DefaultTimeoutSeconds = 30

	// DefaultMaxTokens is the default maximum tokens for LLM responses.
	DefaultMaxTokens = 1000
)

Variables

This section is empty.

Functions

func BuildPrompt added in v0.46.0

func BuildPrompt(request *AnalysisRequest) (string, error)

BuildPrompt combines the base prompt with context data. This function is shared across all LLM providers to ensure consistent prompt formatting.

func ExecuteAnalysis added in v0.46.0

func ExecuteAnalysis(
	ctx context.Context,
	run *params.Run,
	kinteract kubeinteraction.Interface,
	logger *zap.SugaredLogger,
	repo *v1alpha1.Repository,
	pr *tektonv1.PipelineRun,
	event *info.Event,
	prov provider.Interface,
) error

ExecuteAnalysis performs the complete LLM analysis workflow. This is the single entry point called by the reconciler.

func RegisterProvider added in v0.46.0

func RegisterProvider(name AIProvider, fn NewClientFunc)

RegisterProvider registers a provider constructor. Called by provider init() functions.

func ValidateClientConfig added in v0.46.0

func ValidateClientConfig(provider AIProvider, secretRef *v1alpha1.Secret, apiURL string, timeoutSeconds, maxTokens int) error

ValidateClientConfig validates the client configuration before creating a client.

func ValidateURL added in v0.46.0

func ValidateURL(urlStr string) error

ValidateURL validates that the URL is properly formatted with http or https scheme.

Types

type AIProvider added in v0.46.0

type AIProvider string

AIProvider represents a supported LLM provider.

const (
	ProviderOpenAI AIProvider = "openai"
	ProviderGemini AIProvider = "gemini"
)

func SupportedProviders added in v0.46.0

func SupportedProviders() []AIProvider

SupportedProviders returns the names of all registered providers.

type AnalysisError added in v0.46.0

type AnalysisError struct {
	Provider  string `json:"provider"`
	Type      string `json:"type"`
	Message   string `json:"message"`
	Retryable bool   `json:"retryable"`
}

AnalysisError represents an error from LLM analysis.

func (*AnalysisError) Error added in v0.46.0

func (e *AnalysisError) Error() string

type AnalysisRequest added in v0.46.0

type AnalysisRequest struct {
	Prompt         string                 `json:"prompt"`
	Context        map[string]interface{} `json:"context"`
	MaxTokens      int                    `json:"max_tokens"`
	TimeoutSeconds int                    `json:"timeout_seconds"`
}

AnalysisRequest represents a request to analyze CI/CD pipeline data.

type AnalysisResponse added in v0.46.0

type AnalysisResponse struct {
	Content    string        `json:"content"`
	TokensUsed int           `json:"tokens_used"`
	Provider   string        `json:"provider"`
	Timestamp  time.Time     `json:"timestamp"`
	Duration   time.Duration `json:"duration"`
}

AnalysisResponse represents the response from an LLM analysis.

type AnalysisResult

type AnalysisResult struct {
	Role     string
	Response *AnalysisResponse
	Error    error
}

AnalysisResult represents the result of an LLM analysis.

type Client added in v0.46.0

type Client interface {
	Analyze(ctx context.Context, request *AnalysisRequest) (*AnalysisResponse, error)
	GetProviderName() string
	ValidateConfig() error
}

Client defines the interface for LLM providers.

func NewClient added in v0.46.0

func NewClient(ctx context.Context, provider AIProvider, secretRef *v1alpha1.Secret,
	namespace string, kinteract kubeinteraction.Interface,
	apiURL, model string, timeoutSeconds, maxTokens int,
) (Client, error)

NewClient creates an LLM client by validating configuration, fetching the API token from a K8s secret, and delegating to the registered provider constructor.

type NewClientFunc added in v0.46.0

type NewClientFunc func(cfg *ProviderConfig) (Client, error)

NewClientFunc is the constructor signature every provider must implement.

type ProviderConfig added in v0.46.0

type ProviderConfig struct {
	APIKey         string
	BaseURL        string
	Model          string
	TimeoutSeconds int
	MaxTokens      int
}

ProviderConfig is the unified configuration passed to every provider constructor.

Directories

Path Synopsis
providers
gemini
Package gemini is the Client implementation for Google Gemini LLM integration.
Package gemini is the Client implementation for Google Gemini LLM integration.
openai
Package openai is the Client implementation for OpenAI LLM integration.
Package openai is the Client implementation for OpenAI LLM integration.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL