Documentation
¶
Index ¶
- func CreateAWSCommands() *cobra.Command
- func GetLLMAnalysisPrompt(question string) string
- type AIProfile
- type Client
- func (c *Client) ExecCLI(ctx context.Context, args []string) (string, error)
- func (c *Client) ExecuteOperation(ctx context.Context, toolName string, input map[string]interface{}) (string, error)
- func (c *Client) ExecuteOperations(ctx context.Context, operations []LLMOperation) (string, error)
- func (c *Client) ExecuteOperationsConcurrently(ctx context.Context, operations []LLMOperation, aiProfile string) (string, error)
- func (c *Client) ExecuteOperationsWithAWSProfile(ctx context.Context, operations []LLMOperation, awsProfile, region string) (string, error)
- func (c *Client) GetAIProfiles() map[string]AIProfile
- func (c *Client) GetRecentAlarms(ctx context.Context) (string, error)
- func (c *Client) GetRelevantContext(ctx context.Context, question string) (string, error)
- type LLMAnalysis
- type LLMOperation
- type LLMOperationResult
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func CreateAWSCommands ¶
CreateAWSCommands creates the AWS command tree for static commands
func GetLLMAnalysisPrompt ¶
GetLLMAnalysisPrompt returns the prompt for LLM to analyze what AWS operations are needed
Types ¶
type AIProfile ¶
type AIProfile struct {
Provider string `mapstructure:"provider"`
AWSProfile string `mapstructure:"aws_profile"`
Model string `mapstructure:"model"`
Region string `mapstructure:"region"`
APIKeyEnv string `mapstructure:"api_key_env"`
}
AIProfile represents an AI provider configuration
func GetAIProfile ¶
GetAIProfile returns the AI configuration for the given provider name
type Client ¶
type Client struct {
// contains filtered or unexported fields
}
func NewClientWithProfile ¶
func (*Client) ExecuteOperation ¶
func (c *Client) ExecuteOperation(ctx context.Context, toolName string, input map[string]interface{}) (string, error)
ExecuteOperation exposes the default single-operation execution helper.
func (*Client) ExecuteOperations ¶
ExecuteOperations exposes the default batch execution helper.
func (*Client) ExecuteOperationsConcurrently ¶
func (c *Client) ExecuteOperationsConcurrently(ctx context.Context, operations []LLMOperation, aiProfile string) (string, error)
ExecuteOperationsConcurrently executes multiple AWS operations concurrently for LLM processing
func (*Client) ExecuteOperationsWithAWSProfile ¶
func (c *Client) ExecuteOperationsWithAWSProfile(ctx context.Context, operations []LLMOperation, awsProfile, region string) (string, error)
ExecuteOperationsWithAWSProfile executes multiple AWS operations concurrently using a direct AWS profile
func (*Client) GetAIProfiles ¶
GetAIProfiles returns all AI profiles from the configuration
func (*Client) GetRecentAlarms ¶
GetRecentAlarms gets recent CloudWatch alarm information
type LLMAnalysis ¶
type LLMAnalysis struct {
Operations []LLMOperation `json:"operations"`
Analysis string `json:"analysis"`
}
LLMAnalysis represents the LLM's analysis of what AWS operations are needed
type LLMOperation ¶
type LLMOperation struct {
Operation string `json:"operation"`
Reason string `json:"reason"`
Parameters map[string]interface{} `json:"parameters"`
}
LLMOperation represents an AWS operation requested by the LLM