Documentation
¶
Overview ¶
Package copilot provides a GitHub Copilot chat model implementation using the Copilot SDK.
Index ¶
- type ChatModel
- func (m *ChatModel) Batch(ctx context.Context, inputs [][]core.Message, opts ...core.Option) ([]*core.AIMessage, error)
- func (m *ChatModel) BindTools(toolDefs ...llms.ToolDefinition) llms.ChatModel
- func (m *ChatModel) Close() error
- func (m *ChatModel) Generate(ctx context.Context, messages []core.Message, opts ...core.Option) (*llms.ChatResult, error)
- func (m *ChatModel) GetName() string
- func (m *ChatModel) Invoke(ctx context.Context, input []core.Message, opts ...core.Option) (*core.AIMessage, error)
- func (m *ChatModel) Stream(ctx context.Context, input []core.Message, opts ...core.Option) (*core.StreamIterator[*core.AIMessage], error)
- func (m *ChatModel) WithStructuredOutput(schema map[string]any) llms.ChatModel
- type OptionFunc
- func WithCLIPath(path string) OptionFunc
- func WithGithubToken(token string) OptionFunc
- func WithLogLevel(level string) OptionFunc
- func WithMaxConcurrency(n int) OptionFunc
- func WithModelName(model string) OptionFunc
- func WithPermissionHandler(handler copilot.PermissionHandlerFunc) OptionFunc
- func WithReasoningEffort(effort string) OptionFunc
- func WithTools(t ...tools.Tool) OptionFunc
- type Options
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type ChatModel ¶
type ChatModel struct {
// contains filtered or unexported fields
}
ChatModel is the GitHub Copilot chat model implementation backed by the Copilot SDK.
func New ¶
func New(ctx context.Context, optFns ...OptionFunc) (*ChatModel, error)
New creates a new GitHub Copilot ChatModel. The returned model must be closed with Close() when no longer needed.
func (*ChatModel) Batch ¶
func (m *ChatModel) Batch(ctx context.Context, inputs [][]core.Message, opts ...core.Option) ([]*core.AIMessage, error)
Batch performs multiple chat completions in parallel. Concurrency is controlled by core.WithMaxConcurrency at call time and falls back to the model's MaxConcurrency option when unset.
func (*ChatModel) BindTools ¶
func (m *ChatModel) BindTools(toolDefs ...llms.ToolDefinition) llms.ChatModel
BindTools returns a copy of the model with tools bound.
func (*ChatModel) Generate ¶
func (m *ChatModel) Generate(ctx context.Context, messages []core.Message, opts ...core.Option) (*llms.ChatResult, error)
Generate performs a chat completion with full result details.
func (*ChatModel) Invoke ¶
func (m *ChatModel) Invoke(ctx context.Context, input []core.Message, opts ...core.Option) (*core.AIMessage, error)
Invoke sends messages to the Copilot API and returns the AI response.
type OptionFunc ¶
type OptionFunc func(*Options)
OptionFunc configures Copilot-specific options.
func WithCLIPath ¶
func WithCLIPath(path string) OptionFunc
WithCLIPath sets the path to the Copilot CLI executable.
func WithGithubToken ¶
func WithGithubToken(token string) OptionFunc
WithGithubToken sets the GitHub token for authentication.
func WithLogLevel ¶
func WithLogLevel(level string) OptionFunc
WithLogLevel sets the log level for the CLI server.
func WithMaxConcurrency ¶
func WithMaxConcurrency(n int) OptionFunc
WithMaxConcurrency sets the maximum number of parallel sessions in Batch.
func WithPermissionHandler ¶
func WithPermissionHandler(handler copilot.PermissionHandlerFunc) OptionFunc
WithPermissionHandler sets a custom permission handler for approving operations. If not set, all permission requests are denied by default. The handler is called for file writes, shell commands, URL fetches, and other operations.
func WithReasoningEffort ¶
func WithReasoningEffort(effort string) OptionFunc
WithReasoningEffort sets the reasoning effort level for models that support it. Valid values: "low", "medium", "high", "xhigh". When set, the model will emit reasoning_delta events alongside message_delta events during streaming, and ThinkingFunc in Jarvis Config will be called with each token.
func WithTools ¶
func WithTools(t ...tools.Tool) OptionFunc
WithTools sets langchain Tool implementations that get bridged to SDK tool handlers. The SDK manages the full tool-calling loop internally, so Invoke returns the final response after all tool calls are resolved. The Copilot session only exposes explicitly configured tools.
type Options ¶
type Options struct {
// GithubToken is the GitHub token for authentication. Falls back to GITHUB_TOKEN env var.
GithubToken string
// Model is the model ID (e.g., "gpt-5-mini", "gpt-5", "claude-sonnet-4.5").
Model string
// CLIPath is the path to the Copilot CLI executable. Defaults to "copilot".
CLIPath string
// LogLevel for the CLI server (e.g., "error", "info", "debug").
LogLevel string
// MaxConcurrency controls the maximum number of parallel sessions in Batch.
// Defaults to 5.
MaxConcurrency int
// Temperature controls randomness (0.0 to 2.0).
Temperature *float64
// MaxTokens limits the response length.
MaxTokens *int
// TopP controls nucleus sampling.
TopP *float64
// Stop sequences.
Stop []string
// Tools are langchain Tool implementations that get bridged to SDK tool handlers.
// Only these tools, plus any llms.ToolDefinition values bound via BindTools,
// are exposed to the Copilot session. Built-in Copilot CLI tools are disabled
// by default.
Tools []tools.Tool
// ReasoningEffort controls the effort the model spends on reasoning.
// Valid values: "low", "medium", "high", "xhigh".
// Only applies to models that support reasoning (e.g. gpt-5-mini, gpt-5).
// Empty string (default) disables the field and uses the model default.
ReasoningEffort string
// OnPermissionRequest is a handler for permission requests from the server.
// If nil, all permission requests are denied by default.
// Provide a handler to approve operations (file writes, shell commands, URL fetches, etc.).
OnPermissionRequest copilot.PermissionHandlerFunc
}
Options holds configuration for the GitHub Copilot chat model.
func DefaultOptions ¶
func DefaultOptions() *Options
DefaultOptions returns sensible defaults for the GitHub Copilot provider.