copilot

package
v0.1.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 12, 2026 License: MIT Imports: 11 Imported by: 0

Documentation

Overview

Package copilot provides a GitHub Copilot chat model implementation using the Copilot SDK.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type ChatModel

type ChatModel struct {
	// contains filtered or unexported fields
}

ChatModel is the GitHub Copilot chat model implementation backed by the Copilot SDK.

func New

func New(ctx context.Context, optFns ...OptionFunc) (*ChatModel, error)

New creates a new GitHub Copilot ChatModel. The returned model must be closed with Close() when no longer needed.

func (*ChatModel) Batch

func (m *ChatModel) Batch(ctx context.Context, inputs [][]core.Message, opts ...core.Option) ([]*core.AIMessage, error)

Batch performs multiple chat completions in parallel. Concurrency is controlled by core.WithMaxConcurrency at call time and falls back to the model's MaxConcurrency option when unset.

func (*ChatModel) BindTools

func (m *ChatModel) BindTools(toolDefs ...llms.ToolDefinition) llms.ChatModel

BindTools returns a copy of the model with tools bound.

func (*ChatModel) Close

func (m *ChatModel) Close() error

Close stops the Copilot CLI server and releases resources.

func (*ChatModel) Generate

func (m *ChatModel) Generate(ctx context.Context, messages []core.Message, opts ...core.Option) (*llms.ChatResult, error)

Generate performs a chat completion with full result details.

func (*ChatModel) GetName

func (m *ChatModel) GetName() string

GetName returns the name of this model.

func (*ChatModel) Invoke

func (m *ChatModel) Invoke(ctx context.Context, input []core.Message, opts ...core.Option) (*core.AIMessage, error)

Invoke sends messages to the Copilot API and returns the AI response.

func (*ChatModel) Stream

func (m *ChatModel) Stream(ctx context.Context, input []core.Message, opts ...core.Option) (*core.StreamIterator[*core.AIMessage], error)

Stream sends messages and streams the response token by token.

func (*ChatModel) WithStructuredOutput

func (m *ChatModel) WithStructuredOutput(schema map[string]any) llms.ChatModel

WithStructuredOutput returns a copy of the model configured for structured output.

type OptionFunc

type OptionFunc func(*Options)

OptionFunc configures Copilot-specific options.

func WithCLIPath

func WithCLIPath(path string) OptionFunc

WithCLIPath sets the path to the Copilot CLI executable.

func WithGithubToken

func WithGithubToken(token string) OptionFunc

WithGithubToken sets the GitHub token for authentication.

func WithLogLevel

func WithLogLevel(level string) OptionFunc

WithLogLevel sets the log level for the CLI server.

func WithMaxConcurrency

func WithMaxConcurrency(n int) OptionFunc

WithMaxConcurrency sets the maximum number of parallel sessions in Batch.

func WithModelName

func WithModelName(model string) OptionFunc

WithModelName sets the model name.

func WithPermissionHandler

func WithPermissionHandler(handler copilot.PermissionHandlerFunc) OptionFunc

WithPermissionHandler sets a custom permission handler for approving operations. If not set, all permission requests are denied by default. The handler is called for file writes, shell commands, URL fetches, and other operations.

func WithReasoningEffort

func WithReasoningEffort(effort string) OptionFunc

WithReasoningEffort sets the reasoning effort level for models that support it. Valid values: "low", "medium", "high", "xhigh". When set, the model will emit reasoning_delta events alongside message_delta events during streaming, and ThinkingFunc in Jarvis Config will be called with each token.

func WithTools

func WithTools(t ...tools.Tool) OptionFunc

WithTools sets langchain Tool implementations that get bridged to SDK tool handlers. The SDK manages the full tool-calling loop internally, so Invoke returns the final response after all tool calls are resolved. The Copilot session only exposes explicitly configured tools.

type Options

type Options struct {
	// GithubToken is the GitHub token for authentication. Falls back to GITHUB_TOKEN env var.
	GithubToken string

	// Model is the model ID (e.g., "gpt-5-mini", "gpt-5", "claude-sonnet-4.5").
	Model string

	// CLIPath is the path to the Copilot CLI executable. Defaults to "copilot".
	CLIPath string

	// LogLevel for the CLI server (e.g., "error", "info", "debug").
	LogLevel string

	// MaxConcurrency controls the maximum number of parallel sessions in Batch.
	// Defaults to 5.
	MaxConcurrency int

	// Temperature controls randomness (0.0 to 2.0).
	Temperature *float64

	// MaxTokens limits the response length.
	MaxTokens *int

	// TopP controls nucleus sampling.
	TopP *float64

	// Stop sequences.
	Stop []string

	// Tools are langchain Tool implementations that get bridged to SDK tool handlers.
	// Only these tools, plus any llms.ToolDefinition values bound via BindTools,
	// are exposed to the Copilot session. Built-in Copilot CLI tools are disabled
	// by default.
	Tools []tools.Tool

	// ReasoningEffort controls the effort the model spends on reasoning.
	// Valid values: "low", "medium", "high", "xhigh".
	// Only applies to models that support reasoning (e.g. gpt-5-mini, gpt-5).
	// Empty string (default) disables the field and uses the model default.
	ReasoningEffort string

	// OnPermissionRequest is a handler for permission requests from the server.
	// If nil, all permission requests are denied by default.
	// Provide a handler to approve operations (file writes, shell commands, URL fetches, etc.).
	OnPermissionRequest copilot.PermissionHandlerFunc
}

Options holds configuration for the GitHub Copilot chat model.

func DefaultOptions

func DefaultOptions() *Options

DefaultOptions returns sensible defaults for the GitHub Copilot provider.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL