llm

package
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 21, 2026 License: MIT Imports: 1 Imported by: 0

README

pkg/llm

This package provides LLM (Large Language Model) client implementations and a provider registry. Falcon supports multiple LLM backends through a common interface and a self-registering provider system — adding a new provider requires no changes to the setup wizard or client factory.

Package Overview

pkg/llm/
├── client.go                # LLMClient interface + Message/StreamCallback types
├── provider.go              # Provider interface + SetupField types
├── registry.go              # Global provider registry (Register, Get, All)
├── register_providers.go    # Documentation for provider registration pattern
├── ollama/
│   ├── ollama.go            # Ollama HTTP client (local and cloud)
│   └── ollama_provider.go   # OllamaProvider — registry metadata + BuildClient + init() registration
├── gemini/
│   ├── gemini.go            # Google Gemini client (official SDK)
│   └── gemini_provider.go   # GeminiProvider — registry metadata + BuildClient + init() registration
└── openrouter/
    ├── openrouter.go        # OpenRouter HTTP client (OpenAI-compatible gateway)
    └── openrouter_provider.go # OpenRouterProvider — registry metadata + BuildClient + init() registration

LLMClient Interface

All provider clients implement this interface (client.go):

type LLMClient interface {
    Chat(messages []Message) (string, error)
    ChatStream(messages []Message, callback StreamCallback) (string, error)
    CheckConnection() error
    GetModel() string
}

type Message struct {
    Role    string // "system", "user", or "assistant"
    Content string
}

type StreamCallback func(chunk string)

Provider Interface

Every provider registration implements this interface (provider.go). It describes both how to show setup UI and how to build the client at runtime.

type Provider interface {
    ID() string                // stable config key, e.g. "openrouter"
    DisplayName() string       // shown in setup wizard
    DefaultModel() string      // used when user leaves model field blank
    SetupFields() []SetupField // fields to collect during first-run setup
    BuildClient(values map[string]string, model string) (LLMClient, error)
}
SetupField

Describes a single configuration field rendered by the setup wizard:

type SetupField struct {
    Key         string        // config key, e.g. "api_key"
    Type        FieldType     // FieldInput (default) or FieldSelect
    Title       string        // label shown in wizard
    Description string
    Placeholder string
    Secret      bool          // echo as password
    Default     string        // applied when user leaves blank
    Options     []FieldOption // only for FieldSelect
    EnvFallback string        // env var checked at runtime when viper value is empty
}

Supported Providers

Ollama (ollama/ollama.go + ollama/ollama_provider.go)

Local and cloud Ollama instances. Uses Ollama's /api/chat endpoint with newline-delimited JSON streaming.

client := ollama.NewOllamaClient("http://localhost:11434", "llama3", "")

Config format:

provider: ollama
default_model: llama3
provider_config:
  mode: local          # "local" or "cloud"
  url: http://localhost:11434
  api_key: ""          # required for cloud mode

Setup fields: mode (select), url, api_key Env fallback: OLLAMA_API_KEY


Google Gemini (gemini/gemini.go + gemini/gemini_provider.go)

Uses the official google.golang.org/genai SDK. Handles Gemini's role mapping ("assistant""model") and system instruction extraction automatically.

client, err := gemini.NewGeminiClient("your-api-key", "gemini-2.5-flash-lite")

Config format:

provider: gemini
default_model: gemini-2.5-flash-lite
provider_config:
  api_key: your-api-key

Setup fields: api_key Env fallback: GEMINI_API_KEY


OpenRouter (openrouter/openrouter.go + openrouter/openrouter_provider.go)

OpenAI-compatible gateway giving access to hundreds of models (Claude, GPT-4, Llama, Gemini, etc.) through a single API endpoint at https://openrouter.ai/api/v1. Uses SSE streaming.

client := openrouter.NewOpenRouterClient("your-api-key", "google/gemini-2.5-flash-lite")

Config format:

provider: openrouter
default_model: google/gemini-2.5-flash-lite
provider_config:
  api_key: your-api-key

Setup fields: api_key Env fallback: OPENROUTER_API_KEY Browse models: https://openrouter.ai/models


Registry

The registry (registry.go) is a simple ordered map. Each provider self-registers via an init() function in its own subpackage (e.g., ollama/ollama_provider.go):

// In each provider's _provider.go file:
func init() {
    llm.Register(&OllamaProvider{})
}

To activate providers, blank imports trigger each provider's init(). These are centralized in register_providers.go:

// pkg/llm/register_providers.go:
import (
    _ "github.com/blackcoderx/falcon/pkg/llm/ollama"
    _ "github.com/blackcoderx/falcon/pkg/llm/gemini"
    _ "github.com/blackcoderx/falcon/pkg/llm/openrouter"
)

API:

llm.All()          // []Provider in registration order — used to build wizard UI
llm.Get("gemini")  // (Provider, bool) — used by the client factory at runtime

The setup wizard and client factory consume the registry — they never need to change when a provider is added.

Usage

Basic Chat
import "github.com/blackcoderx/falcon/pkg/llm/ollama"

client := ollama.NewOllamaClient("http://localhost:11434", "llama3", "")

messages := []llm.Message{
    {Role: "system", Content: "You are a helpful assistant."},
    {Role: "user", Content: "Hello!"},
}

response, err := client.Chat(messages)
Streaming Chat
response, err := client.ChatStream(messages, func(chunk string) {
    fmt.Print(chunk)
})
Connection Check
if err := client.CheckConnection(); err != nil {
    fmt.Printf("LLM not available: %v\n", err)
}

Adding a New Provider

Three steps, no changes to any other package:

Step 1: Create a subpackage and implement LLMClient
// pkg/llm/myprovider/myprovider.go
package myprovider

import "github.com/blackcoderx/falcon/pkg/llm"

type MyProviderClient struct {
    apiKey string
    model  string
}

func NewMyProviderClient(apiKey, model string) *MyProviderClient { ... }

func (c *MyProviderClient) Chat(messages []llm.Message) (string, error)                              { ... }
func (c *MyProviderClient) ChatStream(messages []llm.Message, cb llm.StreamCallback) (string, error) { ... }
func (c *MyProviderClient) CheckConnection() error                                                    { ... }
func (c *MyProviderClient) GetModel() string                                                          { return c.model }
Step 2: Implement Provider with self-registration
// pkg/llm/myprovider/myprovider_provider.go
package myprovider

import "github.com/blackcoderx/falcon/pkg/llm"

func init() {
    llm.Register(&MyProvider{})
}

type MyProvider struct{}

func (p *MyProvider) ID() string          { return "myprovider" }
func (p *MyProvider) DisplayName() string  { return "My Provider" }
func (p *MyProvider) DefaultModel() string { return "my-default-model" }

func (p *MyProvider) SetupFields() []llm.SetupField {
    return []llm.SetupField{
        {
            Key:         "api_key",
            Title:       "API Key",
            Secret:      true,
            EnvFallback: "MYPROVIDER_API_KEY",
        },
    }
}

func (p *MyProvider) BuildClient(values map[string]string, model string) (llm.LLMClient, error) {
    if model == "" {
        model = p.DefaultModel()
    }
    return NewMyProviderClient(values["api_key"], model), nil
}
Step 3: Add a blank import
// In pkg/llm/register_providers.go — add one line:
import (
    _ "github.com/blackcoderx/falcon/pkg/llm/myprovider"
)

The provider now appears in the setup wizard (falcon config), gets its own entry in ~/.falcon/config.yaml, and is instantiated at runtime. Zero changes to pkg/core/init.go or pkg/core/globalconfig.go.

Testing

Create a mock client for unit tests:

type MockLLMClient struct {
    Response string
    Err      error
}

func (m *MockLLMClient) Chat(_ []llm.Message) (string, error)                              { return m.Response, m.Err }
func (m *MockLLMClient) ChatStream(_ []llm.Message, cb llm.StreamCallback) (string, error) { cb(m.Response); return m.Response, m.Err }
func (m *MockLLMClient) CheckConnection() error                                             { return m.Err }
func (m *MockLLMClient) GetModel() string                                                   { return "mock" }
go test ./pkg/llm/...

Documentation

Overview

Package llm provides client implementations for Large Language Models. It defines a common interface (LLMClient) that all providers must implement, enabling easy switching between different LLM backends like Ollama and Gemini.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Register

func Register(p Provider)

Register adds a provider to the global registry. Call this from an init() function in register_providers.go. Panics if a provider with the same ID is registered twice.

Types

type FieldOption

type FieldOption struct {
	Label string
	Value string
}

FieldOption is a label/value pair used by FieldSelect fields.

type FieldType

type FieldType string

FieldType indicates what kind of UI element to render for a setup field.

const (
	FieldInput  FieldType = "input"
	FieldSelect FieldType = "select"
)

type LLMClient

type LLMClient interface {
	// Chat sends a non-streaming chat request and returns the complete response.
	Chat(messages []Message) (string, error)

	// ChatStream sends a streaming chat request and calls callback for each chunk.
	// Returns the complete response when streaming finishes.
	ChatStream(messages []Message, callback StreamCallback) (string, error)

	// CheckConnection verifies that the LLM service is accessible.
	CheckConnection() error

	// GetModel returns the name of the model being used.
	GetModel() string
}

LLMClient defines the interface that all LLM providers must implement. This allows the agent to work with any LLM backend without tight coupling.

type Message

type Message struct {
	Role    string `json:"role"` // "system", "user", or "assistant"
	Content string `json:"content"`
}

Message represents a chat message

type Provider

type Provider interface {
	// ID returns the stable identifier stored in config.yaml (e.g. "openrouter").
	ID() string

	// DisplayName returns the human-readable name shown in the setup wizard.
	DisplayName() string

	// DefaultModel returns the model name used when the user leaves the field blank.
	DefaultModel() string

	// SetupFields describes the fields to collect during first-run setup.
	// The wizard renders them in order using huh.
	SetupFields() []SetupField

	// BuildClient constructs a ready-to-use LLMClient from the collected values
	// (either from the setup wizard or read from viper at runtime).
	// values keys match SetupField.Key; model may be empty (use DefaultModel).
	BuildClient(values map[string]string, model string) (LLMClient, error)
}

Provider is implemented by every LLM backend. Adding a new provider means implementing this interface and registering it in register_providers.go — no other files need to change.

func All

func All() []Provider

All returns all registered providers in registration order.

func Get

func Get(id string) (Provider, bool)

Get returns the provider with the given ID, or (nil, false) if not found.

type SetupField

type SetupField struct {
	Key         string
	Type        FieldType // defaults to FieldInput if zero
	Title       string
	Description string
	Placeholder string
	Secret      bool          // echo as password
	Default     string        // applied when the user leaves the field blank
	Options     []FieldOption // only used when Type == FieldSelect
	EnvFallback string        // environment variable to check when viper value is empty
}

SetupField describes a single piece of configuration a provider needs from the user. The setup wizard renders these generically using huh so providers never import UI code.

type StreamCallback

type StreamCallback func(chunk string)

StreamCallback is called for each chunk of streaming response

Directories

Path Synopsis
Package ollama provides the Ollama LLM client implementation.
Package ollama provides the Ollama LLM client implementation.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL