config

package
v1.3.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 16, 2026 License: AGPL-3.0 Imports: 8 Imported by: 0

Documentation

Overview

Package config provides configuration management for the legible application.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Config

type Config struct {
	// OutputDir is the directory where downloaded and processed files will be saved
	OutputDir string

	// Labels filters documents by reMarkable labels (empty means sync all documents)
	Labels []string

	// OCREnabled determines whether OCR processing should be performed
	OCREnabled bool

	// OCRLanguages specifies the languages to use for OCR (e.g., "eng", "eng+fra")
	OCRLanguages string

	// SyncInterval is the duration between sync operations in daemon mode (0 = run once)
	SyncInterval time.Duration

	// StateFile is the path to the sync state persistence file
	StateFile string

	// LogLevel controls logging verbosity (debug, info, warn, error)
	LogLevel string

	// RemarkableToken is the authentication token for the reMarkable API
	RemarkableToken string

	// DaemonMode enables continuous sync operation
	DaemonMode bool

	// LLM configuration for OCR processing
	LLM LLMConfig
}

Config holds all configuration settings for the legible application. Configuration precedence: CLI flags > Environment variables > Config file > Defaults

func Load

func Load(configFile string) (*Config, error)

Load reads configuration from multiple sources and returns a Config instance. Sources are checked in this order: CLI flags > env vars > config file > defaults

func (*Config) String

func (c *Config) String() string

String returns a string representation of the configuration (with sensitive data redacted)

func (*Config) Validate

func (c *Config) Validate() error

Validate checks that the configuration is valid and internally consistent

type LLMConfig added in v1.1.0

type LLMConfig struct {
	// Provider is the LLM provider to use (ollama, openai, anthropic, google)
	Provider string

	// Model is the specific model to use for OCR
	Model string

	// Endpoint is the API endpoint (primarily for Ollama)
	Endpoint string

	// APIKey is the API key for cloud providers (typically from env vars or keychain)
	// This will be populated from:
	// 1. macOS Keychain (if UseKeychain is true)
	// 2. Environment variables:
	//    - OPENAI_API_KEY for OpenAI
	//    - ANTHROPIC_API_KEY for Anthropic
	//    - GOOGLE_API_KEY or GOOGLE_APPLICATION_CREDENTIALS for Google
	APIKey string

	// MaxRetries is the maximum number of retry attempts for API calls
	MaxRetries int

	// Temperature controls randomness (0.0 = deterministic, recommended for OCR)
	Temperature float64

	// UseKeychain enables macOS Keychain lookup for API keys (macOS only)
	UseKeychain bool

	// KeychainServicePrefix is the prefix for keychain service names
	// Service names will be: {prefix}-{provider} (e.g., "legible-openai")
	KeychainServicePrefix string
}

LLMConfig holds configuration for LLM-based OCR providers

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL