lmstudio

package
v0.10.6 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 5, 2026 License: MIT Imports: 13 Imported by: 0

Documentation

Index

Constants

View Source
const DefaultBaseURL = "http://localhost:1234/v1"

Variables

View Source
var ProtocolCapabilities = openai.ProtocolCapabilities{
	Tools:            true,
	Stream:           true,
	StructuredOutput: true,
}

ProtocolCapabilities reflects what an LM Studio server exposes on the OpenAI-compatible surface. Although LM Studio can host models with native reasoning controls on `/api/v1/chat`, the OpenAI-compatible surface is not a verified reasoning-control surface for routing or benchmark use.

Evidence (2026-04-23) against Bragi LM Studio serving `qwen/qwen3.6-35b-a3b` (arch=qwen35moe, Q4_K_M) shows LM Studio accepts multiple reasoning-related request shapes on `/v1/chat/completions` but does not reliably honor them in the model template. Treat this provider as tool-capable and streaming-capable, but not as supporting request-level reasoning control on the OpenAI-compatible wire.

Functions

func LookupModelLimits

func LookupModelLimits(ctx context.Context, baseURL, model string) limits.ModelLimits

func New

func New(cfg Config) *openai.Provider

Types

type Config

type Config struct {
	BaseURL      string
	APIKey       string
	Model        string
	ModelPattern string
	KnownModels  map[string]string
	Headers      map[string]string
	Reasoning    reasoning.Reasoning
}

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL