omlx

package
v0.10.13 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 7, 2026 License: MIT Imports: 15 Imported by: 0

Documentation

Index

Constants

View Source
const DefaultBaseURL = "http://localhost:1235/v1"

Variables

View Source
var ProtocolCapabilities = openai.ProtocolCapabilities{
	Tools:            true,
	Stream:           true,
	StructuredOutput: true,
	Thinking:         true,
	ThinkingFormat:   openai.ThinkingWireFormatQwen,

	StrictThinkingModelMatch: true,
}

Functions

func LookupModelLimits

func LookupModelLimits(ctx context.Context, baseURL, model string) limits.ModelLimits

func New

func New(cfg Config) *openai.Provider

Types

type Config

type Config struct {
	BaseURL      string
	APIKey       string
	Model        string
	ModelPattern string
	KnownModels  map[string]string
	Headers      map[string]string
	Reasoning    reasoning.Reasoning
}

type UtilizationProbe added in v0.10.9

type UtilizationProbe struct {
	// contains filtered or unexported fields
}

UtilizationProbe queries oMLX server-root observability endpoints and normalizes them into the shared endpoint utilization shape.

func NewUtilizationProbe added in v0.10.9

func NewUtilizationProbe(baseURL string, client *http.Client) *UtilizationProbe

NewUtilizationProbe creates a probe for an OpenAI-compatible oMLX base URL.

func (*UtilizationProbe) Probe added in v0.10.9

Probe fetches /api/status from the server root and returns a normalized sample. Failures return stale or unknown utilization instead of surfacing endpoint unavailability.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL