tingly-box

module
v0.260507.0-rc5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 5, 2026 License: MPL-2.0

README

Tingly Box Web UI Demo

Announcement: Here is fault record. Please update to the latest version to resolve known issues. Thank you for your continued support.

Tingly Box

Quick StartFeaturesIntegrationDocumentationIssues

Go Version License Platform

Tingly Box serves agents, coordinates AI models, optimizes context, and routes requests for maximum efficiency — with built-in remote control and secure, customizable integrations.

Tingly Box Web UI Demo

Key Features

  • Unified API Gateway – One mixin endpoint to rule them all — seamlessly bridge OpenAI, Anthropic, Google Gemini, and more with automatic protocol translation
  • Agent Integration – One-click config for Claude Code, OpenCode, Codex, Xcode, and more — transparent proxying for SDKs and CLI tools
  • Agent Profiles - Run agent like Claude Code with individual profiles, each profile can work with different model and agent config
  • Remote Control via IM Bots – Control AI agents remotely through Telegram, DingTalk, Feishu, Lark, Weixin, WeCom, Slack, and Discord
  • Multi-Tenant API Tokens – Isolate data per user with dedicated API tokens — each user gets their own usage tracking, provider access, and configuration
  • Smart Routing Engine – Intelligently route requests across models and tokens based on cost, speed, or custom policies — far beyond simple load balancing
  • Flexible Authentication – Support for both API keys and OAuth providers (Claude.ai, Codex, etc.) — use your existing quotas anywhere
  • Visual Control Plane – Intuitive web UI to manage providers, routes, aliases, models, and remote bots at a glance — no config files needed
  • Client-Side Usage Analytics – Track token consumption, latency, cost estimates, and model selection per request — directly from your client
  • Blazing Fast Performance – Adds typically < 1ms of overhead — get flexibility without latency tax

Quick Start

Install

From npm (recommended)

# Install and run (auto restart, migrate and open webui while run without any args)
# A golang binary release but npx to wrap cli for convenience
npx tingly-box@latest

# or -y for convenience
npx -y tingly-box@latest

# if any network trouble, try bundle with binary built-in
npx -y tingly-box-bundle@latest

if any trouble, please check tingly-box output, or call for an issue to help.

From Docker (GitHub Host)

mkdir tingly-data
docker run -d \
  --name tingly-box \
  -p 12580:12580 \
  -v `pwd`/tingly-data:/home/tingly/.tingly-box \
  ghcr.io/tingly-dev/tingly-box

Integration Guide

Agent Integration - Claude Code / OpenCode / Codex / Xcode / VSCode / OpenClaw
  • Claude Code (support 1-click config)
  • OpenCode (support 1-click config)
  • Xcode (require manual config)
  • ……

Any application is ready to use.

We've provided detailed config guide in application

Agent Integration Demo

Remote Control Agent via IM Bots - TG / DingTalk / Feishu / Lark / Weixin / WecCom

Tingly Box now supports remote control through popular IM platforms. Interact with your AI agents remotely without direct server access.

Supported Platforms

  • ✅ Telegram
  • ✅ DingTalk
  • ✅ Feishu
  • ✅ Lark
  • ✅ Weixin
  • ✅ WeCom
  • Slack
  • Discord

Quick Setup

  1. Open Web UI like http://localhost:12580
  2. Navigate to Remote section
  3. Configure your preferred IM platform bot
  4. Start interacting with your agents remotely

Use Cases

  • Execute tasks and queries from your phone or any device
  • Team collaboration with shared agent access
  • Monitor and control agents while away from your workstation

Remote Control Demo

OpenAI SDK
from openai import OpenAI

client = OpenAI(
    api_key="your-tingly-model-token",
    base_url="http://localhost:12580/tingly/openai/v1"
)

response = client.chat.completions.create(
    model="tingly-gpt",
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response)
Anthropic SDK
from anthropic import Anthropic

client = Anthropic(
    api_key="your-tingly-model-token",
    base_url="http://localhost:12580/tingly/anthropic"
)

response = client.messages.create(
    model="tingly",
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Hello!"}
    ]
)
print(response)

Tingly Box proxies requests transparently for SDKs and CLI tools.

Using OAuth Providers

You can also add OAuth providers (like Claude Code) and use your existing quota in any OpenAI-compatible tool:

# 1. Add Claude Code via OAuth in Web UI (http://localhost:12580)
# 2. Configure your tool with Tingly Box endpoint

Requests route through your OAuth-authorized provider, using your existing Claude Code quota instead of requiring a separate API key.

This works with any tool that supports OpenAI-compatible endpoints: Cherry Studio, VS Code extensions, or custom AI agents.

OAuth Provider Demo

Web Management UI

Launch the web management interface:

npx tingly-box@latest

Then open http://localhost:12580 in your browser.

Dashboard

Documentation

User Manual – Installation, configuration, and operational guide

Guardrails – Policy-based safety checks, built-in protections, and protected credential masking

MCP Web Tools – Local stdio MCP server for web_search / web_fetch

Contributing

By contributing to this repository, you agree that your contributions may be included in this project under the MPL-2.0 and may also be used by Tingly Inc. under separate commercial licensing terms.

See CONTRIBUTING.md and NOTICE for details.


We welcome contributions!
Please check steps below to build from source code.

Requires: Go 1.25+, Node.js 20+, pnpm, task

# Install dependencies
# - Go: https://go.dev/doc/install
# - Node.js: https://nodejs.org/
# - pnpm: `npm install -g pnpm`
# - task: https://taskfile.dev/installation/, or `go install github.com/go-task/task/v3/cmd/task@latest`
# - shell: copy and run shell command in taskfile directly

git submodule update --init --recursive

# Build with frontend
task build

# Build GUI binary via wails3
task wails:build

Support

Telegram Wechat
image image
https://t.me/+V1sqeajw1pYwMzU1 tingly-box

Early Contributors

Special badges are minted to recognize the contributions from following contributors:


image image image image image

License

This project is available under:

For commercial licensing inquiries, contact biz@tingly.dev.

Directories

Path Synopsis
ask
ai module
cli
harness command
tingly-box command
imbot module
command/tui
Package tui provides interactive terminal prompts for the tingly-box CLI.
Package tui provides interactive terminal prompts for the tingly-box CLI.
feature
Package feature provides experimental feature flag definitions and parsing.
Package feature provides experimental feature flag definitions and parsing.
obs
Package obs provides a unified observability layer for Go applications, integrating structured logging, metrics, and distributed tracing via OpenTelemetry.
Package obs provides a unified observability layer for Go applications, integrating structured logging, metrics, and distributed tracing via OpenTelemetry.
protocol
Package protocol provides backward compatibility aliases to the public protocol package.
Package protocol provides backward compatibility aliases to the public protocol package.
protocol_validate
Package protocoltest provides a framework for end-to-end validation of the model gateway's protocol transformation layer.
Package protocoltest provides a framework for end-to-end validation of the model gateway's protocol transformation layer.
remote_control/bot
Package command provides built-in command definitions for the remote control bot.
Package command provides built-in command definitions for the remote control bot.
server
Package server since we do refactoring and migrating step by step, some api names are not unified, this will be updated in future
Package server since we do refactoring and migrating step by step, some api names are not unified, this will be updated in future
server/module/imbot
Package imbotsettings provides handlers for ImBot settings management.
Package imbotsettings provides handlers for ImBot settings management.
server/module/notify
Package notify is the HTTP front end for scenario plugin events.
Package notify is the HTTP front end for scenario plugin events.
server_validate
Package server_validate provides a mock HTTP provider server that speaks OpenAI, Anthropic, and Google response formats for testing purposes.
Package server_validate provides a mock HTTP provider server that speaks OpenAI, Anthropic, and Google response formats for testing purposes.
smart_compact
Package smart_compact provides conversation compression strategies and transformers for Anthropic requests.
Package smart_compact provides conversation compression strategies and transformers for Anthropic requests.
typ
virtualmodel
Package virtualmodel defines the protocol-agnostic primitives for virtual models.
Package virtualmodel defines the protocol-agnostic primitives for virtual models.
virtualmodel/anthropic
Package anthropic provides Anthropic-protocol virtual models.
Package anthropic provides Anthropic-protocol virtual models.
virtualmodel/benchmark
Package benchmark provides a load-testing client and an in-process server factory for the virtualmodel HTTP service.
Package benchmark provides a load-testing client and an in-process server factory for the virtualmodel HTTP service.
virtualmodel/benchmark/examples/client command
Stand-alone benchmark client driver: starts an in-process LocalServer (vmodel-backed) and drives it with the BenchmarkClient against both the OpenAI Chat and Anthropic Messages routes, printing a metrics summary.
Stand-alone benchmark client driver: starts an in-process LocalServer (vmodel-backed) and drives it with the BenchmarkClient against both the OpenAI Chat and Anthropic Messages routes, printing a metrics summary.
virtualmodel/benchmark/examples/server command
Stand-alone benchmark mock server: starts a local HTTP server backed by the production virtualmodel registries (with their default mock models pre-registered) so external benchmark drivers can hit a realistic vmodel surface over loopback.
Stand-alone benchmark mock server: starts a local HTTP server backed by the production virtualmodel registries (with their default mock models pre-registered) so external benchmark drivers can hit a realistic vmodel surface over loopback.
virtualmodel/openai
Package openai provides OpenAI-protocol virtual models.
Package openai provides OpenAI-protocol virtual models.
virtualmodel/virtualserver
Package virtualserver provides the HTTP handler for virtual model endpoints.
Package virtualserver provides the HTTP handler for virtual model endpoints.
pkg
fs
jsonstore
Package jsonstore provides a generic JSON file-based key-value storage.
Package jsonstore provides a generic JSON file-based key-value storage.
notify
Package notify provides a unified notification system for sending messages to various channels like webhooks, Slack, Discord, email, and system notifications.
Package notify provides a unified notification system for sending messages to various channels like webhooks, Slack, Discord, email, and system notifications.
notify/examples command
Example demonstrating the notification system usage
Example demonstrating the notification system usage
notify/provider/discord
Package discord provides a Discord notification provider
Package discord provides a Discord notification provider
notify/provider/email
Package email provides an SMTP email notification provider
Package email provides an SMTP email notification provider
notify/provider/slack
Package slack provides a Slack notification provider
Package slack provides a Slack notification provider
notify/provider/system
Package system provides desktop system notification provider using beeep
Package system provides desktop system notification provider using beeep
notify/provider/webhook
Package webhook provides an HTTP webhook notification provider
Package webhook provides an HTTP webhook notification provider
obs
otel
Package otel provides OpenTelemetry-based observability for LLM token usage.
Package otel provides OpenTelemetry-based observability for LLM token usage.
remote
binding
Package binding describes which channel a scenario should use for a given event.
Package binding describes which channel a scenario should use for a given event.
channel
Package channel defines the human-facing side of the remote middle layer: a Channel is a surface (IM bot, web UI, CLI, …) that can deliver an interaction.Notification or interaction.Interaction to a human and (for interactive ones) collect a Reply.
Package channel defines the human-facing side of the remote middle layer: a Channel is a surface (IM bot, web UI, CLI, …) that can deliver an interaction.Notification or interaction.Interaction to a human and (for interactive ones) collect a Reply.
channel/autochannel
Package autochannel provides a non-IM Channel implementation for headless / CI / programmatic setups.
Package autochannel provides a non-IM Channel implementation for headless / CI / programmatic setups.
channel/imchannel
Package imchannel adapts an IM bot (one of the platforms in github.com/tingly-dev/tingly-box/imbot) into the internal/remote/channel.Channel contract.
Package imchannel adapts an IM bot (one of the platforms in github.com/tingly-dev/tingly-box/imbot) into the internal/remote/channel.Channel contract.
interaction
Package interaction defines the domain-neutral request / reply / result types that flow between scenarios (back-end content providers) and channels (human-facing surfaces) inside the remote middle layer.
Package interaction defines the domain-neutral request / reply / result types that flow between scenarios (back-end content providers) and channels (human-facing surfaces) inside the remote middle layer.
scenario
Package scenario defines the back-end plugin contract of the remote middle layer.
Package scenario defines the back-end plugin contract of the remote middle layer.
scenario/builtin/claudecode
Package claudecode implements the Claude Code hook scenario plugin.
Package claudecode implements the Claude Code hook scenario plugin.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL