alice

package module
v0.6.107 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 5, 2026 License: MIT Imports: 2 Imported by: 0

README

Alice

AI Local Interactive Cross-device Engine

Same AI session, anywhere. Terminal ↔ Feishu. No cloud lock-in. Works with OpenCode (DeepSeek V4), Codex, Claude, Gemini, Kimi.

Dev CI Main Release Go Report Card Go Reference

中文

  • Access your agent from anywhere. Terminal at your desk. Feishu on your phone. Same session, same context — just /session resume.
  • Pick your AI. OpenCode / DeepSeek V4, Codex, Claude, Gemini, Kimi. Mix and match per scene.
  • Zero cloud dependency. The agent CLI runs on your machine. No API keys, no vendor lock-in.
  • Goal mode × DeepSeek = low cost. Fire off dozens of tasks for pennies. Get notified on your phone when done.

A Feishu long-connection connector for CLI-based LLM agents — OpenCode (DeepSeek V4), Codex, Claude, Gemini, Kimi.

Runs as a local multi-bot runtime: receives Feishu messages over WebSocket, routes them into chat or work scenes, calls the configured LLM CLI, and sends replies, files, and images back. Zero cloud dependency — everything runs on your machine.

Documentation

Full documentation is at alice-space.github.io/alice.

Tutorials Get Alice running in 5 minutes
How-To Guides Task-focused recipes
Configuration Reference Every config key documented
Architecture Code-level architecture

中文文档 »

Quick Start

npm install -g @alice_space/alice
alice setup
# edit ~/.alice/config.yaml
alice --feishu-websocket

Then in Feishu: @Alice #work deploy the staging environment — Alice creates a task thread, runs your LLM backend, and streams progress back. Use /session anytime to resume the task from your terminal.

Development

make check   # fmt, vet, test, race
make build
make run

Contribution guide: CONTRIBUTING.md

License

MIT

Documentation

Index

Constants

This section is empty.

Variables

View Source
var ConfigExampleYAML = mustReadFile(embeddedFiles, "config.example.yaml")
View Source
var OpenCodePluginJS = mustReadFile(embeddedFiles, "opencode-plugin/delegate.js")
View Source
var PromptFS = mustSub(embeddedFiles, "prompts")
View Source
var SkillsFS = mustSub(embeddedFiles, "skills")
View Source
var SoulExampleMarkdown = mustReadFile(embeddedFiles, "prompts/SOUL.md.example")
View Source
var SystemdUnitTmpl = mustReadFile(embeddedFiles, "opencode-plugin/alice.service.tmpl")

Functions

This section is empty.

Types

This section is empty.

Directories

Path Synopsis
cmd
connector command
internal
llm
Package llm provides a unified interface for running LLM agent CLIs (claude, codex, gemini, kimi, etc.) as subprocess backends.
Package llm provides a unified interface for running LLM agent CLIs (claude, codex, gemini, kimi, etc.) as subprocess backends.
llm/providers/claude
Package claude drives the claude CLI as a subprocess and parses its stream-json output into a plain text reply.
Package claude drives the claude CLI as a subprocess and parses its stream-json output into a plain text reply.
llm/providers/codex
Package codex drives the codex CLI as a subprocess and parses its JSON-lines output into a plain text reply with optional file-change events.
Package codex drives the codex CLI as a subprocess and parses its JSON-lines output into a plain text reply with optional file-change events.
llm/providers/gemini
Package gemini drives the gemini CLI as a subprocess and parses its JSON output into a plain text reply.
Package gemini drives the gemini CLI as a subprocess and parses its JSON output into a plain text reply.
llm/providers/kimi
Package kimi drives the kimi CLI as a subprocess and parses its stream-json output into a plain text reply.
Package kimi drives the kimi CLI as a subprocess and parses its stream-json output into a plain text reply.
llm/providers/opencode
Package opencode drives the opencode CLI as a subprocess and parses its JSON-lines output into a plain text reply, session ID, and token usage.
Package opencode drives the opencode CLI as a subprocess and parses its JSON-lines output into a plain text reply, session ID, and token usage.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL