wingman-agent

command module
v0.6.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 3, 2026 License: MIT Imports: 21 Imported by: 0

README ΒΆ

Wingman Agent

A powerful AI-powered coding assistant that runs directly in your terminal. Wingman helps you with coding tasks by reading files, executing commands, editing code, and writing new files β€” all through natural conversation.

Go Version License Platform

✨ Features

  • Interactive TUI β€” Rich terminal interface with markdown rendering and syntax highlighting
  • File Operations β€” Read, write, edit, and search files in your codebase
  • Shell Integration β€” Execute shell commands with user approval
  • LSP Integration β€” Code intelligence via auto-detected language servers (definitions, references, diagnostics, call hierarchy, and more)
  • MCP Support β€” Extend functionality with Model Context Protocol servers
  • Multi-Model Support β€” Works with any OpenResponses API compatible endpoint with auto-selection
  • Rewind & Diff β€” Checkpoint-based undo with visual diff viewer
  • Skills β€” Define custom workflows using Agent Skills format
  • Image Support β€” Paste images from clipboard for vision-capable models
  • File Context β€” Add files to context with @ or drag-and-drop file paths
  • Theme Detection β€” Automatic light/dark theme based on terminal settings
  • Session Management β€” Conversations are saved automatically and can be resumed

πŸ“¦ Installation

Homebrew (macOS / Linux)
brew install adrianliechti/tap/wingman
Scoop (Windows)
scoop bucket add adrianliechti https://github.com/adrianliechti/scoop-bucket
scoop install wingman
From Source
go install github.com/adrianliechti/wingman-agent@latest
Build Locally
git clone https://github.com/adrianliechti/wingman-agent.git
cd wingman-agent
go build -o wingman .

πŸš€ Quick Start

  1. Set up your API key:
# For any OpenAI-compatible API endpoint
export OPENAI_API_KEY="your-api-key"

# Optional: custom endpoint (defaults to OpenAI)
export OPENAI_BASE_URL="https://your-api-endpoint/v1"
  1. Run Wingman in your project directory:
wingman
  1. Start chatting! Ask Wingman to help with coding tasks:
> Show me all TODO comments in this project
> Refactor the config package to use dependency injection
> Write tests for the agent module
  1. Resume a previous session:
wingman --resume              # resume the most recent session
wingman --resume <session-id> # resume a specific session

βš™οΈ Configuration

Environment Variables
Variable Description
OPENAI_API_KEY OpenAI API key (required)
OPENAI_BASE_URL Custom OpenAI-compatible API endpoint
OPENAI_MODEL Model to use (auto-selected if not specified)

Alternative: Wingman Server

Variable Description
WINGMAN_URL Wingman server URL (takes priority over OpenAI vars)
WINGMAN_TOKEN Wingman authentication token
WINGMAN_MODEL Model to use

Note: The fetch (URL fetching) and search_online (web search) tools require WINGMAN_URL to be set, as they delegate to the Wingman server's extract and search APIs.

Project Configuration

Create an AGENTS.md (or CLAUDE.md) file in your project root to provide context-specific instructions. Wingman walks up from your working directory and reads all matching files it finds, so you can layer project and workspace-level guidelines:

# Project Guidelines

- Use Go 1.25+ features
- Follow standard Go project layout
- Write tests for all new functionality
MCP Integration

Add an mcp.json file to integrate with MCP servers:

{
  "mcpServers": {
    "my-server": {
      "command": "npx",
      "args": ["-y", "@my-org/my-mcp-server"]
    }
  }
}

Remote (HTTP/SSE) servers are also supported via the url and optional headers fields.

πŸ› οΈ Built-in Tools

Wingman comes with powerful built-in tools:

Tool Description
read Read file contents with optional line range
write Create or overwrite files
edit Make surgical edits to existing files
ls List directory contents
find Find files using glob patterns
grep Search file contents using regex patterns
shell Execute shell commands
fetch Fetch and extract content from a URL (requires WINGMAN_URL)
search_online Search the web for up-to-date information (requires WINGMAN_URL)
agent Launch a sub-agent to handle independent tasks in a separate context
lsp Code intelligence (definitions, references, diagnostics, symbols, call hierarchy)
LSP Support

Wingman automatically detects and connects to language servers based on project files. No configuration needed β€” if you have a language server installed, Wingman will use it.

Language Server Detected By
Go gopls go.mod, go.work
TypeScript/JS typescript-language-server, vtsls tsconfig.json, package.json
Deno deno lsp deno.json, deno.jsonc
Python basedpyright, pyright, pylsp, jedi-language-server pyproject.toml, requirements.txt
Rust rust-analyzer Cargo.toml
C/C++ clangd, ccls compile_commands.json, CMakeLists.txt
Java jdtls pom.xml, build.gradle
C# omnisharp, csharp-ls *.csproj, *.sln
F# fsautocomplete *.fsproj, *.sln
Ruby ruby-lsp, solargraph Gemfile
PHP intelephense, phpactor composer.json
Swift sourcekit-lsp Package.swift
Kotlin kotlin-language-server build.gradle.kts
Scala metals build.sbt
Dart dart language-server pubspec.yaml
Zig zls build.zig
Lua lua-language-server .luarc.json
Elixir elixir-ls, lexical mix.exs
Haskell haskell-language-server stack.yaml, *.cabal
OCaml ocamllsp dune-project
Clojure clojure-lsp deps.edn, project.clj
Gleam gleam lsp gleam.toml
Nix nixd flake.nix, default.nix
Vue vue-language-server package.json
Svelte svelteserver package.json
Astro astro-ls package.json
Bash bash-language-server .bashrc, *.sh
Terraform terraform-ls main.tf, .terraform
YAML yaml-language-server .yamllint, docker-compose.yml
Docker docker-langserver Dockerfile
Prisma prisma language-server schema.prisma
Typst tinymist typst.toml
LaTeX texlab .latexmkrc

The LSP tool provides these operations:

  • diagnostics / workspaceDiagnostics β€” Compiler errors and warnings
  • definition / implementation β€” Navigate to symbol definitions or interface implementations
  • references β€” Find all usages of a symbol
  • hover β€” Type information and documentation
  • documentSymbol / workspaceSymbol β€” List or search symbols
  • incomingCalls / outgoingCalls β€” Explore call graphs

🎨 Modes

  • Agent Mode β€” Full autonomous operation with tool execution
  • Plan Mode β€” Planning and analysis without project source edits

Toggle between modes using Tab or the explicit /plan and /agent commands.

⌨️ Keyboard Shortcuts

Shortcut Action
Enter Send message
Tab Toggle Agent/Plan mode (or autocomplete slash commands)
Shift+Tab Cycle through available models
@ Open fuzzy file picker to add file context
Ctrl+V / Cmd+V Paste image or text from clipboard
Ctrl+E Toggle tool output expansion
Ctrl+T Toggle mouse capture (enables native text selection)
Ctrl+Y Copy last assistant response to clipboard
Ctrl+L Clear chat history
Escape Cancel stream, close modal, or clear input
Ctrl+C Copy selected text, close modal, cancel stream, or exit

πŸ“ Commands

Command Description
/help Show available commands and skills
/model Select AI model from available options
/effort Set reasoning effort (auto, low, medium, high)
/plan Enter planning mode
/agent Return to execution mode
/problems Show LSP diagnostics for the workspace
/diff Show changes from session baseline (requires git)
/rewind Restore to a previous checkpoint (requires git)
/copy Copy last assistant response to clipboard
/paste Paste from clipboard
/resume Resume the most recent saved session
/clear Clear chat history
/quit Exit application

Skill slash commands (e.g. /commit, /code-review) also appear here β€” see Skills below.

πŸ”§ Skills

Skills are reusable, invocable workflows defined in SKILL.md files. Wingman discovers skills from these locations (later directories take priority):

Personal skills (user-wide, across all projects):

  • ~/.agents/skills/<name>/SKILL.md
  • ~/.wingman/skills/<name>/SKILL.md
  • ~/.claude/skills/<name>/SKILL.md
  • ~/.config/opencode/skills/<name>/SKILL.md

Project skills (scoped to the current repo):

  • .agents/skills/<name>/SKILL.md
  • .wingman/skills/<name>/SKILL.md
  • .claude/skills/<name>/SKILL.md
  • .opencode/skills/<name>/SKILL.md

Project skills override personal skills with the same name, allowing per-project customization.

Bundled Skills

Wingman ships with built-in skills that are available immediately via slash commands and are materialized to ~/.wingman/skills/ on first use so you can customize them:

Skill Description
/init Scan the project and generate an AGENTS.md with conventions and build commands
/commit Stage and commit changes with a well-crafted commit message
/code-review Review code changes for correctness, style, and security
/security-review Deep security audit using parallel sub-agents
/simplify Review changed code for reuse, quality, and efficiency, then fix issues
Custom Skill Example
---
name: run-tests
description: Run the project test suite with coverage
---

# Testing Skill

Run tests with: `go test -cover ./...`

Place this file at .wingman/skills/run-tests/SKILL.md and invoke it with /run-tests.

Skills support argument placeholders (${ARGUMENTS}, ${1}, named args) for parameterized workflows.

πŸ–₯️ Server Mode

Wingman includes a web-based UI server β€” useful for IDE integrations or browser-based access:

wingman server [--port 4242]

This starts an HTTP server at http://localhost:4242 with a React UI featuring a chat panel, file browser, diff viewer, checkpoint browser, diagnostics panel, and session management. The server uses WebSockets for real-time streaming.

πŸ”€ Proxy Mode

When WINGMAN_URL is set, Wingman can act as a local API proxy with a TUI dashboard for inspecting requests:

wingman proxy [--port 4242]

This starts a local OpenAI-compatible proxy server that forwards requests to your Wingman server, showing real-time request/response details in a terminal UI.

🧩 CLI Wrappers

When WINGMAN_URL is set, Wingman can launch other coding agents pre-configured to use your Wingman server as their backend:

wingman codex [args...]    # Launch OpenAI Codex CLI
wingman claude [args...]   # Launch Claude Code
wingman gemini [args...]   # Launch Gemini CLI
wingman opencode [args...] # Launch OpenCode

Each wrapper automatically configures the target CLI tool with the correct endpoint and authentication.

πŸ€– Claw Mode

Wingman includes an experimental multi-agent orchestration mode:

wingman claw

Claw manages a pool of named agents with persistent memory, scheduled tasks, and a TUI interface. Each agent has its own sandboxed workspace and can spawn sub-agents. Agents persist their sessions across restarts and support proactive check-in schedules.

Documentation ΒΆ

The Go Gopher

There is no documentation for this package.

Directories ΒΆ

Path Synopsis
pkg
lsp
lsp/jsonrpc2
Package jsonrpc2 is a minimal implementation of the JSON RPC 2 spec.
Package jsonrpc2 is a minimal implementation of the JSON RPC 2 spec.
mcp
text
Package text provides small string utilities shared across the codebase.
Package text provides small string utilities shared across the codebase.
tui
tui
run

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL