manque-ai

command module
v1.0.0-beta.9 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 7, 2026 License: MIT Imports: 1 Imported by: 0

README ΒΆ

Manque AI Banner

AI Code Reviewer

Go Report Card Docker Image Version

Your Intelligent AI Pair Programmer for GitHub Pull Requests. Instant summaries, deep code analysis, and actionable security insights.


✨ Features

  • πŸš€ Dual Mode Operation: Run as a GitHub Action or a local CLI tool.
  • πŸ€– Multi-Provider LLM Support: First-class support for OpenAI, Anthropic Claude, Google Gemini, and OpenRouter.
  • 🧠 Intelligent Analysis: Generates executive summaries, walkthroughs, and line-by-line review comments.
  • πŸ”’ Security First: dedicated analysis for hardcoded secrets and potential vulnerabilities.
  • πŸ’» Local Pre-PR Checks: Review your code locally before you even push.
  • 🎨 Custom Styling: Enforce your team's unique style guide and best practices.

πŸ’» Local Development (Pre-PR Check)

Review your changes locally without pushing to GitHub. This is perfect for catching issues early!

1. Installation

Quick Install (Recommended)

./install.sh

Manual Install

go install github.com/igcodinap/manque-ai@latest
# or build from source
git clone https://github.com/igcodinap/manque-ai
cd manque-ai && go build -o manque-ai .
2. Updating

Easily update to the latest version:

manque-ai update
2. Setup (One-time)

You can set your LLM credentials as environment variables or using a .env file in the project root. Note: GH_TOKEN is OPTIONAL for local runs!

Copy the example file and fill in your keys:

cp .env.example .env
# Edit .env and add your LLM_API_KEY
Option B: Exporting variables (or adding to .env)

OpenAI

export LLM_PROVIDER=openai
export LLM_API_KEY=sk-...

Anthropic

export LLM_PROVIDER=anthropic
export LLM_API_KEY=sk-ant-...

Local Ollama (No key required, just pointing to your local instance)

export LLM_PROVIDER=openai
export LLM_BASE_URL=http://localhost:11434/v1
export LLM_API_KEY=ollama
export LLM_MODEL=llama3 # Make sure to `ollama pull llama3` first!

OpenRouter

export LLM_PROVIDER=openrouter
export LLM_API_KEY=sk-or-...
export LLM_MODEL=anthropic/claude-3.5-sonnet
3. Run Review
# Review changes in your current branch vs main
manque-ai local

# Compare specific branches
manque-ai local --base develop --head feature-login

# Debug mode (see exact API calls and diff sizes)
manque-ai local --debug

πŸš€ GitHub Action Usage

Integrate directly into your CI/CD pipeline to review every Pull Request automatically.

name: AI Code Review
on:
  pull_request:
    types: [opened, synchronize]

jobs:
  review:
    runs-on: ubuntu-latest
    permissions:
      contents: read
      pull-requests: write
    steps:
      - name: Manque AI
        uses: docker://ghcr.io/igcodinap/manque-ai:latest
        env:
          GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          LLM_API_KEY: ${{ secrets.OPENAI_API_KEY }}
          LLM_PROVIDER: "openai"
          LLM_MODEL: "gpt-4o"
Configuration Options
Variable Description Required (Action) Required (Local) Default
GH_TOKEN GitHub API Token βœ… ❌ -
LLM_API_KEY LLM Provider Key βœ… βœ… -
LLM_PROVIDER openai, anthropic, google, openrouter ❌ ❌ openai
LLM_MODEL Specific model ID ❌ ❌ gpt-4o
STYLE_GUIDE_RULES Custom instructions for the AI ❌ ❌ -
UPDATE_PR_TITLE Auto-update PR title ❌ N/A true
UPDATE_PR_BODY Auto-update PR description ❌ N/A true

πŸ› οΈ Advanced CLI Usage

The CLI can also be used to review remote PRs or check GitHub Actions context.

# Review a specific remote PR
manque-ai --repo owner/repo --pr 123

# Review by URL
manque-ai --url https://github.com/owner/repo/pull/123

🧠 Architecture

The project is built with modularity in mind, separating the "brain" from the interface.

β”œβ”€β”€ cmd/               # CLI Commands
β”‚   β”œβ”€β”€ root.go        # GitHub Action / Remote Review
β”‚   └── local.go       # Local Pre-PR Review
β”œβ”€β”€ pkg/
β”‚   β”œβ”€β”€ review/        # Core Review Engine (Shared Logic)
β”‚   β”œβ”€β”€ ai/            # LLM Client Adapters
β”‚   β”œβ”€β”€ diff/          # Git Diff Parser
β”‚   └── github/        # GitHub API Client
└── internal/          # Config & Logging

🀝 Contributing

We love contributions! Please fork the repository and submit a Pull Request.

πŸ“„ License

MIT Licensed. default_api.

Documentation ΒΆ

The Go Gopher

There is no documentation for this package.

Directories ΒΆ

Path Synopsis
pkg
ai
ast
userconfig
Package userconfig manages persistent user configuration stored in ~/.manque-ai/config.yaml
Package userconfig manages persistent user configuration stored in ~/.manque-ai/config.yaml

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL