README
ΒΆ
mcp-instructions
| Transport | VS Code | VS Code Insiders |
|---|---|---|
| stdio | ||
| HTTP | ||
| Docker |
Preset: Awesome Copilot
π€ Pre-configured to load instructions from github/awesome-copilot β a curated collection of Copilot agents, instructions, and skills.
| Transport | VS Code | VS Code Insiders |
|---|---|---|
| stdio | ||
| HTTP | ||
| Docker |
An MCP server that dynamically serves GitHub Copilot custom instructions from local directories and GitHub repositories.
It discovers .github/copilot-instructions.md and .github/instructions/**/*.instructions.md files from configured sources and exposes them via the Model Context Protocol.
Features
- On-demand loading β local directories are read live; no eager loading
- GitHub repo caching β remote repos are cached locally and synced periodically in the background
- LLM optimization β optionally merge/deduplicate instructions from multiple sources via an OpenAI-compatible endpoint
- Dual transport β supports both stdio (for Copilot CLI / local tools) and Streamable HTTP (for remote deployments)
- MCP primitives β instructions are exposed as Resources, Prompts, and Tools
Installation
# go install (requires Go 1.24+)
go install github.com/Arkestone/mcp/servers/mcp-instructions/cmd/mcp-instructions@latest
# Pinned version
go install github.com/Arkestone/mcp/servers/mcp-instructions/cmd/mcp-instructions@v0.0.1
# Docker
docker pull ghcr.io/arkestone/mcp-instructions:latest
# Pre-built binary β https://github.com/Arkestone/mcp/releases/latest
Getting Started
# Serve instructions from a local directory (stdio)
mcp-instructions -dirs /path/to/repo
# Serve from a GitHub repo with HTTP transport
export GITHUB_TOKEN=ghp_...
mcp-instructions -repos github/awesome-copilot -transport http -addr :8080
# Build from source
make build-instructions # β ./bin/mcp-instructions
Configuration
Configuration is loaded in layers (each overrides the previous):
- YAML file β
config.yamlin the working directory, or specify with-config path/to/config.yaml - Environment variables
- CLI flags
See config.example.yaml for all options.
Environment Variables
| Variable | Description |
|---|---|
INSTRUCTIONS_CONFIG |
Path to YAML config file |
INSTRUCTIONS_DIRS |
Comma-separated local directories |
INSTRUCTIONS_REPOS |
Comma-separated GitHub repos (owner/repo or owner/repo@ref) |
INSTRUCTIONS_TRANSPORT |
stdio (default) or http |
INSTRUCTIONS_ADDR |
HTTP listen address (default :8080) |
INSTRUCTIONS_CACHE_DIR |
Local cache directory (default ~/.cache/mcp-instructions) |
INSTRUCTIONS_SYNC_INTERVAL |
Sync interval for remote repos (default 5m) |
GITHUB_TOKEN |
GitHub API token for private repos |
LLM_ENDPOINT |
OpenAI-compatible API endpoint |
LLM_MODEL |
Model name (e.g. gpt-4o-mini) |
LLM_API_KEY |
LLM API key |
LLM_ENABLED |
true to enable LLM optimization by default |
CLI Flags
-config Path to YAML config file
-dirs Comma-separated local directories
-repos Comma-separated GitHub repos (owner/repo[@ref])
-transport Transport: stdio (default) or http
-addr HTTP listen address (default :8080)
-cache-dir Local cache directory
-sync-interval Sync interval (e.g. 5m, 1h)
-llm-endpoint OpenAI-compatible endpoint URL
-llm-model LLM model name
MCP API
Resources
instructions://{source}/{name}β Content of an individual instruction file from the named source.{source}is the directory basename orowner/repo;{name}is the file path relative to.github/.instructions://optimizedβ All instructions from every configured source merged and deduplicated (via LLM if configured, otherwise concatenated).instructions://indexβ Plain-text index listing all available instruction URIs and their sources.
Tools
-
refresh-instructions- Force an immediate re-sync of all configured sources. Local directories are re-scanned; GitHub repo caches are fetched from the API right away without waiting for the next background sync interval.
- No input required.
-
list-instructions- Return metadata for every discovered instruction file: URI, source, name, and path.
- No required inputs.
-
optimize-instructions- Return consolidated instruction content, optionally passed through an LLM for merging and deduplication.
- Input:
optimize(boolean, optional): override the globalllm.enabledsetting for this request.
Prompts
get-instructions- Inject instruction content into the conversation as a prompt message, optionally filtered to a single source and/or optimized.
- Parameters:
source(string, optional): restrict output to one source (directory basename orowner/repo).optimize(string"true"/"false", optional): override the global LLM optimization setting for this request.
Instruction File Format
The server discovers two kinds of instruction files from each configured source:
| Path | Scope |
|---|---|
.github/copilot-instructions.md |
Repository-wide instructions applied to every conversation |
.github/instructions/*.instructions.md |
Path-specific instructions (front-matter applyTo glob controls scope) |
For details on authoring instruction files, see Customizing Copilot with custom instructions.
Docker
# From the repo root
make docker-instructions
# stdio mode
docker run -i ghcr.io/arkestone/mcp-instructions:latest -dirs /data
# HTTP mode
docker run -p 8080:8080 \
-e INSTRUCTIONS_REPOS=github/awesome-copilot \
-e GITHUB_TOKEN=ghp_... \
-e INSTRUCTIONS_TRANSPORT=http \
ghcr.io/arkestone/mcp-instructions:latest
MCP Client Configuration
VS Code / GitHub Copilot
.vscode/mcp.json:
{
"servers": {
"instructions": {
"command": "mcp-instructions",
"args": ["-dirs", "${workspaceFolder}"]
}
}
}
Method 1: User Configuration (Recommended)
Open the Command Palette (Ctrl+Shift+P) and run MCP: Open User Configuration to open your user mcp.json file and add the server configuration.
Method 2: Workspace Configuration
Add the configuration to .vscode/mcp.json in your workspace to share it with your team.
See the VS Code MCP documentation for more details.
Claude Desktop
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"instructions": {
"command": "mcp-instructions",
"args": ["-dirs", "/path/to/repo"]
}
}
}
Cursor
.cursor/mcp.json (project) or ~/.cursor/mcp.json (global):
{
"mcpServers": {
"instructions": {
"command": "mcp-instructions",
"args": ["-dirs", "."]
}
}
}
Windsurf
~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"instructions": {
"command": "mcp-instructions",
"args": ["-dirs", "/path/to/repo"]
}
}
}
Claude Code
.mcp.json (project) or ~/.mcp.json (global):
{
"mcpServers": {
"instructions": {
"command": "mcp-instructions",
"args": ["-dirs", "."]
}
}
}
Remote (HTTP)
For shared team deployments, start the server with -transport http and connect clients over HTTP:
{
"mcpServers": {
"instructions": {
"type": "http",
"url": "http://localhost:8080/mcp"
}
}
}
Directories
ΒΆ
| Path | Synopsis |
|---|---|
|
cmd
|
|
|
mcp-instructions
command
|
|
|
internal
|
|
|
loader
Package loader provides on-demand access to Copilot custom instruction files from local directories and GitHub repositories.
|
Package loader provides on-demand access to Copilot custom instruction files from local directories and GitHub repositories. |