mattermost-plugin-ai

module
v1.7.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 12, 2025 License: Apache-2.0

README

Mattermost Agents Plugin Download Latest Master Build

The Mattermost Agents Plugin integrates AI capabilities directly into your Mattermost workspace. Run any local LLM on your infrastructure or connect to cloud providers - you control your data and deployment.

The Mattermost Agents AI Plugin is an extension for mattermost that provides functionality for self-hosted and vendor-hosted LLMs

Key Features

  • Multiple AI Assistants: Configure different agents with specialized personalities and capabilities
  • Thread & Channel Summarization: Get concise summaries of long discussions with a single click
  • Action Item Extraction: Automatically identify and extract action items from threads
  • Meeting Transcription: Transcribe and summarize meeting recordings
  • Semantic Search: Find relevant content across your Mattermost instance using natural language
  • Smart Reactions: Let AI suggest contextually appropriate emoji reactions
  • Direct Conversations: Chat directly with AI assistants in dedicated channels
  • Flexible LLM Support: Use local models (Ollama, vLLM, etc.), cloud providers (OpenAI, Anthropic, Azure), or any OpenAI-compatible API

Documentation

Comprehensive documentation is available in the /docs directory:

Installation

  1. Download the latest release from the releases page. You can also download the experimental latest master
  2. Upload and enable the plugin through the Mattermost System Console
  3. Configure your desired LLM provider settings
System Requirements
  • Mattermost Server versions:
    • v10.0 or later recommended
    • v9.11+ (ESR)
  • PostgreSQL database with pgvector extension for semantic search capabilities
  • Network access to your chosen LLM provider

Quick Start

After installation, complete these steps to get started:

  1. Navigate to System Console > Plugins > Agents
  2. Create an agent and configure it with your LLM provider credentials
  3. Set permissions for who can access the agent
  4. Open the Agents panel from any channel using the AI icon in the right sidebar
  5. Start interacting with your AI assistant

For detailed configuration instructions, see the Admin Guide.

Integration

Bridge Client

The plugin provides a Go client library for other Mattermost plugins and the Mattermost server to interact with the AI plugin's LLM Bridge API. This allows you to easily add AI capabilities to your own plugins or server-side features.

See the Bridge Client README for installation and usage instructions.

Development

Prerequisites
  • Go 1.24+
  • Node.js 20.11+
  • Access to an LLM provider (OpenAI, Anthropic, etc.)
Local Setup
  1. Setup your Mattermost development environment by following the Mattermost developer setup guide. If you have a remote mattermost server you want to develop to you can skip this step.

  2. Setup your Mattermost plugin development environment by following the Plugin Developer setup guide.

  3. Clone the repository:

git clone https://github.com/mattermost/mattermost-plugin-ai.git
cd mattermost-plugin-ai
  1. Optional. If you are developing to a remote server, setup environment variables to deploy:
MM_SERVICESETTINGS_SITEURL=http://localhost:8065
MM_ADMIN_USERNAME=<YOUR_USERNAME>
MM_ADMIN_PASSWORD=<YOUR_PASSWORD>
  1. Run deploy to build the plugin
make deploy
Other make commands
  • Run make help for a list of all make commands
  • Run make check-style to verify code style
  • Run make test to run the test suite
  • Run make e2e to run the e2e tests
  • Run make evals to run prompt evaluations interactively (with TUI)
  • Run make evals-ci to run prompt evaluations in CI mode (non-interactive)
Benchmark Tests

The streaming code has benchmark tests to measure performance and detect regressions:

# Run all streaming benchmarks
go test -bench=. -benchmem ./llm/... ./streaming/...

# Run specific benchmark
go test -bench=BenchmarkStreamToPost -benchmem ./streaming/...

# Run with CPU profiling
go test -bench=BenchmarkReadAll -cpuprofile=cpu.prof ./llm/...

Benchmarks cover:

  • ReadAll() stream consumption with varying response sizes
  • TokenUsageLoggingWrapper interception overhead
  • StreamToPost() full processing pipeline with WebSocket events
Multi-Provider Evaluation Support

The evaluation system supports testing with multiple LLM providers: OpenAI, Anthropic, and Azure OpenAI. By default, evaluations run against all providers, but you can target specific ones:

# Run with all providers (default)
make evals

# Run with only Anthropic
LLM_PROVIDER=anthropic make evals

# Run with OpenAI and Azure
LLM_PROVIDER=openai,azure make evals

# Use a specific model
ANTHROPIC_MODEL=claude-3-opus-20240229 make evals

See cmd/evalviewer/README.md for complete documentation on environment variables and configuration options.

License

This repository is licensed under Apache-2, except for the server/enterprise directory which is licensed under the Mattermost Source Available License. See Mattermost Source Available License to learn more.

Directories

Path Synopsis
Asage experimental LLM provider.
Asage experimental LLM provider.
build
manifest command
pluginctl command
main handles deployment of the plugin to a development server using the Client4 API.
main handles deployment of the plugin to a development server using the Client4 API.
llm
Package llm provides a unified abstraction layer for Large Language Model interactions within the Mattermost AI plugin.
Package llm provides a unified abstraction layer for Large Language Model interactions within the Mattermost AI plugin.
Package mcp implements Dynamic Client Registration Protocol (RFC 7591)
Package mcp implements Dynamic Client Registration Protocol (RFC 7591)
cmd command
public
bridgeclient
Package bridgeclient provides a client library for Mattermost plugins and the server to interact with the AI plugin's LLM Bridge API to make requests to Agents to LLM providers.
Package bridgeclient provides a client library for Mattermost plugins and the server to interact with the AI plugin's LLM Bridge API to make requests to Agents to LLM providers.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL