langchain-go

module
v0.1.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 12, 2026 License: MIT

README

langchain-go

Tests codecov

LangChain for Go -- build production-grade AI agents as single, high-performance binaries.

langchain-go brings the battle-tested LangChain framework natively to Go, giving you agents, chains, tools, memory, vector stores, and LLM integrations without leaving the Go ecosystem.

Quickstart

go get github.com/LucaLanziani/langchain-go
Simple chain (prompt + model + parser)
package main

import (
    "context"
    "fmt"
    "log"

    "github.com/LucaLanziani/langchain-go/outputparsers"
    "github.com/LucaLanziani/langchain-go/prompts"
    "github.com/LucaLanziani/langchain-go/providers/openai"
    "github.com/LucaLanziani/langchain-go/runnable"
)

func main() {
    ctx := context.Background()

    prompt := prompts.NewChatPromptTemplate(
        prompts.System("You are a helpful assistant that tells jokes."),
        prompts.Human("Tell me a short joke about {topic}"),
    )
    model  := openai.New()
    parser := outputparsers.NewStringOutputParser()

    // Compose: prompt -> model -> parser
    chain := runnable.Pipe3(prompt, model, parser)

    result, err := chain.Invoke(ctx, map[string]any{"topic": "golang"})
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println(result)
}
Agent with tool use
package main

import (
    "context"
    "fmt"
    "log"

    "github.com/LucaLanziani/langchain-go/agents"
    "github.com/LucaLanziani/langchain-go/prompts"
    "github.com/LucaLanziani/langchain-go/providers/openai"
    "github.com/LucaLanziani/langchain-go/tools"
)

func main() {
    ctx := context.Background()

    calc := tools.NewTool("calculator", "Evaluate math expressions.",
        func(_ context.Context, input string) (string, error) {
            return "42", nil // replace with real eval
        },
    )

    prompt := prompts.NewChatPromptTemplate(
        prompts.System("You are a helpful assistant. Use tools when needed."),
        prompts.Placeholder("agent_scratchpad"),
        prompts.Human("{input}"),
    )

    agent   := agents.NewToolCallingAgent(openai.New(), []tools.Tool{calc}, prompt)
    exec    := agents.NewAgentExecutor(agent, []tools.Tool{calc})

    result, err := exec.Invoke(ctx, map[string]any{"input": "What is 6 * 7?"})
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println(result["output"])
}

Architecture

langchain-go follows Go-idiomatic design principles:

Python LangChain langchain-go Notes
invoke / ainvoke Invoke Single method, use goroutines for concurrency
stream / astream Stream Returns *StreamIterator[T]
batch / abatch Batch Parallel by default with MaxConcurrency control
| operator (LCEL) runnable.Pipe2, Pipe3, Pipe4 Type-safe composition
RunnableParallel runnable.NewParallel Fan-out / fan-in
RunnableLambda runnable.NewLambda Wrap any Go function
RunnableBranch runnable.NewBranch Conditional routing
**kwargs Functional options (...core.Option) WithTemperature(0.7)
async/await context.Context Cancellation and timeouts
Core interface

Every component implements Runnable[I, O]:

type Runnable[I, O any] interface {
    Invoke(ctx context.Context, input I, opts ...Option) (O, error)
    Stream(ctx context.Context, input I, opts ...Option) (*StreamIterator[O], error)
    Batch(ctx context.Context, inputs []I, opts ...Option) ([]O, error)
    GetName() string
}

Packages

Package Description
core Core types: messages, documents, Runnable interface, config, callbacks
prompts Prompt templates (PromptTemplate, ChatPromptTemplate, MessagesPlaceholder)
outputparsers Output parsers (StringOutputParser, JSONOutputParser)
runnable Composition primitives (Sequence, Parallel, Lambda, Passthrough, Branch)
llms Chat model interface and option types
providers/openai OpenAI chat models and embeddings
providers/anthropic Anthropic/Claude chat models
tools Tool interface and typed tool factory
agents Agent implementations (ToolCalling, ReAct) and AgentExecutor
chains Common chain patterns (LLMChain, StuffDocuments, RetrievalQA)
memory Conversation memory (Buffer, Window)
embeddings Embedder interface
vectorstores Vector store interface + in-memory implementation
retrievers Retriever interface wrapping vector stores
textsplitters Text splitting utilities
callbacks Callback handlers (Stdout, LangSmith)

Providers

OpenAI
import "github.com/LucaLanziani/langchain-go/providers/openai"

model := openai.New(
    openai.WithAPIKey("sk-..."),       // or OPENAI_API_KEY env var
    openai.WithModelName("gpt-4o"),
)
Anthropic
import "github.com/LucaLanziani/langchain-go/providers/anthropic"

model := anthropic.New(
    anthropic.WithAPIKey("sk-..."),    // or ANTHROPIC_API_KEY env var
    anthropic.WithModelName("claude-sonnet-4-20250514"),
    anthropic.WithMaxTokens(4096),
)

Streaming

stream, _ := model.Stream(ctx, messages)
for {
    chunk, ok, err := stream.Next()
    if err != nil { log.Fatal(err) }
    if !ok { break }
    fmt.Print(chunk.Content)
}

Tools

Create tools from Go functions with automatic JSON Schema generation:

type SearchArgs struct {
    Query string `json:"query" description:"The search query"`
    Limit int    `json:"limit,omitempty" description:"Max results"`
}

search := tools.NewTypedTool("search", "Search the web", SearchArgs{},
    func(ctx context.Context, args SearchArgs) (string, error) {
        return fmt.Sprintf("Results for: %s", args.Query), nil
    },
)

Memory

mem := memory.NewConversationBufferMemory()
mem.SaveContext(ctx,
    map[string]any{"input": "Hello"},
    map[string]any{"output": "Hi there!"},
)
vars, _ := mem.LoadMemoryVariables(ctx, nil)
fmt.Println(vars["history"]) // "Human: Hello\nAI: Hi there!"

Callbacks and Observability

// Stdout debugging
result, err := chain.Invoke(ctx, input,
    core.WithCallbacks(callbacks.NewStdoutHandler()),
)

// LangSmith tracing
result, err := chain.Invoke(ctx, input,
    core.WithCallbacks(callbacks.NewLangSmithHandler("my-project")),
)

Examples

See the examples/ directory:

Requirements

  • Go 1.22 or later
  • API keys for the providers you want to use (OpenAI, Anthropic)

License

MIT

Directories

Path Synopsis
Package agents provides agent implementations that use language models to determine which actions to take.
Package agents provides agent implementations that use language models to determine which actions to take.
Package callbacks provides callback handler implementations for observability, tracing, and debugging of LangChain executions.
Package callbacks provides callback handler implementations for observability, tracing, and debugging of LangChain executions.
Package chains provides common chain patterns for composing LangChain components.
Package chains provides common chain patterns for composing LangChain components.
Package core provides the fundamental types and interfaces for langchain-go.
Package core provides the fundamental types and interfaces for langchain-go.
Package embeddings provides the interface for text embedding models.
Package embeddings provides the interface for text embedding models.
examples
agent_tool_use command
Example: Agent with tool use — demonstrates the "20 lines of Go" goal.
Example: Agent with tool use — demonstrates the "20 lines of Go" goal.
copilot_chat command
Example: GitHub Copilot chat — demonstrates using the Copilot SDK provider.
Example: GitHub Copilot chat — demonstrates using the Copilot SDK provider.
rag_pipeline command
Example: RAG (Retrieval Augmented Generation) pipeline.
Example: RAG (Retrieval Augmented Generation) pipeline.
simple_chain command
Example: Simple chain using prompt + model + output parser.
Example: Simple chain using prompt + model + output parser.
streaming command
Example: Streaming chat completion with OpenAI.
Example: Streaming chat completion with OpenAI.
Package llms provides the interfaces for language model integrations.
Package llms provides the interfaces for language model integrations.
Package memory provides memory implementations for maintaining conversation state across interactions.
Package memory provides memory implementations for maintaining conversation state across interactions.
Package outputparsers provides parsers for extracting structured data from language model outputs.
Package outputparsers provides parsers for extracting structured data from language model outputs.
Package prompts provides prompt template types for formatting inputs into prompts for language models.
Package prompts provides prompt template types for formatting inputs into prompts for language models.
Package provider implements a unified interface for creating and managing multiple LLM providers with intelligent routing, fallback strategies, and metrics tracking.
Package provider implements a unified interface for creating and managing multiple LLM providers with intelligent routing, fallback strategies, and metrics tracking.
providers
anthropic
Package anthropic provides an Anthropic/Claude chat model implementation.
Package anthropic provides an Anthropic/Claude chat model implementation.
github-copilot
Package copilot provides a GitHub Copilot chat model implementation using the Copilot SDK.
Package copilot provides a GitHub Copilot chat model implementation using the Copilot SDK.
ollama
Package ollama provides a chat model and embeddings implementation using the Ollama HTTP API.
Package ollama provides a chat model and embeddings implementation using the Ollama HTTP API.
openai
Package openai provides an OpenAI chat model implementation.
Package openai provides an OpenAI chat model implementation.
Package retrievers provides the retriever interface and implementations for document retrieval.
Package retrievers provides the retriever interface and implementations for document retrieval.
Package runnable provides composition primitives for building chains from Runnable components (the Go equivalent of LCEL).
Package runnable provides composition primitives for building chains from Runnable components (the Go equivalent of LCEL).
Package textsplitters provides utilities for splitting text into chunks.
Package textsplitters provides utilities for splitting text into chunks.
Package tools provides the tool interface and utilities for creating tools that can be used by LangChain agents.
Package tools provides the tool interface and utilities for creating tools that can be used by LangChain agents.
Package vectorstores provides interfaces and implementations for vector stores.
Package vectorstores provides interfaces and implementations for vector stores.
inmemory
Package inmemory provides an in-memory vector store implementation.
Package inmemory provides an in-memory vector store implementation.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL