llm

command
v0.6.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 21, 2025 License: MIT Imports: 8 Imported by: 0

Documentation

Overview

Package main demonstrates LLM integration with Romancy workflows.

This example shows:

  • Ad-hoc LLM calls with llm.Call()
  • Structured output with llm.CallParse()
  • Multi-turn conversations with llm.CallMessages()
  • Reusable LLM definitions with llm.DefineDurableCall()
  • App-level LLM defaults with llm.SetAppDefaults()

The key benefit is that all LLM calls are automatically cached as activities, so workflow replay will return cached results without re-invoking the LLM API.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL