Documentation
¶
Overview ¶
Package main demonstrates LLM integration with Romancy workflows.
This example shows:
- Ad-hoc LLM calls with llm.Call()
- Structured output with llm.CallParse()
- Multi-turn conversations with llm.CallMessages()
- Reusable LLM definitions with llm.DefineDurableCall()
- App-level LLM defaults with llm.SetAppDefaults()
The key benefit is that all LLM calls are automatically cached as activities, so workflow replay will return cached results without re-invoking the LLM API.
Click to show internal directories.
Click to hide internal directories.