agentcli

package
v0.6.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 31, 2026 License: MIT Imports: 6 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func DailyPrompt

func DailyPrompt(observations []string, date, guidelines string, factPaths ...string) string

DailyPrompt builds a prompt for distilling observations into a daily memory summary. If guidelines is non-empty, it is prepended as team-specific distillation preferences. Optional factPaths are relative file paths to fact JSONL files (discussion, github, etc.) that the AI coworker should read and incorporate into the summary.

func DiscussionFactsPrompt

func DiscussionFactsPrompt(title, summary, transcript, guidelines, annotations string) string

DiscussionFactsPrompt builds a prompt for extracting structured facts from a discussion. The LLM extracts decisions, learnings, open questions, action items, and key context as JSONL output (one JSON object per line). When annotations is non-empty, server-extracted annotations are included as ground truth.

func MonthlyPrompt

func MonthlyPrompt(weeklySummaries []string, month, guidelines string) string

MonthlyPrompt builds a prompt for synthesizing weekly summaries into a monthly memory.

func WeeklyPrompt

func WeeklyPrompt(dailySummaries []string, weekID, guidelines string) string

WeeklyPrompt builds a prompt for synthesizing daily summaries into a weekly memory.

func WriteGuidelines added in v0.6.0

func WriteGuidelines(sb *strings.Builder, guidelines, stage string)

WriteGuidelines wraps team guidelines in a <team-guidelines> tag and tells the LLM which pipeline stage it's in. The stage identifier lets the guidelines reference optional per-stage override files via progressive disclosure (e.g., "if extracting discussions, read EXTRACT-discussions.md"). The LLM MAY use the Read tool to access files in memory/guidance/.

Types

type Backend

type Backend interface {
	// Name returns the backend identifier (e.g., "claude").
	Name() string
	// Available checks if the CLI exists in PATH.
	Available() bool
	// Run sends a prompt and returns the text output.
	Run(ctx context.Context, prompt string) (string, error)
}

Backend represents an AI agent CLI that can process prompts.

func Detect

func Detect() (Backend, error)

Detect returns the first available backend, or an error.

type Claude

type Claude struct {
	// Timeout for the claude process. Zero means no timeout (uses ctx).
	Timeout time.Duration
	// WorkDir sets the working directory for the claude process.
	// When set, relative file paths in prompts resolve from this directory.
	WorkDir string
}

Claude implements Backend using the claude CLI in pipe mode.

func (*Claude) Available

func (c *Claude) Available() bool

func (*Claude) Name

func (c *Claude) Name() string

func (*Claude) Run

func (c *Claude) Run(ctx context.Context, prompt string) (string, error)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL