Documentation
¶
Overview ¶
A very, very basic chat client for `docker agent serve chat`.
PR #2510 (`feat: add docker agent serve chat command`) exposes any docker-agent agent through an OpenAI-compatible HTTP server. The whole point of that feature is that any tool already speaking OpenAI's /v1/chat/completions protocol can drive a docker-agent agent without custom integration. This example demonstrates exactly that: it uses the official github.com/openai/openai-go SDK, only repointed at the local chat server, to run an interactive REPL against an agent.
Prerequisites:
# Start an agent in chat mode (in another terminal): ./bin/docker-agent serve chat ./examples/42.yaml # It listens on http://127.0.0.1:8083 by default.
Then run this client:
go run ./examples/chat # or, to pin a specific agent in a multi-agent team: go run ./examples/chat -model root # or, to point at a different server: go run ./examples/chat -base http://127.0.0.1:9090/v1
Type a message and press <Enter>. Type "exit" (or send EOF with ^D) to quit.
Click to show internal directories.
Click to hide internal directories.