Documentation
¶
Overview ¶
Package gemini implements LLMProvider for Google Gemini models. Importing this package (even with a blank import) registers the "gemini" provider in the global registry.
The provider reads GEMINI_API_KEY from the environment. It uses the Google AI Gemini API (not Vertex AI). To target Vertex AI set both GEMINI_PROJECT and GEMINI_LOCATION env vars instead of GEMINI_API_KEY.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Provider ¶
type Provider struct{}
Provider implements providers.LLMProvider using the Google Gemini API.
func (*Provider) RunTask ¶
func (p *Provider) RunTask( ctx context.Context, cfg *config.Config, task queue.Task, tools []mcp.Tool, callTool func(context.Context, string, json.RawMessage) (string, error), chunkFn func(string), ) (string, queue.TokenUsage, error)
RunTask executes a task through the Gemini agentic tool-use loop. It keeps calling the API until the model stops requesting function calls. Token usage is accumulated across all API calls and returned with the result. If chunkFn is non-nil, the final text turn is streamed token-by-token.