google

package
v0.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 15, 2026 License: Apache-2.0 Imports: 16 Imported by: 0

Documentation

Overview

google implements an API client for the Google Gemini REST API. https://ai.google.dev/gemini-api/docs

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func GenerateRequest

func GenerateRequest(model string, session *schema.Conversation, opts ...opt.Opt) (any, error)

GenerateRequest builds a generate request from options without sending it. Useful for testing and debugging.

func WithFrequencyPenalty

func WithFrequencyPenalty(value float64) opt.Opt

WithFrequencyPenalty sets the frequency penalty (-2.0 to 2.0). Positive values penalise tokens proportionally to how often they have appeared so far, reducing repetition.

See: https://ai.google.dev/gemini-api/docs/text-generation#configuration-parameters

func WithJSONOutput

func WithJSONOutput(schema *jsonschema.Schema) opt.Opt

WithJSONOutput constrains the model to produce JSON conforming to the given schema. Sets responseMimeType to "application/json" and responseJsonSchema on the request.

See: https://ai.google.dev/gemini-api/docs/json-mode

func WithMaxTokens

func WithMaxTokens(value uint) opt.Opt

WithMaxTokens sets the maximum number of tokens to generate (minimum 1).

See: https://ai.google.dev/gemini-api/docs/text-generation#configuration-parameters

func WithOutputDimensionality

func WithOutputDimensionality(d uint) opt.Opt

WithOutputDimensionality sets the output dimensionality for the embedding. The default is 3072. Recommended values are 768, 1536, or 3072. Smaller dimensions save storage space with minimal quality loss.

See: https://ai.google.dev/gemini-api/docs/embeddings#controlling-embedding-size

func WithPresencePenalty

func WithPresencePenalty(value float64) opt.Opt

WithPresencePenalty sets the presence penalty (-2.0 to 2.0). Positive values penalise tokens that have already appeared, encouraging the model to talk about new topics.

See: https://ai.google.dev/gemini-api/docs/text-generation#configuration-parameters

func WithSeed

func WithSeed(value int) opt.Opt

WithSeed sets the seed for deterministic generation. Using the same seed with the same inputs should produce repeatable results.

See: https://ai.google.dev/gemini-api/docs/text-generation#configuration-parameters

func WithStopSequences

func WithStopSequences(values ...string) opt.Opt

WithStopSequences sets custom stop sequences for the request. Generation stops when any of the specified sequences is encountered.

See: https://ai.google.dev/gemini-api/docs/text-generation#configuration-parameters

func WithSystemPrompt

func WithSystemPrompt(value string) opt.Opt

WithSystemPrompt sets the system instruction for the request.

See: https://ai.google.dev/gemini-api/docs/system-instructions

func WithTaskType

func WithTaskType(taskType string) opt.Opt

WithTaskType sets the task type for the embedding request. Supported values: SEMANTIC_SIMILARITY, CLASSIFICATION, CLUSTERING, RETRIEVAL_DOCUMENT, RETRIEVAL_QUERY, CODE_RETRIEVAL_QUERY, QUESTION_ANSWERING, FACT_VERIFICATION.

See: https://ai.google.dev/gemini-api/docs/embeddings#specify-task-type

func WithTemperature

func WithTemperature(value float64) opt.Opt

WithTemperature sets the temperature for the request (0.0 to 2.0). Higher values produce more random output, lower values more deterministic.

See: https://ai.google.dev/gemini-api/docs/text-generation#configuration-parameters

func WithThinking

func WithThinking() opt.Opt

WithThinking enables the model's extended thinking/reasoning capability. When enabled, the model may include internal reasoning in its response. Only supported by models that have thinking capabilities (e.g. gemini-2.5-flash).

See: https://ai.google.dev/gemini-api/docs/thinking

func WithThinkingBudget

func WithThinkingBudget(budgetTokens uint) opt.Opt

WithThinkingBudget enables extended thinking with a specific token budget. See: https://ai.google.dev/gemini-api/docs/thinking

func WithTitle

func WithTitle(title string) opt.Opt

WithTitle sets the title for the embedding request. Only applicable when TaskType is RETRIEVAL_DOCUMENT.

See: https://ai.google.dev/gemini-api/docs/embeddings#specify-task-type

func WithTopK

func WithTopK(value uint) opt.Opt

WithTopK sets the top-K sampling parameter (minimum 1). Limits token selection to the K most probable tokens.

See: https://ai.google.dev/gemini-api/docs/text-generation#configuration-parameters

func WithTopP

func WithTopP(value float64) opt.Opt

WithTopP sets the nucleus sampling parameter (0.0 to 1.0). Tokens are selected from the smallest set whose cumulative probability exceeds top_p.

See: https://ai.google.dev/gemini-api/docs/text-generation#configuration-parameters

Types

type Client

type Client struct {
	*client.Client
	*modelcache.ModelCache
}

func New

func New(apiKey string, opts ...client.ClientOpt) (*Client, error)

New creates a new Google Gemini API client with the given API key

func (*Client) BatchEmbedding

func (c *Client) BatchEmbedding(ctx context.Context, model schema.Model, texts []string, opts ...opt.Opt) ([][]float64, error)

BatchEmbedding generates embedding vectors for multiple texts using the specified model

func (*Client) Embedding

func (c *Client) Embedding(ctx context.Context, model schema.Model, text string, opts ...opt.Opt) ([]float64, error)

Embedding generates an embedding vector for a single text using the specified model

func (*Client) GetModel

func (c *Client) GetModel(ctx context.Context, name string, opts ...opt.Opt) (*schema.Model, error)

GetModel returns a specific model by name

func (*Client) ListModels

func (c *Client) ListModels(ctx context.Context, opts ...opt.Opt) ([]schema.Model, error)

ListModels returns all available models from the Gemini API

func (*Client) Name

func (*Client) Name() string

Name returns the provider name

func (*Client) WithSession

func (c *Client) WithSession(ctx context.Context, model schema.Model, session *schema.Conversation, message *schema.Message, opts ...opt.Opt) (*schema.Message, *schema.Usage, error)

WithSession sends a message within a session and returns the response (stateful)

func (*Client) WithoutSession

func (c *Client) WithoutSession(ctx context.Context, model schema.Model, message *schema.Message, opts ...opt.Opt) (*schema.Message, *schema.Usage, error)

WithoutSession sends a single message and returns the response (stateless)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL