Documentation
¶
Overview ¶
Package googleai provides caching support for Google AI models.
package googleai implements a langchaingo provider for Google AI LLMs. See https://ai.google.dev/ for more details.
Index ¶
- Constants
- Variables
- func IsGemini3Model(modelName string) bool
- func MapError(err error) error
- func WithCachedContent(name string) llms.CallOption
- type CachingHelper
- func (ch *CachingHelper) CreateCachedContent(ctx context.Context, modelName string, messages []llms.MessageContent, ...) (*genai.CachedContent, error)
- func (ch *CachingHelper) DeleteCachedContent(ctx context.Context, name string) error
- func (ch *CachingHelper) GetCachedContent(ctx context.Context, name string) (*genai.CachedContent, error)
- func (ch *CachingHelper) ListCachedContents(ctx context.Context) *genai.CachedContentIterator
- type Gemini3RestClient
- type GoogleAI
- func (g *GoogleAI) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)
- func (g *GoogleAI) Close() error
- func (g *GoogleAI) CreateEmbedding(ctx context.Context, texts []string) ([][]float32, error)
- func (g *GoogleAI) GenerateContent(ctx context.Context, messages []llms.MessageContent, ...) (*llms.ContentResponse, error)
- func (g *GoogleAI) SupportsReasoning() bool
- type HarmBlockThreshold
- type Option
- func WithAPIKey(apiKey string) Option
- func WithCloudLocation(l string) Option
- func WithCloudProject(p string) Option
- func WithCredentialsFile(credentialsFile string) Option
- func WithCredentialsJSON(credentialsJSON []byte) Option
- func WithDefaultCandidateCount(defaultCandidateCount int) Option
- func WithDefaultEmbeddingModel(defaultEmbeddingModel string) Option
- func WithDefaultMaxTokens(maxTokens int) Option
- func WithDefaultModel(defaultModel string) Option
- func WithDefaultTemperature(defaultTemperature float64) Option
- func WithDefaultTopK(defaultTopK int) Option
- func WithDefaultTopP(defaultTopP float64) Option
- func WithGRPCConn(conn *grpc.ClientConn) Option
- func WithHTTPClient(httpClient *http.Client) Option
- func WithHarmThreshold(ht HarmBlockThreshold) Option
- func WithRest() Option
- type Options
Constants ¶
const ( CITATIONS = "citations" SAFETY = "safety" RoleSystem = "system" RoleModel = "model" RoleUser = "user" RoleTool = "tool" ResponseMIMETypeJson = "application/json" )
Variables ¶
Functions ¶
func IsGemini3Model ¶
IsGemini3Model checks if the model name indicates a Gemini 3 model that requires thought signatures for function calling.
func WithCachedContent ¶
func WithCachedContent(name string) llms.CallOption
WithCachedContent enables the use of pre-created cached content. The cached content must be created separately using Client.CreateCachedContent. This is different from Anthropic's inline cache control.
Types ¶
type CachingHelper ¶
type CachingHelper struct {
// contains filtered or unexported fields
}
CachingHelper provides utilities for working with Google AI's cached content feature. Unlike Anthropic which supports inline cache control, Google AI requires pre-creating cached content through the API.
func NewCachingHelper ¶
func NewCachingHelper(ctx context.Context, opts ...Option) (*CachingHelper, error)
NewCachingHelper creates a helper for managing cached content.
func (*CachingHelper) CreateCachedContent ¶
func (ch *CachingHelper) CreateCachedContent( ctx context.Context, modelName string, messages []llms.MessageContent, ttl time.Duration, ) (*genai.CachedContent, error)
CreateCachedContent creates cached content that can be reused across multiple requests. This is useful for caching large system prompts, context documents, or frequently used instructions.
Example usage:
helper, _ := NewCachingHelper(ctx, WithAPIKey(apiKey))
cached, _ := helper.CreateCachedContent(ctx, "gemini-2.0-flash", []llms.MessageContent{
{
Role: llms.ChatMessageTypeSystem,
Parts: []llms.ContentPart{
llms.TextPart("You are an expert assistant with deep knowledge..."),
},
},
}, 1*time.Hour)
// Use the cached content in requests
model, _ := New(ctx, WithAPIKey(apiKey))
resp, _ := model.GenerateContent(ctx, messages, WithCachedContent(cached.Name))
func (*CachingHelper) DeleteCachedContent ¶
func (ch *CachingHelper) DeleteCachedContent(ctx context.Context, name string) error
DeleteCachedContent removes cached content.
func (*CachingHelper) GetCachedContent ¶
func (ch *CachingHelper) GetCachedContent(ctx context.Context, name string) (*genai.CachedContent, error)
GetCachedContent retrieves existing cached content by name.
func (*CachingHelper) ListCachedContents ¶
func (ch *CachingHelper) ListCachedContents(ctx context.Context) *genai.CachedContentIterator
ListCachedContents returns an iterator for all cached content.
type Gemini3RestClient ¶
type Gemini3RestClient struct {
// contains filtered or unexported fields
}
Gemini3RestClient provides REST API access for Gemini 3 models that require thought signatures for function calling.
func NewGemini3RestClient ¶
func NewGemini3RestClient(apiKey string) *Gemini3RestClient
NewGemini3RestClient creates a new REST client for Gemini 3 API.
func (*Gemini3RestClient) GenerateContent ¶
func (c *Gemini3RestClient) GenerateContent( ctx context.Context, model string, messages []llms.MessageContent, opts *llms.CallOptions, ) (*llms.ContentResponse, error)
GenerateContent calls the Gemini 3 REST API with thought signature support.
type GoogleAI ¶
type GoogleAI struct {
CallbacksHandler callbacks.Handler
// contains filtered or unexported fields
}
GoogleAI is a type that represents a Google AI API client.
func (*GoogleAI) Call ¶
func (g *GoogleAI) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)
Call implements the llms.Model interface.
func (*GoogleAI) Close ¶
Close closes the underlying genai client. This should be called when the GoogleAI instance is no longer needed to prevent memory leaks from the underlying gRPC connections.
func (*GoogleAI) CreateEmbedding ¶
CreateEmbedding creates embeddings from texts.
func (*GoogleAI) GenerateContent ¶
func (g *GoogleAI) GenerateContent( ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption, ) (*llms.ContentResponse, error)
GenerateContent implements the llms.Model interface.
func (*GoogleAI) SupportsReasoning ¶
SupportsReasoning implements the ReasoningModel interface. Returns true if the current model supports reasoning/thinking tokens.
type HarmBlockThreshold ¶
type HarmBlockThreshold int32
const ( // HarmBlockUnspecified means threshold is unspecified. HarmBlockUnspecified HarmBlockThreshold = 0 // HarmBlockLowAndAbove means content with NEGLIGIBLE will be allowed. HarmBlockLowAndAbove HarmBlockThreshold = 1 // HarmBlockMediumAndAbove means content with NEGLIGIBLE and LOW will be allowed. HarmBlockMediumAndAbove HarmBlockThreshold = 2 // HarmBlockOnlyHigh means content with NEGLIGIBLE, LOW, and MEDIUM will be allowed. HarmBlockOnlyHigh HarmBlockThreshold = 3 // HarmBlockNone means all content will be allowed. HarmBlockNone HarmBlockThreshold = 4 )
type Option ¶
type Option func(*Options)
func WithAPIKey ¶
WithAPIKey passes the API KEY (token) to the client. This is useful for googleai clients.
func WithCloudLocation ¶
WithCloudLocation passes the GCP cloud location (region) name to the client. This is useful for vertex clients.
func WithCloudProject ¶
WithCloudProject passes the GCP cloud project name to the client. This is useful for vertex clients.
func WithCredentialsFile ¶
WithCredentialsFile append a ClientOption that authenticates API calls with the given service account or refresh token JSON credentials file.
func WithCredentialsJSON ¶
WithCredentialsJSON append a ClientOption that authenticates API calls with the given service account or refresh token JSON credentials.
func WithDefaultCandidateCount ¶
WithDefaultCandidateCount sets the candidate count for the model.
func WithDefaultEmbeddingModel ¶
WithDefaultModel passes a default embedding model name to the client. This model name is used if not explicitly provided in specific client invocations.
func WithDefaultMaxTokens ¶
WithDefaultMaxTokens sets the maximum token count for the model.
func WithDefaultModel ¶
WithDefaultModel passes a default content model name to the client. This model name is used if not explicitly provided in specific client invocations.
func WithDefaultTemperature ¶
WithDefaultTemperature sets the maximum token count for the model.
func WithDefaultTopK ¶
WithDefaultTopK sets the TopK for the model.
func WithDefaultTopP ¶
WithDefaultTopP sets the TopP for the model.
func WithGRPCConn ¶
func WithGRPCConn(conn *grpc.ClientConn) Option
WithGRPCConn appends a ClientOption that uses the provided gRPC client connection to make requests. This is useful for testing embeddings in vertex clients.
func WithHTTPClient ¶
WithHTTPClient append a ClientOption that uses the provided HTTP client to make requests. This is useful for vertex clients.
func WithHarmThreshold ¶
func WithHarmThreshold(ht HarmBlockThreshold) Option
WithHarmThreshold sets the safety/harm setting for the model, potentially limiting any harmful content it may generate.
type Options ¶
type Options struct {
CloudProject string
CloudLocation string
DefaultModel string
DefaultEmbeddingModel string
DefaultCandidateCount int
DefaultMaxTokens int
DefaultTemperature float64
DefaultTopK int
DefaultTopP float64
HarmThreshold HarmBlockThreshold
ClientOptions []option.ClientOption
// APIKey stores the API key separately for REST client usage
// (needed for Gemini 3 thought signature support)
APIKey string
}
Options is a set of options for GoogleAI and Vertex clients.
func DefaultOptions ¶
func DefaultOptions() Options
func (*Options) EnsureAuthPresent ¶
func (o *Options) EnsureAuthPresent()
EnsureAuthPresent attempts to ensure that the client has authentication information. If it does not, it will attempt to use the GOOGLE_API_KEY environment variable.
Source Files
¶
Directories
¶
| Path | Synopsis |
|---|---|
|
internal
|
|
|
cmd
command
Code generator for vertex.go from googleai.go nolint
|
Code generator for vertex.go from googleai.go nolint |
|
package palm implements a langchaingo provider for Google Vertex AI legacy PaLM models.
|
package palm implements a langchaingo provider for Google Vertex AI legacy PaLM models. |
|
package vertex implements a langchaingo provider for Google Vertex AI LLMs, including the new Gemini models.
|
package vertex implements a langchaingo provider for Google Vertex AI LLMs, including the new Gemini models. |