Documentation
¶
Overview ¶
Package googleai provides caching support for Google AI models.
package googleai implements a langchaingo provider for Google AI LLMs. See https://ai.google.dev/ for more details.
Index ¶
- Constants
- Variables
- func MapError(err error) error
- func WithCachedContent(name string) llms.CallOption
- type CachingHelper
- func (ch *CachingHelper) CreateCachedContent(ctx context.Context, modelName string, messages []llms.MessageContent, ...) (*genai.CachedContent, error)
- func (ch *CachingHelper) DeleteCachedContent(ctx context.Context, name string) error
- func (ch *CachingHelper) GetCachedContent(ctx context.Context, name string) (*genai.CachedContent, error)
- func (ch *CachingHelper) ListCachedContents(ctx context.Context) *genai.CachedContentIterator
- type GoogleAI
- func (g *GoogleAI) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)
- func (g *GoogleAI) Close() error
- func (g *GoogleAI) CreateEmbedding(ctx context.Context, texts []string) ([][]float32, error)
- func (g *GoogleAI) GenerateContent(ctx context.Context, messages []llms.MessageContent, ...) (*llms.ContentResponse, error)
- func (g *GoogleAI) SupportsReasoning() bool
- type HarmBlockThreshold
- type Option
- func WithAPIKey(apiKey string) Option
- func WithCloudLocation(l string) Option
- func WithCloudProject(p string) Option
- func WithCredentialsFile(credentialsFile string) Option
- func WithCredentialsJSON(credentialsJSON []byte) Option
- func WithDefaultCandidateCount(defaultCandidateCount int) Option
- func WithDefaultEmbeddingModel(defaultEmbeddingModel string) Option
- func WithDefaultMaxTokens(maxTokens int) Option
- func WithDefaultModel(defaultModel string) Option
- func WithDefaultTemperature(defaultTemperature float64) Option
- func WithDefaultTopK(defaultTopK int) Option
- func WithDefaultTopP(defaultTopP float64) Option
- func WithGRPCConn(conn *grpc.ClientConn) Option
- func WithHTTPClient(httpClient *http.Client) Option
- func WithHarmThreshold(ht HarmBlockThreshold) Option
- func WithRest() Option
- type Options
Constants ¶
const ( CITATIONS = "citations" SAFETY = "safety" RoleSystem = "system" RoleModel = "model" RoleUser = "user" RoleTool = "tool" ResponseMIMETypeJson = "application/json" )
Variables ¶
Functions ¶
func MapError ¶ added in v0.1.14
MapError maps Google AI-specific errors to standardized error codes.
func WithCachedContent ¶ added in v0.1.14
func WithCachedContent(name string) llms.CallOption
WithCachedContent enables the use of pre-created cached content. The cached content must be created separately using Client.CreateCachedContent. This is different from Anthropic's inline cache control.
Types ¶
type CachingHelper ¶ added in v0.1.14
type CachingHelper struct {
// contains filtered or unexported fields
}
CachingHelper provides utilities for working with Google AI's cached content feature. Unlike Anthropic which supports inline cache control, Google AI requires pre-creating cached content through the API.
func NewCachingHelper ¶ added in v0.1.14
func NewCachingHelper(ctx context.Context, opts ...Option) (*CachingHelper, error)
NewCachingHelper creates a helper for managing cached content.
func (*CachingHelper) CreateCachedContent ¶ added in v0.1.14
func (ch *CachingHelper) CreateCachedContent( ctx context.Context, modelName string, messages []llms.MessageContent, ttl time.Duration, ) (*genai.CachedContent, error)
CreateCachedContent creates cached content that can be reused across multiple requests. This is useful for caching large system prompts, context documents, or frequently used instructions.
Example usage:
helper, _ := NewCachingHelper(ctx, WithAPIKey(apiKey))
cached, _ := helper.CreateCachedContent(ctx, "gemini-2.0-flash", []llms.MessageContent{
{
Role: llms.ChatMessageTypeSystem,
Parts: []llms.ContentPart{
llms.TextPart("You are an expert assistant with deep knowledge..."),
},
},
}, 1*time.Hour)
// Use the cached content in requests
model, _ := New(ctx, WithAPIKey(apiKey))
resp, _ := model.GenerateContent(ctx, messages, WithCachedContent(cached.Name))
func (*CachingHelper) DeleteCachedContent ¶ added in v0.1.14
func (ch *CachingHelper) DeleteCachedContent(ctx context.Context, name string) error
DeleteCachedContent removes cached content.
func (*CachingHelper) GetCachedContent ¶ added in v0.1.14
func (ch *CachingHelper) GetCachedContent(ctx context.Context, name string) (*genai.CachedContent, error)
GetCachedContent retrieves existing cached content by name.
func (*CachingHelper) ListCachedContents ¶ added in v0.1.14
func (ch *CachingHelper) ListCachedContents(ctx context.Context) *genai.CachedContentIterator
ListCachedContents returns an iterator for all cached content.
type GoogleAI ¶
type GoogleAI struct {
CallbacksHandler callbacks.Handler
// contains filtered or unexported fields
}
GoogleAI is a type that represents a Google AI API client.
func (*GoogleAI) Call ¶
func (g *GoogleAI) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)
Call implements the llms.Model interface.
func (*GoogleAI) Close ¶ added in v0.1.14
Close closes the underlying genai client. This should be called when the GoogleAI instance is no longer needed to prevent memory leaks from the underlying gRPC connections.
func (*GoogleAI) CreateEmbedding ¶
CreateEmbedding creates embeddings from texts.
func (*GoogleAI) GenerateContent ¶
func (g *GoogleAI) GenerateContent( ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption, ) (*llms.ContentResponse, error)
GenerateContent implements the llms.Model interface.
func (*GoogleAI) SupportsReasoning ¶ added in v0.1.14
SupportsReasoning implements the ReasoningModel interface. Returns true if the current model supports reasoning/thinking tokens.
type HarmBlockThreshold ¶ added in v0.1.9
type HarmBlockThreshold int32
const ( // HarmBlockUnspecified means threshold is unspecified. HarmBlockUnspecified HarmBlockThreshold = 0 // HarmBlockLowAndAbove means content with NEGLIGIBLE will be allowed. HarmBlockLowAndAbove HarmBlockThreshold = 1 // HarmBlockMediumAndAbove means content with NEGLIGIBLE and LOW will be allowed. HarmBlockMediumAndAbove HarmBlockThreshold = 2 // HarmBlockOnlyHigh means content with NEGLIGIBLE, LOW, and MEDIUM will be allowed. HarmBlockOnlyHigh HarmBlockThreshold = 3 // HarmBlockNone means all content will be allowed. HarmBlockNone HarmBlockThreshold = 4 )
type Option ¶
type Option func(*Options)
func WithAPIKey ¶
WithAPIKey passes the API KEY (token) to the client. This is useful for googleai clients.
func WithCloudLocation ¶ added in v0.1.9
WithCloudLocation passes the GCP cloud location (region) name to the client. This is useful for vertex clients.
func WithCloudProject ¶ added in v0.1.9
WithCloudProject passes the GCP cloud project name to the client. This is useful for vertex clients.
func WithCredentialsFile ¶ added in v0.1.11
WithCredentialsFile append a ClientOption that authenticates API calls with the given service account or refresh token JSON credentials file.
func WithCredentialsJSON ¶ added in v0.1.11
WithCredentialsJSON append a ClientOption that authenticates API calls with the given service account or refresh token JSON credentials.
func WithDefaultCandidateCount ¶ added in v0.1.10
WithDefaultCandidateCount sets the candidate count for the model.
func WithDefaultEmbeddingModel ¶
WithDefaultModel passes a default embedding model name to the client. This model name is used if not explicitly provided in specific client invocations.
func WithDefaultMaxTokens ¶ added in v0.1.10
WithDefaultMaxTokens sets the maximum token count for the model.
func WithDefaultModel ¶
WithDefaultModel passes a default content model name to the client. This model name is used if not explicitly provided in specific client invocations.
func WithDefaultTemperature ¶ added in v0.1.10
WithDefaultTemperature sets the maximum token count for the model.
func WithDefaultTopK ¶ added in v0.1.10
WithDefaultTopK sets the TopK for the model.
func WithDefaultTopP ¶ added in v0.1.10
WithDefaultTopP sets the TopP for the model.
func WithGRPCConn ¶ added in v0.1.14
func WithGRPCConn(conn *grpc.ClientConn) Option
WithGRPCConn appends a ClientOption that uses the provided gRPC client connection to make requests. This is useful for testing embeddings in vertex clients.
func WithHTTPClient ¶ added in v0.1.11
WithHTTPClient append a ClientOption that uses the provided HTTP client to make requests. This is useful for vertex clients.
func WithHarmThreshold ¶ added in v0.1.9
func WithHarmThreshold(ht HarmBlockThreshold) Option
WithHarmThreshold sets the safety/harm setting for the model, potentially limiting any harmful content it may generate.
type Options ¶ added in v0.1.9
type Options struct {
CloudProject string
CloudLocation string
DefaultModel string
DefaultEmbeddingModel string
DefaultCandidateCount int
DefaultMaxTokens int
DefaultTemperature float64
DefaultTopK int
DefaultTopP float64
HarmThreshold HarmBlockThreshold
ClientOptions []option.ClientOption
}
Options is a set of options for GoogleAI and Vertex clients.
func DefaultOptions ¶ added in v0.1.9
func DefaultOptions() Options
func (*Options) EnsureAuthPresent ¶ added in v0.1.12
func (o *Options) EnsureAuthPresent()
EnsureAuthPresent attempts to ensure that the client has authentication information. If it does not, it will attempt to use the GOOGLE_API_KEY environment variable.
Directories
¶
| Path | Synopsis |
|---|---|
|
internal
|
|
|
cmd
command
Code generator for vertex.go from googleai.go nolint
|
Code generator for vertex.go from googleai.go nolint |
|
package palm implements a langchaingo provider for Google Vertex AI legacy PaLM models.
|
package palm implements a langchaingo provider for Google Vertex AI legacy PaLM models. |
|
package vertex implements a langchaingo provider for Google Vertex AI LLMs, including the new Gemini models.
|
package vertex implements a langchaingo provider for Google Vertex AI LLMs, including the new Gemini models. |