token

package
v0.260412.1600-preview Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 12, 2026 License: MPL-2.0 Imports: 7 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func CountBetaTokensViaTiktoken

func CountBetaTokensViaTiktoken(req *anthropic.BetaMessageCountTokensParams) (int, error)

CountBetaTokensViaTiktoken approximates token count for OpenAI-style providers using tiktoken

func CountTokensViaTiktoken

func CountTokensViaTiktoken(req *anthropic.MessageCountTokensParams) (int, error)

CountTokensViaTiktoken approximates token count for OpenAI-style providers using tiktoken

func EstimateAnthropicBetaMessagesTokens added in v0.260414.2000

func EstimateAnthropicBetaMessagesTokens(messages []anthropic.BetaMessageParam) int64

EstimateAnthropicBetaMessagesTokens estimates tokens for Anthropic messages

func EstimateAnthropicTokens added in v0.260414.2000

func EstimateAnthropicTokens(messages []anthropic.MessageParam) int64

func EstimateBetaAnthropicTokens added in v0.260414.2000

func EstimateBetaAnthropicTokens(messages []anthropic.BetaMessageParam) int64

func EstimateInputTokens

func EstimateInputTokens(req *openai.ChatCompletionNewParams) (int, error)

EstimateInputTokens estimates input tokens from OpenAI request using tiktoken

func EstimateMessageTokens added in v0.260414.2000

func EstimateMessageTokens(msg openai.ChatCompletionMessageParamUnion) int64

func EstimateMessagesTokens added in v0.260414.2000

func EstimateMessagesTokens(messages []openai.ChatCompletionMessageParamUnion) int64

func EstimateOutputTokens

func EstimateOutputTokens(content string) int

EstimateOutputTokens estimates output tokens from accumulated content

func EstimateTokensString added in v0.260414.2000

func EstimateTokensString(s string) int64

func SplitIntoChunks added in v0.260414.2000

func SplitIntoChunks(content string) []string

SplitIntoChunks splits content into word-based chunks for streaming

func StringPtr added in v0.260414.2000

func StringPtr(s string) *string

Types

type StreamTokenCounter

type StreamTokenCounter struct {
	// contains filtered or unexported fields
}

StreamTokenCounter maintains token count state for streaming responses. It uses incremental counting strategy, tokenizing each delta immediately.

Usage example:

counter, _ := NewStreamTokenCounter()
counter.SetInputTokens(estimateRequestTokens(req)) // Pre-count input
for chunk := range stream {
    counter.ConsumeOpenAIChunk(chunk)
}
input, output := counter.GetCounts()

func NewStreamTokenCounter

func NewStreamTokenCounter() (*StreamTokenCounter, error)

NewStreamTokenCounter creates a new streaming token counter.

func NewStreamTokenCounterWithEncoding

func NewStreamTokenCounterWithEncoding(encoding string) (*StreamTokenCounter, error)

NewStreamTokenCounterWithEncoding creates a new streaming token counter with a specific encoding.

func (*StreamTokenCounter) AddInputTokens

func (c *StreamTokenCounter) AddInputTokens(tokens int)

AddInputTokens adds to the input token count.

func (*StreamTokenCounter) AddToOutputTokens

func (c *StreamTokenCounter) AddToOutputTokens(tokens int)

AddToOutputTokens adds to the output token count.

func (*StreamTokenCounter) ConsumeOpenAIChunk

func (c *StreamTokenCounter) ConsumeOpenAIChunk(chunk *openai.ChatCompletionChunk) (inputTokens, outputTokens int, err error)

ConsumeOpenAIChunk processes an OpenAI streaming chunk and updates token counts. Returns the current (inputTokens, outputTokens) after processing this chunk.

The function handles:

  • Content deltas (text responses)
  • Refusal deltas (refusal messages)
  • Tool call deltas (function names and arguments)
  • Usage information (typically in the final chunk when stream_options.include_usage is set)

func (*StreamTokenCounter) CountText

func (c *StreamTokenCounter) CountText(text string) int

CountText counts tokens for any text string.

func (*StreamTokenCounter) EstimateInputTokens

func (c *StreamTokenCounter) EstimateInputTokens(req *openai.ChatCompletionNewParams) (int, error)

EstimateInputTokens estimates input tokens from a request using the counter's encoder. This is useful to pre-set the input token count before streaming.

func (*StreamTokenCounter) GetCounts

func (c *StreamTokenCounter) GetCounts() (inputTokens, outputTokens int)

GetCounts returns the current token counts (inputTokens, outputTokens).

func (*StreamTokenCounter) InputTokens

func (c *StreamTokenCounter) InputTokens() int

InputTokens returns the current input token count.

func (*StreamTokenCounter) OutputTokens

func (c *StreamTokenCounter) OutputTokens() int

OutputTokens returns the current output token count.

func (*StreamTokenCounter) Reset

func (c *StreamTokenCounter) Reset()

Reset resets the counter to zero.

func (*StreamTokenCounter) SetInputTokens

func (c *StreamTokenCounter) SetInputTokens(tokens int)

SetInputTokens sets the input token count. Use this when you have the exact count from the request.

func (*StreamTokenCounter) SetOutputTokens

func (c *StreamTokenCounter) SetOutputTokens(tokens int)

SetOutputTokens sets the output token count.

func (*StreamTokenCounter) TotalTokens

func (c *StreamTokenCounter) TotalTokens() int

TotalTokens returns the sum of input and output tokens.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL