Documentation
¶
Overview ¶
Package tokencounter provides token estimation for MCP tool definitions.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Counter ¶
Counter estimates the number of tokens a tool definition would consume when sent to an LLM. Implementations may use character-based heuristics or real tokenizers.
func NewJSONByteCounter ¶
func NewJSONByteCounter() Counter
NewJSONByteCounter returns a JSONByteDivisionCounter with a divisor of 4, which is a reasonable approximation for most LLM tokenizers.
type JSONByteDivisionCounter ¶
type JSONByteDivisionCounter struct {
Divisor int
}
JSONByteDivisionCounter estimates token count by serialising the full mcp.Tool to JSON and dividing the byte length by a configurable divisor.
func (JSONByteDivisionCounter) CountTokens ¶
func (c JSONByteDivisionCounter) CountTokens(tool mcp.Tool) int
CountTokens returns len(json(tool)) / divisor. Returns 0 if the divisor is zero or serialisation fails.
type TokenMetrics ¶
type TokenMetrics struct {
// BaselineTokens is the estimated tokens if all tools were sent.
BaselineTokens int `json:"baseline_tokens"`
// ReturnedTokens is the actual tokens for the returned tools.
ReturnedTokens int `json:"returned_tokens"`
// SavingsPercent is the percentage of tokens saved.
SavingsPercent float64 `json:"savings_percent"`
}
TokenMetrics provides information about token usage optimization.
func ComputeTokenMetrics ¶
func ComputeTokenMetrics(baselineTokens int, tokenCounts map[string]int, matchedToolNames []string) TokenMetrics
ComputeTokenMetrics calculates token savings by comparing the precomputed baseline (all tools) against only the matched tool names.