fake

package
v0.36.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 21, 2026 License: MIT Imports: 3 Imported by: 0

Documentation

Overview

Package fake provides mock tokenizers for testing context packing.

Use the Tokenizer type to count tokens by splitting on whitespace or return a fixed count. Useful for unit tests requiring predictable token behavior.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Tokenizer

type Tokenizer struct {
	// TokensPerWord is the number of tokens per word (default 1).
	TokensPerWord int
	// FixedCount, if set, returns this count instead of counting.
	FixedCount int
	// Err, if set, is returned by CountTokens.
	Err error
}

Tokenizer is a fake tokenizer for testing. It counts tokens by splitting on whitespace.

func NewTokenizer

func NewTokenizer() *Tokenizer

NewTokenizer creates a new fake tokenizer with sensible defaults.

func (*Tokenizer) CountTokens

func (f *Tokenizer) CountTokens(_ context.Context, text string) (int, error)

CountTokens returns the token count for the given text.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL