Documentation
¶
Overview ¶
Package fake provides mock tokenizers for testing context packing.
Use the Tokenizer type to count tokens by splitting on whitespace or return a fixed count. Useful for unit tests requiring predictable token behavior.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Tokenizer ¶
type Tokenizer struct {
// TokensPerWord is the number of tokens per word (default 1).
TokensPerWord int
// FixedCount, if set, returns this count instead of counting.
FixedCount int
// Err, if set, is returned by CountTokens.
Err error
}
Tokenizer is a fake tokenizer for testing. It counts tokens by splitting on whitespace.
func NewTokenizer ¶
func NewTokenizer() *Tokenizer
NewTokenizer creates a new fake tokenizer with sensible defaults.
Click to show internal directories.
Click to hide internal directories.