Documentation
¶
Overview ¶
Package lexer implements the lexical analyzer for the Monke programming language.
The lexer is responsible for breaking down the source code into tokens, which are the smallest units of meaning in the language. It reads the input character by character and produces a stream of tokens that can be processed by the parser.
Key features:
- Tokenization of all language elements (keywords, identifiers, literals, operators, etc.)
- Handling of whitespace and comments
- Error detection for illegal characters
- Support for various token types defined in the token package
- Optimized for performance with minimal allocations
The main entry point is the New function, which creates a new Lexer instance, and the NextToken method, which returns the next token from the input.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
Click to show internal directories.
Click to hide internal directories.