Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func AssertEqual ¶
AssertEqual asserts that the the actual Token is a scanTok and that it equals the expected Token which must also be a scanTok.
func AssertSliceEqual ¶
AssertSliceEqual asserts that the the actual Token slice contains only scanTok instances and that each equals the corresponding one from the expected Token slice.
Types ¶
type Kind ¶
type Kind int
Kind represents the type of a token.
const ( TK_UNDEFINED Kind = iota TK_NEWLINE // '\n' or '\r\n' TK_SPACE // All whitespace characters except newlines TK_SHEBANG // Always the first line in a file TK_ID // Identifier TK_ASSIGN // <-, := TK_BOOL // true, false TK_NUMBER // 123.456 TK_STRING // `...` TK_ADD // + TK_SUBTRACT // - TK_MULTIPLY // * TK_DIVIDE // / TK_MODULO // % TK_VOID // _ TK_DELIM // , )
type Token ¶
type Token interface {
// Text returns the textual representation of the token. For scanned tokens
// this will always be the actual text that represents the token while others
// may choose to use as they please.
Text() string
// Line returns the line index of the token within its scroll.
Line() int
// Start returns the column index of the first rune within the line.
Start() int
// End returns the column index after the last rune within the line.
End() int
// Kind returns the type of token.
Kind() Kind
// String returns a string representation of the token.
String() string
}
Token represents a token produced by lexical analysis. I.e. identifier, operator, punctionation, etc.
func UniqueDummy ¶
UniqueDummy creates a new dummy token initialised to the specified token kind and unique text.
func UpdateText ¶
UpdateText updates the text within a token.
Click to show internal directories.
Click to hide internal directories.