lexer

package
v0.2.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 28, 2026 License: MIT Imports: 3 Imported by: 0

Documentation

Overview

Package lexer provides lexical analysis for Cypher query strings. It tokenizes input strings into a stream of tokens that can be consumed by the parser.

The lexer supports:

  • Identifiers and keywords
  • Numbers (integers and floats)
  • Strings (single and double quoted)
  • Operators (+, -, *, /, %, ^, =, !=, <, <=, >, >=)
  • Delimiters (parentheses, braces, brackets, commas, etc.)
  • Comments (single-line // and multi-line /* */)

Basic Usage:

l := lexer.New("MATCH (n:Person) RETURN n.name")
tokens, err := l.Tokenize()
if err != nil {
    log.Fatal(err)
}
for _, tok := range tokens {
    fmt.Printf("%s: %s\n", tok.Type, tok.Value)
}

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func IsKeyword

func IsKeyword(word string) bool

IsKeyword returns true if the given string is a Cypher keyword.

Parameters:

  • word: The string to check

Returns true if the word is a keyword.

Example:

if lexer.IsKeyword("MATCH") {
    // Handle keyword
}

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer tokenizes Cypher query strings into a stream of tokens.

func New

func New(input string) *Lexer

New creates a new Lexer for the given input string.

Parameters:

  • input: The Cypher query string to tokenize

Returns a new Lexer instance.

Example:

l := lexer.New("MATCH (n:Person) RETURN n.name")
tokens, err := l.Tokenize()

func (*Lexer) Tokenize

func (l *Lexer) Tokenize() ([]Token, error)

Tokenize processes the entire input and returns all tokens. It returns an error if any lexical errors are encountered.

Returns the slice of tokens and any error encountered.

Example:

l := lexer.New("MATCH (n:Person) RETURN n.name")
tokens, err := l.Tokenize()
if err != nil {
    log.Fatal(err)
}

type LexerError

type LexerError struct {
	Line    int    // Line number where the error occurred
	Column  int    // Column number where the error occurred
	Message string // Error message
}

LexerError represents a lexical error with position information.

func (*LexerError) Error

func (e *LexerError) Error() string

Error returns the error message.

type LexerErrors

type LexerErrors struct {
	Errors []error // Slice of errors
}

LexerErrors aggregates multiple lexical errors.

func (*LexerErrors) Error

func (e *LexerErrors) Error() string

Error returns a string containing all error messages.

type Token

type Token struct {
	Type     TokenType // The type of the token
	Value    string    // The string value of the token
	Line     int       // Line number where the token appears
	Column   int       // Column number where the token appears
	Position int       // Byte position in the input
}

Token represents a lexical token with its type, value, and position.

func (Token) IsEOF

func (t Token) IsEOF() bool

IsEOF returns true if the token is an EOF token.

func (Token) IsError

func (t Token) IsError() bool

IsError returns true if the token is an error token.

func (Token) IsKeyword

func (t Token) IsKeyword(name string) bool

IsKeyword returns true if the token matches the given keyword.

func (Token) String

func (t Token) String() string

String returns the string value of the token.

type TokenType

type TokenType int

TokenType represents the type of a lexical token.

const (
	TokenEOF TokenType = iota
	TokenError
	TokenInteger
	TokenFloat
	TokenString
	TokenBoolean
	TokenNull
	TokenIdentifier
	TokenPlus
	TokenStar
	TokenSlash
	TokenPercent
	TokenCaret
	TokenEq
	TokenNeq
	TokenLt
	TokenLe
	TokenGt
	TokenGe
	TokenAnd
	TokenOr
	TokenNot
	TokenXor
	TokenLParen
	TokenRParen
	TokenLBrace
	TokenRBrace
	TokenLBracket
	TokenRBracket
	TokenColon
	TokenComma
	TokenDot
	TokenSemicolon
	TokenPipe
	TokenArrowRight
	TokenArrowLeft
	TokenDash
	TokenPlusEq
	TokenRange
	TokenDollar
)

Token type constants.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL