Documentation
¶
Overview ¶
Package parser provides a recursive descent SQL parser that converts tokens into an Abstract Syntax Tree (AST). It supports comprehensive SQL features including SELECT, INSERT, UPDATE, DELETE, DDL operations, Common Table Expressions (CTEs), set operations (UNION, EXCEPT, INTERSECT), and window functions.
Phase 2 Features (v1.2.0+):
- Common Table Expressions (WITH clause) with recursive support
- Set operations: UNION, UNION ALL, EXCEPT, INTERSECT
- Multiple CTE definitions in single query
- CTE column specifications
- Left-associative set operation parsing
- Integration of CTEs with set operations
Phase 2.5 Features (v1.3.0+):
- Window functions with OVER clause support
- PARTITION BY and ORDER BY in window specifications
- Window frame clauses (ROWS/RANGE with bounds)
- Ranking functions: ROW_NUMBER(), RANK(), DENSE_RANK(), NTILE()
- Analytic functions: LAG(), LEAD(), FIRST_VALUE(), LAST_VALUE()
- Function call parsing with parentheses and arguments
- Integration with existing SELECT statement parsing
Index ¶
Constants ¶
const MaxRecursionDepth = 100
MaxRecursionDepth defines the maximum allowed recursion depth for parsing operations. This prevents stack overflow from deeply nested expressions, CTEs, or other recursive structures.
Variables ¶
This section is empty.
Functions ¶
func ConvertTokensForParser ¶ added in v1.4.0
func ConvertTokensForParser(tokens []models.TokenWithSpan) ([]token.Token, error)
ConvertTokensForParser is a convenient function that creates a converter and converts tokens This maintains backward compatibility with existing CLI code
Types ¶
type ConversionResult ¶ added in v1.4.0
type ConversionResult struct {
Tokens []token.Token
PositionMapping []TokenPosition // Maps parser token index to original position
}
ConversionResult contains the converted tokens and any position mappings
func ConvertTokensWithPositions ¶ added in v1.4.0
func ConvertTokensWithPositions(tokens []models.TokenWithSpan) (*ConversionResult, error)
ConvertTokensWithPositions provides both tokens and position mapping for enhanced error reporting
type Parser ¶
type Parser struct {
// contains filtered or unexported fields
}
Parser represents a SQL parser
func (*Parser) ParseContext ¶ added in v1.5.0
ParseContext parses the tokens into an AST with context support for cancellation. It checks the context at strategic points (every statement and expression) to enable fast cancellation. Returns context.Canceled or context.DeadlineExceeded when the context is cancelled.
This method is useful for:
- Long-running parsing operations that need to be cancellable
- Implementing timeouts for parsing
- Graceful shutdown scenarios
Example:
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
defer cancel()
astNode, err := parser.ParseContext(ctx, tokens)
if err == context.DeadlineExceeded {
// Handle timeout
}
type TokenConverter ¶ added in v1.4.0
type TokenConverter struct {
// contains filtered or unexported fields
}
TokenConverter provides centralized, optimized token conversion from tokenizer output (models.TokenWithSpan) to parser input (token.Token)
func NewTokenConverter ¶ added in v1.4.0
func NewTokenConverter() *TokenConverter
NewTokenConverter creates an optimized token converter
func (*TokenConverter) Convert ¶ added in v1.4.0
func (tc *TokenConverter) Convert(tokens []models.TokenWithSpan) (*ConversionResult, error)
Convert converts tokenizer tokens to parser tokens with position tracking
type TokenPosition ¶ added in v1.4.0
type TokenPosition struct {
OriginalIndex int // Index in original token slice
Start models.Location // Original start position
End models.Location // Original end position
SourceToken *models.TokenWithSpan // Reference to original token for error reporting
}
TokenPosition maps a parser token back to its original source position