fexpr

package module
v0.5.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 27, 2025 License: BSD-3-Clause Imports: 5 Imported by: 54

README

fexpr Go Report Card GoDoc

fexpr is a filter query language parser that generates easy to work with AST structure so that you can create safely SQL, Elasticsearch, etc. queries from user input.

Or in other words, transform the string "id > 1" into the struct [{&& {{identifier id} > {number 1}}}].

Supports parenthesis and various conditional expression operators (see Grammar).

Example usage

go get github.com/ganigeorgiev/fexpr
package main

import github.com/ganigeorgiev/fexpr

func main() {
    // [{&& {{identifier id} = {number 123}}} {&& {{identifier status} = {text active}}}]
    result, err := fexpr.Parse("id=123 && status='active'")
}

Note that each parsed expression statement contains a join/union operator (&& or ||) so that the result can be consumed on small chunks without having to rely on the group/nesting context.

See the package documentation for more details and examples.

Grammar

fexpr grammar resembles the SQL WHERE expression syntax. It recognizes several token types (identifiers, numbers, quoted text, expression operators, whitespaces, etc.).

You could find all supported tokens in scanner.go.

Operators
  • = Equal operator (eg. a=b)
  • != NOT Equal operator (eg. a!=b)
  • > Greater than operator (eg. a>b)
  • >= Greater than or equal operator (eg. a>=b)
  • < Less than or equal operator (eg. a<b)
  • <= Less than or equal operator (eg. a<=b)
  • ~ Like/Contains operator (eg. a~b)
  • !~ NOT Like/Contains operator (eg. a!~b)
  • ?= Array/Any equal operator (eg. a?=b)
  • ?!= Array/Any NOT Equal operator (eg. a?!=b)
  • ?> Array/Any Greater than operator (eg. a?>b)
  • ?>= Array/Any Greater than or equal operator (eg. a?>=b)
  • ?< Array/Any Less than or equal operator (eg. a?<b)
  • ?<= Array/Any Less than or equal operator (eg. a?<=b)
  • ?~ Array/Any Like/Contains operator (eg. a?~b)
  • ?!~ Array/Any NOT Like/Contains operator (eg. a?!~b)
  • && AND join operator (eg. a=b && c=d)
  • || OR join operator (eg. a=b || c=d)
  • () Parenthesis (eg. (a=1 && b=2) || (a=3 && b=4))
Numbers

Number tokens are any integer or decimal numbers.

Example: 123, 10.50, -14.

Quoted text

Text tokens are any literals that are wrapped by ' or " quotes.

Example: 'Lorem ipsum dolor 123!', "escaped \"word\"", "mixed 'quotes' are fine".

Identifiers

Identifier tokens are literals that start with a letter, _, @ or # and could contain further any number of letters, digits, . (usually used as a separator) or : (usually used as modifier) characters.

Example: id, a.b.c, field123, @request.method, author.name:length.

Functions

Function tokens are similar to the identifiers but in addition accept a list of arguments enclosed in parenthesis (). The function arguments must be separated by comma (a single trailing comma is also allowed) and each argument can be an identifier, quoted text, number or another nested function (up to 2 nested).

Example: test(), test(a.b, 123, "abc"), @a.b.c:test(true), a(b(c(1, 2))).

Comments

Comment tokens are any single line text literals starting with //. Similar to whitespaces, comments are ignored by fexpr.Parse().

Example: // test.

Using only the scanner

The tokenizer (aka. fexpr.Scanner) could be used without the parser's state machine so that you can write your own custom tokens processing:

s := fexpr.NewScanner([]byte("id > 123"))

// scan single token at a time until EOF or error is reached
for {
    t, err := s.Scan()
    if t.Type == fexpr.TokenEOF || err != nil {
        break
    }

    fmt.Println(t)
}

// Output:
// {<nil> identifier id}
// {<nil> whitespace  }
// {<nil> sign >}
// {<nil> whitespace  }
// {<nil> number 123}

Documentation

Index

Examples

Constants

View Source
const (
	StepJoin
)

parser's state machine steps

Variables

View Source
var ErrEmpty = errors.New("empty filter expression")
View Source
var ErrIncomplete = errors.New("invalid or incomplete filter expression")
View Source
var ErrInvalidComment = errors.New("invalid comment")

Functions

This section is empty.

Types

type Expr

type Expr struct {
	Left  Token
	Op    SignOp
	Right Token
}

Expr represents an individual tokenized expression consisting of left operand, operator and a right operand.

func (Expr) IsZero added in v0.4.0

func (e Expr) IsZero() bool

IsZero checks if the current Expr has zero-valued props.

type ExprGroup

type ExprGroup struct {
	Item interface{}
	Join JoinOp
}

ExprGroup represents a wrapped expression and its join type.

The group's Item could be either an `Expr` instance or `[]ExprGroup` slice (for nested expressions).

func Parse

func Parse(text string) ([]ExprGroup, error)

Parse parses the provided text and returns its processed AST in the form of `ExprGroup` slice(s).

Comments and whitespaces are ignored.

Example
package main

import (
	"fmt"

	"github.com/ganigeorgiev/fexpr"
)

func main() {
	result, _ := fexpr.Parse("id > 123")

	fmt.Println(result)

}
Output:

[{{{<nil> identifier id} > {<nil> number 123}} &&}]

type JoinOp

type JoinOp string

JoinOp represents a join type operator.

const (
	JoinAnd JoinOp = "&&"
	JoinOr  JoinOp = "||"
)

supported join type operators

type Scanner

type Scanner struct {
	// contains filtered or unexported fields
}

Scanner represents a filter and lexical scanner.

func NewScanner

func NewScanner(data []byte) *Scanner

NewScanner creates and returns a new scanner instance loaded with the specified data.

func (*Scanner) Scan

func (s *Scanner) Scan() (Token, error)

Scan reads and returns the next available token value from the scanner's buffer.

Example
package main

import (
	"fmt"

	"github.com/ganigeorgiev/fexpr"
)

func main() {
	s := fexpr.NewScanner([]byte("id > 123"))

	for {
		t, err := s.Scan()
		if t.Type == fexpr.TokenEOF || err != nil {
			break
		}

		fmt.Println(t)
	}

}
Output:

{<nil> identifier id}
{<nil> whitespace  }
{<nil> sign >}
{<nil> whitespace  }
{<nil> number 123}

type SignOp

type SignOp string

SignOp represents an expression sign operator.

const (
	SignEq    SignOp = "="
	SignNeq   SignOp = "!="
	SignLike  SignOp = "~"
	SignNlike SignOp = "!~"
	SignLt    SignOp = "<"
	SignLte   SignOp = "<="
	SignGt    SignOp = ">"
	SignGte   SignOp = ">="

	// array/any operators
	SignAnyEq    SignOp = "?="
	SignAnyNeq   SignOp = "?!="
	SignAnyLike  SignOp = "?~"
	SignAnyNlike SignOp = "?!~"
	SignAnyLt    SignOp = "?<"
	SignAnyLte   SignOp = "?<="
	SignAnyGt    SignOp = "?>"
	SignAnyGte   SignOp = "?>="
)

supported expression sign operators

type Token

type Token struct {
	Meta    interface{}
	Type    TokenType
	Literal string
}

Token represents a single scanned literal (one or more combined runes).

type TokenType

type TokenType string

TokenType represents a Token type.

const (
	TokenUnexpected TokenType = "unexpected"
	TokenEOF        TokenType = "eof"
	TokenWS         TokenType = "whitespace"
	TokenJoin       TokenType = "join"
	TokenSign       TokenType = "sign"
	TokenIdentifier TokenType = "identifier" // variable, column name, placeholder, etc.
	TokenFunction   TokenType = "function"   // function
	TokenNumber     TokenType = "number"
	TokenText       TokenType = "text"  // ' or " quoted string
	TokenGroup      TokenType = "group" // groupped/nested tokens
	TokenComment    TokenType = "comment"
)

token type constants

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL