Documentation
¶
Overview ¶
Package astparser is used to turn raw GraphQL documents into an AST.
Index ¶
- func ParseGraphqlDocumentBytes(input []byte) (ast.Document, operationreport.Report)
- func ParseGraphqlDocumentString(input string) (ast.Document, operationreport.Report)
- type ErrDepthLimitExceeded
- type ErrFieldsLimitExceeded
- type ErrUnexpectedIdentKey
- type ErrUnexpectedToken
- type Parser
- func (p *Parser) Parse(document *ast.Document, report *operationreport.Report)
- func (p *Parser) ParseType() (ref int)
- func (p *Parser) ParseValue() (value ast.Value)
- func (p *Parser) ParseWithLimits(limits TokenizerLimits, document *ast.Document, report *operationreport.Report) (TokenizerStats, error)
- func (p *Parser) PrepareImport(document *ast.Document, report *operationreport.Report)
- type Tokenizer
- type TokenizerLimits
- type TokenizerStats
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func ParseGraphqlDocumentBytes ¶
func ParseGraphqlDocumentBytes(input []byte) (ast.Document, operationreport.Report)
ParseGraphqlDocumentBytes takes a raw GraphQL document in byte slice format and parses it into an AST. This function creates a new parser as well as a new AST for every call. Therefore you shouldn't use this function in a hot path. Instead create a parser as well as AST objects and re-use them.
func ParseGraphqlDocumentString ¶
func ParseGraphqlDocumentString(input string) (ast.Document, operationreport.Report)
ParseGraphqlDocumentString takes a raw GraphQL document in string format and parses it into an AST. This function creates a new parser as well as a new AST for every call. Therefore you shouldn't use this function in a hot path. Instead create a parser as well as AST objects and re-use them.
Types ¶
type ErrDepthLimitExceeded ¶
type ErrDepthLimitExceeded struct {
// contains filtered or unexported fields
}
ErrDepthLimitExceeded is returned when the parser encounters nesting depth that exceeds the configured limit during tokenization. This error helps prevent stack overflow and DoS attacks from maliciously deep GraphQL documents.
func (ErrDepthLimitExceeded) Error ¶
func (e ErrDepthLimitExceeded) Error() string
type ErrFieldsLimitExceeded ¶
type ErrFieldsLimitExceeded struct {
// contains filtered or unexported fields
}
ErrFieldsLimitExceeded is returned when the parser encounters a number of fields that exceeds the configured limit during tokenization. This error helps prevent
func (ErrFieldsLimitExceeded) Error ¶
func (e ErrFieldsLimitExceeded) Error() string
type ErrUnexpectedIdentKey ¶
type ErrUnexpectedIdentKey struct {
// contains filtered or unexported fields
}
ErrUnexpectedIdentKey is a custom error object to properly render an unexpected ident key error
func (ErrUnexpectedIdentKey) Error ¶
func (e ErrUnexpectedIdentKey) Error() string
type ErrUnexpectedToken ¶
type ErrUnexpectedToken struct {
// contains filtered or unexported fields
}
ErrUnexpectedToken is a custom error object containing all necessary information to properly render an unexpected token error
func (ErrUnexpectedToken) Error ¶
func (e ErrUnexpectedToken) Error() string
type Parser ¶
type Parser struct {
// contains filtered or unexported fields
}
Parser takes a raw input and turns it into an AST use NewParser() to create a parser Don't create new parsers in the hot path, re-use them.
func NewParser ¶
func NewParser() *Parser
NewParser returns a new parser with all values properly initialized
func (*Parser) Parse ¶
func (p *Parser) Parse(document *ast.Document, report *operationreport.Report)
Parse parses all input in a Document.Input into the Document
func (*Parser) ParseValue ¶
func (*Parser) ParseWithLimits ¶
func (p *Parser) ParseWithLimits(limits TokenizerLimits, document *ast.Document, report *operationreport.Report) (TokenizerStats, error)
ParseWithLimits parses all input in a Document.Input into the Document with limits on the number of fields and depth
func (*Parser) PrepareImport ¶
func (p *Parser) PrepareImport(document *ast.Document, report *operationreport.Report)
PrepareImport prepares the Parser for importing new Nodes into an AST without directly parsing the content
type Tokenizer ¶
type Tokenizer struct {
// contains filtered or unexported fields
}
Tokenizer takes a raw input and turns it into set of tokens
func (*Tokenizer) Peek ¶
Peek - returns token next to currentToken if hasNextToken otherwise returns keyword.EOF
func (*Tokenizer) Read ¶
Read - increments currentToken index and return token if hasNextToken otherwise returns keyword.EOF
func (*Tokenizer) TokenizeWithLimits ¶
func (t *Tokenizer) TokenizeWithLimits(limits TokenizerLimits, input *ast.Input) (TokenizerStats, error)