Documentation
¶
Overview ¶
Package obfuscate implements quantizing and obfuscating of tags and resources for a set of spans matching a certain criteria.
Index ¶
Constants ¶
const ( LexError = TokenKind(57346) + iota ID Limit Null String DoubleQuotedString Number BooleanLiteral ValueArg ListArg Comment Variable Savepoint PreparedStatement EscapeSequence NullSafeEqual LE GE NE As From Update Insert Into Join // FilteredGroupable specifies that the given token has been discarded by one of the // token filters and that it is groupable together with consecutive FilteredGroupable // tokens. FilteredGroupable // Filtered specifies that the token is a comma and was discarded by one // of the filters. Filtered // FilteredBracketedIdentifier specifies that we are currently discarding // a bracketed identifier (MSSQL). // See issue https://github.com/DataDog/datadog-trace-agent/issues/475. FilteredBracketedIdentifier )
list of available tokens; this list has been reduced because we don't need a full-fledged tokenizer to implement a Lexer
const EOFChar = unicode.MaxRune + 1
EOFChar is used to signal that no more characters were found by the scanner. It is an invalid rune value that can not be found in any valid string.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Config ¶
type Config struct {
// ES holds the obfuscation configuration for ElasticSearch bodies.
ES JSONSettings
// Mongo holds the obfuscation configuration for MongoDB queries.
Mongo JSONSettings
// RemoveQueryStrings specifies whether query strings should be removed from HTTP URLs.
RemoveQueryString bool
// RemovePathDigits specifies whether digits in HTTP path segments to be removed.
RemovePathDigits bool
// RemoveStackTraces specifies whether stack traces should be removed. More specifically,
// the "error.stack" tag values will be cleared from spans.
RemoveStackTraces bool
// Redis enables obfuscatiion of the "redis.raw_command" tag for spans of type "redis".
Redis bool
// Redis enables obfuscatiion of the "memcached.command" tag for spans of type "memcached".
Memcached bool
// contains filtered or unexported fields
}
Config specifies the obfuscator configuration.
type JSONSettings ¶
type JSONSettings struct {
// Enabled will specify whether obfuscation should be enabled.
Enabled bool
// KeepValues specifies a set of keys for which their values will not be obfuscated.
KeepValues []string
}
JSONSettings specifies the behaviour of the JSON obfuscator.
type Obfuscator ¶
type Obfuscator struct {
// contains filtered or unexported fields
}
Obfuscator quantizes and obfuscates spans. The obfuscator is not safe for concurrent use.
func NewObfuscator ¶
func NewObfuscator(cfg *Config) *Obfuscator
NewObfuscator creates a new Obfuscator.
func (*Obfuscator) Obfuscate ¶
func (o *Obfuscator) Obfuscate(span *pb.Span)
Obfuscate may obfuscate span's properties based on its type and on the Obfuscator's configuration.
func (*Obfuscator) SQLLiteralEscapes ¶
func (o *Obfuscator) SQLLiteralEscapes() bool
SQLLiteralEscapes reports whether escape characters should be treated literally by the SQL obfuscator.
func (*Obfuscator) SetSQLLiteralEscapes ¶
func (o *Obfuscator) SetSQLLiteralEscapes(ok bool)
SetSQLLiteralEscapes sets whether or not escape characters should be treated literally by the SQL obfuscator.
type SQLTokenizer ¶
type SQLTokenizer struct {
// contains filtered or unexported fields
}
SQLTokenizer is the struct used to generate SQL tokens for the parser.
func NewSQLTokenizer ¶
func NewSQLTokenizer(sql string, literalEscapes bool) *SQLTokenizer
NewSQLTokenizer creates a new SQLTokenizer for the given SQL string. The literalEscapes argument specifies whether escape characters should be treated literally or as such.
func (*SQLTokenizer) Err ¶
func (tkn *SQLTokenizer) Err() error
Err returns the last error that the tokenizer encountered, or nil.
func (*SQLTokenizer) Reset ¶
func (tkn *SQLTokenizer) Reset(in string)
Reset the underlying buffer and positions
func (*SQLTokenizer) Scan ¶
func (tkn *SQLTokenizer) Scan() (TokenKind, []byte)
Scan scans the tokenizer for the next token and returns the token type and the token buffer.
func (*SQLTokenizer) SeenEscape ¶
func (tkn *SQLTokenizer) SeenEscape() bool
SeenEscape returns whether or not this tokenizer has seen an escape character within a scanned string
type SyntaxError ¶
type SyntaxError struct {
Offset int64 // error occurred after reading Offset bytes
// contains filtered or unexported fields
}
A SyntaxError is a description of a JSON syntax error.
func (*SyntaxError) Error ¶
func (e *SyntaxError) Error() string