Documentation
¶
Overview ¶
Package json provides a high-performance, thread-safe JSON processing library with 100% encoding/json compatibility and advanced path operations.
The package uses an internal package for implementation details:
- internal: Private implementation including path parsing, navigation, extraction, caching, array utilities, security helpers, and encoding utilities
Most users can simply import the root package:
import "github.com/cybergodev/json"
Basic Usage ¶
Simple operations (100% compatible with encoding/json):
data, err := json.Marshal(value) err = json.Unmarshal(data, &target)
Advanced path operations:
value, err := json.Get(`{"user":{"name":"John"}}`, "user.name")
result, err := json.Set(`{"user":{}}`, "user.age", 30)
Type-safe operations:
name, err := json.GetString(jsonStr, "user.name") age, err := json.GetAsInt(jsonStr, "user.age")
Advanced processor for complex operations:
processor, err := json.New() // Use default config
if err != nil {
// handle error
}
defer processor.Close()
value, err := processor.Get(jsonStr, "complex.path[0].field")
Configuration ¶
Use DefaultConfig and optional parameters for custom configuration:
cfg := json.DefaultConfig()
cfg.EnableCache = true
processor, err := json.New(cfg)
if err != nil {
// handle error
}
defer processor.Close()
defer processor.Close()
Key Features ¶
- 100% encoding/json compatibility - drop-in replacement
- High-performance path operations with smart caching
- Thread-safe concurrent operations
- Type-safe generic operations with Go 1.22+ features
- Memory-efficient resource pooling
- Production-ready error handling and validation
Package Structure ¶
The package is organized with all public API in the root package:
- Core types: Processor, Config, ProcessorOptions, EncodeConfig
- Error types: JsonsError, various error constructors
- Encoding types: Number
Implementation details are in the internal/ package:
- Path parsing and navigation utilities
- Extraction and segment handling
- Cache and array utilities
- Security and encoding helpers
Core Types Organization ¶
Core types are organized in the following files:
- types.go: All type definitions (Config, ProcessorOptions, Stats, etc.)
- processor.go: Processor struct and all methods
- ops.go: Internal operation implementations
- path.go: Path parsing and navigation
- encoding.go: JSON encoding/decoding
- api.go: Package-level API functions
- file.go: File operations
- iterator.go: Iteration utilities
- recursive.go: Recursive processing
Package json provides extension interfaces for customizing JSON processing behavior. These interfaces enable users to inject custom validation, encoding, and middleware logic.
Package json provides a high-performance, thread-safe JSON processing library with 100% encoding/json compatibility and advanced path operations.
Key Features:
- 100% encoding/json compatibility - drop-in replacement
- High-performance path operations with smart caching
- Thread-safe concurrent operations
- Type-safe generic operations with Go 1.18+ generics
- Memory-efficient resource pooling
- Production-ready error handling and validation
Basic Usage:
// Simple operations (100% compatible with encoding/json)
data, err := json.Marshal(value)
err = json.Unmarshal(data, &target)
// Advanced path operations
value, err := json.Get(`{"user":{"name":"John"}}`, "user.name")
result, err := json.Set(`{"user":{}}`, "user.age", 30)
// Type-safe operations
name, err := json.GetString(jsonStr, "user.name")
age, err := json.GetAsInt(jsonStr, "user.age")
// Advanced processor for complex operations
processor := json.New() // Use default config
defer processor.Close()
value, err := processor.Get(jsonStr, "complex.path[0].field")
Index ¶
- Constants
- Variables
- func As[T any](r AccessResult) (T, error)
- func ClampIndex(index, length int) int
- func ClearCache()
- func ClearDangerousPatterns()
- func ClearKeyInternCache()
- func Compact(dst *bytes.Buffer, src []byte) error
- func CompactBuffer(dst *bytes.Buffer, src []byte, cfg ...Config) error
- func CompactString(jsonStr string, cfg ...Config) (string, error)
- func CompareJSON(json1, json2 string) (bool, error)
- func ConvertToBool(value any) (bool, bool)
- func ConvertToFloat64(value any) (float64, bool)
- func ConvertToInt(value any) (int, bool)
- func ConvertToInt64(value any) (int64, bool)
- func ConvertToString(value any) string
- func ConvertToUint64(value any) (uint64, bool)
- func CreateEmptyContainer(containerType string) any
- func DeepCopy(data any) (any, error)
- func Delete(jsonStr, path string, cfg ...Config) (string, error)
- func Encode(value any, cfg ...Config) (string, error)
- func EncodeBatch(pairs map[string]any, cfg ...Config) (string, error)
- func EncodeFields(value any, fields []string, cfg ...Config) (string, error)
- func EncodeStream(values any, cfg ...Config) (string, error)
- func EncodeWithConfig(value any, cfg ...Config) (string, error)
- func EscapeJSONPointer(s string) string
- func Foreach(jsonStr string, fn func(key any, item *IterableValue))
- func ForeachNested(jsonStr string, fn func(key any, item *IterableValue))
- func ForeachReturn(jsonStr string, fn func(key any, item *IterableValue)) (string, error)
- func ForeachWithPath(jsonStr, path string, fn func(key any, item *IterableValue)) error
- func ForeachWithPathAndControl(jsonStr, path string, fn func(key any, value any) IteratorControl) error
- func FormatNumber(value any) string
- func Get(jsonStr, path string, cfg ...Config) (any, error)
- func GetArray(jsonStr, path string, cfg ...Config) ([]any, error)
- func GetBool(jsonStr, path string, cfg ...Config) (bool, error)
- func GetBoolOr(jsonStr, path string, defaultValue bool, cfg ...Config) bool
- func GetContainerSize(data any) int
- func GetErrorSuggestion(err error) string
- func GetFloat(jsonStr, path string, cfg ...Config) (float64, error)
- func GetFloatOr(jsonStr, path string, defaultValue float64, cfg ...Config) float64
- func GetInt(jsonStr, path string, cfg ...Config) (int, error)
- func GetIntOr(jsonStr, path string, defaultValue int, cfg ...Config) int
- func GetKeyInternCacheSize() int
- func GetMultiple(jsonStr string, paths []string, cfg ...Config) (map[string]any, error)
- func GetObject(jsonStr, path string, cfg ...Config) (map[string]any, error)
- func GetResultBuffer() *[]byte
- func GetString(jsonStr, path string, cfg ...Config) (string, error)
- func GetStringOr(jsonStr, path string, defaultValue string, cfg ...Config) string
- func GetTyped[T any](jsonStr, path string, cfg ...Config) (T, error)
- func GetTypedOr[T any](jsonStr, path string, defaultValue T, cfg ...Config) T
- func HTMLEscape(dst *bytes.Buffer, src []byte)
- func HTMLEscapeBuffer(dst *bytes.Buffer, src []byte, cfg ...Config)
- func Indent(dst *bytes.Buffer, src []byte, prefix, indent string) error
- func IndentBuffer(dst *bytes.Buffer, src []byte, prefix, indent string, cfg ...Config) error
- func InternKey(key string) string
- func IsContainer(data any) bool
- func IsDeletedMarker(v any) bool
- func IsEmptyOrZero(v any) bool
- func IsInteger(s string) bool
- func IsNumeric(s string) bool
- func IsRetryable(err error) bool
- func IsSecurityRelated(err error) bool
- func IsUserError(err error) bool
- func IsValidJSON(jsonStr string) bool
- func IsValidPath(path string) bool
- func LoadFromFile(filePath string, cfg ...Config) (string, error)
- func Marshal(v any) ([]byte, error)
- func MarshalIndent(v any, prefix, indent string) ([]byte, error)
- func MarshalToFile(filePath string, data any, cfg ...Config) error
- func MergeJSON(json1, json2 string, mode ...MergeMode) (string, error)
- func MergeJSONMany(mode MergeMode, jsons ...string) (string, error)
- func NewProcessorUtils() *processorUtils
- func Parse(jsonStr string, cfg ...Config) (any, error)
- func ParseBool(s string) (bool, error)
- func ParseFloat(s string) (float64, error)
- func ParseInt(s string) (int, error)
- func ParseJSONL(data []byte) ([]any, error)
- func ParseJSONLInto[T any](data []byte) ([]T, error)
- func Prettify(jsonStr string, cfg ...Config) (string, error)
- func Print(data any)
- func PrintE(data any) error
- func PrintPretty(data any)
- func PrintPrettyE(data any) error
- func PutResultBuffer(buf *[]byte)
- func RegisterDangerousPattern(pattern DangerousPattern)
- func SafeConvertToInt64(value any) (int64, error)
- func SafeConvertToUint64(value any) (uint64, error)
- func SanitizeKey(key string) string
- func SaveToFile(filePath string, data any, cfg ...Config) error
- func SaveToWriter(writer io.Writer, data any, cfg ...Config) error
- func Set(jsonStr, path string, value any, cfg ...Config) (string, error)
- func SetGlobalProcessor(processor *Processor)
- func SetMultiple(jsonStr string, updates map[string]any, cfg ...Config) (string, error)
- func ShutdownGlobalProcessor()
- func StreamArrayCount(reader io.Reader) (int, error)
- func StreamArrayFilter(reader io.Reader, predicate func(any) bool) ([]any, error)
- func StreamArrayFirst(reader io.Reader, predicate func(any) bool) (any, bool, error)
- func StreamArrayForEach(reader io.Reader, fn func(int, any) error) error
- func StreamArrayMap(reader io.Reader, transform func(any) any) ([]any, error)
- func StreamArrayReduce(reader io.Reader, initial any, reducer func(any, any) any) (any, error)
- func StreamArraySkip(reader io.Reader, n int) ([]any, error)
- func StreamArrayTake(reader io.Reader, n int) ([]any, error)
- func StreamLinesInto[T any](reader io.Reader, fn func(lineNum int, data T) error) ([]T, error)
- func StreamLinesIntoWithConfig[T any](reader io.Reader, config JSONLConfig, fn func(lineNum int, data T) error) ([]T, error)
- func ToJSONL(data []any) ([]byte, error)
- func ToJSONLString(data []any) (string, error)
- func UnescapeJSONPointer(s string) string
- func Unmarshal(data []byte, v any) error
- func UnmarshalFromFile(path string, v any, cfg ...Config) error
- func UnregisterDangerousPattern(pattern string)
- func Valid(data []byte) bool
- func ValidString(jsonStr string) bool
- func ValidWithOptions(jsonStr string, cfg ...Config) (bool, error)
- func ValidatePath(path string) error
- func WarmupPathCache(commonPaths []string)
- func WarmupPathCacheWithProcessor(processor *Processor, commonPaths []string)
- func WrapError(err error, op, message string) error
- func WrapPathError(err error, op, path, message string) error
- type AccessResult
- func (r AccessResult) AsBool() (bool, error)
- func (r AccessResult) AsFloat64() (float64, error)
- func (r AccessResult) AsInt() (int, error)
- func (r AccessResult) AsString() (string, error)
- func (r AccessResult) AsStringConverted() (string, error)
- func (r AccessResult) Ok() bool
- func (r AccessResult) Unwrap() any
- func (r AccessResult) UnwrapOr(defaultValue any) any
- type BatchIterator
- type BatchOperation
- type BatchResult
- type BulkProcessor
- type CachedPathParser
- type CheckResult
- type ChunkedReader
- type ChunkedWriter
- type Config
- func (c *Config) AddDangerousPattern(pattern DangerousPattern)
- func (c *Config) AddHook(hook Hook)
- func (c *Config) AddValidator(validator Validator)
- func (c *Config) Clone() *Config
- func (c *Config) GetCacheTTL() time.Duration
- func (c *Config) GetFloatPrecision() int
- func (c *Config) GetIndent() string
- func (c *Config) GetMaxCacheSize() int
- func (c *Config) GetMaxConcurrency() int
- func (c *Config) GetMaxDepth() int
- func (c *Config) GetMaxJSONSize() int64
- func (c *Config) GetMaxNestingDepth() int
- func (c *Config) GetMaxPathDepth() int
- func (c *Config) GetPrefix() string
- func (c *Config) GetSecurityLimits() map[string]any
- func (c *Config) IsCacheEnabled() bool
- func (c *Config) IsCommentsAllowed() bool
- func (c *Config) IsDisallowUnknownEnabled() bool
- func (c *Config) IsHTMLEscapeEnabled() bool
- func (c *Config) IsHealthCheckEnabled() bool
- func (c *Config) IsMetricsEnabled() bool
- func (c *Config) IsPrettyEnabled() bool
- func (c *Config) IsSortKeysEnabled() bool
- func (c *Config) IsStrictMode() bool
- func (c *Config) IsTruncateFloatEnabled() bool
- func (c *Config) ShouldCleanupNulls() bool
- func (c *Config) ShouldCompactArrays() bool
- func (c *Config) ShouldCreatePaths() bool
- func (c *Config) ShouldEscapeNewlines() bool
- func (c *Config) ShouldEscapeSlash() bool
- func (c *Config) ShouldEscapeTabs() bool
- func (c *Config) ShouldEscapeUnicode() bool
- func (c *Config) ShouldIncludeNulls() bool
- func (c *Config) ShouldPreserveNumbers() bool
- func (c *Config) ShouldValidateFilePath() bool
- func (c *Config) ShouldValidateInput() bool
- func (c *Config) ShouldValidateUTF8() bool
- func (c *Config) Validate() error
- func (c *Config) ValidateWithWarnings() []ConfigWarning
- type ConfigWarning
- type CustomEncoder
- type DangerousPattern
- type Decoder
- type Delim
- type Encoder
- type EncoderConfig
- type HealthStatus
- type Hook
- type HookChain
- type HookContext
- type HookFunc
- type InvalidUnmarshalError
- type IterableValue
- func (iv *IterableValue) Exists(key string) bool
- func (iv *IterableValue) ForeachNested(path string, fn func(key any, item *IterableValue))
- func (iv *IterableValue) Get(path string) any
- func (iv *IterableValue) GetArray(key string) []any
- func (iv *IterableValue) GetBool(key string) bool
- func (iv *IterableValue) GetBoolWithDefault(key string, defaultValue bool) bool
- func (iv *IterableValue) GetData() any
- func (iv *IterableValue) GetFloat64(key string) float64
- func (iv *IterableValue) GetFloat64WithDefault(key string, defaultValue float64) float64
- func (iv *IterableValue) GetInt(key string) int
- func (iv *IterableValue) GetIntWithDefault(key string, defaultValue int) int
- func (iv *IterableValue) GetObject(key string) map[string]any
- func (iv *IterableValue) GetString(key string) string
- func (iv *IterableValue) GetStringWithDefault(key string, defaultValue string) string
- func (iv *IterableValue) GetWithDefault(key string, defaultValue any) any
- func (iv *IterableValue) IsEmpty(key string) bool
- func (iv *IterableValue) IsEmptyData() bool
- func (iv *IterableValue) IsNull(key string) bool
- func (iv *IterableValue) IsNullData() bool
- func (iv *IterableValue) Release()
- type Iterator
- type IteratorControl
- type JSONLConfig
- type JSONLProcessor
- func (j *JSONLProcessor) Err() error
- func (j *JSONLProcessor) GetStats() JSONLStats
- func (j *JSONLProcessor) Release()
- func (j *JSONLProcessor) Stop()
- func (j *JSONLProcessor) StreamLines(fn func(lineNum int, data any) bool) error
- func (j *JSONLProcessor) StreamLinesParallel(fn func(lineNum int, data any) error, workers int) (err error)
- type JSONLStats
- type JSONLWriter
- type JsonsError
- type LargeFileConfig
- type LargeFileProcessor
- type LazyParser
- func (lp *LazyParser) Error() error
- func (lp *LazyParser) Get(path string) (any, error)
- func (lp *LazyParser) GetObject() (map[string]any, error)
- func (lp *LazyParser) GetValue() (any, error)
- func (lp *LazyParser) IsArray() bool
- func (lp *LazyParser) IsObject() bool
- func (lp *LazyParser) IsParsed() bool
- func (lp *LazyParser) Parse() (any, error)
- func (lp *LazyParser) Parsed() any
- func (lp *LazyParser) Raw() []byte
- type Marshaler
- type MarshalerError
- type MergeMode
- type NDJSONProcessor
- type Number
- type ParallelIterator
- func (it *ParallelIterator) Close()
- func (it *ParallelIterator) Filter(predicate func(int, any) bool) []any
- func (it *ParallelIterator) ForEach(fn func(int, any) error) error
- func (it *ParallelIterator) ForEachBatch(batchSize int, fn func(int, []any) error) error
- func (it *ParallelIterator) ForEachBatchWithContext(ctx context.Context, batchSize int, fn func(int, []any) error) error
- func (it *ParallelIterator) ForEachWithContext(ctx context.Context, fn func(int, any) error) error
- func (it *ParallelIterator) Map(transform func(int, any) (any, error)) ([]any, error)
- type ParsedJSON
- type PathParser
- type PathSecurityChecker
- type PathSegment
- func NewAppendSegment() PathSegment
- func NewArrayIndexSegment(index int) PathSegment
- func NewArraySliceSegment(start, end, step int, hasStart, hasEnd, hasStep bool) PathSegment
- func NewExtractSegment(key string, flat bool) PathSegment
- func NewPropertySegment(key string) PathSegment
- func NewRecursiveSegment() PathSegment
- func NewWildcardSegment() PathSegment
- type PathSegmentFlags
- type PathSegmentType
- type PathValidator
- type PatternChecker
- type PatternLevel
- type PatternRegistry
- type Processor
- func (p *Processor) AddHook(hook Hook)
- func (p *Processor) BatchDeleteOptimized(jsonStr string, paths []string) (string, error)
- func (p *Processor) BatchSetOptimized(jsonStr string, updates map[string]any) (string, error)
- func (p *Processor) ClearCache()
- func (p *Processor) Close() error
- func (p *Processor) Compact(jsonStr string, opts ...Config) (string, error)
- func (p *Processor) CompactBuffer(dst *bytes.Buffer, src []byte, opts ...Config) error
- func (p *Processor) CompactBytes(dst *bytes.Buffer, src []byte) error
- func (p *Processor) CompactString(jsonStr string, opts ...Config) (string, error)
- func (p *Processor) CompilePath(path string) (*internal.CompiledPath, error)
- func (p *Processor) Delete(jsonStr, path string, opts ...Config) (string, error)
- func (p *Processor) DeleteClean(jsonStr, path string, opts ...Config) (string, error)
- func (p *Processor) Encode(value any, config ...Config) (string, error)
- func (p *Processor) EncodeBatch(pairs map[string]any, cfg ...Config) (string, error)
- func (p *Processor) EncodeFields(value any, fields []string, cfg ...Config) (string, error)
- func (p *Processor) EncodePretty(value any, config ...Config) (string, error)
- func (p *Processor) EncodeStream(values any, cfg ...Config) (string, error)
- func (p *Processor) EncodeWithConfig(value any, cfg ...Config) (string, error)
- func (p *Processor) FastDelete(jsonStr, path string) (string, error)
- func (p *Processor) FastGetMultiple(jsonStr string, paths []string) (map[string]any, error)
- func (p *Processor) FastSet(jsonStr, path string, value any) (string, error)
- func (p *Processor) Foreach(jsonStr string, fn func(key any, item *IterableValue))
- func (p *Processor) ForeachNested(jsonStr string, fn func(key any, item *IterableValue))
- func (p *Processor) ForeachReturn(jsonStr string, fn func(key any, item *IterableValue)) (string, error)
- func (p *Processor) ForeachWithPath(jsonStr, path string, fn func(key any, item *IterableValue)) error
- func (p *Processor) ForeachWithPathAndControl(jsonStr, path string, fn func(key any, value any) IteratorControl) error
- func (p *Processor) ForeachWithPathAndIterator(jsonStr, path string, ...) error
- func (p *Processor) FormatCompact(jsonStr string, opts ...Config) (string, error)
- func (p *Processor) Get(jsonStr, path string, opts ...Config) (any, error)
- func (p *Processor) GetArray(jsonStr, path string, opts ...Config) ([]any, error)
- func (p *Processor) GetBool(jsonStr, path string, opts ...Config) (bool, error)
- func (p *Processor) GetBoolOr(jsonStr, path string, defaultValue bool, opts ...Config) bool
- func (p *Processor) GetCompiled(jsonStr string, cp *internal.CompiledPath) (any, error)
- func (p *Processor) GetCompiledExists(data any, cp *internal.CompiledPath) bool
- func (p *Processor) GetCompiledFromParsed(data any, cp *internal.CompiledPath) (any, error)
- func (p *Processor) GetConfig() Config
- func (p *Processor) GetFloat(jsonStr, path string, opts ...Config) (float64, error)
- func (p *Processor) GetFloatOr(jsonStr, path string, defaultValue float64, opts ...Config) float64
- func (p *Processor) GetFromParsed(parsed *ParsedJSON, path string, opts ...Config) (any, error)
- func (p *Processor) GetFromParsedData(data any, path string) (any, error)
- func (p *Processor) GetHealthStatus() HealthStatus
- func (p *Processor) GetInt(jsonStr, path string, opts ...Config) (int, error)
- func (p *Processor) GetIntOr(jsonStr, path string, defaultValue int, opts ...Config) int
- func (p *Processor) GetMultiple(jsonStr string, paths []string, opts ...Config) (map[string]any, error)
- func (p *Processor) GetObject(jsonStr, path string, opts ...Config) (map[string]any, error)
- func (p *Processor) GetStats() Stats
- func (p *Processor) GetString(jsonStr, path string, opts ...Config) (string, error)
- func (p *Processor) GetStringOr(jsonStr, path string, defaultValue string, opts ...Config) string
- func (p *Processor) HTMLEscapeBuffer(dst *bytes.Buffer, src []byte, opts ...Config)
- func (p *Processor) IndentBuffer(dst *bytes.Buffer, src []byte, prefix, indent string, opts ...Config) error
- func (p *Processor) IndentBytes(dst *bytes.Buffer, src []byte, prefix, indent string) error
- func (p *Processor) IsClosed() bool
- func (p *Processor) LoadFromFile(filePath string, opts ...Config) (string, error)
- func (p *Processor) LoadFromFileAsData(filePath string, opts ...Config) (any, error)
- func (p *Processor) LoadFromReader(reader io.Reader, opts ...Config) (string, error)
- func (p *Processor) LoadFromReaderAsData(reader io.Reader, opts ...Config) (any, error)
- func (p *Processor) Marshal(value any, opts ...Config) ([]byte, error)
- func (p *Processor) MarshalIndent(value any, prefix, indent string, cfg ...Config) ([]byte, error)
- func (p *Processor) MarshalToFile(path string, data any, cfg ...Config) error
- func (p *Processor) Parse(jsonStr string, target any, opts ...Config) error
- func (p *Processor) ParseAny(jsonStr string, opts ...Config) (any, error)
- func (p *Processor) PreParse(jsonStr string, opts ...Config) (*ParsedJSON, error)
- func (p *Processor) Prettify(jsonStr string, opts ...Config) (string, error)
- func (p *Processor) ProcessBatch(operations []BatchOperation, opts ...Config) ([]BatchResult, error)
- func (p *Processor) SafeGet(jsonStr, path string) AccessResult
- func (p *Processor) SaveToFile(filePath string, data any, cfg ...Config) error
- func (p *Processor) SaveToWriter(writer io.Writer, data any, cfg ...Config) error
- func (p *Processor) Set(jsonStr, path string, value any, opts ...Config) (string, error)
- func (p *Processor) SetCreate(jsonStr, path string, value any, opts ...Config) (string, error)
- func (p *Processor) SetFromParsed(parsed *ParsedJSON, path string, value any, opts ...Config) (*ParsedJSON, error)
- func (p *Processor) SetLogger(logger *slog.Logger)
- func (p *Processor) SetMultiple(jsonStr string, updates map[string]any, opts ...Config) (string, error)
- func (p *Processor) SetMultipleCreate(jsonStr string, updates map[string]any, opts ...Config) (string, error)
- func (p *Processor) ToJsonString(value any, cfg ...Config) (string, error)
- func (p *Processor) ToJsonStringPretty(value any, cfg ...Config) (string, error)
- func (p *Processor) ToJsonStringStandard(value any, cfg ...Config) (string, error)
- func (p *Processor) Unmarshal(data []byte, v any, opts ...Config) error
- func (p *Processor) UnmarshalFromFile(path string, v any, opts ...Config) error
- func (p *Processor) Valid(jsonStr string, opts ...Config) (bool, error)
- func (p *Processor) ValidBytes(data []byte) bool
- func (p *Processor) ValidateSchema(jsonStr string, schema *Schema, opts ...Config) ([]ValidationError, error)
- func (p *Processor) WarmupCache(jsonStr string, paths []string, opts ...Config) (*WarmupResult, error)
- type Result
- type SamplingReader
- type Schema
- type SchemaConfig
- type SecurityValidator
- type Stats
- type StreamIterator
- type StreamIteratorConfig
- type StreamObjectIterator
- type StreamingProcessor
- func (sp *StreamingProcessor) Close() error
- func (sp *StreamingProcessor) GetStats() StreamingStats
- func (sp *StreamingProcessor) StreamArray(fn func(index int, item any) bool) error
- func (sp *StreamingProcessor) StreamArrayChunked(chunkSize int, fn func([]any) error) error
- func (sp *StreamingProcessor) StreamObject(fn func(key string, value any) bool) error
- func (sp *StreamingProcessor) StreamObjectChunked(chunkSize int, fn func(map[string]any) error) error
- type StreamingStats
- type StructureValidator
- type SyntaxError
- type TextMarshaler
- type TextUnmarshaler
- type Token
- type TypeEncoder
- type UnmarshalTypeError
- type Unmarshaler
- type UnsupportedTypeError
- type UnsupportedValueError
- type ValidationChain
- type ValidationError
- type Validator
- type WarmupResult
Constants ¶
const ( // Operation Limits - Secure defaults with reasonable headroom DefaultMaxJSONSize = 100 * 1024 * 1024 // 100MB DefaultMaxNestingDepth = 200 DefaultMaxPathDepth = 50 DefaultMaxConcurrency = 50 // Internal operation limits DefaultMaxSecuritySize = 10 * 1024 * 1024 DefaultMaxObjectKeys = 100000 DefaultMaxArrayElements = 100000 DefaultMaxBatchSize = 2000 DefaultParallelThreshold = 10 // LargeStringHashThreshold is the byte threshold for using sampling-based hash. // Re-exported from internal package for public API access. LargeStringHashThreshold = internal.LargeStringHashThreshold // Path Validation - Secure but flexible // MaxPathLength is the maximum allowed path length for security. // Re-exported from internal package for public API access. MaxPathLength = internal.MaxPathLength // Cache TTL DefaultCacheTTL = 5 * time.Minute // Cache key constants // MaxCacheKeyLength is the maximum allowed cache key length. // Re-exported from internal package for public API access. MaxCacheKeyLength = internal.MaxCacheKeyLength )
Configuration constants with optimized defaults for production workloads.
const ( // MergeUnion performs union merge - combines all keys/elements (default) // For objects: all keys from both, conflicts resolved by override value // For arrays: all elements from both with deduplication MergeUnion = internal.MergeUnion // MergeIntersection performs intersection merge - only common keys/elements // For objects: only keys present in both, values from override // For arrays: only elements present in both arrays MergeIntersection = internal.MergeIntersection // MergeDifference performs difference merge - keys/elements only in base // For objects: keys in base but not in override // For arrays: elements in base but not in override MergeDifference = internal.MergeDifference )
Merge mode constants - re-exported from internal package for public API
const InvalidArrayIndex = internal.ArrayIndexInvalid
InvalidArrayIndex is a sentinel value indicating an invalid or out-of-bounds array index. Returned by array parsing functions when the index cannot be determined (e.g., invalid format, overflow, or empty string).
index := processor.ParseArrayIndex(str)
if index == InvalidArrayIndex {
// Handle invalid index
}
Variables ¶
var ( ErrInvalidJSON = errors.New("invalid JSON format") ErrPathNotFound = errors.New("path not found") ErrTypeMismatch = errors.New("type mismatch") ErrOperationFailed = errors.New("operation failed") ErrInvalidPath = errors.New("invalid path format") ErrProcessorClosed = errors.New("processor is closed") ErrInternalError = errors.New("internal error") // Limit-related errors. ErrSizeLimit = errors.New("size limit exceeded") ErrDepthLimit = errors.New("depth limit exceeded") ErrConcurrencyLimit = errors.New("concurrency limit exceeded") // Security and validation errors. ErrSecurityViolation = errors.New("security violation detected") ErrUnsupportedPath = errors.New("unsupported path operation") // Resource and performance errors. ErrCacheFull = errors.New("cache is full") ErrCacheDisabled = errors.New("cache is disabled") ErrOperationTimeout = errors.New("operation timeout") ErrResourceExhausted = errors.New("system resources exhausted") )
Primary errors for common cases.
Functions ¶
func As ¶ added in v1.3.0
func As[T any](r AccessResult) (T, error)
As safely converts the result to type T. Returns error if the type doesn't match.
func ClampIndex ¶
ClampIndex clamps an index to valid bounds for an array
func ClearCache ¶ added in v1.1.0
func ClearCache()
ClearCache clears the processor's internal cache.
func ClearDangerousPatterns ¶ added in v1.3.0
func ClearDangerousPatterns()
ClearDangerousPatterns removes all custom patterns from the global registry. Use with caution - this does not affect built-in patterns.
func ClearKeyInternCache ¶ added in v1.2.0
func ClearKeyInternCache()
ClearKeyInternCache clears the global key interning cache.
func Compact ¶
Compact appends to dst the JSON-encoded src with insignificant space characters elided. This function is 100% compatible with encoding/json.Compact.
func CompactBuffer ¶ added in v1.1.0
CompactBuffer is an alias for Compact for buffer operations
func CompactString ¶ added in v1.2.2
CompactString removes whitespace from JSON string. This is the recommended function for compacting JSON strings.
func CompareJSON ¶ added in v1.3.0
CompareJSON compares two JSON strings for equality by parsing and normalizing them. This function handles numeric precision differences and key ordering.
Example:
equal, err := json.CompareJSON(`{"a":1}`, `{"a":1.0}`)
// equal == true
func ConvertToBool ¶ added in v1.0.4
ConvertToBool converts any value to bool. String conversion supports both standard formats and user-friendly formats: Standard: "1", "t", "T", "TRUE", "true", "True", "0", "f", "F", "FALSE", "false", "False" Extended: "yes", "on" -> true, "no", "off", "" -> false Delegates to internal core function to reduce code duplication
func ConvertToFloat64 ¶ added in v1.0.4
ConvertToFloat64 converts any value to float64. Delegates to internal core functions to reduce code duplication. MAINTENANCE: Keep type switch cases in sync with ConvertToInt, ConvertToInt64, ConvertToUint64.
func ConvertToInt ¶ added in v1.0.4
ConvertToInt converts any value to int with comprehensive type support. Delegates to internal core function to reduce code duplication. MAINTENANCE: Keep type switch cases in sync with ConvertToInt64, ConvertToUint64, ConvertToFloat64.
func ConvertToInt64 ¶ added in v1.0.4
ConvertToInt64 converts any value to int64. Delegates to internal core function to reduce code duplication. MAINTENANCE: Keep type switch cases in sync with ConvertToInt, ConvertToUint64, ConvertToFloat64.
func ConvertToString ¶ added in v1.0.4
ConvertToString converts any value to string (for backward compatibility)
func ConvertToUint64 ¶ added in v1.0.4
ConvertToUint64 converts any value to uint64. Delegates to internal core function to reduce code duplication. MAINTENANCE: Keep type switch cases in sync with ConvertToInt, ConvertToInt64, ConvertToFloat64.
func CreateEmptyContainer ¶
CreateEmptyContainer creates an empty container of the specified type
func DeepCopy ¶
DeepCopy creates a deep copy of JSON-compatible data Uses direct recursive copying for better performance (avoids marshal/unmarshal overhead) SECURITY: Added depth limit to prevent stack overflow
func Encode ¶
Encode converts any Go value to JSON string. For configuration options, use EncodeWithConfig.
func EncodeBatch ¶ added in v1.1.0
EncodeBatch encodes multiple key-value pairs as a JSON object. This is the unified API that replaces EncodeBatchWithOpts.
Example:
result, err := json.EncodeBatch(map[string]any{"name": "Alice", "age": 30})
func EncodeFields ¶ added in v1.1.0
EncodeFields encodes specific fields from a struct or map. This is the unified API that replaces EncodeFieldsWithOpts.
Example:
result, err := json.EncodeFields(user, []string{"name", "email"})
func EncodeStream ¶ added in v1.1.0
EncodeStream encodes multiple values as a JSON array. This is the unified API that replaces EncodeStreamWithOpts.
Example:
result, err := json.EncodeStream([]any{1, 2, 3}, json.PrettyConfig())
func EncodeWithConfig ¶ added in v1.3.0
EncodeWithConfig converts any Go value to JSON string using the unified Config. This is the recommended way to encode JSON with configuration.
Example:
// Default configuration result, err := json.EncodeWithConfig(data) // Pretty output result, err := json.EncodeWithConfig(data, json.PrettyConfig()) // Security-focused output result, err := json.EncodeWithConfig(data, json.SecurityConfig()) // Custom configuration cfg := json.DefaultConfig() cfg.Pretty = true cfg.SortKeys = true result, err := json.EncodeWithConfig(data, cfg)
func EscapeJSONPointer ¶
EscapeJSONPointer escapes special characters for JSON Pointer
func Foreach ¶
func Foreach(jsonStr string, fn func(key any, item *IterableValue))
Foreach iterates over JSON arrays or objects with simplified signature (for test compatibility)
func ForeachNested ¶
func ForeachNested(jsonStr string, fn func(key any, item *IterableValue))
ForeachNested iterates over nested JSON structures
func ForeachReturn ¶
func ForeachReturn(jsonStr string, fn func(key any, item *IterableValue)) (string, error)
ForeachReturn is a variant that returns error (for compatibility with test expectations)
func ForeachWithPath ¶
func ForeachWithPath(jsonStr, path string, fn func(key any, item *IterableValue)) error
ForeachWithPath iterates over JSON arrays or objects with simplified signature (for test compatibility)
func ForeachWithPathAndControl ¶ added in v1.0.8
func ForeachWithPathAndControl(jsonStr, path string, fn func(key any, value any) IteratorControl) error
ForeachWithPathAndControl iterates over JSON arrays or objects and applies a function This is the 3-parameter version used by most code
func FormatNumber ¶
FormatNumber formats a number value as a string
func Get ¶
Get retrieves a value from JSON at the specified path. Returns the value as any and requires type assertion.
func GetBoolOr ¶ added in v1.3.0
GetBoolOr retrieves a bool value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails.
func GetContainerSize ¶
GetContainerSize returns the size of a container
func GetErrorSuggestion ¶ added in v1.0.10
GetErrorSuggestion provides suggestions for common errors
func GetFloatOr ¶ added in v1.3.0
GetFloatOr retrieves a float64 value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails.
func GetIntOr ¶ added in v1.3.0
GetIntOr retrieves an int value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails.
func GetKeyInternCacheSize ¶ added in v1.3.0
func GetKeyInternCacheSize() int
GetKeyInternCacheSize returns the number of interned keys in the cache.
func GetMultiple ¶
GetMultiple retrieves multiple values from JSON at the specified paths
func GetResultBuffer ¶ added in v1.2.0
func GetResultBuffer() *[]byte
GetResultBuffer gets a buffer for result marshaling
func GetStringOr ¶ added in v1.3.0
GetStringOr retrieves a string value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails.
func GetTyped ¶
GetTyped retrieves a typed value from JSON at the specified path. This is the generic typed getter - use this for custom types.
Example:
type User struct { Name string }
user, err := json.GetAs[User](data, "user")
func GetTypedOr ¶ added in v1.3.0
GetTypedOr retrieves a typed value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails. This is the recommended generic function for getting values with defaults.
Example:
name := json.GetOr[string](data, "user.name", "unknown") age := json.GetOr[int](data, "user.age", 0)
func HTMLEscape ¶
HTMLEscape appends to dst the JSON-encoded src with <, >, &, U+2028, and U+2029 characters escaped. This function is 100% compatible with encoding/json.HTMLEscape.
func HTMLEscapeBuffer ¶ added in v1.1.0
HTMLEscapeBuffer is an alias for HTMLEscape for buffer operations
func Indent ¶
Indent appends to dst an indented form of the JSON-encoded src. This function is 100% compatible with encoding/json.Indent.
func IndentBuffer ¶ added in v1.1.0
IndentBuffer appends to dst an indented form of the JSON-encoded src.
func InternKey ¶ added in v1.2.0
InternKey interns a string key for memory efficiency. Returns an interned version of the key that can be reused across operations.
Example:
key := json.InternKey("user_id") // Returns interned string
func IsContainer ¶
IsContainer checks if the data is a container type (map or slice)
func IsDeletedMarker ¶ added in v1.3.0
IsDeletedMarker checks if a value is the deleted marker sentinel. This is the recommended way to check for deleted markers instead of direct comparison.
func IsEmptyOrZero ¶ added in v1.3.0
IsEmptyOrZero checks if a value is empty or its zero value. Supports all standard numeric types, bool, string, slices, maps, and json.Number. For slices and maps, returns true if nil or empty (len == 0).
Example:
if json.IsEmptyOrZero(value) {
// Handle empty or zero value
}
func IsRetryable ¶ added in v1.0.10
IsRetryable determines if an error is retryable
func IsSecurityRelated ¶ added in v1.0.10
IsSecurityRelated determines if an error is security-related
func IsUserError ¶ added in v1.0.10
IsUserError determines if an error is caused by user input
func IsValidJSON ¶ added in v1.0.10
IsValidJSON quickly checks if a string is valid JSON
func IsValidPath ¶
IsValidPath checks if a path expression is valid
func LoadFromFile ¶
LoadFromFile loads JSON data from a file with optional configuration Uses the default processor with support for Config such as security validation
func Marshal ¶
Marshal returns the JSON encoding of v. This function is 100% compatible with encoding/json.Marshal. For configuration options, use EncodeWithConfig or Processor.Marshal with cfg parameter.
func MarshalIndent ¶
MarshalIndent is like Marshal but applies indentation to format the output. This function is 100% compatible with encoding/json.MarshalIndent. For configuration options, use EncodeWithConfig or Processor.MarshalIndent with cfg parameter.
func MarshalToFile ¶ added in v1.0.6
MarshalToFile marshals data to JSON and writes to a file. This is the unified API that replaces MarshalToFileWithOpts.
Example:
err := json.MarshalToFile("data.json", myStruct, json.PrettyConfig())
func MergeJSON ¶ added in v1.3.0
MergeJSON merges two JSON objects using deep merge strategy. For nested objects, it recursively merges keys according to the specified mode. For primitive values and arrays, the value from json2 takes precedence.
Supported modes (optional, defaults to MergeUnion):
- MergeUnion: combines all keys from both objects (default)
- MergeIntersection: only keys present in both objects
- MergeDifference: keys in json1 but not in json2
Example:
// Union merge (default) result, err := json.MergeJSON(a, b) // Intersection merge result, err := json.MergeJSON(a, b, json.MergeIntersection) // Difference merge result, err := json.MergeJSON(a, b, json.MergeDifference)
func MergeJSONMany ¶ added in v1.3.0
MergeJSONMany merges multiple JSON objects with specified merge mode. Returns error if less than 2 JSON strings are provided.
Example:
result, err := json.MergeJSONMany(json.MergeUnion, config1, config2, config3)
func NewProcessorUtils ¶
func NewProcessorUtils() *processorUtils
NewProcessorUtils creates a new processor utils instance
func Parse ¶ added in v1.3.0
Parse parses a JSON string and returns the root value. This is the recommended entry point for parsing JSON strings.
Layer Architecture:
- Package-level (this function): Convenience wrapper that uses cached processors
- Processor-level: Use Processor.ParseAny() for the same behavior, or Processor.Parse() for unmarshaling into a target
Example:
// Simple parsing (uses default processor) data, err := json.Parse(jsonStr) // With configuration (uses config-cached processor) cfg := json.SecurityConfig() data, err := json.Parse(jsonStr, cfg)
Performance Tips:
- For repeated operations on the same JSON, use Processor.PreParse() to parse once
- For batch operations, use Processor.ProcessBatch()
Note: Get(jsonStr, "$") is equivalent but slightly less efficient due to path parsing overhead.
func ParseFloat ¶
ParseFloat parses a string to float64 with error handling
func ParseJSONL ¶ added in v1.2.0
ParseJSONL parses JSONL data from a byte slice
func ParseJSONLInto ¶ added in v1.2.0
ParseJSONLInto parses JSONL data into typed values
func Prettify ¶ added in v1.3.0
Prettify formats JSON string with pretty indentation. This is the recommended function for formatting JSON strings.
Example:
pretty, err := json.Prettify(`{"name":"Alice","age":30}`)
// Output:
// {
// "name": "Alice",
// "age": 30
// }
func Print ¶ added in v1.1.0
func Print(data any)
Print prints any Go value as JSON to stdout in compact format. Note: Writes errors to stderr. Use PrintE for error handling.
func PrintE ¶ added in v1.2.0
PrintE prints any Go value as JSON to stdout in compact format. Returns an error instead of writing to stderr, allowing callers to handle errors.
func PrintPretty ¶ added in v1.1.0
func PrintPretty(data any)
PrintPretty prints any Go value as formatted JSON to stdout. Note: Writes errors to stderr. Use PrintPrettyE for error handling.
func PrintPrettyE ¶ added in v1.2.0
PrintPrettyE prints any Go value as formatted JSON to stdout. Returns an error instead of writing to stderr, allowing callers to handle errors.
func PutResultBuffer ¶ added in v1.2.0
func PutResultBuffer(buf *[]byte)
PutResultBuffer returns a buffer to the pool
func RegisterDangerousPattern ¶ added in v1.3.0
func RegisterDangerousPattern(pattern DangerousPattern)
RegisterDangerousPattern adds a pattern to the global registry. Patterns registered here are checked in addition to default patterns.
Example:
json.RegisterDangerousPattern(json.DangerousPattern{
Pattern: "malicious_keyword",
Name: "Custom dangerous pattern",
Level: json.PatternLevelCritical,
})
func SafeConvertToInt64 ¶
SafeConvertToInt64 safely converts any value to int64 with error handling
func SafeConvertToUint64 ¶
SafeConvertToUint64 safely converts any value to uint64 with error handling
func SanitizeKey ¶
SanitizeKey sanitizes a key for safe use in maps
func SaveToFile ¶
SaveToFile saves JSON data to a file with optional configuration. This is the unified API that replaces SaveToFileWithOpts.
Example:
// Simple save
err := json.SaveToFile("data.json", data)
// With pretty printing
cfg := json.PrettyConfig()
err := json.SaveToFile("data.json", data, cfg)
func SaveToWriter ¶ added in v1.3.0
SaveToWriter writes JSON data to an io.Writer. This is the unified API that replaces SaveToWriterWithOpts.
Example:
var buf bytes.Buffer err := json.SaveToWriter(&buf, data, json.PrettyConfig())
func Set ¶
Set sets a value in JSON at the specified path Returns:
- On success: modified JSON string and nil error
- On failure: original unmodified JSON string and error information
func SetGlobalProcessor ¶
func SetGlobalProcessor(processor *Processor)
SetGlobalProcessor sets a custom global processor (thread-safe)
func SetMultiple ¶
SetMultiple sets multiple values using a map of path-value pairs
func ShutdownGlobalProcessor ¶
func ShutdownGlobalProcessor()
ShutdownGlobalProcessor shuts down the global processor
func StreamArrayCount ¶ added in v1.2.0
StreamArrayCount counts elements without storing them Memory-efficient for just getting array length
func StreamArrayFilter ¶ added in v1.2.0
StreamArrayFilter filters array elements during streaming Only elements that pass the predicate are kept
func StreamArrayFirst ¶ added in v1.2.0
StreamArrayFirst returns the first element that matches a predicate Stops processing as soon as a match is found
func StreamArrayForEach ¶ added in v1.2.0
StreamArrayForEach processes each element without collecting results Useful for side effects like writing to a database or file
func StreamArrayMap ¶ added in v1.2.0
StreamArrayMap transforms array elements during streaming Each element is transformed using the provided function
func StreamArrayReduce ¶ added in v1.2.0
StreamArrayReduce reduces array elements to a single value during streaming The reducer function receives the accumulated value and current element
func StreamArraySkip ¶ added in v1.2.0
StreamArraySkip skips the first n elements and returns the rest Useful for pagination
func StreamArrayTake ¶ added in v1.2.0
StreamArrayTake returns the first n elements from a streaming array Useful for pagination or sampling
func StreamLinesInto ¶ added in v1.2.0
StreamLinesInto processes JSONL data into a slice of typed values Uses generics for type-safe processing
func StreamLinesIntoWithConfig ¶ added in v1.2.0
func StreamLinesIntoWithConfig[T any](reader io.Reader, config JSONLConfig, fn func(lineNum int, data T) error) ([]T, error)
StreamLinesIntoWithConfig processes JSONL data into a slice of typed values with config
func ToJSONLString ¶ added in v1.2.0
ToJSONLString converts a slice of values to JSONL format string
func UnescapeJSONPointer ¶
UnescapeJSONPointer unescapes JSON Pointer special characters
func Unmarshal ¶
Unmarshal parses the JSON-encoded data and stores the result in v. This function is 100% compatible with encoding/json.Unmarshal. For configuration options, use Processor.Unmarshal with cfg parameter.
func UnmarshalFromFile ¶ added in v1.0.6
UnmarshalFromFile reads JSON from a file and unmarshals it into v. This is a convenience function that combines file reading and unmarshalling. Uses the default processor for security validation and decoding.
Parameters:
- path: file path to read JSON from
- v: pointer to the target variable where JSON will be unmarshaled
- cfg: optional Config for security validation and processing
Returns error if file reading fails or JSON cannot be unmarshaled.
func UnregisterDangerousPattern ¶ added in v1.3.0
func UnregisterDangerousPattern(pattern string)
UnregisterDangerousPattern removes a pattern from the global registry.
func Valid ¶
Valid reports whether data is valid JSON. This function is 100% compatible with encoding/json.Valid.
func ValidString ¶ added in v1.2.2
ValidString reports whether the JSON string is valid. This is a convenience wrapper for Valid that accepts a string directly.
func ValidWithOptions ¶ added in v1.2.2
ValidWithOptions reports whether the JSON string is valid with optional configuration. Returns both the validation result and any error that occurred during validation.
func ValidatePath ¶
ValidatePath validates a path expression and returns detailed error information
func WarmupPathCache ¶ added in v1.2.0
func WarmupPathCache(commonPaths []string)
WarmupPathCache pre-populates the path cache with common paths. Delegates to internal.GlobalPathIntern for storage.
func WarmupPathCacheWithProcessor ¶ added in v1.2.0
WarmupPathCacheWithProcessor pre-populates the path cache using a specific processor. Delegates to internal.GlobalPathIntern for storage.
func WrapPathError ¶ added in v1.0.6
WrapPathError wraps an error with path context
Types ¶
type AccessResult ¶ added in v1.3.0
type AccessResult struct {
Value any // The result value (exported for backward compatibility)
Exists bool // Whether the path exists
Type string // Runtime type info (for debugging)
}
AccessResult represents the result of a dynamic access operation. It extends Result[any] with type conversion methods for safe type handling.
Example:
result := processor.SafeGet(data, "user.age") age, err := result.AsInt() name, err := result.AsString()
func NewAccessResult ¶ added in v1.3.0
func NewAccessResult(value any, exists bool) AccessResult
NewAccessResult creates a new AccessResult.
func (AccessResult) AsBool ¶ added in v1.3.0
func (r AccessResult) AsBool() (bool, error)
AsBool safely converts the result to bool. Unlike ConvertToBool, this method is stricter and only accepts bool and string types. Use ConvertToBool directly if you need more permissive conversion (e.g., int to bool).
func (AccessResult) AsFloat64 ¶ added in v1.3.0
func (r AccessResult) AsFloat64() (float64, error)
AsFloat64 safely converts the result to float64 with precision checks. Unlike ConvertToFloat64, this method is stricter and does NOT convert bool to float64. Use ConvertToFloat64 directly if you need more permissive conversion.
func (AccessResult) AsInt ¶ added in v1.3.0
func (r AccessResult) AsInt() (int, error)
AsInt safely converts the result to int with overflow and precision checks. Unlike ConvertToInt, this method is stricter and does NOT convert bool to int. Use ConvertToInt directly if you need more permissive conversion.
func (AccessResult) AsString ¶ added in v1.3.0
func (r AccessResult) AsString() (string, error)
AsString safely converts the result to string. Returns ErrTypeMismatch if the value is not a string type. Use AsStringConverted() for explicit type conversion with formatting.
func (AccessResult) AsStringConverted ¶ added in v1.3.0
func (r AccessResult) AsStringConverted() (string, error)
AsStringConverted converts the result to string using fmt.Sprintf formatting. Use this when you explicitly want string representation of any type. For strict type checking, use AsString() instead.
func (AccessResult) Ok ¶ added in v1.3.0
func (r AccessResult) Ok() bool
Ok returns true if the value exists.
func (AccessResult) Unwrap ¶ added in v1.3.0
func (r AccessResult) Unwrap() any
Unwrap returns the value or nil if it doesn't exist.
func (AccessResult) UnwrapOr ¶ added in v1.3.0
func (r AccessResult) UnwrapOr(defaultValue any) any
UnwrapOr returns the value or the provided default if it doesn't exist.
type BatchIterator ¶ added in v1.2.0
type BatchIterator struct {
// contains filtered or unexported fields
}
BatchIterator processes arrays in batches for efficient bulk operations
func NewBatchIterator ¶ added in v1.2.0
func NewBatchIterator(data []any, batchSize int) *BatchIterator
NewBatchIterator creates a new batch iterator
func (*BatchIterator) CurrentIndex ¶ added in v1.2.0
func (it *BatchIterator) CurrentIndex() int
CurrentIndex returns the current position in the array
func (*BatchIterator) HasNext ¶ added in v1.2.0
func (it *BatchIterator) HasNext() bool
HasNext returns true if there are more batches to process
func (*BatchIterator) NextBatch ¶ added in v1.2.0
func (it *BatchIterator) NextBatch() []any
NextBatch returns the next batch of elements Returns nil when no more batches are available
func (*BatchIterator) Remaining ¶ added in v1.2.0
func (it *BatchIterator) Remaining() int
Remaining returns the number of remaining elements
func (*BatchIterator) Reset ¶ added in v1.2.0
func (it *BatchIterator) Reset()
Reset resets the iterator to the beginning
func (*BatchIterator) TotalBatches ¶ added in v1.2.0
func (it *BatchIterator) TotalBatches() int
TotalBatches returns the total number of batches
type BatchOperation ¶
type BatchOperation struct {
Type string `json:"type"`
JSONStr string `json:"json_str"`
Path string `json:"path"`
Value any `json:"value"`
ID string `json:"id"`
}
BatchOperation represents a single operation in a batch
type BatchResult ¶
type BatchResult struct {
ID string `json:"id"`
Result any `json:"result"`
Error error `json:"error"`
}
BatchResult represents the result of a batch operation
func ProcessBatch ¶ added in v1.1.0
func ProcessBatch(operations []BatchOperation, cfg ...Config) ([]BatchResult, error)
ProcessBatch processes multiple JSON operations in a single batch. This is more efficient than processing each operation individually.
type BulkProcessor ¶ added in v1.2.0
type BulkProcessor struct {
// contains filtered or unexported fields
}
BulkProcessor handles multiple operations efficiently
func NewBulkProcessor ¶ added in v1.2.0
func NewBulkProcessor(processor *Processor, batchSize int) *BulkProcessor
NewBulkProcessor creates a bulk processor
type CachedPathParser ¶ added in v1.3.0
type CachedPathParser interface {
PathParser
// ParsePathCached parses with caching for repeated paths.
ParsePathCached(path string) ([]PathSegment, error)
// ClearPathCache clears the path segment cache.
ClearPathCache()
}
CachedPathParser provides path parsing with caching.
type CheckResult ¶
CheckResult represents the result of a single health check
type ChunkedReader ¶ added in v1.2.0
type ChunkedReader struct {
// contains filtered or unexported fields
}
ChunkedReader reads JSON in chunks for memory efficiency
func NewChunkedReader ¶ added in v1.2.0
func NewChunkedReader(reader io.Reader, chunkSize int) *ChunkedReader
NewChunkedReader creates a new chunked reader
func (*ChunkedReader) ReadArray ¶ added in v1.2.0
func (cr *ChunkedReader) ReadArray(fn func(item any) bool) error
ReadArray reads array elements one at a time
func (*ChunkedReader) ReadObject ¶ added in v1.2.0
func (cr *ChunkedReader) ReadObject(fn func(key string, value any) bool) error
ReadObject reads object key-value pairs one at a time
type ChunkedWriter ¶ added in v1.2.0
type ChunkedWriter struct {
// contains filtered or unexported fields
}
ChunkedWriter writes JSON in chunks for memory efficiency
func NewChunkedWriter ¶ added in v1.2.0
func NewChunkedWriter(writer io.Writer, chunkSize int, isArray bool) *ChunkedWriter
NewChunkedWriter creates a new chunked writer
func (*ChunkedWriter) Count ¶ added in v1.2.0
func (cw *ChunkedWriter) Count() int
Count returns the number of items written
func (*ChunkedWriter) Flush ¶ added in v1.2.0
func (cw *ChunkedWriter) Flush(final bool) error
Flush writes the buffer to the underlying writer
func (*ChunkedWriter) WriteItem ¶ added in v1.2.0
func (cw *ChunkedWriter) WriteItem(item any) error
WriteItem writes a single item to the chunk
func (*ChunkedWriter) WriteKeyValue ¶ added in v1.2.0
func (cw *ChunkedWriter) WriteKeyValue(key string, value any) error
WriteKeyValue writes a key-value pair to the chunk
type Config ¶
type Config struct {
// ===== Cache Settings =====
MaxCacheSize int `json:"max_cache_size"`
CacheTTL time.Duration `json:"cache_ttl"`
EnableCache bool `json:"enable_cache"`
CacheResults bool `json:"cache_results"` // Per-operation caching
// ===== Size Limits =====
MaxJSONSize int64 `json:"max_json_size"`
MaxPathDepth int `json:"max_path_depth"`
MaxBatchSize int `json:"max_batch_size"`
// ===== Security Limits =====
MaxNestingDepthSecurity int `json:"max_nesting_depth"`
MaxSecurityValidationSize int64 `json:"max_security_validation_size"`
MaxObjectKeys int `json:"max_object_keys"`
MaxArrayElements int `json:"max_array_elements"`
// FullSecurityScan enables full (non-sampling) security validation for all JSON input.
//
// When false (default): Large JSON (>4KB) uses optimized sampling with:
// - 16KB beginning section scan
// - 8KB end section scan
// - 15-30 distributed middle samples with 512-byte overlap
// - Critical patterns (__proto__, constructor, prototype) always fully scanned
// - Suspicious character density triggers automatic full scan
//
// When true: All JSON is fully scanned regardless of size.
//
// SECURITY RECOMMENDATION: Enable FullSecurityScan when:
// - Processing untrusted input from external sources
// - Handling sensitive data (authentication, financial, personal)
// - Building public-facing APIs or web services
// - Compliance requirements mandate full content inspection
//
// PERFORMANCE NOTE: Full scanning adds ~10-30% overhead for JSON >100KB.
// For trusted internal services with large JSON payloads, sampling mode is acceptable.
FullSecurityScan bool `json:"full_security_scan"`
// ===== Concurrency =====
MaxConcurrency int `json:"max_concurrency"`
ParallelThreshold int `json:"parallel_threshold"`
// ===== Processing Options =====
EnableValidation bool `json:"enable_validation"`
StrictMode bool `json:"strict_mode"`
CreatePaths bool `json:"create_paths"`
CleanupNulls bool `json:"cleanup_nulls"`
CompactArrays bool `json:"compact_arrays"`
ContinueOnError bool `json:"continue_on_error"` // Continue on batch errors
// ===== Input/Output Options =====
AllowComments bool `json:"allow_comments"`
PreserveNumbers bool `json:"preserve_numbers"`
ValidateInput bool `json:"validate_input"`
ValidateFilePath bool `json:"validate_file_path"`
SkipValidation bool `json:"skip_validation"` // Skip validation for trusted input
// ===== Encoding Options =====
Pretty bool `json:"pretty"`
Indent string `json:"indent"`
Prefix string `json:"prefix"`
EscapeHTML bool `json:"escape_html"`
SortKeys bool `json:"sort_keys"`
ValidateUTF8 bool `json:"validate_utf8"`
MaxDepth int `json:"max_depth"`
DisallowUnknown bool `json:"disallow_unknown"`
FloatPrecision int `json:"float_precision"`
FloatTruncate bool `json:"float_truncate"`
DisableEscaping bool `json:"disable_escaping"`
EscapeUnicode bool `json:"escape_unicode"`
EscapeSlash bool `json:"escape_slash"`
EscapeNewlines bool `json:"escape_newlines"`
EscapeTabs bool `json:"escape_tabs"`
IncludeNulls bool `json:"include_nulls"`
CustomEscapes map[rune]string `json:"custom_escapes,omitempty"`
// ===== Observability =====
EnableMetrics bool `json:"enable_metrics"`
EnableHealthCheck bool `json:"enable_health_check"`
// ===== Context =====
Context context.Context `json:"-"` // Operation context
// CustomEncoder replaces the default encoder entirely.
// If set, Encode operations use this encoder instead of the built-in one.
CustomEncoder CustomEncoder
// CustomTypeEncoders provides encoding for specific types.
// Keys are reflect.Type values; values implement TypeEncoder.
CustomTypeEncoders map[reflect.Type]TypeEncoder
// CustomValidators run before operations.
// All validators must pass for the operation to proceed.
CustomValidators []Validator
// AdditionalDangerousPatterns adds security patterns beyond defaults.
// These are checked in addition to built-in patterns unless
// DisableDefaultPatterns is true.
AdditionalDangerousPatterns []DangerousPattern
// DisableDefaultPatterns disables built-in security patterns.
// Set to true to use only AdditionalDangerousPatterns.
DisableDefaultPatterns bool
// Hooks provide before/after interception for operations.
Hooks []Hook
// CustomPathParser replaces the default path parser.
// If set, path parsing uses this parser instead of the built-in one.
CustomPathParser PathParser
}
Config holds all configuration for the JSON processor. Start with DefaultConfig() and modify as needed.
Example:
cfg := json.DefaultConfig() cfg.CreatePaths = true cfg.Pretty = true result, err := json.Set(data, "path", value, cfg)
func DefaultConfig ¶
func DefaultConfig() Config
DefaultConfig returns the default configuration. Creates a new instance each time to allow modifications without affecting other callers.
func PrettyConfig ¶ added in v1.3.0
func PrettyConfig() Config
PrettyConfig returns a Config for pretty-printed JSON output. This is the unified version that returns Config instead of EncodeConfig.
Example:
result, err := json.Encode(data, json.PrettyConfig())
func SecurityConfig ¶ added in v1.3.0
func SecurityConfig() Config
SecurityConfig returns a configuration with enhanced security settings for processing untrusted input from external sources.
This is the recommended configuration for:
- Public APIs and web services
- User-submitted data
- External webhooks
- Authentication endpoints
- Financial data processing
Key characteristics:
- Full security scan enabled for all input
- Strict mode enabled for predictable parsing
- Conservative limits for untrusted payloads
- Caching enabled for repeated operations
This function unifies HighSecurityConfig and WebAPIConfig into a single entry point.
func (*Config) AddDangerousPattern ¶ added in v1.3.0
func (c *Config) AddDangerousPattern(pattern DangerousPattern)
AddDangerousPattern adds a security pattern to the configuration.
func (*Config) AddHook ¶ added in v1.3.0
AddHook adds an operation hook to the configuration. Hooks are executed in order for Before and in reverse order for After.
func (*Config) AddValidator ¶ added in v1.3.0
AddValidator adds a custom validator to the configuration. Validators are executed in order; all must pass for operations to proceed.
func (*Config) Clone ¶ added in v1.0.6
Clone creates a copy of the configuration. Performs a deep copy of reference types (maps, slices). Returns a pointer to avoid unnecessary copying of the large Config struct.
NOTE: Interface fields (CustomEncoder, CustomPathParser, Context) are shallow-copied as they typically contain stateless or singleton implementations. CustomTypeEncoders, CustomValidators, AdditionalDangerousPatterns, and Hooks are deep-copied as they may be modified independently.
func (*Config) GetCacheTTL ¶
func (*Config) GetFloatPrecision ¶ added in v1.3.0
func (*Config) GetMaxCacheSize ¶
func (*Config) GetMaxConcurrency ¶
func (*Config) GetMaxDepth ¶ added in v1.3.0
func (*Config) GetMaxJSONSize ¶
Convenience accessor methods for testing and interface-based usage. Rationale: These methods enable mock-based testing and potential future interface abstraction. In application code, direct field access is preferred:
cfg.MaxJSONSize instead of cfg.GetMaxJSONSize() cfg.StrictMode instead of cfg.IsStrictMode()
func (*Config) GetMaxNestingDepth ¶
func (*Config) GetMaxPathDepth ¶
func (*Config) GetSecurityLimits ¶
GetSecurityLimits returns a summary of current security limits
func (*Config) IsCacheEnabled ¶
Required by CacheConfig interface (internal/cache.go) - do not remove.
func (*Config) IsCommentsAllowed ¶ added in v1.0.10
func (*Config) IsDisallowUnknownEnabled ¶ added in v1.3.0
func (*Config) IsHTMLEscapeEnabled ¶ added in v1.3.0
Required by EncoderConfig interface (interfaces.go) for custom encoders. These methods provide read-only access to encoding configuration.
func (*Config) IsHealthCheckEnabled ¶
func (*Config) IsMetricsEnabled ¶
func (*Config) IsPrettyEnabled ¶ added in v1.3.0
func (*Config) IsSortKeysEnabled ¶ added in v1.3.0
func (*Config) IsStrictMode ¶
func (*Config) IsTruncateFloatEnabled ¶ added in v1.3.0
func (*Config) ShouldCleanupNulls ¶
func (*Config) ShouldCompactArrays ¶
func (*Config) ShouldCreatePaths ¶
func (*Config) ShouldEscapeNewlines ¶ added in v1.3.0
func (*Config) ShouldEscapeSlash ¶ added in v1.3.0
func (*Config) ShouldEscapeTabs ¶ added in v1.3.0
func (*Config) ShouldEscapeUnicode ¶ added in v1.3.0
func (*Config) ShouldIncludeNulls ¶ added in v1.3.0
func (*Config) ShouldPreserveNumbers ¶ added in v1.0.10
func (*Config) ShouldValidateFilePath ¶
func (*Config) ShouldValidateInput ¶
func (*Config) ShouldValidateUTF8 ¶ added in v1.3.0
func (*Config) Validate ¶ added in v1.0.6
Validate validates the configuration and applies corrections. This is the single source of truth for config validation. DRY FIX: Delegates to ValidateWithWarnings to avoid code duplication
func (*Config) ValidateWithWarnings ¶ added in v1.3.0
func (c *Config) ValidateWithWarnings() []ConfigWarning
ValidateWithWarnings validates the configuration and returns warnings for any modifications made. This is useful for debugging configuration issues or informing users about automatic adjustments.
Example:
cfg := json.DefaultConfig()
cfg.MaxJSONSize = -1 // Invalid value
warnings := cfg.ValidateWithWarnings()
for _, w := range warnings {
fmt.Printf("%s: %s\n", w.Field, w.Reason)
}
type ConfigWarning ¶ added in v1.3.0
type ConfigWarning struct {
Field string // The field that was modified
OldValue any // The original value (may be nil for invalid values)
NewValue any // The corrected value
Reason string // Why the modification was made
}
ConfigWarning represents a configuration modification made during validation.
type CustomEncoder ¶
type CustomEncoder interface {
// Encode converts a Go value to JSON string.
Encode(value any) (string, error)
}
CustomEncoder provides custom JSON encoding capability. Implement this interface to replace the default encoder entirely.
Example:
type UpperCaseEncoder struct{}
func (e *UpperCaseEncoder) Encode(value any) (string, error) {
// Custom encoding logic
}
cfg := json.DefaultConfig()
cfg.CustomEncoder = &UpperCaseEncoder{}
type DangerousPattern ¶ added in v1.3.0
type DangerousPattern struct {
// Pattern is the substring to detect in input.
Pattern string
// Name is a human-readable description of the security risk.
Name string
// Level determines how the pattern is handled.
Level PatternLevel
}
DangerousPattern represents a security risk pattern to detect.
func GetCriticalPatterns ¶ added in v1.3.0
func GetCriticalPatterns() []DangerousPattern
GetCriticalPatterns returns patterns that are always fully scanned.
func GetDefaultPatterns ¶ added in v1.3.0
func GetDefaultPatterns() []DangerousPattern
GetDefaultPatterns returns the built-in dangerous patterns as DangerousPattern values. All default patterns are considered Critical level.
func ListDangerousPatterns ¶ added in v1.3.0
func ListDangerousPatterns() []DangerousPattern
ListDangerousPatterns returns all registered custom patterns.
type Decoder ¶
type Decoder struct {
// contains filtered or unexported fields
}
Decoder reads and decodes JSON values from an input stream. This type is fully compatible with encoding/json.Decoder.
func NewDecoder ¶
NewDecoder returns a new decoder that reads from r. This function is fully compatible with encoding/json.NewDecoder.
func (*Decoder) Decode ¶
Decode reads the next JSON-encoded value from its input and stores it in v.
func (*Decoder) DisallowUnknownFields ¶
func (dec *Decoder) DisallowUnknownFields()
func (*Decoder) InputOffset ¶
type Encoder ¶
type Encoder struct {
// contains filtered or unexported fields
}
Encoder writes JSON values to an output stream. This type is fully compatible with encoding/json.Encoder.
func NewEncoder ¶
NewEncoder returns a new encoder that writes to w. This function is fully compatible with encoding/json.NewEncoder.
func NewEncoderWithOpts ¶ added in v1.3.0
NewEncoderWithOpts returns a new encoder that writes to w with Config configuration. This is the unified API for creating encoders with consistent Config pattern.
Example:
cfg := json.DefaultConfig() cfg.Pretty = true encoder := json.NewEncoderWithOpts(writer, &cfg) err := encoder.Encode(data)
func (*Encoder) Encode ¶
Encode writes the JSON encoding of v to the stream, followed by a newline character.
See the documentation for Marshal for details about the conversion of Go values to JSON.
func (*Encoder) SetEscapeHTML ¶
SetEscapeHTML specifies whether problematic HTML characters should be escaped inside JSON quoted strings. The default behavior is to escape &, <, and > to \u0026, \u003c, and \u003e to avoid certain safety problems that can arise when embedding JSON in HTML.
In non-HTML settings where the escaping interferes with the readability of the output, SetEscapeHTML(false) disables this behavior.
type EncoderConfig ¶ added in v1.3.0
type EncoderConfig interface {
// HTML escaping
IsHTMLEscapeEnabled() bool
// Pretty printing
IsPrettyEnabled() bool
GetIndent() string
GetPrefix() string
// Key handling
IsSortKeysEnabled() bool
// Float handling
GetFloatPrecision() int
IsTruncateFloatEnabled() bool
// Depth control
GetMaxDepth() int
// Null handling
ShouldIncludeNulls() bool
// UTF-8 validation
ShouldValidateUTF8() bool
// Unknown field handling
IsDisallowUnknownEnabled() bool
// Unicode escaping
ShouldEscapeUnicode() bool
// Slash escaping
ShouldEscapeSlash() bool
// Newline escaping
ShouldEscapeNewlines() bool
// Tab escaping
ShouldEscapeTabs() bool
}
EncoderConfig provides configuration access for custom encoders. Implemented by Config struct.
type HealthStatus ¶
type HealthStatus struct {
Timestamp time.Time `json:"timestamp"`
Healthy bool `json:"healthy"`
Checks map[string]CheckResult `json:"checks"`
}
HealthStatus represents the health status of the processor
func GetHealthStatus ¶ added in v1.1.0
func GetHealthStatus() HealthStatus
GetHealthStatus returns the health status of the default processor.
type Hook ¶ added in v1.3.0
type Hook interface {
// Before is called before an operation.
// Return error to abort the operation.
Before(ctx HookContext) error
// After is called after an operation completes.
// Modify result or error as needed.
After(ctx HookContext, result any, err error) (any, error)
}
Hook intercepts operations before/after execution. Implement this interface to add cross-cutting concerns like logging, metrics, tracing, or request transformation.
Example:
type LoggingHook struct{ logger *slog.Logger }
func (h *LoggingHook) Before(ctx json.HookContext) error {
h.logger.Info("operation starting", "op", ctx.Operation, "path", ctx.Path)
return nil
}
func (h *LoggingHook) After(ctx json.HookContext, result any, err error) (any, error) {
h.logger.Info("operation completed", "op", ctx.Operation, "error", err)
return result, err
}
func ErrorHook ¶ added in v1.3.0
func ErrorHook(handler func(ctx HookContext, err error) error) Hook
ErrorHook creates a hook that intercepts errors. The handler can transform errors or log them.
Example:
p.AddHook(json.ErrorHook(func(ctx json.HookContext, err error) error {
sentry.CaptureException(err)
return err // return original or transformed error
}))
func LoggingHook ¶ added in v1.3.0
LoggingHook creates a hook that logs all operations. The logger must implement Info(msg string, args ...any).
Example:
p.AddHook(json.LoggingHook(slog.Default()))
func TimingHook ¶ added in v1.3.0
TimingHook creates a hook that records operation duration. The recorder must implement Record(op string, duration time.Duration).
Example:
p.AddHook(json.TimingHook(myMetricsRecorder))
func ValidationHook ¶ added in v1.3.0
ValidationHook creates a hook that validates input before operations. Return error from the validator to abort the operation.
Example:
p.AddHook(json.ValidationHook(func(jsonStr, path string) error {
if len(jsonStr) > 1_000_000 {
return errors.New("JSON too large")
}
return nil
}))
type HookChain ¶ added in v1.3.0
type HookChain []Hook
HookChain manages multiple hooks for sequential execution.
func (HookChain) ExecuteAfter ¶ added in v1.3.0
ExecuteAfter runs all After hooks in reverse order.
func (HookChain) ExecuteBefore ¶ added in v1.3.0
func (hc HookChain) ExecuteBefore(ctx HookContext) error
ExecuteBefore runs all Before hooks in order, stopping at first error.
type HookContext ¶ added in v1.3.0
type HookContext struct {
// Operation is the type of operation being performed.
// Values: "get", "set", "delete", "marshal", "unmarshal"
Operation string
// JSONStr is the input JSON string (may be empty for marshal).
JSONStr string
// Path is the target path (may be empty for marshal/unmarshal).
Path string
// Value is the value for set operations.
Value any
// Config is the active configuration.
Config *Config
// StartTime is when the operation started (set before After is called).
StartTime time.Time
}
HookContext provides context for operation hooks.
type HookFunc ¶ added in v1.3.0
type HookFunc struct {
BeforeFn func(ctx HookContext) error
AfterFn func(ctx HookContext, result any, err error) (any, error)
}
HookFunc is an adapter to use ordinary functions as Hooks. Useful for simple hooks that don't need both Before and After.
Example:
// Only need After
p.AddHook(&json.HookFunc{
AfterFn: func(ctx json.HookContext, result any, err error) (any, error) {
log.Printf("%s completed in %v", ctx.Operation, time.Since(ctx.StartTime))
return result, err
},
})
func (*HookFunc) After ¶ added in v1.3.0
After calls the AfterFn if set, otherwise returns the original result.
func (*HookFunc) Before ¶ added in v1.3.0
func (h *HookFunc) Before(ctx HookContext) error
Before calls the BeforeFn if set, otherwise returns nil.
type InvalidUnmarshalError ¶
InvalidUnmarshalError describes an invalid argument passed to Unmarshal. (The argument to Unmarshal must be a non-nil pointer.)
func (*InvalidUnmarshalError) Error ¶
func (e *InvalidUnmarshalError) Error() string
type IterableValue ¶
type IterableValue struct {
// contains filtered or unexported fields
}
IterableValue wraps a value to provide convenient access methods Note: Simplified to avoid resource leaks from holding processor/iterator references
func NewIterableValue ¶
func NewIterableValue(data any) *IterableValue
NewIterableValue creates an IterableValue from data
func (*IterableValue) Exists ¶
func (iv *IterableValue) Exists(key string) bool
Exists checks if a key or path exists in the object Supports path navigation with dot notation and array indices
func (*IterableValue) ForeachNested ¶
func (iv *IterableValue) ForeachNested(path string, fn func(key any, item *IterableValue))
ForeachNested iterates over nested JSON structures with a path
func (*IterableValue) Get ¶
func (iv *IterableValue) Get(path string) any
Get returns a value by path (supports dot notation and array indices)
func (*IterableValue) GetArray ¶
func (iv *IterableValue) GetArray(key string) []any
GetArray returns an array value by key or path Supports path navigation with dot notation and array indices
func (*IterableValue) GetBool ¶
func (iv *IterableValue) GetBool(key string) bool
GetBool returns a bool value by key or path Supports path navigation with dot notation and array indices
func (*IterableValue) GetBoolWithDefault ¶
func (iv *IterableValue) GetBoolWithDefault(key string, defaultValue bool) bool
GetBoolWithDefault returns a bool value by key or path with a default fallback Supports path navigation with dot notation and array indices
func (*IterableValue) GetData ¶ added in v1.2.0
func (iv *IterableValue) GetData() any
GetData returns the underlying data
func (*IterableValue) GetFloat64 ¶
func (iv *IterableValue) GetFloat64(key string) float64
GetFloat64 returns a float64 value by key or path Supports path navigation with dot notation and array indices
func (*IterableValue) GetFloat64WithDefault ¶
func (iv *IterableValue) GetFloat64WithDefault(key string, defaultValue float64) float64
GetFloat64WithDefault returns a float64 value by key or path with a default fallback Supports path navigation with dot notation and array indices
func (*IterableValue) GetInt ¶
func (iv *IterableValue) GetInt(key string) int
GetInt returns an int value by key or path Supports path navigation with dot notation and array indices (e.g., "user.age" or "users[0].id")
func (*IterableValue) GetIntWithDefault ¶
func (iv *IterableValue) GetIntWithDefault(key string, defaultValue int) int
GetIntWithDefault returns an int value by key or path with a default fallback Supports path navigation with dot notation and array indices
func (*IterableValue) GetObject ¶
func (iv *IterableValue) GetObject(key string) map[string]any
GetObject returns an object value by key or path Supports path navigation with dot notation and array indices
func (*IterableValue) GetString ¶
func (iv *IterableValue) GetString(key string) string
GetString returns a string value by key or path Supports path navigation with dot notation and array indices (e.g., "user.address.city" or "users[0].name")
func (*IterableValue) GetStringWithDefault ¶
func (iv *IterableValue) GetStringWithDefault(key string, defaultValue string) string
GetStringWithDefault returns a string value by key or path with a default fallback Supports path navigation with dot notation and array indices
func (*IterableValue) GetWithDefault ¶
func (iv *IterableValue) GetWithDefault(key string, defaultValue any) any
GetWithDefault returns a value by key or path with a default fallback Supports path navigation with dot notation and array indices
func (*IterableValue) IsEmpty ¶
func (iv *IterableValue) IsEmpty(key string) bool
IsEmpty checks if a specific key's or path's value is empty Supports path navigation with dot notation and array indices
func (*IterableValue) IsEmptyData ¶ added in v1.0.8
func (iv *IterableValue) IsEmptyData() bool
IsEmptyData checks if the whole value is empty (for backward compatibility)
func (*IterableValue) IsNull ¶
func (iv *IterableValue) IsNull(key string) bool
IsNull checks if a specific key's or path's value is null Supports path navigation with dot notation and array indices
func (*IterableValue) IsNullData ¶ added in v1.0.8
func (iv *IterableValue) IsNullData() bool
IsNullData checks if the whole value is null (for backward compatibility)
func (*IterableValue) Release ¶ added in v1.2.0
func (iv *IterableValue) Release()
Release returns the IterableValue to the pool
type Iterator ¶
type Iterator struct {
// contains filtered or unexported fields
}
Iterator represents an iterator over JSON data
func NewIterator ¶
NewIterator creates a new Iterator over the provided data. Simplified API: creates an iterator for traversing arrays and objects. Note: The opts parameter is reserved for future use and currently ignored.
type IteratorControl ¶
type IteratorControl int
IteratorControl represents control flags for iteration operations. Used by Foreach* functions to control iteration flow.
const ( // IteratorNormal continues iteration normally IteratorNormal IteratorControl = iota // IteratorContinue skips the current item and continues iteration IteratorContinue // IteratorBreak stops iteration entirely IteratorBreak )
type JSONLConfig ¶ added in v1.2.0
type JSONLConfig struct {
BufferSize int // Buffer size for reading (default: 64KB)
MaxLineSize int // Maximum line size (default: 1MB)
SkipEmpty bool // Skip empty lines (default: true)
SkipComments bool // Skip lines starting with # or // (default: false)
ContinueOnErr bool // Continue processing on parse errors (default: false)
Processor *Processor // Optional custom processor (default: global processor)
}
JSONLConfig holds configuration for JSONL processing
func DefaultJSONLConfig ¶ added in v1.2.0
func DefaultJSONLConfig() JSONLConfig
DefaultJSONLConfig returns the default JSONL configuration
type JSONLProcessor ¶ added in v1.2.0
type JSONLProcessor struct {
// contains filtered or unexported fields
}
JSONLProcessor processes JSON Lines format data PERFORMANCE: Uses buffered reading and object pooling for efficiency
func NewJSONLProcessor ¶ added in v1.2.0
func NewJSONLProcessor(reader io.Reader) *JSONLProcessor
NewJSONLProcessor creates a new JSONL processor with default configuration.
func NewJSONLProcessorWithConfig ¶ added in v1.2.0
func NewJSONLProcessorWithConfig(reader io.Reader, config JSONLConfig) *JSONLProcessor
NewJSONLProcessorWithConfig creates a new JSONL processor with the specified configuration.
Example:
cfg := json.DefaultJSONLConfig() cfg.SkipEmpty = false cfg.Processor = customProcessor proc := json.NewJSONLProcessorWithConfig(reader, cfg)
func (*JSONLProcessor) Err ¶ added in v1.2.0
func (j *JSONLProcessor) Err() error
Err returns any error encountered during processing
func (*JSONLProcessor) GetStats ¶ added in v1.2.0
func (j *JSONLProcessor) GetStats() JSONLStats
GetStats returns current processing statistics
func (*JSONLProcessor) Release ¶ added in v1.2.0
func (j *JSONLProcessor) Release()
Release releases resources held by the JSONLProcessor.
IMPORTANT:
- Call this after all processing is complete
- For StreamLinesParallel, ensure workers have finished before calling Release
- After Release, the processor must not be used
Resource Management:
- Sets stopped flag to halt any pending operations
- Clears internal processor reference to allow garbage collection
func (*JSONLProcessor) Stop ¶ added in v1.2.0
func (j *JSONLProcessor) Stop()
Stop stops the JSONL processor
func (*JSONLProcessor) StreamLines ¶ added in v1.2.0
func (j *JSONLProcessor) StreamLines(fn func(lineNum int, data any) bool) error
StreamLines processes JSONL data line by line The callback function receives the line number and parsed data Return false from the callback to stop iteration
func (*JSONLProcessor) StreamLinesParallel ¶ added in v1.2.0
func (j *JSONLProcessor) StreamLinesParallel(fn func(lineNum int, data any) error, workers int) (err error)
StreamLinesParallel processes JSONL data in parallel using worker pool PERFORMANCE: Parallel processing for CPU-bound operations on JSONL data
type JSONLStats ¶ added in v1.2.0
Stats returns processing statistics
type JSONLWriter ¶ added in v1.2.0
type JSONLWriter struct {
// contains filtered or unexported fields
}
JSONLWriter writes JSON Lines format to an io.Writer
func NewJSONLWriter ¶ added in v1.2.0
func NewJSONLWriter(writer io.Writer) *JSONLWriter
NewJSONLWriter creates a new JSONL writer
func (*JSONLWriter) Err ¶ added in v1.2.0
func (w *JSONLWriter) Err() error
Err returns any error encountered during writing
func (*JSONLWriter) Stats ¶ added in v1.2.0
func (w *JSONLWriter) Stats() JSONLStats
Stats returns writing statistics
func (*JSONLWriter) Write ¶ added in v1.2.0
func (w *JSONLWriter) Write(data any) error
Write writes a single JSON value as a line
func (*JSONLWriter) WriteAll ¶ added in v1.2.0
func (w *JSONLWriter) WriteAll(data []any) error
WriteAll writes multiple JSON values as lines
func (*JSONLWriter) WriteRaw ¶ added in v1.2.0
func (w *JSONLWriter) WriteRaw(line []byte) error
WriteRaw writes a raw JSON line (already encoded)
type JsonsError ¶
type JsonsError struct {
Op string `json:"op"` // Operation that failed
Path string `json:"path"` // JSON path where error occurred
Message string `json:"message"` // Human-readable error message
Err error `json:"err"` // Underlying error
}
JsonsError represents a JSON processing error with essential context
func (*JsonsError) Error ¶
func (e *JsonsError) Error() string
func (*JsonsError) Is ¶
func (e *JsonsError) Is(target error) bool
Is implements error matching for Go 1.13+ error handling Compares Op, Path, and Err fields for complete equality. Note: Message is intentionally excluded as it's derived from other fields.
func (*JsonsError) Unwrap ¶
func (e *JsonsError) Unwrap() error
Unwrap returns the underlying error for error chain support
type LargeFileConfig ¶ added in v1.2.0
type LargeFileConfig struct {
ChunkSize int64 // Size of each chunk in bytes
MaxMemory int64 // Maximum memory to use
BufferSize int // Buffer size for reading
SamplingEnabled bool // Enable sampling for very large files
SampleSize int // Number of samples to take
}
LargeFileConfig holds configuration for large file processing
func DefaultLargeFileConfig ¶ added in v1.2.0
func DefaultLargeFileConfig() LargeFileConfig
DefaultLargeFileConfig returns the default configuration
type LargeFileProcessor ¶ added in v1.2.0
type LargeFileProcessor struct {
// contains filtered or unexported fields
}
LargeFileProcessor handles processing of large JSON files
func NewLargeFileProcessor ¶ added in v1.2.0
func NewLargeFileProcessor(config LargeFileConfig) *LargeFileProcessor
NewLargeFileProcessor creates a new large file processor
func (*LargeFileProcessor) ProcessFile ¶ added in v1.2.0
func (lfp *LargeFileProcessor) ProcessFile(filename string, fn func(item any) error) error
ProcessFile processes a large JSON file efficiently
func (*LargeFileProcessor) ProcessFileChunked ¶ added in v1.2.0
func (lfp *LargeFileProcessor) ProcessFileChunked(filename string, chunkSize int, fn func(chunk []any) error) error
ProcessFileChunked processes a large JSON file in chunks
type LazyParser ¶ added in v1.2.0
type LazyParser struct {
// contains filtered or unexported fields
}
LazyParser provides lazy JSON parsing that supports both JSON objects and arrays
func NewLazyParser ¶ added in v1.2.0
func NewLazyParser(data []byte) *LazyParser
NewLazyParser creates a new lazy parser
func (*LazyParser) Error ¶ added in v1.3.0
func (lp *LazyParser) Error() error
Error returns any parsing error, triggering parse if needed
func (*LazyParser) Get ¶ added in v1.2.0
func (lp *LazyParser) Get(path string) (any, error)
Get retrieves a value at the given path
func (*LazyParser) GetObject ¶ added in v1.3.0
func (lp *LazyParser) GetObject() (map[string]any, error)
GetObject returns the parsed data as a map (only for JSON objects). Returns ErrTypeMismatch if the parsed JSON is not an object.
func (*LazyParser) GetValue ¶ added in v1.2.2
func (lp *LazyParser) GetValue() (any, error)
GetValue returns all parsed data as interface{} (supports any JSON type)
func (*LazyParser) IsArray ¶ added in v1.2.2
func (lp *LazyParser) IsArray() bool
IsArray returns true if the parsed JSON is an array
func (*LazyParser) IsObject ¶ added in v1.2.2
func (lp *LazyParser) IsObject() bool
IsObject returns true if the parsed JSON is an object
func (*LazyParser) IsParsed ¶ added in v1.2.0
func (lp *LazyParser) IsParsed() bool
IsParsed returns whether the JSON has been parsed
func (*LazyParser) Parse ¶ added in v1.3.0
func (lp *LazyParser) Parse() (any, error)
Parse forces parsing and returns the parsed data
func (*LazyParser) Parsed ¶ added in v1.3.0
func (lp *LazyParser) Parsed() any
Parsed returns the parsed data without forcing parsing. Returns nil if not yet parsed.
func (*LazyParser) Raw ¶ added in v1.2.0
func (lp *LazyParser) Raw() []byte
Raw returns the raw JSON bytes
type Marshaler ¶
Marshaler is the interface implemented by types that can marshal themselves into valid JSON.
type MarshalerError ¶
type MarshalerError struct {
Type reflect.Type
Err error
// contains filtered or unexported fields
}
MarshalerError represents an error from calling a MarshalJSON or MarshalText method.
func (*MarshalerError) Error ¶
func (e *MarshalerError) Error() string
func (*MarshalerError) Unwrap ¶
func (e *MarshalerError) Unwrap() error
type MergeMode ¶ added in v1.3.0
MergeMode defines the merge strategy for combining JSON objects and arrays. This is a type alias to internal.MergeMode to ensure consistency across the codebase.
type NDJSONProcessor ¶ added in v1.2.0
type NDJSONProcessor struct {
// contains filtered or unexported fields
}
NDJSONProcessor processes newline-delimited JSON files
func NewNDJSONProcessor ¶ added in v1.2.0
func NewNDJSONProcessor(bufferSize int) *NDJSONProcessor
NewNDJSONProcessor creates a new NDJSON processor
func (*NDJSONProcessor) ProcessFile ¶ added in v1.2.0
func (np *NDJSONProcessor) ProcessFile(filename string, fn func(lineNum int, obj map[string]any) error) error
ProcessFile processes an NDJSON file line by line
func (*NDJSONProcessor) ProcessReader ¶ added in v1.2.0
func (np *NDJSONProcessor) ProcessReader(reader io.Reader, fn func(lineNum int, obj map[string]any) error) error
ProcessReader processes NDJSON from a reader
type Number ¶
type Number string
Number represents a JSON number literal.
type ParallelIterator ¶ added in v1.2.0
type ParallelIterator struct {
// contains filtered or unexported fields
}
ParallelIterator processes arrays in parallel using worker goroutines
func NewParallelIterator ¶ added in v1.2.0
func NewParallelIterator(data []any, workers int) *ParallelIterator
NewParallelIterator creates a new parallel iterator
func (*ParallelIterator) Close ¶ added in v1.3.0
func (it *ParallelIterator) Close()
Close releases resources associated with the ParallelIterator. RESOURCE FIX: Added for API consistency and to document proper cleanup patterns. The semaphore channel is automatically garbage collected when the iterator is no longer referenced.
func (*ParallelIterator) Filter ¶ added in v1.2.0
func (it *ParallelIterator) Filter(predicate func(int, any) bool) []any
Filter filters elements in parallel using a predicate function Returns a new slice with elements that pass the predicate
func (*ParallelIterator) ForEach ¶ added in v1.2.0
func (it *ParallelIterator) ForEach(fn func(int, any) error) error
ForEach processes each element in parallel using the provided function The function receives the index and value of each element Returns the first error encountered, or nil if all operations succeed
func (*ParallelIterator) ForEachBatch ¶ added in v1.2.0
ForEachBatch processes elements in batches in parallel Each batch is processed by a single goroutine
func (*ParallelIterator) ForEachBatchWithContext ¶ added in v1.3.0
func (it *ParallelIterator) ForEachBatchWithContext(ctx context.Context, batchSize int, fn func(int, []any) error) error
ForEachBatchWithContext processes elements in batches with context support for cancellation Each batch is processed by a single goroutine RESOURCE FIX: Added context support for graceful goroutine termination
func (*ParallelIterator) ForEachWithContext ¶ added in v1.3.0
ForEachWithContext processes each element in parallel with context support for cancellation The function receives the index and value of each element Returns the first error encountered, or ctx.Err() if context is cancelled RESOURCE FIX: Added context support for graceful goroutine termination
type ParsedJSON ¶ added in v1.2.0
type ParsedJSON struct {
// contains filtered or unexported fields
}
ParsedJSON represents a pre-parsed JSON document that can be reused for multiple operations. This is a performance optimization for scenarios where the same JSON is queried multiple times. OPTIMIZED: Pre-parsing avoids repeated JSON parsing overhead for repeated queries.
func (*ParsedJSON) Data ¶ added in v1.2.0
func (p *ParsedJSON) Data() any
Data returns the underlying parsed data
type PathParser ¶
type PathParser interface {
// ParsePath parses a path string into segments.
ParsePath(path string) ([]PathSegment, error)
}
PathParser parses path strings into segments. Implement to provide custom path syntax support.
type PathSecurityChecker ¶ added in v1.3.0
type PathSecurityChecker struct {
// contains filtered or unexported fields
}
PathSecurityChecker handles path security validation. RESPONSIBILITY: Validates JSON path syntax and security.
func NewPathSecurityChecker ¶ added in v1.3.0
func NewPathSecurityChecker(maxPathLength int) *PathSecurityChecker
NewPathSecurityChecker creates a new path security checker.
func (*PathSecurityChecker) Validate ¶ added in v1.3.0
func (pc *PathSecurityChecker) Validate(path string) error
Validate checks the path for security issues.
type PathSegment ¶
type PathSegment = internal.PathSegment
PathSegment represents a parsed path segment. This is an alias to internal.PathSegment. Methods like HasStart(), HasEnd(), etc. are available through the internal type.
func NewAppendSegment ¶ added in v1.3.0
func NewAppendSegment() PathSegment
NewAppendSegment creates an append segment.
func NewArrayIndexSegment ¶ added in v1.3.0
func NewArrayIndexSegment(index int) PathSegment
NewArrayIndexSegment creates an array index access segment.
func NewArraySliceSegment ¶ added in v1.3.0
func NewArraySliceSegment(start, end, step int, hasStart, hasEnd, hasStep bool) PathSegment
NewArraySliceSegment creates an array slice segment.
func NewExtractSegment ¶ added in v1.3.0
func NewExtractSegment(key string, flat bool) PathSegment
NewExtractSegment creates an extraction segment.
func NewPropertySegment ¶ added in v1.3.0
func NewPropertySegment(key string) PathSegment
NewPropertySegment creates a property access segment.
func NewRecursiveSegment ¶ added in v1.3.0
func NewRecursiveSegment() PathSegment
NewRecursiveSegment creates a recursive descent segment.
func NewWildcardSegment ¶ added in v1.3.0
func NewWildcardSegment() PathSegment
NewWildcardSegment creates a wildcard segment.
type PathSegmentFlags ¶ added in v1.3.0
type PathSegmentFlags = internal.PathSegmentFlags
PathSegmentFlags are bit flags for path segment options. This is an alias to internal.PathSegmentFlags.
const ( // PathFlagNegative indicates a negative array index. PathFlagNegative PathSegmentFlags = internal.PathFlagNegative // PathFlagWildcard indicates a wildcard segment. PathFlagWildcard PathSegmentFlags = internal.PathFlagWildcard // PathFlagFlat indicates flat extraction mode. PathFlagFlat PathSegmentFlags = internal.PathFlagFlat // PathFlagHasStart indicates slice has start value. PathFlagHasStart PathSegmentFlags = internal.PathFlagHasStart // PathFlagHasEnd indicates slice has end value. PathFlagHasEnd PathSegmentFlags = internal.PathFlagHasEnd // PathFlagHasStep indicates slice has step value. PathFlagHasStep PathSegmentFlags = internal.PathFlagHasStep )
Path flag constants - aliases to internal package
type PathSegmentType ¶ added in v1.3.0
type PathSegmentType = internal.PathSegmentType
PathSegmentType identifies the type of a path segment. This is an alias to internal.PathSegmentType.
const ( // PathSegmentProperty represents object property access (e.g., "user.name"). PathSegmentProperty PathSegmentType = internal.PropertySegment // PathSegmentArrayIndex represents array index access (e.g., "items[0]"). PathSegmentArrayIndex PathSegmentType = internal.ArrayIndexSegment // PathSegmentArraySlice represents array slice (e.g., "items[1:5:2]"). PathSegmentArraySlice PathSegmentType = internal.ArraySliceSegment // PathSegmentWildcard represents wildcard access (e.g., "items[*]"). PathSegmentWildcard PathSegmentType = internal.WildcardSegment // PathSegmentRecursive represents recursive descent (e.g., "**.name"). PathSegmentRecursive PathSegmentType = internal.RecursiveSegment // PathSegmentFilter represents filter expression (e.g., "items[?(@.active)]"). PathSegmentFilter PathSegmentType = internal.FilterSegment // PathSegmentExtract represents field extraction (e.g., "{name,email}"). PathSegmentExtract PathSegmentType = internal.ExtractSegment // PathSegmentAppend represents append operation (e.g., "items[+]"). PathSegmentAppend PathSegmentType = internal.AppendSegment )
Path segment type constants - aliases to internal package
type PathValidator ¶ added in v1.3.0
PathValidator validates path syntax before navigation. Implement to add custom path validation rules.
type PatternChecker ¶ added in v1.3.0
type PatternChecker struct {
// contains filtered or unexported fields
}
PatternChecker handles dangerous pattern detection in JSON content. RESPONSIBILITY: Detects and reports dangerous patterns like XSS, prototype pollution, etc.
func NewPatternChecker ¶ added in v1.3.0
func NewPatternChecker(fullSecurityScan bool) *PatternChecker
NewPatternChecker creates a new pattern checker.
func (*PatternChecker) Check ¶ added in v1.3.0
func (pc *PatternChecker) Check(jsonStr string) error
Check performs pattern validation on the JSON string. Returns error if dangerous patterns are detected.
type PatternLevel ¶ added in v1.3.0
type PatternLevel int
PatternLevel represents the severity level of a dangerous pattern.
const ( // PatternLevelCritical always blocks the operation. // Use for patterns that pose immediate security risks (e.g., prototype pollution). PatternLevelCritical PatternLevel = iota // PatternLevelWarning blocks in strict mode, logs warning in lenient mode. // Use for patterns that may indicate malicious intent but have legitimate uses. PatternLevelWarning // PatternLevelInfo logs but never blocks. // Use for audit/tracking purposes without interrupting operations. PatternLevelInfo )
func (PatternLevel) String ¶ added in v1.3.0
func (pl PatternLevel) String() string
String returns the string representation of PatternLevel.
type PatternRegistry ¶ added in v1.3.0
type PatternRegistry interface {
// Add registers a new dangerous pattern.
Add(pattern DangerousPattern)
// Remove unregisters a pattern by its pattern string.
Remove(pattern string)
// List returns all registered patterns.
List() []DangerousPattern
// ListByLevel returns patterns filtered by severity level.
ListByLevel(level PatternLevel) []DangerousPattern
// Clear removes all registered patterns.
Clear()
}
PatternRegistry manages dangerous patterns with thread-safe operations.
type Processor ¶
type Processor struct {
// contains filtered or unexported fields
}
Processor is the main JSON processing engine with thread safety and performance optimization
func New ¶
New creates a new JSON processor with the given configuration. If no configuration is provided, uses default configuration. Returns an error if the configuration is invalid.
Example:
// Using default configuration processor, err := json.New() // With custom configuration cfg := json.DefaultConfig() cfg.CreatePaths = true processor, err := json.New(cfg) // Using preset configuration processor, err := json.New(json.SecurityConfig())
func (*Processor) AddHook ¶ added in v1.3.0
AddHook adds an operation hook to the processor. Hooks are called before and after each operation. Multiple hooks can be added and are executed in order (Before) and reverse order (After).
Example:
type LoggingHook struct{}
func (h *LoggingHook) Before(ctx json.HookContext) error {
log.Printf("before %s", ctx.Operation)
return nil
}
func (h *LoggingHook) After(ctx json.HookContext, result any, err error) (any, error) {
log.Printf("after %s", ctx.Operation)
return result, err
}
p := json.MustNew()
p.AddHook(&LoggingHook{})
func (*Processor) BatchDeleteOptimized ¶ added in v1.2.0
BatchDeleteOptimized performs multiple Delete operations efficiently
func (*Processor) BatchSetOptimized ¶ added in v1.2.0
BatchSetOptimized performs multiple Set operations efficiently
func (*Processor) Close ¶
Close closes the processor and cleans up resources This method is idempotent and thread-safe
func (*Processor) CompactBuffer ¶ added in v1.1.0
CompactBuffer appends to dst the JSON-encoded src with insignificant space characters elided. Compatible with encoding/json.Compact with optional Config support.
func (*Processor) CompactBytes ¶ added in v1.3.0
CompactBytes appends to dst the JSON-encoded src with insignificant space characters elided. This is an alias for CompactBuffer without optional Config, providing encoding/json.Compact compatibility. Use this method when you need the exact encoding/json.Compact signature.
Example:
var buf bytes.Buffer
err := processor.CompactBytes(&buf, []byte(`{"name": "Alice"}`))
func (*Processor) CompactString ¶ added in v1.3.0
CompactString removes whitespace from JSON string. This is an alias for Compact for consistency with the package-level CompactString function.
func (*Processor) CompilePath ¶ added in v1.2.0
func (p *Processor) CompilePath(path string) (*internal.CompiledPath, error)
CompilePath compiles a JSON path string into a CompiledPath for fast repeated operations The returned CompiledPath can be reused for multiple Get/Set/Delete operations
func (*Processor) DeleteClean ¶ added in v1.3.0
DeleteClean removes a value from JSON and cleans up null placeholders. This is the unified API for delete-with-cleanup operations.
Example:
result, err := processor.DeleteClean(data, "users[0].profile")
func (*Processor) Encode ¶ added in v1.1.0
Encode converts any Go value to JSON string This is a convenience method that matches the package-level Encode signature
func (*Processor) EncodeBatch ¶
EncodeBatch encodes multiple key-value pairs as a JSON object. This method accepts variadic Config for unified API pattern.
Example:
result, err := processor.EncodeBatch(pairs, json.PrettyConfig())
func (*Processor) EncodeFields ¶
EncodeFields encodes struct fields selectively based on field names. This method accepts variadic Config for unified API pattern.
Example:
result, err := processor.EncodeFields(value, []string{"name", "email"}, json.PrettyConfig())
func (*Processor) EncodePretty ¶ added in v1.1.0
EncodePretty converts any Go value to pretty-formatted JSON string This is a convenience method that matches the package-level EncodePretty signature
func (*Processor) EncodeStream ¶
EncodeStream encodes multiple values as a JSON array stream. This method accepts variadic Config for unified API pattern.
Example:
result, err := processor.EncodeStream(values, json.PrettyConfig())
func (*Processor) EncodeWithConfig ¶
EncodeWithConfig converts any Go value to JSON string with full configuration control. PERFORMANCE: Uses FastEncoder for simple types to avoid reflection overhead.
Example:
// Default configuration result, err := processor.EncodeWithConfig(data) // With custom configuration cfg := json.DefaultConfig() cfg.Pretty = true result, err := processor.EncodeWithConfig(data, cfg) // With preset configuration result, err := processor.EncodeWithConfig(data, json.PrettyConfig())
func (*Processor) FastDelete ¶ added in v1.2.0
FastDelete is an optimized Delete operation for simple paths
func (*Processor) FastGetMultiple ¶ added in v1.2.0
FastGetMultiple performs multiple Get operations with single parse
func (*Processor) FastSet ¶ added in v1.2.0
FastSet is an optimized Set operation for simple paths Uses pooled resources and optimized marshaling
func (*Processor) Foreach ¶
func (p *Processor) Foreach(jsonStr string, fn func(key any, item *IterableValue))
Foreach iterates over JSON arrays or objects using this processor
func (*Processor) ForeachNested ¶ added in v1.0.9
func (p *Processor) ForeachNested(jsonStr string, fn func(key any, item *IterableValue))
ForeachNested recursively iterates over all nested JSON structures This method traverses through all nested objects and arrays
func (*Processor) ForeachReturn ¶
func (p *Processor) ForeachReturn(jsonStr string, fn func(key any, item *IterableValue)) (string, error)
ForeachReturn iterates over JSON arrays or objects and returns the JSON string This is useful for iteration with transformation purposes
func (*Processor) ForeachWithPath ¶
func (p *Processor) ForeachWithPath(jsonStr, path string, fn func(key any, item *IterableValue)) error
ForeachWithPath iterates over JSON arrays or objects at a specific path using this processor This allows using custom processor configurations (security limits, nesting depth, etc.)
func (*Processor) ForeachWithPathAndControl ¶ added in v1.0.9
func (p *Processor) ForeachWithPathAndControl(jsonStr, path string, fn func(key any, value any) IteratorControl) error
ForeachWithPathAndControl iterates with control over iteration flow
func (*Processor) ForeachWithPathAndIterator ¶ added in v1.0.9
func (p *Processor) ForeachWithPathAndIterator(jsonStr, path string, fn func(key any, item *IterableValue, currentPath string) IteratorControl) error
ForeachWithPathAndIterator iterates over JSON at a path with path information
func (*Processor) FormatCompact ¶ added in v1.1.0
FormatCompact removes whitespace from JSON string (alias for Compact)
func (*Processor) GetArray ¶ added in v1.1.0
GetArray retrieves an array value from JSON at the specified path
func (*Processor) GetBool ¶ added in v1.1.0
GetBool retrieves a bool value from JSON at the specified path
func (*Processor) GetBoolOr ¶ added in v1.3.0
GetBoolOr retrieves a bool value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails.
func (*Processor) GetCompiled ¶ added in v1.2.0
GetCompiled retrieves a value from JSON using a pre-compiled path PERFORMANCE: Skips path parsing for faster repeated operations
func (*Processor) GetCompiledExists ¶ added in v1.2.0
func (p *Processor) GetCompiledExists(data any, cp *internal.CompiledPath) bool
GetCompiledExists checks if a path exists in pre-parsed JSON data using a compiled path
func (*Processor) GetCompiledFromParsed ¶ added in v1.2.0
GetCompiledFromParsed retrieves a value from pre-parsed JSON data using a compiled path PERFORMANCE: No JSON parsing overhead - uses already parsed data
func (*Processor) GetFloat ¶ added in v1.3.0
GetFloat retrieves a float64 value from JSON at the specified path
func (*Processor) GetFloatOr ¶ added in v1.3.0
GetFloatOr retrieves a float64 value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails.
func (*Processor) GetFromParsed ¶ added in v1.2.0
GetFromParsed retrieves a value from a pre-parsed JSON document at the specified path. This is significantly faster than Get() for repeated queries on the same JSON.
OPTIMIZED: Skips JSON parsing, goes directly to path navigation.
func (*Processor) GetFromParsedData ¶ added in v1.2.0
GetFromParsedData retrieves a value from already-parsed data Uses the processor's path navigation without re-parsing
func (*Processor) GetHealthStatus ¶
func (p *Processor) GetHealthStatus() HealthStatus
GetHealthStatus returns the current health status
func (*Processor) GetInt ¶ added in v1.1.0
GetInt retrieves an int value from JSON at the specified path
func (*Processor) GetIntOr ¶ added in v1.3.0
GetIntOr retrieves an int value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails.
func (*Processor) GetMultiple ¶
func (p *Processor) GetMultiple(jsonStr string, paths []string, opts ...Config) (map[string]any, error)
GetMultiple retrieves multiple values from JSON using multiple path expressions
func (*Processor) GetObject ¶ added in v1.1.0
GetObject retrieves an object value from JSON at the specified path
func (*Processor) GetString ¶ added in v1.1.0
GetString retrieves a string value from JSON at the specified path
func (*Processor) GetStringOr ¶ added in v1.3.0
GetStringOr retrieves a string value from JSON at the specified path with a default fallback. Returns defaultValue if: path not found, value is null, or type conversion fails.
func (*Processor) HTMLEscapeBuffer ¶ added in v1.1.0
HTMLEscapeBuffer appends to dst the JSON-encoded src with HTML-safe escaping. Replaces &, <, and > with \u0026, \u003c, and \u003e for safe HTML embedding. Compatible with encoding/json.HTMLEscape with optional Config support.
func (*Processor) IndentBuffer ¶ added in v1.1.0
func (p *Processor) IndentBuffer(dst *bytes.Buffer, src []byte, prefix, indent string, opts ...Config) error
IndentBuffer appends to dst an indented form of the JSON-encoded src. Compatible with encoding/json.Indent with optional Config support.
func (*Processor) IndentBytes ¶ added in v1.3.0
IndentBytes appends to dst an indented form of the JSON-encoded src. This is an alias for IndentBuffer without optional Config, providing encoding/json.Indent compatibility. Use this method when you need the exact encoding/json.Indent signature.
Example:
var buf bytes.Buffer
err := processor.IndentBytes(&buf, []byte(`{"name":"Alice"}`), "", " ")
func (*Processor) LoadFromFile ¶
LoadFromFile loads JSON data from a file and returns the raw JSON string.
func (*Processor) LoadFromFileAsData ¶ added in v1.1.0
LoadFromFileAsData loads JSON data from a file and returns the parsed data structure.
func (*Processor) LoadFromReader ¶
LoadFromReader loads JSON data from an io.Reader and returns the raw JSON string.
func (*Processor) LoadFromReaderAsData ¶ added in v1.1.0
LoadFromReaderAsData loads JSON data from an io.Reader and returns the parsed data structure.
func (*Processor) Marshal ¶
Marshal converts any Go value to JSON bytes (similar to json.Marshal) PERFORMANCE: Uses FastEncoder for simple types to avoid reflection overhead
func (*Processor) MarshalIndent ¶
MarshalIndent converts any Go value to indented JSON bytes (similar to json.MarshalIndent)
func (*Processor) MarshalToFile ¶ added in v1.0.6
MarshalToFile converts data to JSON and saves it to the specified file using Config. This is the unified API that accepts variadic Config.
Example:
err := processor.MarshalToFile("data.json", data, json.PrettyConfig())
func (*Processor) Parse ¶
Parse parses a JSON string into the provided target with improved error handling
func (*Processor) ParseAny ¶ added in v1.3.0
ParseAny parses a JSON string and returns the result as any. This method provides the same behavior as the package-level Parse function. Use Parse when you need to unmarshal into a specific target type.
Example:
data, err := processor.ParseAny(`{"name": "Alice"}`)
func (*Processor) PreParse ¶ added in v1.2.0
func (p *Processor) PreParse(jsonStr string, opts ...Config) (*ParsedJSON, error)
PreParse parses a JSON string and returns a ParsedJSON object that can be reused for multiple Get operations. This is a performance optimization for scenarios where the same JSON is queried multiple times.
OPTIMIZED: Pre-parsing avoids repeated JSON parsing overhead for repeated queries.
Example:
parsed, err := processor.PreParse(jsonStr)
if err != nil { return err }
value1, _ := processor.GetFromParsed(parsed, "path1")
value2, _ := processor.GetFromParsed(parsed, "path2")
func (*Processor) Prettify ¶ added in v1.3.0
Prettify formats JSON string with indentation. This is the recommended method for formatting JSON strings.
func (*Processor) ProcessBatch ¶
func (p *Processor) ProcessBatch(operations []BatchOperation, opts ...Config) ([]BatchResult, error)
ProcessBatch processes multiple operations in a single batch
func (*Processor) SafeGet ¶
func (p *Processor) SafeGet(jsonStr, path string) AccessResult
SafeGet performs a type-safe get operation with comprehensive error handling
func (*Processor) SaveToFile ¶
SaveToFile saves data to a JSON file using Config. This is the unified API that accepts variadic Config.
Example:
err := processor.SaveToFile("data.json", data, json.PrettyConfig())
func (*Processor) SaveToWriter ¶
SaveToWriter saves data to an io.Writer using Config. This is the unified API that accepts variadic Config.
Example:
var buf bytes.Buffer err := processor.SaveToWriter(&buf, data, json.PrettyConfig())
func (*Processor) Set ¶
Set sets a value in JSON at the specified path Returns:
- On success: modified JSON string and nil error
- On failure: original unmodified JSON string and error information
func (*Processor) SetCreate ¶ added in v1.3.0
SetCreate sets a value at the specified path, creating intermediate paths as needed. This is the unified API for set-with-path-creation operations.
Example:
result, err := processor.SetCreate(data, "users[0].profile.name", "Alice")
func (*Processor) SetFromParsed ¶ added in v1.2.0
func (p *Processor) SetFromParsed(parsed *ParsedJSON, path string, value any, opts ...Config) (*ParsedJSON, error)
SetFromParsed modifies a pre-parsed JSON document at the specified path. Returns a new ParsedJSON with the modified data (original is not modified).
OPTIMIZED: Skips JSON parsing, works directly on parsed data.
func (*Processor) SetMultiple ¶
func (p *Processor) SetMultiple(jsonStr string, updates map[string]any, opts ...Config) (string, error)
SetMultiple sets multiple values in JSON using a map of path-value pairs Returns:
- On success: modified JSON string and nil error
- On failure: original unmodified JSON string and error information
func (*Processor) SetMultipleCreate ¶ added in v1.3.0
func (p *Processor) SetMultipleCreate(jsonStr string, updates map[string]any, opts ...Config) (string, error)
SetMultipleCreate sets multiple values, creating intermediate paths as needed. This is the unified API for batch set-with-path-creation operations.
Example:
result, err := processor.SetMultipleCreate(data, map[string]any{"user.name": "Alice", "user.age": 30})
func (*Processor) ToJsonString ¶
ToJsonString converts any Go value to JSON string with HTML escaping (safe for web)
func (*Processor) ToJsonStringPretty ¶
ToJsonStringPretty converts any Go value to pretty JSON string with HTML escaping
func (*Processor) ToJsonStringStandard ¶
ToJsonStringStandard converts any Go value to compact JSON string without HTML escaping
func (*Processor) Unmarshal ¶
Unmarshal parses the JSON-encoded data and stores the result in the value pointed to by v. This method is fully compatible with encoding/json.Unmarshal. PERFORMANCE: Fast path for simple cases to avoid string conversion overhead.
func (*Processor) UnmarshalFromFile ¶ added in v1.0.6
UnmarshalFromFile reads JSON data from the specified file and unmarshals it into the provided value.
func (*Processor) ValidBytes ¶ added in v1.1.0
ValidBytes validates JSON format from byte slice (matches encoding/json.Valid signature) This method provides compatibility with the standard library's json.Valid function
func (*Processor) ValidateSchema ¶
func (p *Processor) ValidateSchema(jsonStr string, schema *Schema, opts ...Config) ([]ValidationError, error)
ValidateSchema validates JSON data against a schema
func (*Processor) WarmupCache ¶
func (p *Processor) WarmupCache(jsonStr string, paths []string, opts ...Config) (*WarmupResult, error)
WarmupCache pre-loads commonly used paths into cache to improve first-access performance
type Result ¶ added in v1.3.0
type Result[T any] struct { Value T // The result value (exported for backward compatibility) Exists bool // Whether the path exists Error error // Error if any }
Result represents a type-safe operation result with comprehensive error handling. This is the unified type for all type-safe operations.
Example:
result := json.GetResult[string](data, "user.name")
if result.Ok() {
name := result.Unwrap()
}
// Or with default
name := json.GetResult[string](data, "user.name").UnwrapOr("unknown")
type SamplingReader ¶ added in v1.2.0
type SamplingReader struct {
// contains filtered or unexported fields
}
SamplingReader samples data from large JSON arrays
func NewSamplingReader ¶ added in v1.2.0
func NewSamplingReader(reader io.Reader, sampleSize int) *SamplingReader
NewSamplingReader creates a new sampling reader
func (*SamplingReader) Sample ¶ added in v1.2.0
func (sr *SamplingReader) Sample(fn func(index int, item any) bool) error
Sample reads a sample of items from a JSON array using reservoir sampling. The reservoir sampling algorithm ensures uniform random sampling distribution: each item in the array has an equal probability of being included in the sample.
func (*SamplingReader) TotalRead ¶ added in v1.2.0
func (sr *SamplingReader) TotalRead() int64
TotalRead returns the total number of items read
type Schema ¶
type Schema struct {
Type string `json:"type,omitempty"`
Properties map[string]*Schema `json:"properties,omitempty"`
Items *Schema `json:"items,omitempty"`
Required []string `json:"required,omitempty"`
MinLength int `json:"minLength,omitempty"`
MaxLength int `json:"maxLength,omitempty"`
Minimum float64 `json:"minimum,omitempty"`
Maximum float64 `json:"maximum,omitempty"`
Pattern string `json:"pattern,omitempty"`
Format string `json:"format,omitempty"`
AdditionalProperties bool `json:"additionalProperties,omitempty"`
MinItems int `json:"minItems,omitempty"`
MaxItems int `json:"maxItems,omitempty"`
UniqueItems bool `json:"uniqueItems,omitempty"`
Enum []any `json:"enum,omitempty"`
Const any `json:"const,omitempty"`
MultipleOf float64 `json:"multipleOf,omitempty"`
ExclusiveMinimum bool `json:"exclusiveMinimum,omitempty"`
ExclusiveMaximum bool `json:"exclusiveMaximum,omitempty"`
Title string `json:"title,omitempty"`
Description string `json:"description,omitempty"`
Default any `json:"default,omitempty"`
Examples []any `json:"examples,omitempty"`
// contains filtered or unexported fields
}
Schema represents a JSON schema for validation
func DefaultSchema ¶
func DefaultSchema() *Schema
DefaultSchema returns a default schema configuration. All zero values are omitted for brevity; only non-zero defaults are set.
func NewSchemaWithConfig ¶ added in v1.3.0
func NewSchemaWithConfig(cfg SchemaConfig) *Schema
NewSchemaWithConfig creates a new Schema with the provided configuration. This is the recommended way to create configured Schema instances.
func (*Schema) HasMaxItems ¶
HasMaxItems returns true if MaxItems constraint is explicitly set
func (*Schema) HasMaxLength ¶
HasMaxLength returns true if MaxLength constraint is explicitly set
func (*Schema) HasMaximum ¶
HasMaximum returns true if Maximum constraint is explicitly set
func (*Schema) HasMinItems ¶
HasMinItems returns true if MinItems constraint is explicitly set
func (*Schema) HasMinLength ¶
HasMinLength returns true if MinLength constraint is explicitly set
func (*Schema) HasMinimum ¶
HasMinimum returns true if Minimum constraint is explicitly set
type SchemaConfig ¶ added in v1.3.0
type SchemaConfig struct {
Type string
Properties map[string]*Schema
Items *Schema
Required []string
MinLength *int
MaxLength *int
Minimum *float64
Maximum *float64
Pattern string
Format string
AdditionalProperties *bool
MinItems *int
MaxItems *int
UniqueItems bool
Enum []any
Const any
MultipleOf *float64
ExclusiveMinimum *bool
ExclusiveMaximum *bool
Title string
Description string
Default any
Examples []any
}
SchemaConfig provides configuration options for creating a Schema. This follows the Config pattern as required by the design guidelines.
type SecurityValidator ¶ added in v1.0.6
type SecurityValidator interface {
Validator
// ValidateContent checks raw content for security issues.
// This is called on the raw input before JSON parsing.
ValidateContent(content string) error
}
SecurityValidator checks for dangerous patterns in content. Extends Validator with content-specific security checks.
type Stats ¶
type Stats struct {
CacheSize int64 `json:"cache_size"`
CacheMemory int64 `json:"cache_memory"`
MaxCacheSize int `json:"max_cache_size"`
HitCount int64 `json:"hit_count"`
MissCount int64 `json:"miss_count"`
HitRatio float64 `json:"hit_ratio"`
CacheTTL time.Duration `json:"cache_ttl"`
CacheEnabled bool `json:"cache_enabled"`
IsClosed bool `json:"is_closed"`
MemoryEfficiency float64 `json:"memory_efficiency"`
OperationCount int64 `json:"operation_count"`
ErrorCount int64 `json:"error_count"`
}
Stats provides processor performance statistics
type StreamIterator ¶ added in v1.2.0
type StreamIterator struct {
// contains filtered or unexported fields
}
StreamIterator provides memory-efficient iteration over large JSON arrays It processes elements one at a time without loading the entire array into memory
func NewStreamIterator ¶ added in v1.2.0
func NewStreamIterator(reader io.Reader) *StreamIterator
NewStreamIterator creates a stream iterator from a reader with default settings
func NewStreamIteratorWithConfig ¶ added in v1.2.0
func NewStreamIteratorWithConfig(reader io.Reader, config StreamIteratorConfig) *StreamIterator
NewStreamIteratorWithConfig creates a stream iterator with custom configuration PERFORMANCE: Configurable buffer size improves throughput for large JSON streams
func (*StreamIterator) Err ¶ added in v1.2.0
func (si *StreamIterator) Err() error
Err returns any error encountered during iteration
func (*StreamIterator) Index ¶ added in v1.2.0
func (si *StreamIterator) Index() int
Index returns the current index
func (*StreamIterator) Next ¶ added in v1.2.0
func (si *StreamIterator) Next() bool
Next advances to the next element Returns true if there is a next element, false otherwise
func (*StreamIterator) Value ¶ added in v1.2.0
func (si *StreamIterator) Value() any
Value returns the current element
type StreamIteratorConfig ¶ added in v1.2.0
type StreamIteratorConfig struct {
BufferSize int // Buffer size for underlying reader (default: 32KB)
ReadAhead bool // Enable read-ahead buffering for improved performance
}
StreamIteratorConfig holds configuration options for StreamIterator
type StreamObjectIterator ¶ added in v1.2.0
type StreamObjectIterator struct {
// contains filtered or unexported fields
}
StreamObjectIterator provides memory-efficient iteration over JSON objects
func NewStreamObjectIterator ¶ added in v1.2.0
func NewStreamObjectIterator(reader io.Reader) *StreamObjectIterator
NewStreamObjectIterator creates a stream object iterator from a reader
func (*StreamObjectIterator) Err ¶ added in v1.2.0
func (soi *StreamObjectIterator) Err() error
Err returns any error encountered
func (*StreamObjectIterator) Key ¶ added in v1.2.0
func (soi *StreamObjectIterator) Key() string
Key returns the current key
func (*StreamObjectIterator) Next ¶ added in v1.2.0
func (soi *StreamObjectIterator) Next() bool
Next advances to the next key-value pair
func (*StreamObjectIterator) Value ¶ added in v1.2.0
func (soi *StreamObjectIterator) Value() any
Value returns the current value
type StreamingProcessor ¶ added in v1.2.0
type StreamingProcessor struct {
// contains filtered or unexported fields
}
StreamingProcessor handles large JSON files efficiently
func NewStreamingProcessor ¶ added in v1.2.0
func NewStreamingProcessor(reader io.Reader, bufferSize int) *StreamingProcessor
NewStreamingProcessor creates a streaming processor for large JSON
func (*StreamingProcessor) Close ¶ added in v1.3.0
func (sp *StreamingProcessor) Close() error
Close releases any resources held by the streaming processor. Note: This does NOT close the underlying reader - the caller owns it. Provided for API consistency and future extensibility. PERFORMANCE v2: Returns processor to pool for reuse
func (*StreamingProcessor) GetStats ¶ added in v1.2.0
func (sp *StreamingProcessor) GetStats() StreamingStats
GetStats returns streaming statistics
func (*StreamingProcessor) StreamArray ¶ added in v1.2.0
func (sp *StreamingProcessor) StreamArray(fn func(index int, item any) bool) error
StreamArray streams array elements one at a time This is memory-efficient for large arrays
func (*StreamingProcessor) StreamArrayChunked ¶ added in v1.2.0
func (sp *StreamingProcessor) StreamArrayChunked(chunkSize int, fn func([]any) error) error
StreamArrayChunked streams array elements in chunks for memory-efficient processing The chunkSize parameter controls how many elements are processed at once
func (*StreamingProcessor) StreamObject ¶ added in v1.2.0
func (sp *StreamingProcessor) StreamObject(fn func(key string, value any) bool) error
StreamObject streams object key-value pairs
func (*StreamingProcessor) StreamObjectChunked ¶ added in v1.2.0
func (sp *StreamingProcessor) StreamObjectChunked(chunkSize int, fn func(map[string]any) error) error
StreamObjectChunked streams object key-value pairs in chunks for memory-efficient processing The chunkSize parameter controls how many pairs are processed at once
type StreamingStats ¶ added in v1.2.0
StreamingStats tracks streaming processing statistics
type StructureValidator ¶ added in v1.3.0
type StructureValidator struct {
// contains filtered or unexported fields
}
StructureValidator handles JSON structure and nesting validation. RESPONSIBILITY: Validates JSON syntax, structure, and depth limits.
func NewStructureValidator ¶ added in v1.3.0
func NewStructureValidator(maxNestingDepth int64) *StructureValidator
NewStructureValidator creates a new structure validator.
func (*StructureValidator) ValidateNesting ¶ added in v1.3.0
func (sv *StructureValidator) ValidateNesting(jsonStr string) error
ValidateNesting checks the nesting depth of the JSON.
func (*StructureValidator) ValidateStructure ¶ added in v1.3.0
func (sv *StructureValidator) ValidateStructure(jsonStr string) error
ValidateStructure checks the JSON structure for validity.
type SyntaxError ¶
type SyntaxError struct {
Offset int64 // error occurred after reading Offset bytes
// contains filtered or unexported fields
}
SyntaxError is a description of a JSON syntax error. Unmarshal will return a SyntaxError if the JSON can't be parsed.
func (*SyntaxError) Error ¶
func (e *SyntaxError) Error() string
type TextMarshaler ¶
TextMarshaler is the interface implemented by an object that can marshal itself into a textual form.
MarshalText encodes the receiver into UTF-8-encoded text and returns the result.
type TextUnmarshaler ¶
TextUnmarshaler is the interface implemented by an object that can unmarshal a textual representation of itself.
UnmarshalText must be able to decode the form generated by MarshalText. UnmarshalText must copy the text if it wishes to retain the text after returning.
type Token ¶
type Token any
Token holds a value of one of these types:
Delim, for the four JSON delimiters [ ] { }
bool, for JSON booleans
float64, for JSON numbers
Number, for JSON numbers
string, for JSON string literals
nil, for JSON null
type TypeEncoder ¶ added in v1.3.0
type TypeEncoder interface {
// Encode converts a specific type to its JSON representation.
// Return the JSON string (including quotes for strings) or an error.
Encode(v reflect.Value) (string, error)
}
TypeEncoder handles encoding for specific reflect.Types. Register via Config.CustomTypeEncoders map.
Example:
type TimeEncoder struct{}
func (e *TimeEncoder) Encode(v reflect.Value) (string, error) {
t := v.Interface().(time.Time)
return `"` + t.Format(time.RFC3339) + `"`, nil
}
cfg := json.DefaultConfig()
cfg.CustomTypeEncoders = map[reflect.Type]json.TypeEncoder{
reflect.TypeOf(time.Time{}): &TimeEncoder{},
}
type UnmarshalTypeError ¶
type UnmarshalTypeError struct {
Value string // description of JSON value - "bool", "array", "number -5"
Type reflect.Type // type of Go value it could not be assigned to
Offset int64 // error occurred after reading Offset bytes
Struct string // name of the root type containing the field
Field string // the full path from root node to the value
Err error // may be nil
}
UnmarshalTypeError describes a JSON value that was not appropriate for a value of a specific Go type.
func (*UnmarshalTypeError) Error ¶
func (e *UnmarshalTypeError) Error() string
func (*UnmarshalTypeError) Unwrap ¶
func (e *UnmarshalTypeError) Unwrap() error
type Unmarshaler ¶
Unmarshaler is the interface implemented by types that can unmarshal a JSON description of themselves. The input can be assumed to be a valid encoding of a JSON value. UnmarshalJSON must copy the JSON data if it wishes to retain the data after returning.
By convention, to approximate the behavior of Unmarshal itself, Unmarshalers implement UnmarshalJSON([]byte("null")) as a no-op.
type UnsupportedTypeError ¶
UnsupportedTypeError is returned by Marshal when attempting to encode an unsupported value type.
func (*UnsupportedTypeError) Error ¶
func (e *UnsupportedTypeError) Error() string
type UnsupportedValueError ¶
UnsupportedValueError is returned by Marshal when attempting to encode an unsupported value.
func (*UnsupportedValueError) Error ¶
func (e *UnsupportedValueError) Error() string
type ValidationChain ¶ added in v1.3.0
type ValidationChain []Validator
ValidationChain runs multiple validators in sequence. Stops at the first error encountered.
func (ValidationChain) Validate ¶ added in v1.3.0
func (vc ValidationChain) Validate(jsonStr string) error
Validate executes all validators in order, stopping at first error.
type ValidationError ¶
ValidationError represents a schema validation error
func ValidateSchema ¶
func ValidateSchema(jsonStr string, schema *Schema, cfg ...Config) ([]ValidationError, error)
ValidateSchema validates JSON data against a schema
func (*ValidationError) Error ¶
func (ve *ValidationError) Error() string
type Validator ¶ added in v1.0.6
type Validator interface {
// Validate checks JSON string for issues.
// Returns nil if valid, or an error describing the problem.
Validate(jsonStr string) error
}
Validator validates JSON input before processing. Implement this interface to add custom validation logic.
Example:
type SizeValidator struct { MaxSize int64 }
func (v *SizeValidator) Validate(jsonStr string) error {
if int64(len(jsonStr)) > v.MaxSize {
return fmt.Errorf("JSON exceeds max size: %d", v.MaxSize)
}
return nil
}
type WarmupResult ¶
type WarmupResult struct {
TotalPaths int `json:"total_paths"`
Successful int `json:"successful"`
Failed int `json:"failed"`
SuccessRate float64 `json:"success_rate"`
FailedPaths []string `json:"failed_paths,omitempty"`
}
WarmupResult represents the result of a cache warmup operation
func WarmupCache ¶ added in v1.1.0
func WarmupCache(jsonStr string, paths []string, cfg ...Config) (*WarmupResult, error)
WarmupCache pre-warms the cache for frequently accessed paths. This can improve performance for subsequent operations on the same JSON.