json

package module
v1.4.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 7, 2026 License: MIT Imports: 29 Imported by: 0

README

cybergodev/json

Go Version GoDoc MIT License Thread Safe Security

A high-performance, thread-safe Go JSON processing library with 100% encoding/json compatibility. Powerful path syntax, type safety, streaming processing, production-grade performance.

中文文档 | www.cybergo.dev/json


Why cybergodev/json

Feature encoding/json cybergodev/json
Path-based access Manual unmarshal json.Get(data, "users[0].name")
Negative index - items[-1] gets last element
Flatten nested arrays - users{flat:tags}
JSON Pointer (RFC 6901) - /users/0/name
Type-safe defaults - GetString(data, "path", "default")
Streaming large files - Built-in streaming processors
Memory pooling - sync.Pool for hot paths
Path caching - Smart cache with TTL
Batch operations - ProcessBatch() for bulk work
100% Compatibility Native Drop-in replacement (signatures extended with optional config)

Features

  • 100% Compatible - Drop-in replacement for encoding/json, all standard function signatures supported with optional config extension
  • Powerful Paths - Dot notation, array slicing, field extraction, JSON Pointer (RFC 6901)
  • High Performance - Smart caching, memory pooling, optimized hot paths
  • Type Safe - Generics support with GetTyped[T], built-in defaults, AccessResult type conversion
  • Feature Rich - Batch operations, streaming, file I/O, schema validation, deep merge, JSONL
  • Production Ready - Thread-safe, comprehensive error handling, security hardened, health monitoring

Installation

go get github.com/cybergodev/json

Requirements: Go 1.25 or later

Import:

import "github.com/cybergodev/json"

Quick Start

package main

import (
    "fmt"
    "github.com/cybergodev/json"
)

func main() {
    data := `{"user": {"name": "Alice", "age": 28, "tags": ["premium", "verified"]}}`

    // Simple field access (returns value directly, no error)
    name := json.GetString(data, "user.name")
    fmt.Println(name) // "Alice"

    // Type-safe retrieval with generics
    age := json.GetTyped[int](data, "user.age", 0)
    fmt.Println(age) // 28

    // With default value (no panic on missing path)
    email := json.GetTyped[string](data, "user.email", "unknown@example.com")
    fmt.Println(email) // "unknown@example.com"

    // Negative indexing (last element)
    lastTag, _ := json.Get(data, "user.tags[-1]")
    fmt.Println(lastTag) // "verified"

    // Modify data
    updated, _ := json.Set(data, "user.age", 29)
    newAge := json.GetInt(updated, "user.age")
    fmt.Println(newAge) // 29

    // 100% encoding/json compatible
    bytes, _ := json.Marshal(map[string]any{"status": "ok"})
    fmt.Println(string(bytes)) // {"status":"ok"}
}

Path Syntax Reference

Syntax Description Example
.property Access property user.name -> "Alice"
[n] Array index items[0] -> first element
[-n] Negative index (from end) items[-1] -> last element
[start:end] Array slice items[1:3] -> elements 1-2
[start:end:step] Slice with step items[::2] -> every other element
[+] Append to array items[+] -> append position
{field} Extract field from all elements users{name} -> ["Alice", "Bob"]
{flat:field} Flatten nested arrays users{flat:tags} -> merge all tags
/pointer JSON Pointer (RFC 6901) /users/0/name -> "Alice"

Core API

Data Retrieval
// Basic getters - return value directly, accept optional default
// When path is missing or type mismatches: returns zero value, or default if provided
json.Get(data, "user.name")            // (any, error)
json.GetString(data, "user.name")      // string
json.GetInt(data, "user.age")          // int
json.GetFloat(data, "user.score")      // float64
json.GetBool(data, "user.active")      // bool
json.GetArray(data, "user.tags")       // []any
json.GetObject(data, "user.profile")   // map[string]any

// Type-safe generic retrieval
json.GetTyped[string](data, "user.name", "default")
json.GetTyped[[]int](data, "numbers", nil)
json.GetTyped[User](data, "user", User{})  // custom struct

// Typed getters with defaults
json.GetString(data, "user.name", "Anonymous")
json.GetInt(data, "user.age", 0)
json.GetBool(data, "user.active", false)
json.GetFloat(data, "user.score", 0.0)
json.GetTyped[[]any](data, "user.tags", []any{})

// Safe access with result type and type conversion
result := json.SafeGet(data, "user.age")
if result.Ok() {
    age, _ := result.AsInt()
    fmt.Println(age)
}

// Batch retrieval
results, err := json.GetMultiple(data, []string{"user.name", "user.age"})
Data Modification
// Basic set - returns modified JSON on success, original data on failure
result, err := json.Set(data, "user.name", "Bob")

// Auto-create paths with config
cfg := json.DefaultConfig()
cfg.CreatePaths = true
result, err := json.Set(data, "user.profile.level", "gold", cfg)

// Append to array
result, _ := json.Set(data, "user.tags[+]", "new-tag")

// Batch set
result, _ := json.SetMultiple(data, map[string]any{
    "user.name": "Bob",
    "user.age":  30,
})

// Delete
result, err := json.Delete(data, "user.temp")

// Delete with null cleanup
result, err = json.DeleteClean(data, "user.temp")

// Set with auto-create (creates intermediate paths)
result, err = json.SetCreate(data, "user.profile.level", "gold")

// Batch set with auto-create
result, err = json.SetMultipleCreate(data, map[string]any{
    "user.profile.level": "gold",
    "user.profile.badge": "star",
})
Encoding and Formatting
// Standard encoding (100% compatible)
bytes, _ := json.Marshal(data)
json.Unmarshal(bytes, &target)
bytes, _ := json.MarshalIndent(data, "", "  ")

// Quick formatting
pretty, _ := json.Prettify(jsonStr)       // pretty print

// Compact/minify using encoding/json compatible buffer API
var buf bytes.Buffer
json.Compact(&buf, []byte(jsonStr))
compact := buf.String()

// Encoding with config
cfg := json.DefaultConfig()
cfg.Pretty = true
cfg.SortKeys = true
result, _ := json.Encode(data, cfg)

// Preset configs
result, _ := json.Encode(data, json.PrettyConfig())
File Operations
// Load and save (package-level functions)
jsonStr, _ := json.LoadFromFile("data.json")
json.SaveToFile("output.json", data, json.PrettyConfig())

// Struct/Map serialization
json.MarshalToFile("user.json", user)
json.UnmarshalFromFile("user.json", &user)

// Write to any io.Writer
json.SaveToWriter(writer, data, cfg)

// File iteration (process without loading entire file)
err := json.ForeachFile("data.json", func(key any, item *json.IterableValue) error {
    fmt.Println(item.GetString("name"))
    return nil
})

err = json.ForeachFileChunked("large.json", 100, func(chunk []*json.IterableValue) error {
    // Process 100 items at a time
    return nil
})

// Processor-based file operations with full config support
processor, _ := json.New(json.DefaultConfig())
defer processor.Close()
jsonStr, _ = processor.LoadFromFile("data.json")
_ = processor.SaveToFile("output.json", data, json.PrettyConfig())
JSON Utilities
// Compare two JSON strings
equal, _  := json.CompareJSON(json1, json2)

// Union merge (default) - combines all keys
merged, _ := json.MergeJSON(json1, json2)

// Intersection merge - only common keys
cfg := json.DefaultConfig()
cfg.MergeMode = json.MergeIntersection
merged, _ = json.MergeJSON(json1, json2, cfg)

// Difference merge - keys in json1 but not json2
cfg.MergeMode = json.MergeDifference
merged, _ = json.MergeJSON(json1, json2, cfg)

// Merge multiple JSON objects
merged, _ = json.MergeMany([]string{json1, json2, json3})

Configuration

Custom Configuration
cfg := json.DefaultConfig()     // start from safe defaults
cfg.EnableCache = true
cfg.MaxCacheSize = 256
cfg.CacheTTL = 5 * time.Minute
cfg.MaxConcurrency = 50
cfg.CreatePaths = true          // auto-create paths on Set
cfg.CleanupNulls = true         // cleanup nulls after Delete

processor, err := json.New(cfg)
if err != nil {
    // handle configuration error
}
defer processor.Close()

// Use processor methods
result, _ := processor.Get(jsonStr, "user.name")
stats := processor.GetStats()
health := processor.GetHealthStatus()
processor.ClearCache()
Preset Configurations
cfg := json.DefaultConfig()   // balanced defaults
cfg := json.SecurityConfig()  // for untrusted input
cfg := json.PrettyConfig()    // for pretty output

Advanced Features

Iteration
// Basic iteration
json.Foreach(data, func(key any, item *json.IterableValue) {
    name := item.GetString("name")
    fmt.Printf("Key: %v, Name: %s\n", key, name)
})

// With path
json.ForeachWithPath(data, "users", func(key any, item *json.IterableValue) {
    name := item.GetString("name")
    fmt.Printf("Key: %v, Name: %s\n", key, name)
})

// Nested iteration (specify nested field path)
json.ForeachNested(data, func(key any, item *json.IterableValue) {
    item.ForeachNested("items", func(nestedKey any, nestedItem *json.IterableValue) {
        fmt.Printf("Nested: %v\n", nestedItem.Get("id"))
    })
})

// Error-returning iteration (use ForeachFile for file-based error callbacks)
err := json.ForeachWithPath(data, "users", func(key any, item *json.IterableValue) {
    if item.IsNull("id") {
        log.Printf("warning: missing id at key %v", key)
    }
})

// Iterator control (break / continue)
_ = json.ForeachWithPathAndControl(data, "users", func(key any, value any) json.IteratorControl {
    if key.(int) > 5 {
        return json.IteratorBreak
    }
    return json.IteratorNormal
})

// Iterate and return original JSON string
result, err := json.ForeachReturn(data, func(key any, item *json.IterableValue) {
    // iterate over all elements; original JSON string is returned unchanged
})
Batch Operations
data := `{"user": {"name": "Alice", "age": 28, "temp": "value"}}`

operations := []json.BatchOperation{
    {Type: "get", JSONStr: data, Path: "user.name", ID: "op1"},
    {Type: "set", JSONStr: data, Path: "user.age", Value: 25},
    {Type: "delete", JSONStr: data, Path: "user.temp"},
}
results, err := json.ProcessBatch(operations)
Schema Validation
schema := &json.Schema{
    Type:     "object",
    Required: []string{"name", "email"},
    Properties: map[string]*json.Schema{
        "name":  {Type: "string", MinLength: 1, MaxLength: 100},
        "email": {Type: "string", Format: "email"},
        "age":   {Type: "integer", Minimum: 0, Maximum: 150},
    },
}

validationErrors, err := json.ValidateSchema(jsonStr, schema)

The Schema type supports JSON Schema Draft 7 subset fields: Type, Properties, Required, Items, Pattern, AdditionalProperties, Enum, Const, Format, MinLength, MaxLength, Minimum, Maximum, ExclusiveMinimum, ExclusiveMaximum, MultipleOf, MinItems, MaxItems, UniqueItems, Title, Description, Default.

PreParse and CompiledPath
processor, _ := json.New(json.DefaultConfig())
defer processor.Close()

// Pre-parse once, query many times (avoids re-parsing)
parsed, _ := processor.PreParse(jsonStr)
name, _    := processor.GetFromParsed(parsed, "user.name")
age, _     := processor.GetFromParsed(parsed, "user.age")
updated, _ := processor.SetFromParsed(parsed, "user.age", 30)

// Compile a path for fast repeated access
compiled, _ := processor.CompilePath("user.profile.settings.theme")
value, _    := processor.GetCompiled(jsonStr, compiled)
Encode Utilities
// EncodeStream - encode slice as JSON array
streamJSON, _ := json.EncodeStream(users, json.PrettyConfig())

// EncodeBatch - encode key-value pairs as JSON object
batchJSON, _ := json.EncodeBatch(pairs, cfg)

// EncodeFields - encode only specific fields (filter sensitive data)
fieldsJSON, _ := json.EncodeFields(user, []string{"id", "name", "email"}, cfg)
JSONL Processing
// Convert between JSON array and JSONL
jsonlData, _    := json.ToJSONL(records)          // []any -> JSONL bytes
jsonlString, _  := json.ToJSONLString(records)    // []any -> JSONL string
records, _      := json.ParseJSONL(jsonlData)     // JSONL bytes -> []any

// Stream JSONL from reader
processor, _ := json.New(json.DefaultConfig())
defer processor.Close()
err := processor.StreamJSONL(reader, func(lineNum int, item *json.IterableValue) error {
    fmt.Printf("Line %d: %s\n", lineNum, item.GetString("id"))
    return nil
})

// JSONL writer
writer := json.NewJSONLWriter(bufWriter)
writer.Write(record1)
writer.Write(record2)
writer.WriteAll(records)
stats := writer.Stats() // LinesProcessed, BytesWritten

// NDJSON file processor
ndjson := json.NewNDJSONProcessor(json.DefaultConfig())
err = ndjson.ProcessFile("data.ndjson", func(lineNum int, obj map[string]any) error {
    fmt.Printf("Line %d: %v\n", lineNum, obj["id"])
    return nil
})

// JSONL filter, map, reduce
filtered, _ := processor.FilterJSONL(reader, func(item *json.IterableValue) bool {
    return item.GetBool("active")
})
mapped, _   := processor.MapJSONL(reader, func(lineNum int, item *json.IterableValue) (any, error) {
    return map[string]any{"id": item.GetString("id")}, nil
})
result, _   := processor.ReduceJSONL(reader, 0, func(acc any, item *json.IterableValue) any {
    count := item.GetInt("count")
    return acc.(int) + count
})
Streaming Iterators
// Stream large JSON arrays without loading into memory
reader := strings.NewReader(largeJSONArray)
iter := json.NewStreamIterator(reader)
for iter.Next() {
    item := iter.Value()
    // process item
}

// Stream JSON objects (key-value pairs)
objIter := json.NewStreamObjectIterator(reader)
for objIter.Next() {
    key, value := objIter.Key(), objIter.Value()
}

// Batch processing with in-memory data
batchIter := json.NewBatchIterator(items, json.DefaultConfig())
for batchIter.HasNext() {
    batch := batchIter.NextBatch()
}
Parallel Processing
// ParallelIterator - parallel processing with worker pool
items := []any{"a", "b", "c", "d"}
iter := json.NewParallelIterator(items)

// Process items in parallel
err := iter.ForEach(func(idx int, item any) error {
    // process each item concurrently
    return nil
})

// Process in batches
err = iter.ForEachBatch(2, func(batchIdx int, batch []any) error {
    // process batch of items
    return nil
})

// Parallel JSONL streaming
processor, _ := json.New(json.DefaultConfig())
defer processor.Close()
err = processor.StreamJSONLParallel(reader, 4, func(lineNum int, item *json.IterableValue) error {
    // Process item with 4 parallel workers
    return nil
})
Hooks
processor, _ := json.New(json.DefaultConfig())
defer processor.Close()

// Logging hook - takes any type with Info(string, ...any) method
processor.AddHook(json.LoggingHook(slog.Default()))

// Timing hook - takes an interface with Record(op string, duration time.Duration)
type MetricsRecorder struct{}
func (m *MetricsRecorder) Record(op string, d time.Duration) { /* record */ }
processor.AddHook(json.TimingHook(&MetricsRecorder{}))

// Validation hook - takes func(jsonStr, path string) error
processor.AddHook(json.ValidationHook(func(jsonStr, path string) error {
    if len(path) > 100 {
        return fmt.Errorf("path too long: %s", path)
    }
    return nil
}))

// Error hook - takes func(ctx json.HookContext, err error) error
processor.AddHook(json.ErrorHook(func(ctx json.HookContext, err error) error {
    log.Printf("operation %s on path %s failed: %v", ctx.Operation, ctx.Path, err)
    return err // return original or transformed error
}))

// Custom hook using HookFunc
processor.AddHook(json.HookFunc{
    BeforeFn: func(ctx json.HookContext) error {
        fmt.Printf("before: %s %s\n", ctx.Operation, ctx.Path)
        return nil
    },
    AfterFn: func(ctx json.HookContext, result any, err error) (any, error) {
        fmt.Printf("after: %s (err=%v)\n", ctx.Operation, err)
        return result, err
    },
})

Common Use Cases

API Response Processing
apiResponse := `{
    "status": "success",
    "data": {
        "users": [{"id": 1, "name": "Alice", "permissions": ["read", "write"]}],
        "pagination": {"total": 25, "page": 1}
    }
}`

// Quick extraction
status := json.GetString(apiResponse, "status")
total  := json.GetInt(apiResponse, "data.pagination.total")

// Extract all user names
names, _ := json.Get(apiResponse, "data.users{name}")
// Result: ["Alice"]

// Flatten all permissions
permissions, _ := json.Get(apiResponse, "data.users{flat:permissions}")
// Result: ["read", "write"]

// JSON Pointer access
name, _ := json.Get(apiResponse, "/data/users/0/name")
// Result: "Alice"
Configuration Management
config := `{
    "database": {"host": "localhost", "port": 5432},
    "cache": {"enabled": true}
}`

// Type-safe with defaults
dbHost        := json.GetString(config, "database.host", "localhost")
dbPort        := json.GetInt(config, "database.port", 5432)
cacheEnabled  := json.GetBool(config, "cache.enabled", false)

// Dynamic update
updated, _ := json.SetMultiple(config, map[string]any{
    "database.host": "prod-db.example.com",
    "cache.ttl":     3600,
})
Multi-Source Data Merge
// Merge configs from multiple sources
defaults := `{"timeout": 30, "retries": 3, "debug": false}`
file     := `{"timeout": 60, "debug": true}`
env      := `{"retries": 5}`

// Union merge (default behavior)
merged, _ := json.MergeMany([]string{defaults, file, env})
// Result: {"timeout": 60, "retries": 5, "debug": true}

Performance Monitoring

// Package-level monitoring
stats := json.GetStats()
fmt.Printf("Operations: %d\n", stats.OperationCount)
fmt.Printf("Cache Hit Rate: %.2f%%\n", stats.HitRatio*100)

health := json.GetHealthStatus()
fmt.Printf("Health Status: %v\n", health.Healthy)

// Cache management
json.ClearCache()

// Cache warmup - preload paths for faster access
paths := []string{"user.name", "user.age", "user.profile"}
result, _ := json.WarmupCache(jsonStr, paths)

// Processor-level monitoring
processor, _ := json.New(json.DefaultConfig())
defer processor.Close()
stats := processor.GetStats()
health := processor.GetHealthStatus()
processor.ClearCache()

Migrating from encoding/json

Simply change the import:

// Before
import "encoding/json"

// After
import "github.com/cybergodev/json"

All standard functions are fully compatible:

  • json.Marshal() / json.Unmarshal()
  • json.MarshalIndent()
  • json.NewEncoder() / json.NewDecoder()
  • json.Valid()
  • json.Compact() / json.Indent() / json.HTMLEscape()

Compatible types: Encoder, Decoder, Number, Token, Delim, SyntaxError, UnmarshalTypeError, InvalidUnmarshalError, UnsupportedTypeError, UnsupportedValueError, MarshalerError.

Note: RawMessage is not currently re-exported. Use encoding/json.RawMessage if needed.

See Compatibility Guide for full details.


Security Configuration

// For handling untrusted JSON input
secureConfig := json.SecurityConfig()
// Features:
// - Full security scanning enabled
// - Conservative size limits (max 10MB)
// - Strict mode validation
// - Prototype pollution protection
// - Path traversal protection

processor, _ := json.New(secureConfig)
defer processor.Close()
Security Utilities
// Register custom dangerous patterns
json.RegisterDangerousPattern(json.DangerousPattern{
    Pattern: "eval\\(",
    Name:    "eval-injection",
    Level:   json.PatternLevelCritical,
})

// List all registered patterns
patterns := json.ListDangerousPatterns()

// Safe error reporting (no internal details leaked)
safeMsg := json.SafeError(err)

// Redact sensitive paths in logs
redacted := json.RedactedPath("user.password")
Error Handling
// Sentinel errors for programmatic checks
errors.Is(err, json.ErrInvalidJSON)
errors.Is(err, json.ErrPathNotFound)
errors.Is(err, json.ErrTypeMismatch)
errors.Is(err, json.ErrSizeLimit)
errors.Is(err, json.ErrSecurityViolation)

// Structured error with context
var jsonErr *json.JsonsError
if errors.As(err, &jsonErr) {
    fmt.Printf("op=%s path=%s: %s\n", jsonErr.Op, jsonErr.Path, jsonErr.Message)
}

See Security Guide for detailed security best practices.


Example Code

File Description
1_basic_usage.go Core operations
2_advanced_features.go Complex paths, nested extraction
3_production_ready.go Thread-safe patterns
4_error_handling.go Error handling patterns
5_encoding_options.go Encoding configuration
6_validation.go Schema validation
7_type_conversion.go Type conversion
8_helper_functions.go Helper utilities
9_iterator_functions.go Iteration patterns
10_file_operations.go File I/O
11_with_defaults.go Default value handling
12_advanced_delete.go Delete operations
13_batch_operations.go Batch processing and caching
14_streaming_iterators.go Streaming iterators
15_jsonl_processing.go JSONL format processing
16_hooks_and_security.go Hooks and security patterns
17_advanced_patterns.go PreParse, CompiledPath, advanced patterns
# Run individual examples (build tag required)
go run -tags=example examples/1_basic_usage.go
go run -tags=example examples/2_advanced_features.go

Documentation


License

MIT License - See LICENSE file for details.

Documentation

Overview

Package json provides a high-performance, thread-safe JSON processing library with 100% encoding/json compatibility and advanced path operations.

The package uses an internal package for implementation details:

  • internal: Private implementation including path parsing, navigation, extraction, caching, array utilities, security helpers, and encoding utilities

Most users can simply import the root package:

import "github.com/cybergodev/json"

Basic Usage

Simple operations (100% compatible with encoding/json):

data, err := json.Marshal(value)
err = json.Unmarshal(data, &target)

Advanced path operations:

value, err := json.Get(`{"user":{"name":"John"}}`, "user.name")
result, err := json.Set(`{"user":{}}`, "user.age", 30)

Type-safe operations:

name := json.GetString(jsonStr, "user.name", "")
age := json.GetInt(jsonStr, "user.age", 0)

Advanced processor for complex operations:

processor, err := json.New() // Use default config
if err != nil {
    // handle error
}
defer processor.Close()
value, err := processor.Get(jsonStr, "complex.path[0].field")

Configuration

Use DefaultConfig and optional parameters for custom configuration:

cfg := json.DefaultConfig()
cfg.EnableCache = true
processor, err := json.New(cfg)
if err != nil {
    // handle error
}
defer processor.Close()

Key Features

  • 100% encoding/json compatibility - drop-in replacement
  • High-performance path operations with smart caching
  • Thread-safe concurrent operations
  • Type-safe generic operations with Go generics
  • Memory-efficient resource pooling
  • Production-ready error handling and validation

Package Structure

The package is organized with all public API in the root package:

  • Core types: Processor, Config
  • Error types: JsonsError, various error constructors
  • Encoding types: Number

Implementation details are in the internal/ package:

  • Path parsing and navigation utilities
  • Extraction and segment handling
  • Cache and array utilities
  • Security and encoding helpers

Core Types Organization

Core types are organized in the following files:

  • types.go: All type definitions (Config, Stats, Schema, Result[T], etc.)
  • processor.go: Processor struct, constructor, lifecycle, typed getters
  • operation.go: Set/Delete implementations, array operations, fast paths
  • recursive.go: Unified recursive processing, segment handlers
  • path.go: Path parsing and navigation
  • encoding.go: JSON encoding/decoding, streaming, schema validation
  • api.go: Package-level API functions and delegation helpers
  • file.go: File operations and NDJSON processing
  • iterator.go: Iteration utilities and streaming iterators
  • security.go: Security validation and dangerous pattern detection
  • helpers.go: Type conversion, deep copy, merge, compare utilities
  • errors.go: Error types, sentinels, and classification
  • interfaces.go: Extension interfaces (hooks, custom encoders, validators)
  • processor_streamjsonl.go: JSONL streaming operations
  • performance.go: Path cache warmup, decoder pool, bulk operations

Package json provides extension interfaces for customizing JSON processing behavior. These interfaces enable users to inject custom validation, encoding, and middleware logic.

Package json provides a high-performance, thread-safe JSON processing library with 100% encoding/json compatibility and advanced path operations.

Key Features:

  • 100% encoding/json compatibility - drop-in replacement
  • High-performance path operations with smart caching
  • Thread-safe concurrent operations
  • Type-safe generic operations with Go generics
  • Memory-efficient resource pooling
  • Production-ready error handling and validation

Basic Usage:

// Simple operations (100% compatible with encoding/json)
data, err := json.Marshal(value)
err = json.Unmarshal(data, &target)

// Advanced path operations
value, err := json.Get(`{"user":{"name":"John"}}`, "user.name")
result, err := json.Set(`{"user":{}}`, "user.age", 30)

// Type-safe operations
name := json.GetString(jsonStr, "user.name", "")
age := json.GetInt(jsonStr, "user.age", 0)

// Advanced processor for complex operations
processor := json.New() // Use default config
defer processor.Close()
value, err := processor.Get(jsonStr, "complex.path[0].field")

Index

Constants

View Source
const (

	// DefaultMaxJSONSize is the maximum allowed JSON input size (100MB).
	// JSON documents larger than this will be rejected with ErrSizeLimit.
	// Adjust via Config.MaxJSONSize if processing larger documents.
	DefaultMaxJSONSize = 100 * 1024 * 1024

	// DefaultMaxNestingDepth is the maximum allowed JSON nesting depth (200).
	// Deeply nested JSON may indicate malicious input.
	// Adjust via Config.MaxNestingDepthSecurity if needed.
	DefaultMaxNestingDepth = 200

	// DefaultMaxPathDepth is the maximum allowed path depth (50).
	// Limits the number of segments in a path like "a.b.c.d...".
	// Adjust via Config.MaxPathDepth if needed.
	DefaultMaxPathDepth = 50

	// DefaultMaxConcurrency is the maximum concurrent operations (50).
	// Limits parallel operations to prevent resource exhaustion.
	// Adjust via Config.MaxConcurrency for high-throughput scenarios.
	DefaultMaxConcurrency = 50

	// DefaultMaxSecuritySize is the maximum size for security validation (10MB).
	// JSON documents larger than this use sampling-based security checks.
	DefaultMaxSecuritySize = 10 * 1024 * 1024

	// DefaultMaxObjectKeys is the maximum number of keys in an object (100000).
	// Objects with more keys will be rejected to prevent memory exhaustion.
	DefaultMaxObjectKeys = 100000

	// DefaultMaxArrayElements is the maximum number of elements in an array (100000).
	// Arrays with more elements will be rejected to prevent memory exhaustion.
	DefaultMaxArrayElements = 100000

	// DefaultMaxBatchSize is the maximum number of operations in a batch (2000).
	// Larger batches will be rejected to prevent memory exhaustion.
	DefaultMaxBatchSize = 2000

	// DefaultParallelThreshold is the threshold for parallel processing (10).
	// Operations below this count use sequential processing.
	DefaultParallelThreshold = 10

	// DefaultCacheTTL is the default cache entry time-to-live (5 minutes).
	// Cached values expire after this duration.
	// Adjust via Config.CacheTTL if needed.
	DefaultCacheTTL = 5 * time.Minute
)

Configuration constants with optimized defaults for production workloads.

View Source
const (
	// MergeUnion performs union merge - combines all keys/elements (default)
	// For objects: all keys from both, conflicts resolved by override value
	// For arrays: all elements from both with deduplication
	MergeUnion = internal.MergeUnion

	// MergeIntersection performs intersection merge - only common keys/elements
	// For objects: only keys present in both, values from override
	// For arrays: only elements present in both arrays
	MergeIntersection = internal.MergeIntersection

	// MergeDifference performs difference merge - keys/elements only in base
	// For objects: keys in base but not in override
	// For arrays: elements in base but not in override
	MergeDifference = internal.MergeDifference
)

Merge mode constants - re-exported from internal package for public API

Variables

View Source
var (
	// ErrInvalidJSON indicates that the input is not valid JSON.
	// This error is returned when JSON parsing fails due to syntax errors,
	// malformed structures, or invalid UTF-8 encoding.
	ErrInvalidJSON = errors.New("invalid JSON format")

	// ErrPathNotFound indicates that the specified path does not exist
	// in the JSON structure. This can occur when accessing nested keys
	// that don't exist or using array indices out of bounds.
	ErrPathNotFound = errors.New("path not found")

	// ErrTypeMismatch indicates that the value at the path is not of the expected type.
	// For example, trying to get a string when the value is a number.
	ErrTypeMismatch = errors.New("type mismatch")

	// ErrInvalidPath indicates that the path syntax is invalid.
	// Paths should use format: "key.subkey" or "array[0]".
	ErrInvalidPath = errors.New("invalid path format")

	// ErrProcessorClosed indicates that the processor has been closed
	// and cannot accept new operations. Create a new processor with New().
	ErrProcessorClosed = errors.New("processor is closed")

	// ErrSizeLimit indicates that the JSON size exceeds the configured limit.
	// Increase MaxJSONSize in Config to handle larger inputs.
	ErrSizeLimit = errors.New("size limit exceeded")

	// ErrDepthLimit indicates that the JSON nesting depth exceeds the configured limit.
	// Increase MaxNestingDepth in Config for deeply nested structures.
	ErrDepthLimit = errors.New("depth limit exceeded")

	// ErrConcurrencyLimit indicates that the concurrent operation count exceeds the limit.
	// Increase MaxConcurrency in Config for high-concurrency scenarios.
	ErrConcurrencyLimit = errors.New("concurrency limit exceeded")

	// ErrSecurityViolation indicates that potentially dangerous content was detected.
	// This includes prototype pollution patterns and other security risks.
	ErrSecurityViolation = errors.New("security violation detected")

	// ErrUnsupportedPath indicates that the path operation is not supported.
	// This may occur with invalid path segments or operations.
	ErrUnsupportedPath = errors.New("unsupported path operation")

	// ErrOperationTimeout indicates that an operation exceeded its timeout duration.
	// Consider increasing timeout or optimizing the operation.
	ErrOperationTimeout = errors.New("operation timeout")

	// ErrResourceExhausted indicates that system resources are exhausted.
	// This may indicate a memory leak or excessive resource usage.
	ErrResourceExhausted = errors.New("system resources exhausted")
)

Primary errors for common cases.

Functions

func ClearCache added in v1.1.0

func ClearCache()

ClearCache clears the processor's internal cache.

func Compact

func Compact(dst *bytes.Buffer, src []byte, cfg ...Config) error

Compact appends to dst the JSON-encoded src with insignificant space characters elided. This function is 100% compatible with encoding/json.Compact. Accepts optional Config to control compact behavior (e.g., number preservation).

Example:

// encoding/json compatible usage (no Config)
var buf bytes.Buffer
err := json.Compact(&buf, []byte(`{"name": "Alice"}`))

// With configuration
cfg := json.DefaultConfig()
cfg.PreserveNumbers = true
err = json.Compact(&buf, []byte(jsonStr), cfg)

func CompareJSON added in v1.3.0

func CompareJSON(json1, json2 string) (bool, error)

CompareJSON compares two JSON strings for equality by parsing and normalizing them. This function handles numeric precision differences and key ordering.

Example:

equal, err := json.CompareJSON(`{"a":1}`, `{"a":1.0}`)
// equal == true

func Delete

func Delete(jsonStr, path string, cfg ...Config) (string, error)

Delete deletes a value from JSON at the specified path.

Errors:

  • ErrInvalidJSON: jsonStr is not valid JSON
  • ErrPathNotFound: path does not exist
  • ErrInvalidPath: path syntax is invalid

func DeleteClean added in v1.4.0

func DeleteClean(jsonStr, path string, cfg ...Config) (string, error)

DeleteClean deletes a value from JSON and cleans up null values and empty arrays. This is equivalent to calling Delete with CleanupNulls and CompactArrays enabled.

Example:

result, err := json.DeleteClean(data, "users[0].profile")

func Encode

func Encode(value any, cfg ...Config) (string, error)

Encode converts any Go value to JSON string. For configuration options, use EncodeWithConfig.

func EncodeBatch added in v1.1.0

func EncodeBatch(pairs map[string]any, cfg ...Config) (string, error)

EncodeBatch encodes multiple key-value pairs as a JSON object. This is the unified API that replaces EncodeBatchWithOpts.

Example:

result, err := json.EncodeBatch(map[string]any{"name": "Alice", "age": 30})

func EncodeFields added in v1.1.0

func EncodeFields(value any, fields []string, cfg ...Config) (string, error)

EncodeFields encodes specific fields from a struct or map. This is the unified API that replaces EncodeFieldsWithOpts.

Example:

result, err := json.EncodeFields(user, []string{"name", "email"})

func EncodePretty

func EncodePretty(value any, cfg ...Config) (string, error)

EncodePretty converts any Go value to pretty-printed JSON string. This is the package-level equivalent of Processor.EncodePretty().

Example:

result, err := json.EncodePretty(data)

// With custom configuration
cfg := json.DefaultConfig()
cfg.Indent = "    "
result, err := json.EncodePretty(data, cfg)

func EncodeStream added in v1.1.0

func EncodeStream(values any, cfg ...Config) (string, error)

EncodeStream encodes multiple values as a JSON array. This is the unified API that replaces EncodeStreamWithOpts.

Example:

result, err := json.EncodeStream([]any{1, 2, 3}, json.PrettyConfig())

func EncodeWithConfig added in v1.3.0

func EncodeWithConfig(value any, cfg ...Config) (string, error)

EncodeWithConfig converts any Go value to JSON string using the unified Config. This is the recommended way to encode JSON with configuration.

Example:

// Default configuration
result, err := json.EncodeWithConfig(data)

// Pretty output
result, err := json.EncodeWithConfig(data, json.PrettyConfig())

// Security-focused output
result, err := json.EncodeWithConfig(data, json.SecurityConfig())

// Custom configuration
cfg := json.DefaultConfig()
cfg.Pretty = true
cfg.SortKeys = true
result, err := json.EncodeWithConfig(data, cfg)

func Foreach

func Foreach(jsonStr string, fn func(key any, item *IterableValue), cfg ...Config)

Foreach iterates over JSON arrays or objects with simplified signature (for test compatibility). Accepts optional Config for consistency with Processor.Foreach.

func ForeachFile added in v1.4.0

func ForeachFile(filePath string, fn func(key any, item *IterableValue) error) error

ForeachFile iterates over JSON arrays or objects directly from a file. The callback returns an error to signal iteration control:

  • nil: continue iteration
  • item.Break(): stop iteration without error
  • other error: stop iteration and return the error

Example:

err := json.ForeachFile("data.json", func(key any, item *json.IterableValue) error {
    fmt.Println(item.GetString("name"))
    return nil // continue
})

func ForeachFileChunked added in v1.4.0

func ForeachFileChunked(filePath string, chunkSize int, fn func(chunk []*IterableValue) error) error

ForeachFileChunked iterates over JSON arrays from a file in chunks (batches). This is useful for batch processing large datasets.

Example:

err := json.ForeachFileChunked("data.json", 100, func(chunk []*json.IterableValue) error {
    // Process batch of 100 items
    for _, item := range chunk {
        fmt.Println(item.GetString("name"))
    }
    return nil
})

func ForeachFileNested added in v1.4.0

func ForeachFileNested(filePath string, fn func(key any, item *IterableValue) error) error

ForeachFileNested recursively iterates over all nested JSON structures from a file.

Example:

err := json.ForeachFileNested("data.json", func(key any, item *json.IterableValue) error {
    fmt.Printf("Key: %v, Type: %T\n", key, item.Value)
    return nil
})

func ForeachFileWithPath added in v1.4.0

func ForeachFileWithPath(filePath, path string, fn func(key any, item *IterableValue) error) error

ForeachFileWithPath iterates over JSON arrays or objects at a specific path from a file.

Example:

err := json.ForeachFileWithPath("data.json", ".users", func(key any, item *json.IterableValue) error {
    fmt.Println(item.GetString("name"))
    return nil
})

func ForeachJSONL added in v1.4.0

func ForeachJSONL(reader io.Reader, fn func(lineNum int, item *IterableValue) error) error

ForeachJSONL iterates over JSONL data with IterableValue callback.

Example:

err := json.ForeachJSONL(reader, func(lineNum int, item *json.IterableValue) error {
	fmt.Printf("Line: %d, Value: %v\n", lineNum, item.GetData())
	return nil
})

func ForeachNested

func ForeachNested(jsonStr string, fn func(key any, item *IterableValue), cfg ...Config)

ForeachNested iterates over nested JSON structures. Accepts optional Config for consistency with Processor.ForeachNested.

func ForeachNestedWithError added in v1.4.0

func ForeachNestedWithError(jsonStr string, fn func(key any, item *IterableValue) error) error

ForeachNestedWithError recursively iterates over all nested JSON structures with error-returning callback.

Example:

err := json.ForeachNestedWithError(jsonStr, func(key any, item *json.IterableValue) error {
    fmt.Printf("Key: %v\n", key)
    return nil
})

func ForeachReturn

func ForeachReturn(jsonStr string, fn func(key any, item *IterableValue), cfg ...Config) (string, error)

ForeachReturn is a variant that returns error (for compatibility with test expectations). Accepts optional Config for consistency with Processor.ForeachReturn.

func ForeachWithError added in v1.4.0

func ForeachWithError(jsonStr, path string, fn func(key any, item *IterableValue) error) error

ForeachWithError iterates over JSON arrays or objects with error-returning callback. The callback returns an error to signal iteration control:

  • nil: continue iteration
  • item.Break(): stop iteration without error
  • other error: stop iteration and return the error

Example:

err := json.ForeachWithError(jsonStr, ".", func(key any, item *json.IterableValue) error {
    if item.GetInt("id") == targetId {
        return item.Break() // stop iteration
    }
    return nil // continue
})

func ForeachWithPath

func ForeachWithPath(jsonStr, path string, fn func(key any, item *IterableValue), cfg ...Config) error

ForeachWithPath iterates over JSON arrays or objects with simplified signature (for test compatibility). Accepts optional Config for consistency with Processor.ForeachWithPath.

func ForeachWithPathAndControl added in v1.0.8

func ForeachWithPathAndControl(jsonStr, path string, fn func(key any, value any) IteratorControl, cfg ...Config) error

ForeachWithPathAndControl iterates over JSON arrays or objects and applies a function. This is the 3-parameter version used by most code. Accepts optional Config for consistency with Processor.ForeachWithPathAndControl.

func ForeachWithPathAndIterator added in v1.1.0

func ForeachWithPathAndIterator(jsonStr, path string, fn func(key any, item *IterableValue, currentPath string) IteratorControl) error

ForeachWithPathAndIterator iterates over JSON at a path with path information in the callback. The callback receives the current path and returns IteratorControl to control iteration flow.

Example:

err := json.ForeachWithPathAndIterator(jsonStr, ".users", func(key any, item *json.IterableValue, currentPath string) json.IteratorControl {
    fmt.Printf("Path: %s, Key: %v\n", currentPath, key)
    return json.IteratorNormal // continue
})

func Get

func Get(jsonStr, path string, cfg ...Config) (any, error)

Get retrieves a value from JSON at the specified path. Returns the value as any and requires type assertion.

Errors:

  • ErrInvalidJSON: jsonStr is not valid JSON
  • ErrPathNotFound: path does not exist in the JSON structure
  • ErrProcessorClosed: processor has been closed

Example:

value, err := json.Get(`{"user":{"name":"Alice"}}`, "user.name")
if err != nil {
    // Handle error
}
name := value.(string)

func GetArray

func GetArray(jsonStr, path string, defaultValue ...[]any) []any

GetArray retrieves an array value from JSON at the specified path. Returns defaultValue if provided, otherwise nil when: path not found, value is null, or type conversion fails.

func GetBool

func GetBool(jsonStr, path string, defaultValue ...bool) bool

GetBool retrieves a bool value from JSON at the specified path. Returns defaultValue if provided, otherwise false when: path not found, value is null, or type conversion fails.

func GetFloat added in v1.3.0

func GetFloat(jsonStr, path string, defaultValue ...float64) float64

GetFloat retrieves a float64 value from JSON at the specified path. Returns defaultValue if provided, otherwise 0.0 when: path not found, value is null, or type conversion fails.

func GetInt

func GetInt(jsonStr, path string, defaultValue ...int) int

GetInt retrieves an int value from JSON at the specified path. Returns defaultValue if provided, otherwise 0 when: path not found, value is null, or type conversion fails.

func GetMultiple

func GetMultiple(jsonStr string, paths []string, cfg ...Config) (map[string]any, error)

GetMultiple retrieves multiple values from JSON at the specified paths. Returns a map of path to value for each successfully retrieved path.

Errors:

  • ErrInvalidJSON: jsonStr is not valid JSON

func GetObject

func GetObject(jsonStr, path string, defaultValue ...map[string]any) map[string]any

GetObject retrieves an object value from JSON at the specified path. Returns defaultValue if provided, otherwise nil when: path not found, value is null, or type conversion fails.

func GetString

func GetString(jsonStr, path string, defaultValue ...string) string

GetString retrieves a string value from JSON at the specified path. Returns defaultValue if provided, otherwise "" when: path not found, value is null, or type conversion fails.

func GetTyped

func GetTyped[T any](jsonStr, path string, defaultValue ...T) T

GetTyped retrieves a typed value from JSON at the specified path. Returns defaultValue if provided, otherwise zero value of T when: path not found, value is null, or type conversion fails.

Example:

name := json.GetTyped[string](data, "user.name", "unknown")
age := json.GetTyped[int](data, "user.age", 0)
name := json.GetTyped[string](data, "user.name") // returns "" if not found

func GetWithContext added in v1.4.1

func GetWithContext(ctx context.Context, jsonStr, path string, cfg ...Config) (any, error)

GetWithContext retrieves a value from JSON with context support for cancellation. This is the context-aware version of Get() that respects context cancellation and timeout deadlines.

Example:

ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
defer cancel()
value, err := json.GetWithContext(ctx, `{"user":{"name":"Alice"}}`, "user.name")

func HTMLEscape

func HTMLEscape(dst *bytes.Buffer, src []byte, cfg ...Config)

HTMLEscape appends to dst the JSON-encoded src with <, >, &, U+2028, and U+2029 characters escaped. This function is 100% compatible with encoding/json.HTMLEscape. Accepts optional Config for consistent API pattern.

Example:

var buf bytes.Buffer
json.HTMLEscape(&buf, []byte(`{"url":"<script>alert(1)</script>"}`))

func Indent

func Indent(dst *bytes.Buffer, src []byte, prefix, indent string, cfg ...Config) error

Indent appends to dst an indented form of the JSON-encoded src. This function is 100% compatible with encoding/json.Indent. Accepts optional Config for controlling indentation behavior.

Example:

var buf bytes.Buffer
err := json.Indent(&buf, []byte(`{"name":"Alice"}`), "", "  ")

func LoadFromFile

func LoadFromFile(filePath string, cfg ...Config) (string, error)

LoadFromFile loads JSON data from a file with optional configuration Uses the default processor with support for Config such as security validation

func LoadFromReader added in v1.4.0

func LoadFromReader(reader io.Reader, cfg ...Config) (string, error)

LoadFromReader loads JSON data from an io.Reader with size limiting. The reader is limited to MaxJSONSize to prevent excessive memory usage.

Example:

file, _ := os.Open("data.json")
defer file.Close()
jsonStr, err := json.LoadFromReader(file)

func MapJSONL added in v1.4.0

func MapJSONL(reader io.Reader, fn func(lineNum int, item *IterableValue) (any, error)) ([]any, error)

MapJSONL maps JSONL data into a new format using a mapping function.

Example:

result, err := json.MapJSONL(reader, func(lineNum int, item *json.IterableValue) (any, error) {
	return map[string]any{
		"name": item.GetString("name"),
		"age":  item.GetInt("age"),
	}, nil
})

func Marshal

func Marshal(value any) ([]byte, error)

Marshal returns the JSON encoding of v. This function is 100% compatible with encoding/json.Marshal. For configuration options, use EncodeWithConfig or Processor.Marshal with cfg parameter.

func MarshalIndent

func MarshalIndent(v any, prefix, indent string) ([]byte, error)

MarshalIndent is like Marshal but applies indentation to format the output. This function is 100% compatible with encoding/json.MarshalIndent. For configuration options, use EncodeWithConfig or Processor.MarshalIndent with cfg parameter.

func MarshalToFile added in v1.0.6

func MarshalToFile(filePath string, data any, cfg ...Config) error

MarshalToFile marshals data to JSON and writes to a file. This is the unified API that replaces MarshalToFileWithOpts.

Example:

err := json.MarshalToFile("data.json", myStruct, json.PrettyConfig())

func MergeJSON added in v1.3.0

func MergeJSON(json1, json2 string, cfg ...Config) (string, error)

MergeJSON merges two JSON objects using deep merge strategy. For nested objects, it recursively merges keys according to Config.MergeMode. For primitive values and arrays, the value from json2 takes precedence.

Merge modes (Config.MergeMode, defaults to MergeUnion):

  • MergeUnion: combines all keys from both objects (default)
  • MergeIntersection: only keys present in both objects
  • MergeDifference: keys in json1 but not in json2

Example:

// Union merge (default)
result, err := json.MergeJSON(a, b)

// Intersection merge
cfg := json.DefaultConfig()
cfg.MergeMode = json.MergeIntersection
result, err := json.MergeJSON(a, b, cfg)

// Difference merge
cfg.MergeMode = json.MergeDifference
result, err := json.MergeJSON(a, b, cfg)

func MergeMany added in v1.4.0

func MergeMany(jsons []string, cfg ...Config) (string, error)

MergeMany merges multiple JSON objects using the unified Config pattern. Uses Config.MergeMode to determine the merge strategy (default: MergeUnion). Returns error if less than 2 JSON strings are provided.

Example:

// Union merge (default)
result, err := json.MergeMany([]string{config1, config2, config3})

// Intersection merge
cfg := json.DefaultConfig()
cfg.MergeMode = json.MergeIntersection
result, err := json.MergeMany([]string{config1, config2, config3}, cfg)

func Parse added in v1.3.0

func Parse(jsonStr string, target any, cfg ...Config) error

Parse parses a JSON string into the target variable. This is the unified package-level method matching Processor.Parse().

target must be a non-nil pointer. For parsing to any, use ParseAny() instead.

Example:

// Parse into a map
var obj map[string]any
err := json.Parse(jsonStr, &obj)

// Parse into a struct
var user User
err := json.Parse(jsonStr, &user)

// With configuration
cfg := json.DefaultConfig()
cfg.PreserveNumbers = true
err := json.Parse(jsonStr, &data, cfg)

func ParseAny added in v1.4.0

func ParseAny(jsonStr string, cfg ...Config) (any, error)

ParseAny parses a JSON string and returns the root value as any. This is the unified name matching Processor.ParseAny().

For unmarshaling into a specific target type, use Parse() instead.

Example:

// Parse to any (uses default processor)
data, err := json.ParseAny(jsonStr)

// With configuration (uses config-cached processor)
cfg := json.SecurityConfig()
data, err := json.ParseAny(jsonStr, cfg)

func ParseJSONL added in v1.2.0

func ParseJSONL(data []byte, cfg ...Config) ([]any, error)

ParseJSONL parses JSONL data from a byte slice. Uses Config.JSONLSkipComments and Config.JSONLContinueOnErr for processing options.

Example:

// Basic usage
data, err := json.ParseJSONL(jsonlBytes)

// With custom config
cfg := json.DefaultConfig()
cfg.JSONLSkipComments = true
cfg.JSONLContinueOnErr = true
data, err := json.ParseJSONL(jsonlBytes, cfg)

func Prettify added in v1.3.0

func Prettify(jsonStr string, cfg ...Config) (string, error)

Prettify formats JSON string with pretty indentation. This is the recommended function for formatting JSON strings.

Example:

pretty, err := json.Prettify(`{"name":"Alice","age":30}`)
// Output:
// {
//   "name": "Alice",
//   "age": 30
// }

func RedactedPath added in v1.4.0

func RedactedPath(path string) string

RedactedPath returns a redacted version of a path for safe inclusion in error messages. SECURITY: Prevents path content from leaking into logs or error responses.

func ReduceJSONL added in v1.4.0

func ReduceJSONL(reader io.Reader, initial any, fn func(acc any, item *IterableValue) any) (any, error)

ReduceJSONL reduces JSONL data to a single aggregated result using a reducer function.

Example:

totalAge, err := json.ReduceJSONL(reader, 0, func(acc any, item *json.IterableValue) any {
	return acc.(int64) + int64(item.GetInt("age"))
})

func RegisterDangerousPattern added in v1.3.0

func RegisterDangerousPattern(pattern DangerousPattern)

RegisterDangerousPattern adds a pattern to the global registry. Patterns registered here are checked in addition to default patterns.

Example:

json.RegisterDangerousPattern(json.DangerousPattern{
    Pattern: "malicious_keyword",
    Name:    "Custom dangerous pattern",
    Level:   json.PatternLevelCritical,
})

func SafeError added in v1.4.0

func SafeError(err error) string

SafeError returns a client-safe error message that omits internal details. Use this when returning errors to external clients (HTTP responses, APIs). The full Error() output may include path names and internal structure that should not be exposed to clients (CWE-209).

Example:

result, err := processor.Get(json, "users.admin.password")
if err != nil {
    // Do not expose: "JSON get failed at path 'users.admin.password': ..."
    // Instead return: "path not found"
    http.Error(w, json.SafeError(err), http.StatusBadRequest)
}

func SaveToFile

func SaveToFile(filePath string, data any, cfg ...Config) error

SaveToFile saves JSON data to a file with optional configuration. This is the unified API that replaces SaveToFileWithOpts.

Example:

// Simple save
err := json.SaveToFile("data.json", data)

// With pretty printing
cfg := json.PrettyConfig()
err := json.SaveToFile("data.json", data, cfg)

func SaveToWriter added in v1.3.0

func SaveToWriter(writer io.Writer, data any, cfg ...Config) error

SaveToWriter writes JSON data to an io.Writer. This is the unified API that replaces SaveToWriterWithOpts.

Example:

var buf bytes.Buffer
err := json.SaveToWriter(&buf, data, json.PrettyConfig())

func Set

func Set(jsonStr, path string, value any, cfg ...Config) (string, error)

Set sets a value in JSON at the specified path. Creates intermediate paths if Config.CreatePaths is true.

Returns:

  • On success: modified JSON string and nil error
  • On failure: original unmodified JSON string and error information

Errors:

  • ErrInvalidJSON: jsonStr is not valid JSON
  • ErrInvalidPath: path syntax is invalid
  • ErrPathNotFound: path does not exist and CreatePaths is false
  • ErrTypeMismatch: cannot set value at path due to type conflict

func SetCreate added in v1.4.0

func SetCreate(jsonStr, path string, value any, cfg ...Config) (string, error)

SetCreate sets a value in JSON at the specified path, creating intermediate paths as needed. This is equivalent to calling Set with CreatePaths enabled.

Example:

result, err := json.SetCreate(data, "users[0].profile.name", "Alice")

func SetGlobalProcessor

func SetGlobalProcessor(processor *Processor)

SetGlobalProcessor sets a custom global processor for package-level operations. The processor is used by all package-level functions (Get, Set, Delete, Marshal, etc.). Passing nil is a no-op. The previous processor is closed before being replaced. This function is thread-safe.

Example:

cfg := json.DefaultConfig()
cfg.EnableCache = true
processor, _ := json.New(cfg)
json.SetGlobalProcessor(processor)

// Now all package-level operations use the custom processor
data, err := json.Get(`{"name":"Alice"}`, "name")

func SetMultiple

func SetMultiple(jsonStr string, updates map[string]any, cfg ...Config) (string, error)

SetMultiple sets multiple values using a map of path-value pairs. Creates intermediate paths if Config.CreatePaths is true.

Errors:

  • ErrInvalidJSON: jsonStr is not valid JSON
  • ErrInvalidPath: any path syntax is invalid
  • ErrPathNotFound: path does not exist and CreatePaths is false

func SetMultipleCreate added in v1.4.0

func SetMultipleCreate(jsonStr string, updates map[string]any, cfg ...Config) (string, error)

SetMultipleCreate sets multiple values, creating intermediate paths as needed. This is equivalent to calling SetMultiple with CreatePaths enabled.

Example:

result, err := json.SetMultipleCreate(data, map[string]any{"user.name": "Alice", "user.age": 30})

func ShutdownGlobalProcessor

func ShutdownGlobalProcessor()

ShutdownGlobalProcessor closes and removes the global processor. After calling this, package-level operations will create a new default processor on first use. Call this for clean shutdown in long-running services. This function is thread-safe.

Example:

// At application shutdown
json.ShutdownGlobalProcessor()

func StreamJSONL added in v1.4.0

func StreamJSONL(reader io.Reader, fn func(lineNum int, item *IterableValue) error) error

StreamJSONL streams JSONL data from a reader with IterableValue callback support.

Example:

err := json.StreamJSONL(reader, func(lineNum int, item *json.IterableValue) error {
	name := item.GetString("name")
	fmt.Printf("Line %d: name=%s\n", lineNum, name)
	return nil // continue processing
})

func StreamJSONLChunked added in v1.4.0

func StreamJSONLChunked(reader io.Reader, chunkSize int, fn func(chunk []*IterableValue) error) error

StreamJSONLChunked processes JSONL data in chunks for memory-efficient processing.

Example:

err := json.StreamJSONLChunked(reader, 1000, func(chunk []*json.IterableValue) error {
	// Process chunk of 1000 items
	return nil
})

func StreamJSONLFile added in v1.4.0

func StreamJSONLFile(filename string, fn func(lineNum int, item *IterableValue) error) error

StreamJSONLFile streams JSONL data from a file with IterableValue callback.

Example:

err := json.StreamJSONLFile("data.jsonl", func(lineNum int, item *json.IterableValue) error {
	fmt.Printf("Line %d: %v\n", lineNum, item.GetData())
	return nil
})

func StreamJSONLParallel added in v1.4.0

func StreamJSONLParallel(reader io.Reader, workers int, fn func(lineNum int, item *IterableValue) error) error

StreamJSONLParallel processes JSONL data in parallel with multiple workers.

Example:

err := json.StreamJSONLParallel(reader, 4, func(lineNum int, item *json.IterableValue) error {
	// Process each item in parallel
	return nil
})

func StreamJSONLParallelWithContext added in v1.4.0

func StreamJSONLParallelWithContext(ctx context.Context, reader io.Reader, workers int, fn func(lineNum int, item *IterableValue) error) error

StreamJSONLParallelWithContext processes JSONL data in parallel with context support for cancellation. See Processor.StreamJSONLParallelWithContext for details.

func StreamLinesInto added in v1.2.0

func StreamLinesInto[T any](reader io.Reader, fn func(lineNum int, data T) error, cfg ...Config) ([]T, error)

StreamLinesInto processes JSONL data into a slice of typed values. Uses generics for type-safe processing. The optional cfg parameter allows customization using the unified Config pattern.

Example:

// Default settings
results, err := json.StreamLinesInto[MyType](reader, func(lineNum int, data MyType) error {
    fmt.Printf("Line %d: %+v\n", lineNum, data)
    return nil
})

// With custom configuration
cfg := json.DefaultConfig()
cfg.JSONLSkipEmpty = false
cfg.JSONLSkipComments = true
results, err := json.StreamLinesInto[MyType](reader, processFunc, cfg)

func ToJSONL added in v1.2.0

func ToJSONL(data []any, cfg ...Config) ([]byte, error)

ToJSONL converts a slice of values to JSONL format. Uses Config.EscapeHTML for encoding options.

Example:

// Basic usage
jsonl, err := json.ToJSONL([]any{map[string]any{"id": 1}, map[string]any{"id": 2}})

// With custom config
cfg := json.DefaultConfig()
cfg.EscapeHTML = true
jsonl, err := json.ToJSONL(data, cfg)

func ToJSONLString added in v1.2.0

func ToJSONLString(data []any, cfg ...Config) (string, error)

ToJSONLString converts a slice of values to JSONL format string. Uses Config.EscapeHTML and Config.Pretty for encoding options.

Example:

// Basic usage
jsonlStr, err := json.ToJSONLString(data)

// With custom config
cfg := json.DefaultConfig()
cfg.EscapeHTML = true
jsonlStr, err := json.ToJSONLString(data, cfg)

func Unmarshal

func Unmarshal(data []byte, value any) error

Unmarshal parses the JSON-encoded data and stores the result in v. This function is 100% compatible with encoding/json.Unmarshal. For configuration options, use Processor.Unmarshal with cfg parameter.

func UnmarshalFromFile added in v1.0.6

func UnmarshalFromFile(filePath string, v any, cfg ...Config) error

UnmarshalFromFile reads JSON from a file and unmarshals it into v. This is a convenience function that combines file reading and unmarshalling. Uses the default processor for security validation and decoding.

Parameters:

  • path: file path to read JSON from
  • v: pointer to the target variable where JSON will be unmarshaled
  • cfg: optional Config for security validation and processing

Returns error if file reading fails or JSON cannot be unmarshaled.

func UnregisterDangerousPattern added in v1.3.0

func UnregisterDangerousPattern(pattern string)

UnregisterDangerousPattern removes a pattern from the global registry.

func Valid

func Valid(data []byte) bool

Valid reports whether data is valid JSON. This function is 100% compatible with encoding/json.Valid. Delegates to Processor.ValidBytes for consistent []byte → bool behavior.

func ValidWithConfig added in v1.4.0

func ValidWithConfig(jsonStr string, cfg ...Config) (bool, error)

ValidWithConfig reports whether the JSON string is valid with configuration. Returns both the validation result and any error that occurred during validation. This is the unified API for validation with configuration.

Example:

cfg := json.SecurityConfig()
valid, err := json.ValidWithConfig(jsonStr, cfg)

Types

type AccessResult added in v1.3.0

type AccessResult struct {
	Value  any    // The result value (exported for backward compatibility)
	Exists bool   // Whether the path exists
	Type   string // Runtime type info (for debugging)
}

AccessResult represents the result of a dynamic access operation. It extends Result[any] with type conversion methods for safe type handling.

Example:

result := processor.SafeGet(data, "user.age")
age, err := result.AsInt()
name, err := result.AsString()

func SafeGet added in v1.4.0

func SafeGet(jsonStr, path string, cfg ...Config) AccessResult

SafeGet performs a type-safe get operation returning an AccessResult with type conversion methods (AsString, AsInt, AsFloat64, AsBool). Accepts optional Config for controlling validation, security, and caching behavior.

Example:

result := json.SafeGet(data, "user.age")
if result.Ok() {
    age, _ := result.AsInt()
}

func (AccessResult) AsBool added in v1.3.0

func (r AccessResult) AsBool() (bool, error)

AsBool safely converts the result to bool. Unlike ConvertToBool, this method is stricter and only accepts bool and string types. Use ConvertToBool directly if you need more permissive conversion (e.g., int to bool).

func (AccessResult) AsFloat64 added in v1.3.0

func (r AccessResult) AsFloat64() (float64, error)

AsFloat64 safely converts the result to float64 with precision checks. Unlike convertToFloat64, this method is stricter and does NOT convert bool to float64. Use convertToFloat64 directly if you need more permissive conversion.

func (AccessResult) AsInt added in v1.3.0

func (r AccessResult) AsInt() (int, error)

AsInt safely converts the result to int with overflow and precision checks. Unlike convertToInt, this method is stricter and does NOT convert bool to int. Use convertToInt directly if you need more permissive conversion.

func (AccessResult) AsString added in v1.3.0

func (r AccessResult) AsString() (string, error)

AsString safely converts the result to string. Returns ErrTypeMismatch if the value is not a string type. Use AsStringConverted() for explicit type conversion with formatting.

func (AccessResult) AsStringConverted added in v1.3.0

func (r AccessResult) AsStringConverted() (string, error)

AsStringConverted converts the result to string using fmt.Sprintf formatting. Use this when you explicitly want string representation of any type. For strict type checking, use AsString() instead.

func (AccessResult) Ok added in v1.3.0

func (r AccessResult) Ok() bool

Ok returns true if the value exists.

func (AccessResult) Unwrap added in v1.3.0

func (r AccessResult) Unwrap() any

Unwrap returns the value or nil if it doesn't exist.

func (AccessResult) UnwrapOr added in v1.3.0

func (r AccessResult) UnwrapOr(defaultValue any) any

UnwrapOr returns the value or the provided default if it doesn't exist.

type BatchIterator added in v1.2.0

type BatchIterator struct {
	// contains filtered or unexported fields
}

BatchIterator processes arrays in batches for efficient bulk operations

func NewBatchIterator added in v1.2.0

func NewBatchIterator(data []any, cfg ...Config) *BatchIterator

NewBatchIterator creates a new batch iterator. The optional cfg parameter allows customization using the unified Config pattern. When config is provided, cfg.MaxBatchSize is used as the batch size.

Example:

// Default settings (batch size = 100)
iter := json.NewBatchIterator(data)

// With custom batch size
cfg := json.DefaultConfig()
cfg.MaxBatchSize = 50
iter := json.NewBatchIterator(data, cfg)

// Legacy pattern (backward compatible)
iter := json.NewBatchIteratorWithSize(data, 50)

func (*BatchIterator) CurrentIndex added in v1.2.0

func (it *BatchIterator) CurrentIndex() int

CurrentIndex returns the current position in the array

func (*BatchIterator) HasNext added in v1.2.0

func (it *BatchIterator) HasNext() bool

HasNext returns true if there are more batches to process

func (*BatchIterator) NextBatch added in v1.2.0

func (it *BatchIterator) NextBatch() []any

NextBatch returns the next batch of elements Returns nil when no more batches are available

func (*BatchIterator) Remaining added in v1.2.0

func (it *BatchIterator) Remaining() int

Remaining returns the number of remaining elements

func (*BatchIterator) Reset added in v1.2.0

func (it *BatchIterator) Reset()

Reset resets the iterator to the beginning

func (*BatchIterator) TotalBatches added in v1.2.0

func (it *BatchIterator) TotalBatches() int

TotalBatches returns the total number of batches. Returns 0 if batch size is not positive.

type BatchOperation

type BatchOperation struct {
	Type    string `json:"type"`
	JSONStr string `json:"json_str"`
	Path    string `json:"path"`
	Value   any    `json:"value"`
	ID      string `json:"id"`
}

BatchOperation represents a single operation in a batch

type BatchResult

type BatchResult struct {
	ID     string `json:"id"`
	Result any    `json:"result"`
	Error  error  `json:"error"`
}

BatchResult represents the result of a batch operation

func ProcessBatch added in v1.1.0

func ProcessBatch(operations []BatchOperation, cfg ...Config) ([]BatchResult, error)

ProcessBatch processes multiple JSON operations in a single batch. This is more efficient than processing each operation individually.

type CheckResult

type CheckResult struct {
	Healthy bool   `json:"healthy"`
	Message string `json:"message"`
}

CheckResult represents the result of a single health check

type CompiledPath added in v1.4.0

type CompiledPath = internal.CompiledPath

CompiledPath represents a pre-parsed JSON path for fast repeated operations. Use Processor.CompilePath() to create instances.

Example:

cp, err := processor.CompilePath("users[0].name")
defer cp.Release()
value, err := processor.GetCompiled(jsonStr, cp)

type Config

type Config struct {
	// ===== Cache Settings =====
	MaxCacheSize int           `json:"max_cache_size"`
	CacheTTL     time.Duration `json:"cache_ttl"`
	EnableCache  bool          `json:"enable_cache"`
	CacheResults bool          `json:"cache_results"` // Per-operation caching

	// ===== Size Limits =====
	MaxJSONSize  int64 `json:"max_json_size"`
	MaxPathDepth int   `json:"max_path_depth"`
	MaxBatchSize int   `json:"max_batch_size"`

	// ===== Security Limits =====
	MaxNestingDepthSecurity   int   `json:"max_nesting_depth"`
	MaxSecurityValidationSize int64 `json:"max_security_validation_size"`
	MaxObjectKeys             int   `json:"max_object_keys"`
	MaxArrayElements          int   `json:"max_array_elements"`
	// FullSecurityScan enables full (non-sampling) security validation for all JSON input.
	//
	// When false (default): Large JSON (>4KB) uses optimized scanning with:
	//   - Rolling window scan (32KB windows) over the entire JSON content
	//   - Suspicious character density sampling (4KB beginning, middle, and end regions)
	//   - Critical patterns (__proto__, constructor, prototype) always fully scanned
	//   - High suspicious density triggers automatic full scan
	//
	// When true: All JSON is fully scanned regardless of size.
	//
	// SECURITY RECOMMENDATION: Enable FullSecurityScan when:
	//   - Processing untrusted input from external sources
	//   - Handling sensitive data (authentication, financial, personal)
	//   - Building public-facing APIs or web services
	//   - Compliance requirements mandate full content inspection
	//
	// PERFORMANCE NOTE: Full scanning adds ~10-30% overhead for JSON >100KB.
	// For trusted internal services with large JSON payloads, sampling mode is acceptable.
	FullSecurityScan bool `json:"full_security_scan"`

	// ===== Concurrency =====
	MaxConcurrency    int `json:"max_concurrency"`
	ParallelThreshold int `json:"parallel_threshold"`

	// ===== Processing Options =====
	EnableValidation bool `json:"enable_validation"`
	StrictMode       bool `json:"strict_mode"`
	CreatePaths      bool `json:"create_paths"`
	CleanupNulls     bool `json:"cleanup_nulls"`
	CompactArrays    bool `json:"compact_arrays"`
	ContinueOnError  bool `json:"continue_on_error"` // Continue on batch errors

	// ===== Input/Output Options =====
	AllowComments    bool `json:"allow_comments"`
	PreserveNumbers  bool `json:"preserve_numbers"`
	ValidateInput    bool `json:"validate_input"`
	ValidateFilePath bool `json:"validate_file_path"`
	SkipValidation   bool `json:"skip_validation"` // Skip validation for trusted input

	// ===== Encoding Options =====
	Pretty          bool            `json:"pretty"`
	Indent          string          `json:"indent"`
	Prefix          string          `json:"prefix"`
	EscapeHTML      bool            `json:"escape_html"`
	SortKeys        bool            `json:"sort_keys"`
	ValidateUTF8    bool            `json:"validate_utf8"`
	MaxDepth        int             `json:"max_depth"`
	DisallowUnknown bool            `json:"disallow_unknown"`
	FloatPrecision  int             `json:"float_precision"`
	FloatTruncate   bool            `json:"float_truncate"`
	DisableEscaping bool            `json:"disable_escaping"`
	EscapeUnicode   bool            `json:"escape_unicode"`
	EscapeSlash     bool            `json:"escape_slash"`
	EscapeNewlines  bool            `json:"escape_newlines"`
	EscapeTabs      bool            `json:"escape_tabs"`
	IncludeNulls    bool            `json:"include_nulls"`
	CustomEscapes   map[rune]string `json:"custom_escapes,omitempty"`

	// ===== Observability =====
	EnableMetrics     bool `json:"enable_metrics"`
	EnableHealthCheck bool `json:"enable_health_check"`

	// ===== Large File Processing =====
	// ChunkSize is the size of each chunk when processing large files.
	// Default: 1MB (1024 * 1024 bytes)
	ChunkSize int64 `json:"chunk_size"`

	// MaxMemory is the maximum memory to use for large file processing.
	// Default: 100MB (100 * 1024 * 1024 bytes)
	MaxMemory int64 `json:"max_memory"`

	// BufferSize is the buffer size for reading large files.
	// Default: 64KB (64 * 1024 bytes)
	BufferSize int `json:"buffer_size"`

	// SamplingEnabled enables sampling for very large files.
	// When true, only a subset of data is validated for security.
	// Default: true
	SamplingEnabled bool `json:"sampling_enabled"`

	// SampleSize is the number of samples to take when sampling is enabled.
	// Default: 1000
	SampleSize int `json:"sample_size"`

	// JSONLBufferSize is the buffer size for reading JSONL files.
	// Default: 64KB (64 * 1024 bytes)
	JSONLBufferSize int `json:"jsonl_buffer_size"`

	// JSONLMaxLineSize is the maximum allowed line size for JSONL files.
	// Default: 1MB (1024 * 1024 bytes)
	JSONLMaxLineSize int `json:"jsonl_max_line_size"`

	// JSONLSkipEmpty skips empty lines when processing JSONL files.
	// Default: true
	JSONLSkipEmpty bool `json:"jsonl_skip_empty"`

	// JSONLSkipComments skips lines starting with # or //.
	// Default: false
	JSONLSkipComments bool `json:"jsonl_skip_comments"`

	// JSONLContinueOnErr continues processing on parse errors.
	// Default: false
	JSONLContinueOnErr bool `json:"jsonl_continue_on_err"`

	// JSONLWorkers is the number of parallel workers for JSONL processing.
	// Default: 4
	JSONLWorkers int `json:"jsonl_workers"`

	// JSONLChunkSize is the chunk size for batched JSONL processing.
	// Default: 1000
	JSONLChunkSize int `json:"jsonl_chunk_size"`

	// JSONLMaxMemory is the maximum memory for JSONL file processing in bytes.
	// Default: 100MB
	JSONLMaxMemory int64 `json:"jsonl_max_memory"`

	// ===== Merge Options =====
	// MergeMode controls how JSON documents are merged by MergeJSON and MergeMany.
	// Default: MergeUnion (combine all keys/elements)
	MergeMode MergeMode `json:"merge_mode"`

	// CustomEncoder replaces the default encoder entirely.
	// If set, Encode operations use this encoder instead of the built-in one.
	CustomEncoder CustomEncoder

	// CustomTypeEncoders provides encoding for specific types.
	// Keys are reflect.Type values; values implement TypeEncoder.
	CustomTypeEncoders map[reflect.Type]TypeEncoder

	// CustomValidators run before operations.
	// All validators must pass for the operation to proceed.
	CustomValidators []Validator

	// AdditionalDangerousPatterns adds security patterns beyond defaults.
	// These are checked in addition to built-in patterns.
	// SECURITY: Critical patterns (__proto__, constructor[, prototype.) are always
	// enforced and cannot be disabled.
	AdditionalDangerousPatterns []DangerousPattern

	// DisableDefaultPatterns disables built-in warning-level security patterns
	// (HTML tags, event handlers, etc.) but has no effect on critical patterns.
	// SECURITY: Critical patterns (__proto__, constructor[, prototype.) are always
	// enforced regardless of this setting. These patterns are too dangerous to disable.
	DisableDefaultPatterns bool

	// Hooks provide before/after interception for operations.
	Hooks []Hook

	// CustomPathParser replaces the default path parser.
	// If set, path parsing uses this parser instead of the built-in one.
	CustomPathParser PathParser
}

Config holds all configuration for the JSON processor. Start with DefaultConfig() and modify as needed.

Example:

cfg := json.DefaultConfig()
cfg.CreatePaths = true
cfg.Pretty = true
result, err := json.Set(data, "path", value, cfg)

func DefaultConfig

func DefaultConfig() Config

DefaultConfig returns the default configuration. Creates a new instance each time to allow modifications without affecting other callers.

func PrettyConfig added in v1.3.0

func PrettyConfig() Config

PrettyConfig returns a Config for pretty-printed JSON output. This is the unified version that returns Config instead of EncodeConfig.

Example:

result, err := json.Encode(data, json.PrettyConfig())

func SecurityConfig added in v1.3.0

func SecurityConfig() Config

SecurityConfig returns a configuration with enhanced security settings for processing untrusted input from external sources.

This is the recommended configuration for:

  • Public APIs and web services
  • User-submitted data
  • External webhooks
  • Authentication endpoints
  • Financial data processing

Key characteristics:

  • Full security scan enabled for all input
  • Strict mode enabled for predictable parsing
  • Conservative limits for untrusted payloads
  • Caching enabled for repeated operations

This function provides a single entry point for security-focused configuration.

func (*Config) AddDangerousPattern added in v1.3.0

func (c *Config) AddDangerousPattern(pattern DangerousPattern)

AddDangerousPattern adds a security pattern to the configuration.

func (*Config) AddHook added in v1.3.0

func (c *Config) AddHook(hook Hook)

AddHook adds an operation hook to the configuration. Hooks are executed in order for Before and in reverse order for After.

func (*Config) AddValidator added in v1.3.0

func (c *Config) AddValidator(validator Validator)

AddValidator adds a custom validator to the configuration. Validators are executed in order; all must pass for operations to proceed.

func (*Config) Clone added in v1.0.6

func (c *Config) Clone() *Config

Clone creates a copy of the configuration. Performs a deep copy of reference types (maps, slices). Returns a pointer to avoid unnecessary copying of the large Config struct.

NOTE: Interface fields (CustomEncoder, CustomPathParser) are shallow-copied as they typically contain stateless or singleton implementations. CustomTypeEncoders, CustomValidators, AdditionalDangerousPatterns, and Hooks are deep-copied as they may be modified independently.

func (*Config) Validate added in v1.0.6

func (c *Config) Validate() error

Validate validates the configuration and applies corrections. This is the single source of truth for config validation. DRY FIX: Delegates to ValidateWithWarnings to avoid code duplication. BEHAVIOR: Auto-corrects invalid values (e.g., negative sizes are clamped to minimums). Returns nil after corrections are applied — use ValidateWithWarnings for visibility.

func (*Config) ValidateWithWarnings added in v1.3.0

func (c *Config) ValidateWithWarnings() []ConfigWarning

ValidateWithWarnings validates the configuration and returns warnings for any modifications made. This is useful for debugging configuration issues or informing users about automatic adjustments.

Example:

cfg := json.DefaultConfig()
cfg.MaxJSONSize = -1 // Invalid value
warnings := cfg.ValidateWithWarnings()
for _, w := range warnings {
    fmt.Printf("%s: %s\n", w.Field, w.Reason)
}

type ConfigWarning added in v1.3.0

type ConfigWarning struct {
	Field    string // The field that was modified
	OldValue any    // The original value (may be nil for invalid values)
	NewValue any    // The corrected value
	Reason   string // Why the modification was made
}

ConfigWarning represents a configuration modification made during validation.

type CustomEncoder

type CustomEncoder interface {
	// Encode converts a Go value to JSON string.
	Encode(value any) (string, error)
}

CustomEncoder provides custom JSON encoding capability. Implement this interface to replace the default encoder entirely.

Example:

type UpperCaseEncoder struct{}
func (e *UpperCaseEncoder) Encode(value any) (string, error) {
    // Custom encoding logic
}

cfg := json.DefaultConfig()
cfg.CustomEncoder = &UpperCaseEncoder{}

type DangerousPattern added in v1.3.0

type DangerousPattern struct {
	// Pattern is the substring to detect in input.
	Pattern string

	// Name is a human-readable description of the security risk.
	Name string

	// Level determines how the pattern is handled.
	Level PatternLevel
}

DangerousPattern represents a security risk pattern to detect.

func ListDangerousPatterns added in v1.3.0

func ListDangerousPatterns() []DangerousPattern

ListDangerousPatterns returns all registered custom patterns.

type Decoder

type Decoder struct {
	// contains filtered or unexported fields
}

Decoder reads and decodes JSON values from an input stream. This type is fully compatible with encoding/json.Decoder.

func NewDecoder

func NewDecoder(r io.Reader, cfg ...Config) *Decoder

NewDecoder returns a new decoder that reads from r. This function is fully compatible with encoding/json.NewDecoder.

The optional cfg parameter allows customization of decoding behavior. If no configuration is provided, default settings are used.

Example:

// Default decoder
decoder := json.NewDecoder(reader)

// With custom configuration
cfg := json.DefaultConfig()
cfg.DisallowUnknown = true
decoder := json.NewDecoder(reader, cfg)

func (*Decoder) Buffered

func (dec *Decoder) Buffered() io.Reader

func (*Decoder) Decode

func (dec *Decoder) Decode(v any) error

Decode reads the next JSON-encoded value from its input and stores it in v.

func (*Decoder) DisallowUnknownFields

func (dec *Decoder) DisallowUnknownFields()

func (*Decoder) InputOffset

func (dec *Decoder) InputOffset() int64

func (*Decoder) More

func (dec *Decoder) More() bool

func (*Decoder) Token

func (dec *Decoder) Token() (Token, error)

Token returns the next JSON token in the input stream. At the end of the input stream, Token returns nil, io.EOF.

func (*Decoder) UseNumber

func (dec *Decoder) UseNumber()

type Delim

type Delim rune

Delim is a JSON delimiter.

func (Delim) String

func (d Delim) String() string

type Encoder

type Encoder struct {
	// contains filtered or unexported fields
}

Encoder writes JSON values to an output stream. This type is fully compatible with encoding/json.Encoder.

func NewEncoder

func NewEncoder(w io.Writer, cfg ...Config) *Encoder

NewEncoder returns a new encoder that writes to w. This function is fully compatible with encoding/json.NewEncoder.

The optional cfg parameter allows customization of encoding behavior. If no configuration is provided, default settings are used.

Example:

// Default encoder
encoder := json.NewEncoder(writer)

// With configuration
cfg := json.DefaultConfig()
cfg.Pretty = true
encoder := json.NewEncoder(writer, cfg)

func (*Encoder) Encode

func (enc *Encoder) Encode(v any) error

func (*Encoder) SetEscapeHTML

func (enc *Encoder) SetEscapeHTML(on bool)

SetEscapeHTML specifies whether problematic HTML characters should be escaped inside JSON quoted strings. The default behavior is to escape &, <, and > to \u0026, \u003c, and \u003e to avoid certain safety problems that can arise when embedding JSON in HTML.

In non-HTML settings where the escaping interferes with the readability of the output, SetEscapeHTML(false) disables this behavior.

func (*Encoder) SetIndent

func (enc *Encoder) SetIndent(prefix, indent string)

SetIndent instructs the encoder to format each subsequent encoded value as if indented by the package-level function Indent(dst, src, prefix, indent). Calling SetIndent("", "") disables indentation.

type HealthStatus

type HealthStatus struct {
	Timestamp time.Time              `json:"timestamp"`
	Healthy   bool                   `json:"healthy"`
	Checks    map[string]CheckResult `json:"checks"`
}

HealthStatus represents the health status of the processor

func GetHealthStatus added in v1.1.0

func GetHealthStatus() HealthStatus

GetHealthStatus returns the health status of the default processor.

type Hook added in v1.3.0

type Hook interface {
	// Before is called before an operation.
	// Return error to abort the operation.
	Before(ctx HookContext) error

	// After is called after an operation completes.
	// Modify result or error as needed.
	After(ctx HookContext, result any, err error) (any, error)
}

Hook intercepts operations before/after execution. Implement this interface to add cross-cutting concerns like logging, metrics, tracing, or request transformation.

Example:

type LoggingHook struct{ logger *slog.Logger }

func (h *LoggingHook) Before(ctx json.HookContext) error {
    h.logger.Info("operation starting", "op", ctx.Operation, "path", ctx.Path)
    return nil
}

func (h *LoggingHook) After(ctx json.HookContext, result any, err error) (any, error) {
    h.logger.Info("operation completed", "op", ctx.Operation, "error", err)
    return result, err
}

func ErrorHook added in v1.3.0

func ErrorHook(handler func(ctx HookContext, err error) error) Hook

ErrorHook creates a hook that intercepts errors. The handler can transform errors or log them.

Example:

p.AddHook(json.ErrorHook(func(ctx json.HookContext, err error) error {
    sentry.CaptureException(err)
    return err // return original or transformed error
}))

func LoggingHook added in v1.3.0

func LoggingHook(logger interface{ Info(msg string, args ...any) }) Hook

LoggingHook creates a hook that logs all operations. The logger must implement Info(msg string, args ...any).

Example:

p.AddHook(json.LoggingHook(slog.Default()))

func TimingHook added in v1.3.0

func TimingHook(recorder interface {
	Record(op string, duration time.Duration)
}) Hook

TimingHook creates a hook that records operation duration. The recorder must implement Record(op string, duration time.Duration).

Example:

p.AddHook(json.TimingHook(myMetricsRecorder))

func ValidationHook added in v1.3.0

func ValidationHook(validator func(jsonStr, path string) error) Hook

ValidationHook creates a hook that validates input before operations. Return error from the validator to abort the operation.

Example:

p.AddHook(json.ValidationHook(func(jsonStr, path string) error {
    if len(jsonStr) > 1_000_000 {
        return errors.New("JSON too large")
    }
    return nil
}))

type HookContext added in v1.3.0

type HookContext struct {
	// Operation is the type of operation being performed.
	// Values: "get", "set", "delete", "marshal", "unmarshal"
	Operation string

	// JSONStr is the input JSON string (may be empty for marshal).
	//
	// SECURITY WARNING: This field may contain sensitive data (passwords,
	// tokens, API keys, PII). Do NOT log this value. Only inspect specific
	// paths if needed. Use Operation and Path for logging purposes.
	JSONStr string

	// Path is the target path (may be empty for marshal/unmarshal).
	Path string

	// Value is the value for set operations.
	Value any

	// Config is the active configuration.
	Config *Config

	// StartTime is when the operation started (set before After is called).
	StartTime time.Time
}

HookContext provides context for operation hooks.

type HookFunc added in v1.3.0

type HookFunc struct {
	BeforeFn func(ctx HookContext) error
	AfterFn  func(ctx HookContext, result any, err error) (any, error)
}

HookFunc is an adapter to use ordinary functions as Hooks. Useful for simple hooks that don't need both Before and After.

Example:

// Only need After
p.AddHook(&json.HookFunc{
    AfterFn: func(ctx json.HookContext, result any, err error) (any, error) {
        log.Printf("%s completed in %v", ctx.Operation, time.Since(ctx.StartTime))
        return result, err
    },
})

func (*HookFunc) After added in v1.3.0

func (h *HookFunc) After(ctx HookContext, result any, err error) (any, error)

After calls the AfterFn if set, otherwise returns the original result.

func (*HookFunc) Before added in v1.3.0

func (h *HookFunc) Before(ctx HookContext) error

Before calls the BeforeFn if set, otherwise returns nil.

type InvalidUnmarshalError

type InvalidUnmarshalError struct {
	Type reflect.Type
}

InvalidUnmarshalError describes an invalid argument passed to Unmarshal. (The argument to Unmarshal must be a non-nil pointer.)

func (*InvalidUnmarshalError) Error

func (e *InvalidUnmarshalError) Error() string

type IterableValue

type IterableValue struct {
	// contains filtered or unexported fields
}

IterableValue wraps a value to provide convenient access methods during iteration. Used by Foreach and ForeachKey callback functions to provide structured access. Note: Simplified to avoid resource leaks from holding processor/iterator references.

Example:

err := processor.Foreach(data, "items", func(item json.IterableValue) error {
    name, _ := item.GetString("name")
    age, _ := item.GetInt("age")
    fmt.Printf("Name: %s, Age: %d\n", name, age)
    return nil
})

func CollectJSONL added in v1.4.0

func CollectJSONL(reader io.Reader) ([]*IterableValue, error)

CollectJSONL collects all JSONL items into a slice.

Example:

items, err := json.CollectJSONL(reader)
for _, item := range items {
	fmt.Println(item.GetString("name"))
}

func FilterJSONL added in v1.4.0

func FilterJSONL(reader io.Reader, predicate func(item *IterableValue) bool) ([]*IterableValue, error)

FilterJSONL filters JSONL data based on a predicate function.

Example:

adults, err := json.FilterJSONL(reader, func(item *json.IterableValue) bool {
	return item.GetInt("age") >= 18
})

func FirstJSONL added in v1.4.0

func FirstJSONL(reader io.Reader, predicate func(item *IterableValue) bool) (*IterableValue, bool, error)

FirstJSONL returns the first JSONL item that matches a predicate.

Example:

user, found, err := json.FirstJSONL(reader, func(item *json.IterableValue) bool {
	return item.GetString("name") == "Alice"
})

func (*IterableValue) Break

func (iv *IterableValue) Break() error

Break returns a signal to stop iteration without error. Use it in ForeachFile/ForeachFileChunked callback to exit early.

Example:

processor.ForeachFile("data.json", func(key any, item *json.IterableValue) error {
    if item.GetInt("id") == targetId {
        return item.Break() // stop iteration
    }
    return nil // continue
})

func (*IterableValue) Exists

func (iv *IterableValue) Exists(key string) bool

Exists checks if a key or path exists in the object Supports path navigation with dot notation and array indices

func (*IterableValue) ForeachNested

func (iv *IterableValue) ForeachNested(path string, fn func(key any, item *IterableValue))

ForeachNested iterates over nested JSON structures with a path

func (*IterableValue) Get

func (iv *IterableValue) Get(path string) any

Get returns a value by path (supports dot notation and array indices)

func (*IterableValue) GetArray

func (iv *IterableValue) GetArray(key string) []any

GetArray returns an array value by key or path Supports path navigation with dot notation and array indices

func (*IterableValue) GetBool

func (iv *IterableValue) GetBool(key string) bool

GetBool returns a bool value by key or path Supports path navigation with dot notation and array indices

func (*IterableValue) GetBoolWithDefault

func (iv *IterableValue) GetBoolWithDefault(key string, defaultValue bool) bool

GetBoolWithDefault returns a bool value by key or path with a default fallback Supports path navigation with dot notation and array indices

func (*IterableValue) GetData added in v1.2.0

func (iv *IterableValue) GetData() any

GetData returns the underlying data

func (*IterableValue) GetFloat64

func (iv *IterableValue) GetFloat64(key string) float64

GetFloat64 returns a float64 value by key or path Supports path navigation with dot notation and array indices

func (*IterableValue) GetFloat64WithDefault

func (iv *IterableValue) GetFloat64WithDefault(key string, defaultValue float64) float64

GetFloat64WithDefault returns a float64 value by key or path with a default fallback Supports path navigation with dot notation and array indices

func (*IterableValue) GetInt

func (iv *IterableValue) GetInt(key string) int

GetInt returns an int value by key or path Supports path navigation with dot notation and array indices (e.g., "user.age" or "users[0].id")

func (*IterableValue) GetIntWithDefault

func (iv *IterableValue) GetIntWithDefault(key string, defaultValue int) int

GetIntWithDefault returns an int value by key or path with a default fallback Supports path navigation with dot notation and array indices

func (*IterableValue) GetObject

func (iv *IterableValue) GetObject(key string) map[string]any

GetObject returns an object value by key or path Supports path navigation with dot notation and array indices

func (*IterableValue) GetString

func (iv *IterableValue) GetString(key string) string

GetString returns a string value by key or path Supports path navigation with dot notation and array indices (e.g., "user.address.city" or "users[0].name")

func (*IterableValue) GetStringWithDefault

func (iv *IterableValue) GetStringWithDefault(key string, defaultValue string) string

GetStringWithDefault returns a string value by key or path with a default fallback Supports path navigation with dot notation and array indices

func (*IterableValue) GetWithDefault

func (iv *IterableValue) GetWithDefault(key string, defaultValue any) any

GetWithDefault returns a value by key or path with a default fallback Supports path navigation with dot notation and array indices

func (*IterableValue) IsEmpty

func (iv *IterableValue) IsEmpty(key string) bool

IsEmpty checks if a specific key's or path's value is empty Supports path navigation with dot notation and array indices

func (*IterableValue) IsEmptyData added in v1.0.8

func (iv *IterableValue) IsEmptyData() bool

IsEmptyData checks if the whole value is empty (for backward compatibility)

func (*IterableValue) IsNull

func (iv *IterableValue) IsNull(key string) bool

IsNull checks if a specific key's or path's value is null Supports path navigation with dot notation and array indices

func (*IterableValue) IsNullData added in v1.0.8

func (iv *IterableValue) IsNullData() bool

IsNullData checks if the whole value is null (for backward compatibility)

func (*IterableValue) Release added in v1.2.0

func (iv *IterableValue) Release()

Release returns the IterableValue to the pool

type Iterator

type Iterator struct {
	// contains filtered or unexported fields
}

Iterator represents an iterator over JSON data for sequential access. Supports iteration over both arrays and objects. Thread-safe for single goroutine use; for concurrent access, create separate iterators.

Example:

// Iterate over an array
data, _ := json.ParseAny(`[1, 2, 3]`)
iter := json.NewIterator(data)
for iter.HasNext() {
    value, _ := iter.Next()
    fmt.Println(value)
}

// Iterate over an object
data, _ := json.ParseAny(`{"a": 1, "b": 2}`)
iter := json.NewIterator(data)
for iter.HasNext() {
    value, _ := iter.Next()
    fmt.Println(value)
}

func NewIterator

func NewIterator(data any, cfg ...Config) *Iterator

NewIterator creates a new Iterator over the provided data. Creates an iterator for traversing arrays and objects.

The optional cfg parameter is reserved for future configuration options and maintains API consistency with other constructors. Currently no configuration options affect Iterator behavior.

Example:

data, _ := json.ParseAny(`{"name": "Alice", "age": 30}`)
iter := json.NewIterator(data)
for iter.HasNext() {
    value, ok := iter.Next()
    if !ok {
        break
    }
    fmt.Println(value)
}

func (*Iterator) HasNext added in v1.0.8

func (it *Iterator) HasNext() bool

HasNext checks if there are more elements to iterate. Returns true if the iterator has not reached the end of the data. For arrays, checks if position < array length. For objects, checks if position < number of keys.

func (*Iterator) Next added in v1.0.8

func (it *Iterator) Next() (any, bool)

Next returns the next element and advances the iterator. Returns (value, true) if an element is available, or (nil, false) at the end. For arrays, returns the array element at the current position. For objects, returns the value at the current key position.

func (*Iterator) Reset added in v1.4.0

func (it *Iterator) Reset()

Reset clears the iterator state and releases cached resources. After calling Reset, the iterator can be reused with new data via ResetWith. This is useful for reducing allocations when iterating over multiple JSON structures.

Example:

iter := json.NewIterator(data1)
for iter.HasNext() {
    iter.Next()
}
iter.Reset() // Clear cached keys
iter.ResetWith(data2) // Reuse iterator with new data

func (*Iterator) ResetWith added in v1.4.0

func (it *Iterator) ResetWith(data any)

ResetWith clears the iterator state and initializes it with new data. This allows reusing the iterator to avoid allocations.

Example:

iter := json.NewIterator(data1)
// ... iterate over data1 ...
iter.ResetWith(data2) // Reuse iterator with new data
// ... iterate over data2 ...

type IteratorControl

type IteratorControl int

IteratorControl represents control flags for iteration operations. Used by Foreach* functions to control iteration flow.

const (
	IteratorNormal IteratorControl = iota // continue normally
	// IteratorContinue skips the current item and continues iteration
	IteratorContinue
	// IteratorBreak stops iteration entirely
	IteratorBreak
)

type JSONLStats added in v1.2.0

type JSONLStats struct {
	LinesProcessed int64
	BytesWritten   int64
}

JSONLStats holds statistics for JSONL (JSON Lines) processing. Used by JSONLWriter to track lines and bytes written.

type JSONLWriter added in v1.2.0

type JSONLWriter struct {
	// contains filtered or unexported fields
}

JSONLWriter writes JSON Lines (NDJSON) format to an io.Writer. Each value is written as a single line of JSON, suitable for log files, data pipelines, and streaming applications.

Example:

file, _ := os.Create("output.jsonl")
writer := json.NewJSONLWriter(file)
defer file.Close()

writer.Write(map[string]any{"name": "Alice", "id": 1})
writer.Write(map[string]any{"name": "Bob", "id": 2})

func NewJSONLWriter added in v1.2.0

func NewJSONLWriter(writer io.Writer, cfg ...Config) *JSONLWriter

NewJSONLWriter creates a new JSONL writer that writes to the provided io.Writer. HTML escaping is controlled by Config.EscapeHTML (default: false for performance).

Example:

var buf bytes.Buffer
writer := json.NewJSONLWriter(&buf)
writer.Write(map[string]any{"event": "login", "user": "alice"})

With custom config:

cfg := json.DefaultConfig()
cfg.EscapeHTML = true
writer := json.NewJSONLWriter(&buf, cfg)

func (*JSONLWriter) Err added in v1.2.0

func (w *JSONLWriter) Err() error

Err returns any error encountered during previous write operations. Returns nil if no errors have occurred.

func (*JSONLWriter) Stats added in v1.2.0

func (w *JSONLWriter) Stats() JSONLStats

Stats returns writing statistics including lines processed and bytes written.

func (*JSONLWriter) Write added in v1.2.0

func (w *JSONLWriter) Write(data any) error

Write writes a single JSON value as a line to the underlying writer. Each value is followed by a newline character. Returns any error encountered during writing.

Example:

writer.Write(map[string]any{"name": "Alice"})  // Writes: {"name":"Alice"}\n
writer.Write([]int{1, 2, 3})                   // Writes: [1,2,3]\n

func (*JSONLWriter) WriteAll added in v1.2.0

func (w *JSONLWriter) WriteAll(data []any) error

WriteAll writes multiple JSON values as separate lines. Stops and returns the first error encountered.

Example:

writer.WriteAll([]any{
    map[string]any{"id": 1},
    map[string]any{"id": 2},
})

func (*JSONLWriter) WriteRaw added in v1.2.0

func (w *JSONLWriter) WriteRaw(line []byte) error

WriteRaw writes a raw JSON line that is already encoded. A newline is appended if the line does not already end with one. Use this for pre-encoded JSON to avoid re-encoding overhead.

Example:

writer.WriteRaw([]byte(`{"pre":"encoded"}`))

type JsonsError

type JsonsError struct {
	Op      string `json:"op"`      // Operation that failed
	Path    string `json:"path"`    // JSON path where error occurred
	Message string `json:"message"` // Human-readable error message
	Err     error  `json:"err"`     // Underlying error
}

JsonsError represents a JSON processing error with essential context

func (*JsonsError) Error

func (e *JsonsError) Error() string

func (*JsonsError) Is

func (e *JsonsError) Is(target error) bool

Is implements error matching for Go 1.13+ error handling Compares Op, Path, and Err fields for complete equality. Note: Message is intentionally excluded as it's derived from other fields.

func (*JsonsError) Unwrap

func (e *JsonsError) Unwrap() error

Unwrap returns the underlying error for error chain support

type MarshalerError

type MarshalerError struct {
	Type reflect.Type
	Err  error
	// contains filtered or unexported fields
}

MarshalerError represents an error from calling a MarshalJSON or MarshalText method.

func (*MarshalerError) Error

func (e *MarshalerError) Error() string

func (*MarshalerError) Unwrap

func (e *MarshalerError) Unwrap() error

type MergeMode added in v1.3.0

type MergeMode = internal.MergeMode

MergeMode defines the merge strategy for combining JSON objects and arrays. This is a type alias to internal.MergeMode to ensure consistency across the codebase.

type NDJSONProcessor added in v1.2.0

type NDJSONProcessor struct {
	// contains filtered or unexported fields
}

NDJSONProcessor processes newline-delimited JSON files

func NewNDJSONProcessor added in v1.2.0

func NewNDJSONProcessor(cfg ...Config) *NDJSONProcessor

NewNDJSONProcessor creates a new NDJSON processor. The optional cfg parameter allows customization using the unified Config pattern. When config is provided, cfg.JSONLBufferSize is used as the buffer size.

Example:

// Default settings
processor := json.NewNDJSONProcessor()

// With custom buffer size
cfg := json.DefaultConfig()
cfg.JSONLBufferSize = 128 * 1024
processor := json.NewNDJSONProcessor(cfg)

func (*NDJSONProcessor) ProcessFile added in v1.2.0

func (np *NDJSONProcessor) ProcessFile(filename string, fn func(lineNum int, obj map[string]any) error) error

ProcessFile processes an NDJSON file line by line

func (*NDJSONProcessor) ProcessReader added in v1.2.0

func (np *NDJSONProcessor) ProcessReader(reader io.Reader, fn func(lineNum int, obj map[string]any) error) error

ProcessReader processes NDJSON from a reader. Enforces per-line size limits and nesting depth checks to prevent DoS attacks.

type Number

type Number string

Number represents a JSON number literal.

func (Number) Float64

func (n Number) Float64() (float64, error)

Float64 returns the number as a float64.

func (Number) Int64

func (n Number) Int64() (int64, error)

Int64 returns the number as an int64.

func (Number) String

func (n Number) String() string

String returns the literal text of the number.

type ParallelIterator added in v1.2.0

type ParallelIterator struct {
	// contains filtered or unexported fields
}

ParallelIterator processes arrays in parallel using worker goroutines

func NewParallelIterator added in v1.2.0

func NewParallelIterator(data []any, cfg ...Config) *ParallelIterator

NewParallelIterator creates a new parallel iterator. The optional cfg parameter allows customization using the unified Config pattern. When config is provided, cfg.MaxConcurrency is used as the worker count.

Example:

// Default settings (workers = 4)
iter := json.NewParallelIterator(data)

// With custom worker count
cfg := json.DefaultConfig()
cfg.MaxConcurrency = 8
iter := json.NewParallelIterator(data, cfg)

// Legacy pattern (backward compatible)
iter := json.NewParallelIteratorWithWorkers(data, 8)

func (*ParallelIterator) Close added in v1.3.0

func (it *ParallelIterator) Close()

Close releases resources associated with the ParallelIterator. Signals any running goroutines to stop and waits for them to finish.

func (*ParallelIterator) Filter added in v1.2.0

func (it *ParallelIterator) Filter(predicate func(int, any) bool) []any

Filter filters elements in parallel using a predicate function Returns a new slice with elements that pass the predicate

func (*ParallelIterator) ForEach added in v1.2.0

func (it *ParallelIterator) ForEach(fn func(int, any) error) error

ForEach processes each element in parallel using the provided function The function receives the index and value of each element Returns the first error encountered, or nil if all operations succeed

func (*ParallelIterator) ForEachBatch added in v1.2.0

func (it *ParallelIterator) ForEachBatch(batchSize int, fn func(int, []any) error) error

ForEachBatch processes elements in batches in parallel Each batch is processed by a single goroutine

func (*ParallelIterator) ForEachBatchWithContext added in v1.3.0

func (it *ParallelIterator) ForEachBatchWithContext(ctx context.Context, batchSize int, fn func(int, []any) error) error

ForEachBatchWithContext processes elements in batches with context support for cancellation Each batch is processed by a single goroutine RESOURCE FIX: Added context support for graceful goroutine termination

func (*ParallelIterator) ForEachWithContext added in v1.3.0

func (it *ParallelIterator) ForEachWithContext(ctx context.Context, fn func(int, any) error) error

ForEachWithContext processes each element in parallel with context support for cancellation The function receives the index and value of each element Returns the first error encountered, or ctx.Err() if context is cancelled RESOURCE FIX: Added context support for graceful goroutine termination

func (*ParallelIterator) Map added in v1.2.0

func (it *ParallelIterator) Map(transform func(int, any) (any, error)) ([]any, error)

Map applies a transformation function to each element in parallel Returns a new slice with the transformed values

type ParsedJSON added in v1.2.0

type ParsedJSON struct {
	// contains filtered or unexported fields
}

ParsedJSON represents a pre-parsed JSON document that can be reused for multiple operations. This is a performance optimization for scenarios where the same JSON is queried multiple times. OPTIMIZED: Pre-parsing avoids repeated JSON parsing overhead for repeated queries.

func (*ParsedJSON) Data added in v1.2.0

func (p *ParsedJSON) Data() any

Data returns the underlying parsed data

func (*ParsedJSON) Release added in v1.4.1

func (p *ParsedJSON) Release()

Release releases resources held by ParsedJSON. After calling Release, Data() returns nil and the processor reference is cleared. Call this when finished with a pre-parsed JSON document to allow GC of the processor.

type PathParser

type PathParser interface {
	// ParsePath parses a path string into segments.
	ParsePath(path string) ([]PathSegment, error)
}

PathParser parses path strings into segments. Implement to provide custom path syntax support.

type PathSegment

type PathSegment = internal.PathSegment

PathSegment represents a parsed path segment. This is an alias to internal.PathSegment.

type PatternLevel added in v1.3.0

type PatternLevel int

PatternLevel represents the severity level of a dangerous pattern.

const (
	// PatternLevelCritical always blocks the operation.
	// Use for patterns that pose immediate security risks (e.g., prototype pollution).
	PatternLevelCritical PatternLevel = iota

	// PatternLevelWarning blocks in strict mode, logs warning in lenient mode.
	// Use for patterns that may indicate malicious intent but have legitimate uses.
	PatternLevelWarning

	// PatternLevelInfo logs but never blocks.
	// Use for audit/tracking purposes without interrupting operations.
	PatternLevelInfo
)

func (PatternLevel) String added in v1.3.0

func (pl PatternLevel) String() string

String returns the string representation of PatternLevel.

type Processor

type Processor struct {
	// contains filtered or unexported fields
}

Processor is the main JSON processing engine with thread safety and performance optimization

func New

func New(cfg ...Config) (*Processor, error)

New creates a new JSON processor with the given configuration. If no configuration is provided, uses default configuration.

Returns an error if the configuration is invalid (see Config.Validate). Always call Close() when done to release resources.

Example:

// Using default configuration
processor, err := json.New()
if err != nil {
    // Handle configuration error
}
defer processor.Close()

// With custom configuration
cfg := json.DefaultConfig()
cfg.CreatePaths = true
cfg.EnableCache = true
processor, err := json.New(cfg)

// Using preset configuration
processor, err := json.New(json.SecurityConfig())

func (*Processor) AddHook added in v1.3.0

func (p *Processor) AddHook(hook Hook)

AddHook adds an operation hook to the processor. Hooks are called before and after each operation. Multiple hooks can be added and are executed in order (Before) and reverse order (After).

Example:

type LoggingHook struct{}
func (h *LoggingHook) Before(ctx json.HookContext) error {
    log.Printf("before %s", ctx.Operation)
    return nil
}
func (h *LoggingHook) After(ctx json.HookContext, result any, err error) (any, error) {
    log.Printf("after %s", ctx.Operation)
    return result, err
}

processor, err := json.New()
if err != nil {
    return err
}
defer processor.Close()
processor.AddHook(&LoggingHook{})

func (*Processor) ClearCache

func (p *Processor) ClearCache()

ClearCache clears all cached data

func (*Processor) Close

func (p *Processor) Close() error

Close closes the processor and cleans up resources. This method is idempotent and thread-safe. After Close is called, all operations on the processor will return ErrProcessorClosed.

IMPORTANT: Always call Close() to release resources:

processor, err := json.New()
if err != nil {
    return err
}
defer processor.Close()

func (*Processor) CollectJSONL added in v1.4.0

func (p *Processor) CollectJSONL(reader io.Reader) ([]*IterableValue, error)

CollectJSONL collects all JSONL items into a slice

Example:

processor, _ := json.New()
defer processor.Close()

items, err := processor.CollectJSONL(reader)
for _, item := range items {
	fmt.Println(item.GetString("name"))
}

func (*Processor) Compact

func (p *Processor) Compact(jsonStr string, cfg ...Config) (string, error)

Compact removes whitespace from JSON string. This is useful for minimizing JSON size for transmission or storage. The result is a single-line JSON string with no unnecessary whitespace.

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrInvalidJSON: jsonStr is not valid JSON
  • ErrSizeLimit: JSON exceeds MaxJSONSize

Example:

compact, err := processor.Compact(`{
    "name": "Alice",
    "age": 30
}`)
// Output: {"name":"Alice","age":30}

func (*Processor) CompactBuffer added in v1.1.0

func (p *Processor) CompactBuffer(dst *bytes.Buffer, src []byte, cfg ...Config) error

CompactBuffer appends to dst the JSON-encoded src with insignificant space characters elided. Compatible with encoding/json.Compact with optional Config support. This is the buffer-based counterpart to Compact, matching the encoding/json.Compact signature.

Example:

var buf bytes.Buffer
err := processor.CompactBuffer(&buf, []byte(`{"name": "Alice"}`))

func (*Processor) CompilePath added in v1.2.0

func (p *Processor) CompilePath(path string) (*CompiledPath, error)

CompilePath compiles a JSON path string into a CompiledPath for fast repeated operations The returned CompiledPath can be reused for multiple Get/Set/Delete operations. Call Release() on the returned CompiledPath when done to return it to the pool.

func (*Processor) Delete

func (p *Processor) Delete(jsonStr, path string, cfg ...Config) (string, error)

Delete removes a value from JSON at the specified path

func (*Processor) DeleteClean added in v1.3.0

func (p *Processor) DeleteClean(jsonStr, path string, cfg ...Config) (string, error)

DeleteClean removes a value from JSON and cleans up null placeholders. This is the unified API for delete-with-cleanup operations.

Example:

result, err := processor.DeleteClean(data, "users[0].profile")

func (*Processor) Encode added in v1.1.0

func (p *Processor) Encode(value any, config ...Config) (string, error)

Encode converts any Go value to JSON string This is a convenience method that matches the package-level Encode signature

func (*Processor) EncodeBatch

func (p *Processor) EncodeBatch(pairs map[string]any, cfg ...Config) (string, error)

EncodeBatch encodes multiple key-value pairs as a JSON object. This method accepts variadic Config for unified API pattern.

Example:

result, err := processor.EncodeBatch(pairs, json.PrettyConfig())

func (*Processor) EncodeFields

func (p *Processor) EncodeFields(value any, fields []string, cfg ...Config) (string, error)

EncodeFields encodes struct fields selectively based on field names. This method accepts variadic Config for unified API pattern.

Example:

result, err := processor.EncodeFields(value, []string{"name", "email"}, json.PrettyConfig())

func (*Processor) EncodePretty added in v1.1.0

func (p *Processor) EncodePretty(value any, config ...Config) (string, error)

EncodePretty converts any Go value to pretty-formatted JSON string This is a convenience method that matches the package-level EncodePretty signature

func (*Processor) EncodeStream

func (p *Processor) EncodeStream(values any, cfg ...Config) (string, error)

EncodeStream encodes multiple values as a JSON array stream. This method accepts variadic Config for unified API pattern.

Example:

result, err := processor.EncodeStream(values, json.PrettyConfig())

func (*Processor) EncodeWithConfig

func (p *Processor) EncodeWithConfig(value any, cfg ...Config) (string, error)

EncodeWithConfig converts any Go value to JSON string with full configuration control. PERFORMANCE: Uses FastEncoder for simple types to avoid reflection overhead.

Example:

// Default configuration
result, err := processor.EncodeWithConfig(data)

// With custom configuration
cfg := json.DefaultConfig()
cfg.Pretty = true
result, err := processor.EncodeWithConfig(data, cfg)

// With preset configuration
result, err := processor.EncodeWithConfig(data, json.PrettyConfig())

func (*Processor) FilterJSONL added in v1.4.0

func (p *Processor) FilterJSONL(reader io.Reader, predicate func(item *IterableValue) bool) ([]*IterableValue, error)

FilterJSONL filters JSONL data based on a predicate function

Example:

processor, _ := json.New()
defer processor.Close()

adults, err := processor.FilterJSONL(reader, func(item *json.IterableValue) bool {
	return item.GetInt("age") >= 18
})

func (*Processor) FirstJSONL added in v1.4.0

func (p *Processor) FirstJSONL(reader io.Reader, predicate func(item *IterableValue) bool) (*IterableValue, bool, error)

FirstJSONL returns the first JSONL item that matches a predicate

Example:

processor, _ := json.New()
defer processor.Close()

user, found, err := processor.FirstJSONL(reader, func(item *json.IterableValue) bool {
	return item.GetString("name") == "Alice"
})

func (*Processor) Foreach

func (p *Processor) Foreach(jsonStr string, fn func(key any, item *IterableValue))

Foreach iterates over JSON arrays or objects using this processor

func (*Processor) ForeachFile added in v1.4.0

func (p *Processor) ForeachFile(filePath string, fn func(key any, item *IterableValue) error) error

ForeachFile iterates over JSON arrays or objects directly from a file. The callback returns an error to signal iteration control:

  • nil: continue iteration
  • item.Break(): stop iteration without error
  • other error: stop iteration and return the error

Example:

err := processor.ForeachFile("data.json", func(key any, item *json.IterableValue) error {
    fmt.Println(item.GetString("name"))
    return nil // continue
})

func (*Processor) ForeachFileChunked added in v1.4.0

func (p *Processor) ForeachFileChunked(filePath string, chunkSize int, fn func(chunk []*IterableValue) error) error

ForeachFileChunked iterates over JSON arrays from a file in chunks (batches). This is useful for batch processing large datasets.

Example:

err := processor.ForeachFileChunked("data.json", 100, func(chunk []*json.IterableValue) error {
    // Process batch of 100 items
    for _, item := range chunk {
        fmt.Println(item.GetString("name"))
    }
    return nil
})

func (*Processor) ForeachFileNested added in v1.4.0

func (p *Processor) ForeachFileNested(filePath string, fn func(key any, item *IterableValue) error) error

ForeachFileNested recursively iterates over all nested JSON structures from a file.

Example:

err := processor.ForeachFileNested("data.json", func(key any, item *json.IterableValue) error {
    fmt.Printf("Key: %v, Type: %T\n", key, item.Value)
    return nil
})

func (*Processor) ForeachFileWithPath added in v1.4.0

func (p *Processor) ForeachFileWithPath(filePath, path string, fn func(key any, item *IterableValue) error) error

ForeachFileWithPath iterates over JSON arrays or objects at a specific path from a file.

Example:

err := processor.ForeachFileWithPath("data.json", ".users", func(key any, item *json.IterableValue) error {
    fmt.Println(item.GetString("name"))
    return nil
})

func (*Processor) ForeachJSONL added in v1.4.0

func (p *Processor) ForeachJSONL(reader io.Reader, fn func(lineNum int, item *IterableValue) error) error

ForeachJSONL iterates over JSONL data with IterableValue callback (similar to Foreach)

Example:

processor, _ := json.New()
defer processor.Close()

err := processor.ForeachJSONL(reader, func(lineNum int, item *json.IterableValue) error {
	fmt.Printf("Line: %d, Value: %v\n", lineNum, item.GetData())
	return nil
})

func (*Processor) ForeachNested added in v1.0.9

func (p *Processor) ForeachNested(jsonStr string, fn func(key any, item *IterableValue))

ForeachNested recursively iterates over all nested JSON structures This method traverses through all nested objects and arrays

func (*Processor) ForeachNestedWithError added in v1.4.0

func (p *Processor) ForeachNestedWithError(jsonStr string, fn func(key any, item *IterableValue) error) error

ForeachNestedWithError recursively iterates over all nested JSON structures with error-returning callback.

Example:

err := processor.ForeachNestedWithError(jsonStr, func(key any, item *json.IterableValue) error {
    fmt.Printf("Key: %v\n", key)
    return nil
})

func (*Processor) ForeachReturn

func (p *Processor) ForeachReturn(jsonStr string, fn func(key any, item *IterableValue)) (string, error)

ForeachReturn iterates over JSON arrays or objects and returns the JSON string This is useful for iteration with transformation purposes

func (*Processor) ForeachWithError added in v1.4.0

func (p *Processor) ForeachWithError(jsonStr, path string, fn func(key any, item *IterableValue) error) error

ForeachWithError iterates over JSON arrays or objects with error-returning callback. The callback returns an error to signal iteration control:

  • nil: continue iteration
  • errBreak (via item.Break()): stop iteration without error
  • other error: stop iteration and return the error

Example:

err := processor.ForeachWithError(jsonStr, ".", func(key any, item *json.IterableValue) error {
    if item.GetInt("id") == targetId {
        return item.Break() // stop iteration
    }
    return nil // continue
})

func (*Processor) ForeachWithPath

func (p *Processor) ForeachWithPath(jsonStr, path string, fn func(key any, item *IterableValue)) error

ForeachWithPath iterates over JSON arrays or objects at a specific path using this processor This allows using custom processor configurations (security limits, nesting depth, etc.)

func (*Processor) ForeachWithPathAndControl added in v1.0.9

func (p *Processor) ForeachWithPathAndControl(jsonStr, path string, fn func(key any, value any) IteratorControl) error

ForeachWithPathAndControl iterates with control over iteration flow

func (*Processor) ForeachWithPathAndIterator added in v1.0.9

func (p *Processor) ForeachWithPathAndIterator(jsonStr, path string, fn func(key any, item *IterableValue, currentPath string) IteratorControl) error

ForeachWithPathAndIterator iterates over JSON at a path with path information

func (*Processor) Get

func (p *Processor) Get(jsonStr, path string, cfg ...Config) (result any, err error)

Get retrieves a value from JSON using a path expression with performance

func (*Processor) GetArray added in v1.1.0

func (p *Processor) GetArray(jsonStr, path string, defaultValue ...[]any) []any

GetArray retrieves an array value from JSON at the specified path. Returns defaultValue if provided, otherwise nil when: path not found, value is null, or type conversion fails.

func (*Processor) GetBool added in v1.1.0

func (p *Processor) GetBool(jsonStr, path string, defaultValue ...bool) bool

GetBool retrieves a bool value from JSON at the specified path. Returns defaultValue if provided, otherwise false when: path not found, value is null, or type conversion fails.

func (*Processor) GetCompiled added in v1.2.0

func (p *Processor) GetCompiled(jsonStr string, cp *CompiledPath) (any, error)

GetCompiled retrieves a value from JSON using a pre-compiled path. PERFORMANCE: Skips path parsing for faster repeated operations.

func (*Processor) GetConfig

func (p *Processor) GetConfig() Config

GetConfig returns a copy of the processor configuration

func (*Processor) GetFloat added in v1.3.0

func (p *Processor) GetFloat(jsonStr, path string, defaultValue ...float64) float64

GetFloat retrieves a float64 value from JSON at the specified path. Returns defaultValue if provided, otherwise 0.0 when: path not found, value is null, or type conversion fails.

func (*Processor) GetFromParsed added in v1.2.0

func (p *Processor) GetFromParsed(parsed *ParsedJSON, path string, cfg ...Config) (any, error)

GetFromParsed retrieves a value from a pre-parsed JSON document at the specified path. This is significantly faster than Get() for repeated queries on the same JSON.

OPTIMIZED: Skips JSON parsing, goes directly to path navigation.

func (*Processor) GetHealthStatus

func (p *Processor) GetHealthStatus() HealthStatus

GetHealthStatus returns the current health status

func (*Processor) GetInt added in v1.1.0

func (p *Processor) GetInt(jsonStr, path string, defaultValue ...int) int

GetInt retrieves an int value from JSON at the specified path. Returns defaultValue if provided, otherwise 0 when: path not found, value is null, or type conversion fails.

func (*Processor) GetMultiple

func (p *Processor) GetMultiple(jsonStr string, paths []string, cfg ...Config) (map[string]any, error)

GetMultiple retrieves multiple values from JSON using multiple path expressions

func (*Processor) GetObject added in v1.1.0

func (p *Processor) GetObject(jsonStr, path string, defaultValue ...map[string]any) map[string]any

GetObject retrieves an object value from JSON at the specified path. Returns defaultValue if provided, otherwise nil when: path not found, value is null, or type conversion fails.

func (*Processor) GetStats

func (p *Processor) GetStats() Stats

GetStats returns processor performance statistics

func (*Processor) GetString added in v1.1.0

func (p *Processor) GetString(jsonStr, path string, defaultValue ...string) string

GetString retrieves a string value from JSON at the specified path. Returns defaultValue if provided, otherwise "" when: path not found, value is null, or type conversion fails.

func (*Processor) GetWithContext added in v1.4.1

func (p *Processor) GetWithContext(ctx context.Context, jsonStr, path string, cfg ...Config) (any, error)

GetWithContext retrieves a value from JSON with context support for cancellation. This is the context-aware version of Get() that respects context cancellation and timeout deadlines.

Example:

ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
defer cancel()
value, err := processor.GetWithContext(ctx, jsonStr, "user.name")

func (*Processor) HTMLEscape added in v1.4.0

func (p *Processor) HTMLEscape(dst *bytes.Buffer, src []byte, cfg ...Config)

HTMLEscape appends to dst the JSON-encoded src with HTML-safe escaping. Performs character-level escaping of <, >, &, U+2028, and U+2029 without re-encoding. Compatible with encoding/json.HTMLEscape.

Example:

var buf bytes.Buffer
processor.HTMLEscape(&buf, []byte(`{"url":"<script>alert(1)</script>"}`))

func (*Processor) Indent added in v1.4.0

func (p *Processor) Indent(dst *bytes.Buffer, src []byte, prefix, indent string, cfg ...Config) error

Indent appends to dst an indented form of the JSON-encoded src. Compatible with encoding/json.Indent with optional Config support.

Example:

var buf bytes.Buffer
err := processor.Indent(&buf, []byte(`{"name":"Alice"}`), "", "  ")

func (*Processor) IsClosed

func (p *Processor) IsClosed() bool

IsClosed returns true if the processor has been closed or close timed out. In both states the processor should not accept new operations.

func (*Processor) LoadFromFile

func (p *Processor) LoadFromFile(filePath string, cfg ...Config) (string, error)

LoadFromFile loads JSON data from a file and returns the raw JSON string. The file path is validated for security (path traversal, symlinks, etc.).

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrSecurityViolation: path contains traversal or unsafe patterns
  • ErrSizeLimit: file exceeds MaxJSONSize
  • File system errors (wrapped in JsonsError)

Example:

jsonStr, err := processor.LoadFromFile("data.json")
if err != nil {
    // Handle error
}

func (*Processor) LoadFromReader

func (p *Processor) LoadFromReader(reader io.Reader, cfg ...Config) (string, error)

LoadFromReader loads JSON data from an io.Reader and returns the raw JSON string. The reader is limited to MaxJSONSize to prevent excessive memory usage.

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrSizeLimit: data exceeds MaxJSONSize
  • Reader errors (wrapped in JsonsError)

Example:

file, _ := os.Open("data.json")
defer file.Close()
jsonStr, err := processor.LoadFromReader(file)

func (*Processor) MapJSONL added in v1.4.0

func (p *Processor) MapJSONL(reader io.Reader, fn func(lineNum int, item *IterableValue) (any, error)) ([]any, error)

MapJSONL maps JSONL data into a new format using a mapping function

Example:

processor, _ := json.New()
defer processor.Close()

result, err := processor.MapJSONL(reader, func(lineNum int, item *json.IterableValue) (any, error) {
	// Transform each item
	return map[string]any{
		"name": item.GetString("name"),
		"age":  item.GetInt("age"),
	}, nil
})

func (*Processor) Marshal

func (p *Processor) Marshal(value any, cfg ...Config) ([]byte, error)

Marshal converts any Go value to JSON bytes (similar to json.Marshal) PERFORMANCE: Uses FastEncoder for simple types to avoid reflection overhead. Uses encodeWithConfigToBytes for complex types to avoid string round-trip.

func (*Processor) MarshalIndent

func (p *Processor) MarshalIndent(value any, prefix, indent string, cfg ...Config) ([]byte, error)

MarshalIndent converts any Go value to indented JSON bytes (similar to json.MarshalIndent)

func (*Processor) MarshalToFile added in v1.0.6

func (p *Processor) MarshalToFile(path string, data any, cfg ...Config) error

MarshalToFile converts data to JSON and saves it to the specified file using Config. This is the unified API that accepts variadic Config. Creates parent directories if they don't exist.

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrSecurityViolation: path contains traversal or unsafe patterns
  • ErrInvalidJSON: data cannot be marshaled
  • File system errors (wrapped in JsonsError)

Example:

// Simple save
err := processor.MarshalToFile("data.json", data)

// Pretty-printed save
err := processor.MarshalToFile("data.json", data, json.PrettyConfig())

func (*Processor) Parse

func (p *Processor) Parse(jsonStr string, target any, cfg ...Config) error

Parse parses a JSON string into the provided target with improved error handling. This is the core parsing method that supports both standard and number-preserving modes.

Parameters:

  • jsonStr: the JSON string to parse
  • target: pointer to the target variable where parsed data will be stored
  • opts: optional Config for parsing options (e.g., PreserveNumbers)

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrInvalidJSON: jsonStr is not valid JSON
  • ErrSizeLimit: JSON exceeds MaxJSONSize
  • ErrTypeMismatch: JSON structure doesn't match target type

Example:

// Parse into map
var obj map[string]any
err := processor.Parse(`{"name":"Alice"}`, &obj)

// Parse into struct
type User struct { Name string }
var user User
err := processor.Parse(`{"name":"Alice"}`, &user)

// Parse with number preservation
cfg := json.DefaultConfig()
cfg.PreserveNumbers = true
var data any
err := processor.Parse(`{"price":19.99}`, &data, cfg)

func (*Processor) ParseAny added in v1.3.0

func (p *Processor) ParseAny(jsonStr string, cfg ...Config) (any, error)

ParseAny parses a JSON string and returns the result as any. This method provides the same behavior as the package-level Parse function. Use Parse when you need to unmarshal into a specific target type.

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrInvalidJSON: jsonStr is not valid JSON
  • ErrSizeLimit: JSON exceeds MaxJSONSize

Example:

data, err := processor.ParseAny(`{"name": "Alice"}`)
if err != nil {
    // Handle error
}
obj := data.(map[string]any)

func (*Processor) PreParse added in v1.2.0

func (p *Processor) PreParse(jsonStr string, cfg ...Config) (*ParsedJSON, error)

PreParse parses a JSON string and returns a ParsedJSON object that can be reused for multiple Get operations. This is a performance optimization for scenarios where the same JSON is queried multiple times.

OPTIMIZED: Pre-parsing avoids repeated JSON parsing overhead for repeated queries.

Call Release() on the returned ParsedJSON when finished to free the processor reference.

Example:

parsed, err := processor.PreParse(jsonStr)
if err != nil { return err }
defer parsed.Release()
value1, _ := processor.GetFromParsed(parsed, "path1")
value2, _ := processor.GetFromParsed(parsed, "path2")

func (*Processor) Prettify added in v1.3.0

func (p *Processor) Prettify(jsonStr string, cfg ...Config) (string, error)

Prettify formats JSON string with indentation. This is the recommended method for formatting JSON strings. Uses default indentation of 2 spaces, configurable via Config.Indent and Config.Prefix.

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrInvalidJSON: jsonStr is not valid JSON
  • ErrSizeLimit: JSON exceeds MaxJSONSize

Example:

pretty, err := processor.Prettify(`{"name":"Alice","age":30}`)
// Output:
// {
//   "name": "Alice",
//   "age": 30
// }

// Custom indentation
cfg := json.DefaultConfig()
cfg.Indent = "    " // 4 spaces
pretty, err := processor.Prettify(jsonStr, cfg)

func (*Processor) ProcessBatch

func (p *Processor) ProcessBatch(operations []BatchOperation, cfg ...Config) ([]BatchResult, error)

ProcessBatch processes multiple operations in a single batch

func (*Processor) ReduceJSONL added in v1.4.0

func (p *Processor) ReduceJSONL(reader io.Reader, initial any, fn func(acc any, item *IterableValue) any) (any, error)

ReduceJSONL reduces JSONL data to a single aggregated result using a reducer function The accumulator starts with the initial value and is updated by the reducer function.

Example:

processor, _ := json.New()
defer processor.Close()

totalAge, err := processor.ReduceJSONL(reader, 0, func(acc any, item *json.IterableValue) any {
	return acc.(int64) + int64(item.GetInt("age"))
})

func (*Processor) SafeGet

func (p *Processor) SafeGet(jsonStr, path string, cfg ...Config) AccessResult

SafeGet performs a type-safe get operation with comprehensive error handling. Accepts optional Config for controlling validation, security, and caching behavior.

func (*Processor) SaveToFile

func (p *Processor) SaveToFile(filePath string, data any, cfg ...Config) error

SaveToFile saves data to a JSON file using Config. This is the unified API that accepts variadic Config. Creates parent directories if they don't exist.

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrSecurityViolation: path contains traversal or unsafe patterns
  • ErrInvalidJSON: data contains invalid JSON string
  • File system errors (wrapped in JsonsError)

Example:

// Simple save
err := processor.SaveToFile("data.json", data)

// Pretty-printed save
err := processor.SaveToFile("data.json", data, json.PrettyConfig())

func (*Processor) SaveToWriter

func (p *Processor) SaveToWriter(writer io.Writer, data any, cfg ...Config) error

SaveToWriter saves data to an io.Writer using Config. This is the unified API that accepts variadic Config.

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrInvalidJSON: data contains invalid JSON string
  • Writer errors (wrapped in JsonsError)

Example:

var buf bytes.Buffer
err := processor.SaveToWriter(&buf, data, json.PrettyConfig())

func (*Processor) Set

func (p *Processor) Set(jsonStr, path string, value any, cfg ...Config) (string, error)

Set sets a value in JSON at the specified path Returns:

  • On success: modified JSON string and nil error
  • On failure: original unmodified JSON string and error information

func (*Processor) SetCreate added in v1.3.0

func (p *Processor) SetCreate(jsonStr, path string, value any, cfg ...Config) (string, error)

SetCreate sets a value at the specified path, creating intermediate paths as needed. This is the unified API for set-with-path-creation operations.

Example:

result, err := processor.SetCreate(data, "users[0].profile.name", "Alice")

func (*Processor) SetFromParsed added in v1.2.0

func (p *Processor) SetFromParsed(parsed *ParsedJSON, path string, value any, cfg ...Config) (*ParsedJSON, error)

SetFromParsed modifies a pre-parsed JSON document at the specified path. Returns a new ParsedJSON with the modified data (original is not modified).

OPTIMIZED: Skips JSON parsing, works directly on parsed data.

func (*Processor) SetLogger

func (p *Processor) SetLogger(logger *slog.Logger)

SetLogger sets a custom structured logger for the processor

func (*Processor) SetMultiple

func (p *Processor) SetMultiple(jsonStr string, updates map[string]any, cfg ...Config) (string, error)

SetMultiple sets multiple values in JSON using a map of path-value pairs Returns:

  • On success: modified JSON string and nil error
  • On failure: original unmodified JSON string and error information

func (*Processor) SetMultipleCreate added in v1.3.0

func (p *Processor) SetMultipleCreate(jsonStr string, updates map[string]any, cfg ...Config) (string, error)

SetMultipleCreate sets multiple values, creating intermediate paths as needed. This is the unified API for batch set-with-path-creation operations.

Example:

result, err := processor.SetMultipleCreate(data, map[string]any{"user.name": "Alice", "user.age": 30})

func (*Processor) StreamJSONL added in v1.4.0

func (p *Processor) StreamJSONL(reader io.Reader, fn func(lineNum int, item *IterableValue) error) error

StreamJSONL streams JSONL data from a reader with IterableValue callback support.

This method provides line-by-line processing of JSONL (NDJSON) files with full IterableValue support for type-safe data access.

Example:

processor, _ := json.New()
defer processor.Close()

err := processor.StreamJSONL(reader, func(lineNum int, item *json.IterableValue) error {
	name := item.GetString("name")
	age := item.GetInt("age")
	fmt.Printf("Line %d: name=%s, age=%d\n", lineNum, name, age)
	return nil // continue processing
	// return item.Break() // to stop iteration
})

func (*Processor) StreamJSONLChunked added in v1.4.0

func (p *Processor) StreamJSONLChunked(reader io.Reader, chunkSize int, fn func(chunk []*IterableValue) error) error

StreamJSONLChunked processes JSONL data in chunks for memory-efficient processing This method provides chunked processing of JSONL files with configurable chunk size

Example:

processor, _ := json.New()
defer processor.Close()

err := processor.StreamJSONLChunked(reader, 1000, func(chunk []*IterableValue) error {
	// Process chunk of 1000 items
	return nil
})

func (*Processor) StreamJSONLFile added in v1.4.0

func (p *Processor) StreamJSONLFile(filename string, fn func(lineNum int, item *IterableValue) error) error

StreamJSONLFile streams JSONL data from a file with IterableValue callback

Example:

processor, _ := json.New()
defer processor.Close()

err := processor.StreamJSONLFile("data.jsonl", func(lineNum int, item *json.IterableValue) error {
	fmt.Printf("Line %d: %v\n", lineNum, item.GetData())
	return nil
})

func (*Processor) StreamJSONLParallel added in v1.4.0

func (p *Processor) StreamJSONLParallel(reader io.Reader, workers int, fn func(lineNum int, item *IterableValue) error) error

StreamJSONLParallel processes JSONL data in parallel with multiple workers. This method provides parallel processing of JSONL files with configurable worker count.

Example:

processor, _ := json.New()
defer processor.Close()

err := processor.StreamJSONLParallel(reader, 4, func(lineNum int, item *json.IterableValue) error {
	// Process each item in parallel
	return nil
})

func (*Processor) StreamJSONLParallelWithContext added in v1.4.0

func (p *Processor) StreamJSONLParallelWithContext(ctx context.Context, reader io.Reader, workers int, fn func(lineNum int, item *IterableValue) error) error

StreamJSONLParallelWithContext processes JSONL data in parallel with context support for cancellation. Workers and the scanner goroutine respect context cancellation. RESOURCE FIX: Added context parameter to prevent goroutine leaks when reader/fn blocks.

Example:

ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
defer cancel()
err := processor.StreamJSONLParallelWithContext(ctx, reader, 4, func(lineNum int, item *json.IterableValue) error {
    return nil
})

func (*Processor) Unmarshal

func (p *Processor) Unmarshal(data []byte, value any, cfg ...Config) error

Unmarshal parses the JSON-encoded data and stores the result in the value pointed to by v. This method is fully compatible with encoding/json.Unmarshal. PERFORMANCE: Fast path for simple cases to avoid string conversion overhead.

func (*Processor) UnmarshalFromFile added in v1.0.6

func (p *Processor) UnmarshalFromFile(path string, v any, cfg ...Config) error

UnmarshalFromFile reads JSON data from the specified file and unmarshals it into the provided value. The file path is validated for security before reading.

Parameters:

  • path: file path to read JSON from
  • v: pointer to the target variable where JSON will be unmarshaled
  • opts: optional Config for security validation and processing

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrInvalidJSON: file content is not valid JSON
  • ErrSecurityViolation: path contains traversal or unsafe patterns
  • ErrSizeLimit: file exceeds MaxJSONSize
  • File system errors (wrapped in JsonsError)

Example:

var config Config
err := processor.UnmarshalFromFile("config.json", &config)

func (*Processor) Valid

func (p *Processor) Valid(jsonStr string, cfg ...Config) (bool, error)

Valid validates JSON format without parsing the entire structure. Returns (true, nil) if valid, (false, error) if invalid. The error contains details about what makes the JSON invalid.

Errors:

  • ErrProcessorClosed: processor has been closed
  • ErrInvalidJSON: jsonStr is not valid JSON (returned with false)
  • ErrSizeLimit: JSON exceeds MaxJSONSize

Example:

valid, err := processor.Valid(`{"name":"Alice"}`)
if err != nil {
    // Handle validation error
}
if valid {
    // JSON is valid
}

func (*Processor) ValidBytes added in v1.1.0

func (p *Processor) ValidBytes(data []byte) bool

ValidBytes validates JSON format from byte slice (matches encoding/json.Valid signature) This method provides compatibility with the standard library's json.Valid function

func (*Processor) ValidateSchema

func (p *Processor) ValidateSchema(jsonStr string, schema *Schema, cfg ...Config) ([]ValidationError, error)

ValidateSchema validates JSON data against a schema

func (*Processor) WarmupCache

func (p *Processor) WarmupCache(jsonStr string, paths []string, cfg ...Config) (*WarmupResult, error)

WarmupCache pre-loads commonly used paths into cache to improve first-access performance

type Result added in v1.3.0

type Result[T any] struct {
	Value  T     // The result value (exported for backward compatibility)
	Exists bool  // Whether the path exists
	Error  error // Error if any
}

Result represents a type-safe operation result with comprehensive error handling. This is the unified type for all type-safe operations.

Example:

result := json.GetResult[string](data, "user.name")
if result.Ok() {
    name := result.Unwrap()
}
// Or with default
name := json.GetResult[string](data, "user.name").UnwrapOr("unknown")

func (Result[T]) Ok added in v1.3.0

func (r Result[T]) Ok() bool

Ok returns true if the result is valid (no error and exists).

func (Result[T]) Unwrap added in v1.3.0

func (r Result[T]) Unwrap() T

Unwrap returns the value or zero value if there's an error or value doesn't exist. For panic behavior, use Must() instead.

func (Result[T]) UnwrapOr added in v1.3.0

func (r Result[T]) UnwrapOr(defaultValue T) T

UnwrapOr returns the value or the provided default if there's an error or value doesn't exist.

type Schema

type Schema struct {
	// Type specifies the JSON type: "object", "array", "string", "number", "integer", "boolean", "null".
	Type string `json:"type,omitempty"`

	// Properties defines the schema for each property when Type is "object".
	Properties map[string]*Schema `json:"properties,omitempty"`

	// Items defines the schema for array elements when Type is "array".
	Items *Schema `json:"items,omitempty"`

	// Required lists property names that must be present (for objects).
	Required []string `json:"required,omitempty"`

	// MinLength is the minimum string length (for strings).
	MinLength int `json:"minLength,omitempty"`

	// MaxLength is the maximum string length (for strings).
	MaxLength int `json:"maxLength,omitempty"`

	// Minimum is the minimum numeric value (for numbers/integers).
	Minimum float64 `json:"minimum,omitempty"`

	// Maximum is the maximum numeric value (for numbers/integers).
	Maximum float64 `json:"maximum,omitempty"`

	// Pattern is a regex pattern that the string must match (for strings).
	Pattern string `json:"pattern,omitempty"`

	// Format specifies a semantic format: "email", "uri", "date", "date-time", etc.
	Format string `json:"format,omitempty"`

	// AdditionalProperties controls whether extra properties are allowed (for objects).
	AdditionalProperties bool `json:"additionalProperties,omitempty"`

	// MinItems is the minimum number of items (for arrays).
	MinItems int `json:"minItems,omitempty"`

	// MaxItems is the maximum number of items (for arrays).
	MaxItems int `json:"maxItems,omitempty"`

	// UniqueItems requires all array elements to be unique (for arrays).
	UniqueItems bool `json:"uniqueItems,omitempty"`

	// Enum restricts the value to one of the specified values.
	Enum []any `json:"enum,omitempty"`

	// Const requires the value to equal this exact value.
	Const any `json:"const,omitempty"`

	// MultipleOf requires the value to be a multiple of this number.
	MultipleOf float64 `json:"multipleOf,omitempty"`

	// ExclusiveMinimum excludes the minimum value itself (for numbers).
	ExclusiveMinimum bool `json:"exclusiveMinimum,omitempty"`

	// ExclusiveMaximum excludes the maximum value itself (for numbers).
	ExclusiveMaximum bool `json:"exclusiveMaximum,omitempty"`

	// Title is a human-readable title for the schema.
	Title string `json:"title,omitempty"`

	// Description is a human-readable description of the schema.
	Description string `json:"description,omitempty"`

	// Default is the default value for the property.
	Default any `json:"default,omitempty"`

	// Examples provides example values for documentation.
	Examples []any `json:"examples,omitempty"`
	// contains filtered or unexported fields
}

Schema represents a JSON schema for validation. Supports a subset of JSON Schema Draft 7 for validating JSON structures.

Example:

schema := &json.Schema{
    Type:     "object",
    Required: []string{"name", "email"},
    Properties: map[string]*json.Schema{
        "name":  {Type: "string", MinLength: 1},
        "email": {Type: "string", Format: "email"},
        "age":   {Type: "integer", Minimum: 0},
    },
}
err := processor.ValidateSchema(jsonStr, schema)

func DefaultSchema

func DefaultSchema() *Schema

DefaultSchema returns a default schema configuration. All zero values are omitted for brevity; only non-zero defaults are set.

func NewSchemaWithConfig added in v1.3.0

func NewSchemaWithConfig(cfg SchemaConfig) *Schema

NewSchemaWithConfig creates a new Schema with the provided configuration. This is the recommended way to create configured Schema instances.

type SchemaConfig added in v1.3.0

type SchemaConfig struct {
	Type                 string
	Properties           map[string]*Schema
	Items                *Schema
	Required             []string
	MinLength            *int
	MaxLength            *int
	Minimum              *float64
	Maximum              *float64
	Pattern              string
	Format               string
	AdditionalProperties *bool
	MinItems             *int
	MaxItems             *int
	UniqueItems          bool
	Enum                 []any
	Const                any
	MultipleOf           *float64
	ExclusiveMinimum     *bool
	ExclusiveMaximum     *bool
	Title                string
	Description          string
	Default              any
	Examples             []any
}

SchemaConfig provides configuration options for creating a Schema. This follows the Config pattern as required by the design guidelines.

func DefaultSchemaConfig added in v1.4.0

func DefaultSchemaConfig() SchemaConfig

DefaultSchemaConfig returns the default configuration for creating a Schema. This follows the unified Config pattern as required by the design guidelines.

Example:

cfg := json.DefaultSchemaConfig()
cfg.Type = "object"
cfg.Required = []string{"name", "email"}
schema := json.NewSchemaWithConfig(cfg)

type Stats

type Stats struct {
	CacheSize        int64         `json:"cache_size"`
	CacheMemory      int64         `json:"cache_memory"`
	MaxCacheSize     int           `json:"max_cache_size"`
	HitCount         int64         `json:"hit_count"`
	MissCount        int64         `json:"miss_count"`
	HitRatio         float64       `json:"hit_ratio"`
	CacheTTL         time.Duration `json:"cache_ttl"`
	CacheEnabled     bool          `json:"cache_enabled"`
	IsClosed         bool          `json:"is_closed"`
	MemoryEfficiency float64       `json:"memory_efficiency"`
	OperationCount   int64         `json:"operation_count"`
	ErrorCount       int64         `json:"error_count"`
}

Stats provides processor performance statistics

func GetStats added in v1.1.0

func GetStats() Stats

GetStats returns statistics about the default processor.

type StreamIterator added in v1.2.0

type StreamIterator struct {
	// contains filtered or unexported fields
}

StreamIterator provides memory-efficient iteration over large JSON arrays. It processes elements one at a time without loading the entire array into memory.

func NewStreamIterator added in v1.2.0

func NewStreamIterator(reader io.Reader, cfg ...Config) *StreamIterator

NewStreamIterator creates a stream iterator from a reader with default settings. The optional cfg parameter allows customization using the unified Config pattern.

Example:

// Default settings
iter := json.NewStreamIterator(reader)

// With custom configuration
cfg := json.DefaultConfig()
cfg.BufferSize = 64 * 1024
iter := json.NewStreamIterator(reader, cfg)

func (*StreamIterator) Err added in v1.2.0

func (si *StreamIterator) Err() error

Err returns any error encountered during iteration

func (*StreamIterator) Index added in v1.2.0

func (si *StreamIterator) Index() int

Index returns the current index

func (*StreamIterator) Next added in v1.2.0

func (si *StreamIterator) Next() bool

Next advances to the next element Returns true if there is a next element, false otherwise

func (*StreamIterator) Value added in v1.2.0

func (si *StreamIterator) Value() any

Value returns the current element

type StreamObjectIterator added in v1.2.0

type StreamObjectIterator struct {
	// contains filtered or unexported fields
}

StreamObjectIterator provides memory-efficient iteration over JSON objects

func NewStreamObjectIterator added in v1.2.0

func NewStreamObjectIterator(reader io.Reader, cfg ...Config) *StreamObjectIterator

NewStreamObjectIterator creates a stream object iterator from a reader. The optional cfg parameter allows customization using the unified Config pattern. When config is provided, cfg.BufferSize is used for buffered reading.

Example:

// Default settings
iter := json.NewStreamObjectIterator(reader)

// With custom buffer size
cfg := json.DefaultConfig()
cfg.BufferSize = 128 * 1024
iter := json.NewStreamObjectIterator(reader, cfg)

func (*StreamObjectIterator) Err added in v1.2.0

func (soi *StreamObjectIterator) Err() error

Err returns any error encountered

func (*StreamObjectIterator) Key added in v1.2.0

func (soi *StreamObjectIterator) Key() string

Key returns the current key

func (*StreamObjectIterator) Next added in v1.2.0

func (soi *StreamObjectIterator) Next() bool

Next advances to the next key-value pair

func (*StreamObjectIterator) Value added in v1.2.0

func (soi *StreamObjectIterator) Value() any

Value returns the current value

type SyntaxError

type SyntaxError struct {
	Offset int64 // error occurred after reading Offset bytes
	// contains filtered or unexported fields
}

SyntaxError is a description of a JSON syntax error. Unmarshal will return a SyntaxError if the JSON can't be parsed.

func (*SyntaxError) Error

func (e *SyntaxError) Error() string

type Token

type Token any

Token holds a value of one of these types:

Delim, for the four JSON delimiters [ ] { }
bool, for JSON booleans
float64, for JSON numbers
Number, for JSON numbers
string, for JSON string literals
nil, for JSON null

type TypeEncoder added in v1.3.0

type TypeEncoder interface {
	// Encode converts a specific type to its JSON representation.
	// Return the JSON string (including quotes for strings) or an error.
	Encode(v reflect.Value) (string, error)
}

TypeEncoder handles encoding for specific reflect.Types. Register via Config.CustomTypeEncoders map.

Example:

type TimeEncoder struct{}
func (e *TimeEncoder) Encode(v reflect.Value) (string, error) {
    t := v.Interface().(time.Time)
    return `"` + t.Format(time.RFC3339) + `"`, nil
}

cfg := json.DefaultConfig()
cfg.CustomTypeEncoders = map[reflect.Type]json.TypeEncoder{
    reflect.TypeOf(time.Time{}): &TimeEncoder{},
}

type UnmarshalTypeError

type UnmarshalTypeError struct {
	Value  string       // description of JSON value - "bool", "array", "number -5"
	Type   reflect.Type // type of Go value it could not be assigned to
	Offset int64        // error occurred after reading Offset bytes
	Struct string       // name of the root type containing the field
	Field  string       // the full path from root node to the value
	Err    error        // may be nil
}

UnmarshalTypeError describes a JSON value that was not appropriate for a value of a specific Go type.

func (*UnmarshalTypeError) Error

func (e *UnmarshalTypeError) Error() string

func (*UnmarshalTypeError) Unwrap

func (e *UnmarshalTypeError) Unwrap() error

type UnsupportedTypeError

type UnsupportedTypeError struct {
	Type reflect.Type
}

UnsupportedTypeError is returned by Marshal when attempting to encode an unsupported value type.

func (*UnsupportedTypeError) Error

func (e *UnsupportedTypeError) Error() string

type UnsupportedValueError

type UnsupportedValueError struct {
	Value reflect.Value
	Str   string
}

UnsupportedValueError is returned by Marshal when attempting to encode an unsupported value.

func (*UnsupportedValueError) Error

func (e *UnsupportedValueError) Error() string

type ValidationError

type ValidationError struct {
	// Path is the JSON path where the validation error occurred.
	Path string `json:"path"`
	// Message describes the validation failure.
	Message string `json:"message"`
}

ValidationError represents a schema validation error. It includes the path where the error occurred and a descriptive message.

Example:

err := processor.ValidateSchema(jsonStr, schema)
if err != nil {
    var valErr *json.ValidationError
    if errors.As(err, &valErr) {
        fmt.Printf("Error at %s: %s\n", valErr.Path, valErr.Message)
    }
}

func ValidateSchema

func ValidateSchema(jsonStr string, schema *Schema, cfg ...Config) ([]ValidationError, error)

ValidateSchema validates JSON data against a schema

func (*ValidationError) Error

func (ve *ValidationError) Error() string

type Validator added in v1.0.6

type Validator interface {
	// Validate checks JSON string for issues.
	// Returns nil if valid, or an error describing the problem.
	Validate(jsonStr string) error
}

Validator validates JSON input before processing. Implement this interface to add custom validation logic.

Example:

type SizeValidator struct { MaxSize int64 }
func (v *SizeValidator) Validate(jsonStr string) error {
    if int64(len(jsonStr)) > v.MaxSize {
        return fmt.Errorf("JSON exceeds max size: %d", v.MaxSize)
    }
    return nil
}

type WarmupResult

type WarmupResult struct {
	TotalPaths  int      `json:"total_paths"`
	Successful  int      `json:"successful"`
	Failed      int      `json:"failed"`
	SuccessRate float64  `json:"success_rate"`
	FailedPaths []string `json:"failed_paths,omitempty"`
}

WarmupResult represents the result of a cache warmup operation

func WarmupCache added in v1.1.0

func WarmupCache(jsonStr string, paths []string, cfg ...Config) (*WarmupResult, error)

WarmupCache pre-warms the cache for frequently accessed paths. This can improve performance for subsequent operations on the same JSON.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL