sdutil

package module
v0.0.0-...-ebffecd Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 29, 2025 License: MIT Imports: 11 Imported by: 1

README

Secret Detection Utility Module

This module provides reusable utility functions for Secret Detection Clients (CLI/Server/Adapters) to abstract non-domain-specific tasks.

Utility Purpose
file_iter Lazily walks through file paths (files/directories) sequentially and applies any input filters
data_chunker Chunks data/files and lazily iterates/streams each chunk to the consumer

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type ChunkInfo

type ChunkInfo struct {
	Data   []byte
	Offset uint64
}

ChunkInfo structure represents a chunk of data along with its offset.

type ChunkIterEl

type ChunkIterEl struct {
	Chunk ChunkInfo
	Error error
}

func (ChunkIterEl) Empty

func (el ChunkIterEl) Empty() bool

type DataChunker

type DataChunker struct {
	// Size of chunk to be created
	ChunkSize int
	// contains filtered or unexported fields
}

DataChunker struct for chunking payloads for scanning.

func NewDataChunker

func NewDataChunker(chunkSize int) (*DataChunker, error)

func (*DataChunker) Chunk

func (p *DataChunker) Chunk(ctx context.Context, r io.Reader) iter.Seq[ChunkIterEl]

Chunk creates an iterator for reading any source implementing io.Reader in given chunk size.

func (*DataChunker) ChunkByteStream

func (p *DataChunker) ChunkByteStream(
	ctx context.Context,
	data []byte,
	bufferSize int,
) chan StreamChunkInfo

ChunkByteStream converts the chunk iterator to a stream of StreamDataChunk. It closes the stream when done, or when the context is cancelled. If an error occurs while reading, it sends the error through the stream. It returns the unbuffered channel to keep chunks consumption synchronized.

func (*DataChunker) ChunkBytes

func (p *DataChunker) ChunkBytes(data []byte) iter.Seq[ChunkInfo]

ChunkBytes creates an iterator over the given data, splitting it into chunks. The iterator provides chunks of size up to the given chunk size. If the data size is less than the chunk size, it yields the entire data as a single chunk.

func (*DataChunker) ChunkStream

func (p *DataChunker) ChunkStream(
	ctx context.Context,
	r io.Reader,
	bufferSize int,
) chan StreamChunkInfo

ChunkStream creates a stream of chunks from the given io.Reader. It closes the stream when done, or when the context is cancelled. If an error occurs while reading, it sends the error through the stream. It returns the unbuffered channel to keep chunks consumption synchronized.

func (*DataChunker) Close

func (p *DataChunker) Close()

type FileIter

type FileIter struct {
	// contains filtered or unexported fields
}

FileIter holds the paths to be iterated and the skip paths. Used in the fileIter function to iterate through paths while skipping specified paths. This is useful when we need to iterate over a set of paths and skip certain paths based on glob patterns or direct matches.

func NewFileIter

func NewFileIter(paths, skipPaths []string, skipGitDir, includeSize bool) FileIter

func (FileIter) Do

Do creates an iterator over a list of file paths, skipping specified skip paths.

type FileIterResult

type FileIterResult struct {
	FilePath string
	Err      error
	Size     int64 // If includeSize is true, this will be populated
	Skipped  bool
}

FileIterResult represents the result of a single iteration over payloads.

type StreamChunkInfo

type StreamChunkInfo struct {
	ChunkInfo // Embedded DataChunk

	Error error
}

StreamChunkInfo struct to encapsulate data chunk and possible error encountered during chunk streaming.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL