gdl

package module
v1.4.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 11, 2025 License: MIT Imports: 13 Imported by: 0

README ΒΆ

gdl - Go Downloader

A fast, concurrent, and feature-rich file downloader library and CLI tool written in Go.

CI/CD Pipeline Go Report Card codecov Go Reference

✨ Features

  • πŸš€ Optimized Performance: Smart defaults with adaptive concurrency based on file size
  • ⚑ Bandwidth Control: Rate limiting with human-readable formats (1MB/s, 500k, etc.)
  • πŸ“Š Progress Tracking: Real-time progress with multiple display formats
  • πŸ”„ Resume Support: Automatic resume of interrupted downloads
  • 🌐 Protocol Support: HTTP/HTTPS with custom headers and proxy support
  • πŸ›‘οΈ Error Handling: Comprehensive error handling with smart retry logic
  • ⚑ Cross-Platform: Works on Linux, macOS, Windows, and ARM (including Apple Silicon, Raspberry Pi, and ARM servers)
  • πŸ”§ Dual Interface: Both library API and command-line tool
  • πŸ“± User-Friendly: Interactive prompts and helpful error messages
  • πŸ”Œ Plugin System: Extensible plugin architecture for custom functionality
  • 🎯 Event-Driven: Hook into download lifecycle events
  • πŸ“Š Performance Monitoring: Metrics collection and aggregation for production use
  • πŸ” Security: Built-in security constraints and validation

πŸ“ˆ Performance

gdl uses intelligent optimization to provide the best download speeds:

Benchmark-Optimized Smart Defaults
  • Small files (<1MB): Lightweight mode - minimal overhead, 60-90% of curl speed ⚑
  • Small-medium files (1-10MB): 2 concurrent connections
  • Medium files (10-100MB): 4 concurrent connections
  • Large files (100MB-1GB): 8 concurrent connections
  • Very large files (>1GB): 16 concurrent connections
Real-World Performance

Based on benchmarks against curl and wget:

File Size gdl (optimized) gdl (baseline) curl wget
100KB 90% ⚑ 30% 100% 90%
500KB 80% ⚑ 20% 100% 80%
1MB 80% 80% 100% 130%
10MB 85% πŸš€ 70% 100% 90%
50MB 90% πŸš€ 60% 100% 85%

Performance as % of curl speed (baseline = 100%). Higher is better.

Technical Optimizations
  • Zero-Copy I/O: Linux sendfile for files >10MB reduces CPU by 20-30%
  • Buffer Pooling: 4-tier memory pool (8KB/64KB/1MB/4MB) reduces allocations by 50-90%
  • Advanced Connection Pool: DNS caching, TLS session resumption, CDN optimization
  • Lightweight Mode: Minimal overhead HTTP client for files <1MB (3-6x faster)
  • Optimized HTTP Client: Enhanced connection pooling and HTTP/2 support
  • Adaptive Chunk Sizing: Dynamic buffer sizes (8KB-1MB) based on file size
  • Memory Efficiency: 50-90% less memory usage with advanced pooling
  • CI Performance Testing: Automated regression detection with 10% threshold

πŸ“¦ Installation

As a CLI tool
Go Install
go install github.com/forest6511/gdl/cmd/gdl@latest
Homebrew (macOS/Linux)
brew tap forest6511/tap
brew install forest6511/tap/gdl

Note: Use the full tap name forest6511/tap/gdl to avoid conflicts with the GNOME gdl package.

Binary Downloads

Download pre-built binaries from GitHub Releases

As a library
go get github.com/forest6511/gdl

πŸš€ Quick Start

CLI Usage
# Simple download (uses smart defaults)
gdl https://example.com/file.zip

# Override smart defaults with custom settings
gdl --concurrent 8 --chunk-size 2MB -o myfile.zip https://example.com/file.zip

# With bandwidth limiting
gdl --max-rate 1MB/s https://example.com/large-file.zip
gdl --max-rate 500k https://example.com/file.zip  # Smart concurrency still applies

# With custom headers and resume
gdl -H "Authorization: Bearer token" --resume https://example.com/file.zip

# Using plugins
gdl plugin install oauth2-auth
gdl plugin list
gdl --plugin oauth2-auth https://secure-api.example.com/file.zip
Library Usage
package main

import (
    "bytes"
    "context"
    "fmt"
    "github.com/forest6511/gdl"
)

func main() {
    // Simple download using Download function
    stats, err := gdl.Download(context.Background(), 
        "https://example.com/file.zip", "file.zip")
    if err != nil {
        panic(err)
    }
    fmt.Printf("Downloaded %d bytes in %v\n", stats.BytesDownloaded, stats.Duration)
    
    // Download with progress callback and bandwidth limiting using DownloadWithOptions
    options := &gdl.Options{
        MaxConcurrency: 4,
        MaxRate: 1024 * 1024, // 1MB/s rate limit
        ProgressCallback: func(p gdl.Progress) {
            fmt.Printf("Progress: %.1f%% Speed: %.2f MB/s\n", 
                p.Percentage, float64(p.Speed)/1024/1024)
        },
    }
    
    stats, err = gdl.DownloadWithOptions(context.Background(),
        "https://example.com/file.zip", "file.zip", options)
    if err == nil {
        fmt.Printf("Download completed successfully! Average speed: %.2f MB/s\n", 
            float64(stats.AverageSpeed)/1024/1024)
    }
    
    // Download to memory using DownloadToMemory
    data, stats, err := gdl.DownloadToMemory(context.Background(),
        "https://example.com/small-file.txt")
    if err == nil {
        fmt.Printf("Downloaded %d bytes to memory in %v\n", len(data), stats.Duration)
    }
    
    // Download to any io.Writer using DownloadToWriter
    var buffer bytes.Buffer
    stats, err = gdl.DownloadToWriter(context.Background(),
        "https://example.com/data.json", &buffer)
    if err == nil {
        fmt.Printf("Downloaded to buffer: %d bytes\n", stats.BytesDownloaded)
    }
        
    // Resume a partial download using DownloadWithResume
    stats, err = gdl.DownloadWithResume(context.Background(),
        "https://example.com/large-file.zip", "large-file.zip")
    if err == nil && stats.Resumed {
        fmt.Printf("Successfully resumed download: %d bytes\n", stats.BytesDownloaded)
    }
        
    // Get file information without downloading using GetFileInfo
    fileInfo, err := gdl.GetFileInfo(context.Background(),
        "https://example.com/file.zip")
    if err == nil {
        fmt.Printf("File size: %d bytes\n", fileInfo.Size)
    }
    
    // Using the extensible Downloader with plugins
    downloader := gdl.NewDownloader()
    
    // Register custom protocol handler
    err = downloader.RegisterProtocol(customProtocolHandler)
    
    // Use middleware
    downloader.UseMiddleware(rateLimitingMiddleware)
    
    // Register event listeners
    downloader.On(events.EventDownloadStarted, func(event events.Event) {
        fmt.Printf("Download started: %s\n", event.Data["url"])
    })
    
    // Download with plugins and middleware
    stats, err = downloader.Download(context.Background(),
        "https://example.com/file.zip", "file.zip", options)
    if err == nil {
        fmt.Printf("Plugin-enhanced download completed: %d bytes\n", stats.BytesDownloaded)
    }
}

πŸ“š Complete API Documentation

πŸ“‹ Complete CLI Reference

πŸ“– Examples

Complete working examples are available in the examples/ directory:

Running Examples
# Core functionality examples
cd examples/01_basic_download && go run main.go
cd examples/02_concurrent_download && go run main.go
cd examples/03_progress_tracking && go run main.go
cd examples/04_resume_functionality && go run main.go
cd examples/05_error_handling && go run main.go
cd examples/06_production_usage && go run main.go

# Interface examples
cd examples/cli
chmod +x *.sh
./basic_cli_examples.sh
./advanced_cli_examples.sh

# Integration demo
cd examples/integration
go run feature_demo.go

# Plugin examples
cd examples/plugins/auth/oauth2
go build -buildmode=plugin -o oauth2.so
cd examples/plugins/storage/s3
go build -buildmode=plugin -o s3.so

πŸ”„ Feature Parity Matrix

Feature CLI Library Description
Basic download βœ… βœ… Simple URL to file download
Custom destination βœ… βœ… Specify output filename/path
Overwrite existing βœ… βœ… Force overwrite of existing files
Create directories βœ… βœ… Auto-create parent directories
Concurrent downloads βœ… βœ… Multiple simultaneous connections
Custom chunk size βœ… βœ… Configurable download chunks
Bandwidth throttling βœ… βœ… Rate limiting with human-readable formats
Single-threaded mode βœ… βœ… Disable concurrent downloads
Resume downloads βœ… βœ… Continue interrupted downloads
Retry on failure βœ… βœ… Automatic retry with backoff
Custom retry settings βœ… βœ… Configure retry attempts/delays
Custom headers βœ… βœ… Add custom HTTP headers
Custom User-Agent βœ… βœ… Set custom User-Agent string
Proxy support βœ… βœ… HTTP proxy configuration
SSL verification control βœ… βœ… Skip SSL certificate verification
Redirect handling βœ… βœ… Follow HTTP redirects
Timeout configuration βœ… βœ… Set request/download timeouts
Progress display βœ… ❌ Visual progress bars
Progress callbacks ❌ βœ… Programmatic progress updates
Multiple progress formats βœ… ❌ Simple/detailed/JSON progress
Quiet mode βœ… βœ… Suppress output
Verbose mode βœ… βœ… Detailed logging
Download to memory ❌ βœ… Download directly to memory
Download to writer ❌ βœ… Download to any io.Writer
File info retrieval ❌ βœ… Get file metadata without download
Error handling βœ… βœ… Robust error handling and recovery
Comprehensive errors βœ… βœ… Detailed error information
Error suggestions βœ… ❌ User-friendly error suggestions
Multilingual messages βœ… ❌ Localized error messages
Interactive prompts βœ… ❌ User confirmation prompts
Disk space checking βœ… ❌ Pre-download space verification
Network diagnostics βœ… ❌ Network connectivity testing
Signal handling βœ… ❌ Graceful shutdown on signals
Plugin system βœ… βœ… Extensible plugin architecture
Custom protocols βœ… βœ… Plugin-based protocol handlers
Middleware support ❌ βœ… Request/response processing
Event system ❌ βœ… Download lifecycle events
Custom storage ❌ βœ… Pluggable storage backends
Performance monitoring ❌ βœ… Metrics collection and aggregation
Legend
  • βœ… Fully supported - Feature is available and fully functional
  • ❌ Not applicable - Feature doesn't make sense in this context
Key Concepts: Resume vs. Retry

The terms "Resume" and "Retry" sound similar but handle different situations. Understanding the difference is key to using gdl effectively.

Retry Resume
Purpose Automatic recovery from temporary errors Manual continuation after an intentional stop
Trigger Network errors, server errors Existence of an incomplete file
Control Number of attempts (RetryAttempts) Enabled/Disabled (EnableResume)
Result stats.Retries (count) stats.Resumed (true/false)
Scenario: Combining Resume and Retry
  1. Day 1: You start downloading a 10GB file. It gets to 5GB, and you stop the program (e.g., with Ctrl+C). (-> Interruption)
  2. Day 2: You run the same command again with resume enabled. gdl detects the incomplete 5GB file and starts downloading from that point. (-> This is a Resumed download)
  3. During the download of the remaining 5GB, your network connection briefly drops, causing a timeout error.
  4. gdl automatically waits a moment and re-attempts the failed request. (-> This is a Retry)
  5. The download then completes successfully.

In this case, the final DownloadStats would be Resumed: true and Retries: 1.

πŸ—οΈ Architecture

For the complete project structure, see Directory Structure.

Key Components
  • Core Engine (internal/core): Main download orchestration
  • Concurrency Manager (internal/concurrent): Parallel download coordination
  • Rate Limiter (pkg/ratelimit): Bandwidth throttling and rate control
  • Resume Engine (internal/resume): Download resumption and partial file handling
  • Progress System (pkg/progress): Real-time progress tracking
  • Error Framework (pkg/errors): Comprehensive error handling
  • Network Layer (internal/network): HTTP client and diagnostics
  • Storage Layer (internal/storage): File system operations
  • Plugin System (pkg/plugin): Extensible plugin architecture
  • Event System (pkg/events): Download lifecycle events
  • Middleware Layer (pkg/middleware): Request/response processing
  • Protocol Registry (pkg/protocols): Custom protocol handlers
  • Monitoring System (pkg/monitoring): Performance metrics and analytics
  • CLI Interface (cmd/gdl): Command-line tool implementation
Plugin Architecture
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                       gdl Core                              β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                 Plugin Manager                              β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Auth Plugins  β”‚ Protocol Plugins β”‚ Storage Plugins         β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ Transform      β”‚ Hook Plugins     β”‚ Custom Plugins          β”‚
β”‚ Plugins        β”‚                  β”‚                         β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Plugin Types:

  • Authentication Plugins: OAuth2, API keys, custom auth schemes
  • Protocol Plugins: FTP, S3, custom protocols
  • Storage Plugins: Cloud storage, databases, custom backends
  • Transform Plugins: Compression, encryption, format conversion
  • Hook Plugins: Pre/post processing, logging, analytics

πŸ§ͺ Testing

The project includes comprehensive testing:

Quick Development Testing
# Run all tests (basic development)
go test ./...

# Run tests with race detection (recommended)
go test -race ./...

# Run with coverage
go test -coverprofile=coverage.out ./...

# Run benchmarks
go test -bench=. ./...

# Run performance benchmarks with optimization comparisons
go test -bench=BenchmarkDownloadWith -benchmem ./internal/core/

# Check for performance regressions
./scripts/performance_check.sh
# Complete local CI validation (recommended before pushing)
./scripts/local-ci-check.sh

# OR use Makefile targets
make pre-push           # Format + all CI checks
make ci-check          # All CI checks without formatting
Cross-Platform Testing
# Test all platforms locally (requires: brew install act)
make test-ci-all       # Ubuntu + Windows + macOS
make test-ci-ubuntu    # Ubuntu only
make test-ci-windows   # Windows only  
make test-ci-macos     # macOS only

# Quick cross-compilation check
make test-cross-compile
When to Use What
Purpose Command Use Case
Development go test ./... Quick feedback during coding
Safety Check go test -race ./... Detect race conditions
Before Push ./scripts/local-ci-check.sh Full CI validation
Cross-Platform make test-ci-all Test Windows/macOS compatibility
Coverage go test -coverprofile=... Coverage analysis
Test Coverage
  • Unit tests: All packages have >90% coverage
  • Integration tests: Real HTTP download scenarios
  • CLI tests: Command-line interface functionality
  • Benchmark tests: Performance regression detection
  • Race detection: Concurrent safety verification

🀝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

Development Setup
# Clone the repository
git clone https://github.com/forest6511/gdl.git
cd gdl

# Install dependencies
go mod download

# Install golangci-lint (if not already installed)
go install github.com/golangci/golangci-lint/cmd/golangci-lint@latest

# Install act for local CI testing (optional but recommended)
brew install act

# Verify everything works (CI equivalent check)
make ci-check

# Run tests
go test ./...

# Build CLI
go build -o gdl ./cmd/gdl/
Pre-commit Checks

Before committing and pushing changes, always run these checks locally to avoid CI failures:

# Run lint checks (essential before commit/push)
golangci-lint run

# Run tests with race detection
go test -race ./...

# Run tests with coverage
go test -coverprofile=coverage.out ./...
πŸ”„ CI Compatibility

To prevent "works locally but fails in CI" issues, use these CI-equivalent commands:

# RECOMMENDED: Run complete pre-push validation
make pre-push        # Formats code AND runs all CI checks

# Alternative: Run CI checks without formatting
make ci-check        # All CI checks locally

# Cross-platform testing with act (requires: brew install act)
make test-ci-all     # Test Ubuntu, Windows, macOS locally
make test-ci-ubuntu  # Test Ubuntu CI locally
make test-ci-windows # Test Windows CI locally  
make test-ci-macos   # Test macOS CI locally

# Fix formatting issues automatically
make fix-and-commit  # Auto-fix formatting and create commit if needed

# Quick cross-compilation check
make test-cross-compile  # Fast Windows/macOS build verification

# Individual commands
make ci-format       # Format code to CI standards
make ci-vet          # go vet (excluding examples)
make ci-test-core    # core library tests with race detection

⚠️ Important: If the pre-commit hook reformats your code, it will stop the commit. Add the changes and commit again:

git add .
git commit

A pre-commit hook is automatically installed that runs CI-equivalent checks on every commit, preventing most CI failures.

Important: Always run make ci-check locally before pushing to ensure all checks pass. This prevents CI pipeline failures and maintains code quality standards.

πŸ”§ Developer Tools

Git Hooks
# Setup commit message validation and pre-commit checks
./scripts/setup-git-hooks.sh
Release Management
# Prepare a new release
./scripts/prepare-release.sh v0.10.0
# Edit CHANGELOG.md with release notes
./scripts/prepare-release.sh --release v0.10.0
Local Testing
# Test GitHub Actions locally with act
act push -j quick-checks          # Fast validation
act push -W .github/workflows/main.yml --dryrun  # Full CI dry run

πŸ“š Documentation

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • Go community for excellent libraries and tools
  • Contributors who helped improve this project
  • Users who provided feedback and bug reports

Made with ❀️ in Go

Documentation ΒΆ

Index ΒΆ

Examples ΒΆ

Constants ΒΆ

This section is empty.

Variables ΒΆ

This section is empty.

Functions ΒΆ

This section is empty.

Types ΒΆ

type DownloadStats ΒΆ

type DownloadStats struct {
	// URL is the source URL that was downloaded.
	URL string

	// Filename is the destination filename or path.
	Filename string

	// TotalSize is the total size of the downloaded file in bytes.
	TotalSize int64

	// BytesDownloaded is the number of bytes successfully downloaded.
	BytesDownloaded int64

	// StartTime is when the download started.
	StartTime time.Time

	// EndTime is when the download completed or failed.
	EndTime time.Time

	// Duration is the total time taken for the download.
	Duration time.Duration

	// AverageSpeed is the average download speed in bytes per second.
	AverageSpeed int64

	// Retries is the number of retry attempts that were made.
	Retries int

	// Success indicates whether the download completed successfully.
	Success bool

	// Error contains any error that occurred during download.
	Error error

	// Resumed indicates whether this download was resumed from a partial file.
	Resumed bool

	// ChunksUsed indicates the number of concurrent chunks used for download.
	ChunksUsed int
}

DownloadStats contains statistics about a download operation.

func Download ΒΆ

func Download(ctx context.Context, url, dest string) (*DownloadStats, error)

Download downloads a file from URL to destination path.

Example:

ctx := context.Background()
stats, err := gdl.Download(ctx, "https://example.com/file.zip", "./downloads/file.zip")
if err != nil {
    log.Fatal(err)
}
fmt.Printf("Downloaded %d bytes in %v\n", stats.BytesDownloaded, stats.Duration)
Example ΒΆ

Example demonstrates basic usage.

package main

import (
	"context"
	"fmt"

	"github.com/forest6511/gdl"
)

func main() {
	ctx := context.Background()

	_, err := gdl.Download(ctx, "https://example.com/file.txt", "downloaded.txt")
	if err != nil {
		fmt.Printf("Download failed: %v\n", err)
	}
}

func DownloadToMemory ΒΆ

func DownloadToMemory(ctx context.Context, url string) ([]byte, *DownloadStats, error)

DownloadToMemory downloads to memory and returns bytes.

Example:

ctx := context.Background()
data, err := gdl.DownloadToMemory(ctx, "https://example.com/api/data.json")
if err != nil {
    log.Fatal(err)
}
fmt.Printf("Downloaded %d bytes\n", len(data))
Example ΒΆ

Example demonstrates downloading to memory.

package main

import (
	"context"
	"fmt"

	"github.com/forest6511/gdl"
)

func main() {
	ctx := context.Background()

	data, _, err := gdl.DownloadToMemory(ctx, "https://example.com/data.json")
	if err != nil {
		fmt.Printf("Download failed: %v\n", err)
		return
	}

	fmt.Printf("Downloaded %d bytes\n", len(data))
}

func DownloadToWriter ΒΆ

func DownloadToWriter(ctx context.Context, url string, w io.Writer) (*DownloadStats, error)

DownloadToWriter downloads to an io.Writer.

Example:

ctx := context.Background()
var buf bytes.Buffer
err := gdl.DownloadToWriter(ctx, "https://example.com/data.txt", &buf)
if err != nil {
    log.Fatal(err)
}
fmt.Println("Downloaded data:", buf.String())
Example ΒΆ

Example demonstrates downloading to a custom writer.

package main

import (
	"context"
	"fmt"
	"os"

	"github.com/forest6511/gdl"
	"github.com/forest6511/gdl/pkg/validation"
)

func init() {

	validation.SetConfig(validation.TestConfig())
}

func main() {
	ctx := context.Background()

	file, err := os.Create("output.txt")
	if err != nil {
		fmt.Printf("Failed to create file: %v\n", err)
		return
	}
	defer func() { _ = file.Close() }()

	_, err = gdl.DownloadToWriter(ctx, "https://example.com/content.txt", file)
	if err != nil {
		fmt.Printf("Download failed: %v\n", err)
	}
}

func DownloadWithOptions ΒΆ

func DownloadWithOptions(ctx context.Context, url, dest string, opts *Options) (*DownloadStats, error)

DownloadWithOptions downloads with custom options.

Example:

ctx := context.Background()
opts := &gdl.Options{
    MaxConcurrency: 4,
    EnableResume: true,
    ProgressCallback: func(p gdl.Progress) {
        fmt.Printf("Downloaded: %.2f%%\n", p.Percentage)
    },
}
err := gdl.DownloadWithOptions(ctx, "https://example.com/file.zip", "./file.zip", opts)
Example ΒΆ

Example demonstrates download with options.

package main

import (
	"context"
	"fmt"

	"github.com/forest6511/gdl"
)

func main() {
	ctx := context.Background()
	opts := &gdl.Options{
		MaxConcurrency: 4,
		EnableResume:   true,
		ProgressCallback: func(p gdl.Progress) {
			fmt.Printf("Downloaded: %.2f%%\n", p.Percentage)
		},
	}

	_, err := gdl.DownloadWithOptions(ctx, "https://example.com/file.zip", "file.zip", opts)
	if err != nil {
		fmt.Printf("Download failed: %v\n", err)
	}
}

func DownloadWithResume ΒΆ

func DownloadWithResume(ctx context.Context, url, dest string) (*DownloadStats, error)

DownloadWithResume downloads a file with resume support.

Example:

ctx := context.Background()
err := gdl.DownloadWithResume(ctx, "https://example.com/large-file.zip", "./large-file.zip")
if err != nil {
    log.Fatal(err)
}
// If interrupted, running again will resume from where it left off

type Downloader ΒΆ

type Downloader struct {
	// contains filtered or unexported fields
}

Downloader provides an extensible download client with plugin support.

func NewDownloader ΒΆ

func NewDownloader() *Downloader

NewDownloader creates a new Downloader with plugin support.

func (*Downloader) Download ΒΆ

func (d *Downloader) Download(ctx context.Context, url, dest string, opts *Options) (*DownloadStats, error)

Download downloads a file using the configured plugins and middleware.

func (*Downloader) DownloadToWriter ΒΆ

func (d *Downloader) DownloadToWriter(ctx context.Context, url string, w io.Writer, opts *Options) (*DownloadStats, error)

DownloadToWriter downloads to an io.Writer with plugin support.

func (*Downloader) GetFileInfo ΒΆ

func (d *Downloader) GetFileInfo(ctx context.Context, url string) (*FileInfo, error)

GetFileInfo retrieves file information with plugin support.

func (*Downloader) On ΒΆ

func (d *Downloader) On(event events.EventType, handler events.EventListener)

On registers an event listener.

func (*Downloader) RegisterProtocol ΒΆ

func (d *Downloader) RegisterProtocol(handler protocols.ProtocolHandler) error

RegisterProtocol registers a custom protocol handler.

func (*Downloader) SetStorageBackend ΒΆ

func (d *Downloader) SetStorageBackend(name string, backend storage.StorageBackend) error

SetStorageBackend sets the storage backend.

func (*Downloader) UseMiddleware ΒΆ

func (d *Downloader) UseMiddleware(m middleware.Middleware)

UseMiddleware adds middleware to the chain.

func (*Downloader) UsePlugin ΒΆ

func (d *Downloader) UsePlugin(p plugin.Plugin) error

UsePlugin registers and initializes a plugin.

type FileInfo ΒΆ

type FileInfo struct {
	Size           int64
	Filename       string
	ContentType    string
	LastModified   time.Time
	SupportsRanges bool
}

FileInfo contains information about a remote file.

func GetFileInfo ΒΆ

func GetFileInfo(ctx context.Context, url string) (*FileInfo, error)

GetFileInfo retrieves file information without downloading.

Example:

ctx := context.Background()
info, err := gdl.GetFileInfo(ctx, "https://example.com/file.zip")
if err != nil {
    log.Fatal(err)
}
fmt.Printf("File: %s, Size: %d bytes, Type: %s\n", info.Filename, info.Size, info.ContentType)

type Options ΒΆ

type Options struct {
	ProgressCallback  ProgressCallback
	MaxConcurrency    int
	ChunkSize         int64
	EnableResume      bool
	RetryAttempts     int
	Timeout           time.Duration
	UserAgent         string
	Headers           map[string]string
	CreateDirs        bool
	OverwriteExisting bool
	Quiet             bool
	Verbose           bool
	MaxRate           int64 // Maximum download rate in bytes per second (0 = unlimited)
}

Options defines download options.

type Progress ΒΆ

type Progress struct {
	TotalSize       int64
	BytesDownloaded int64
	Speed           int64
	Percentage      float64
	TimeElapsed     time.Duration
	TimeRemaining   time.Duration
}

Progress represents the download progress.

type ProgressCallback ΒΆ

type ProgressCallback func(Progress)

ProgressCallback is a function that receives progress updates.

Directories ΒΆ

Path Synopsis
cmd
gdl command
Package main provides the command-line interface for the gdl download tool.
Package main provides the command-line interface for the gdl download tool.
examples
01_basic_download command
Package main demonstrates basic download functionality using the gdl library.
Package main demonstrates basic download functionality using the gdl library.
02_concurrent_download command
Package main demonstrates concurrent download functionality.
Package main demonstrates concurrent download functionality.
03_progress_tracking command
Package main demonstrates advanced progress tracking functionality.
Package main demonstrates advanced progress tracking functionality.
04_resume_functionality command
Package main demonstrates resume functionality for interrupted downloads.
Package main demonstrates resume functionality for interrupted downloads.
05_error_handling command
Package main demonstrates comprehensive error handling capabilities.
Package main demonstrates comprehensive error handling capabilities.
06_production_usage command
Package main demonstrates production-ready usage patterns for the gdl library.
Package main demonstrates production-ready usage patterns for the gdl library.
integration command
Package main provides a comprehensive demonstration of all gdl features This program shows both library and CLI integration working together
Package main provides a comprehensive demonstration of all gdl features This program shows both library and CLI integration working together
library_api command
internal
core
Package core provides the core implementation of the gdl download functionality.
Package core provides the core implementation of the gdl download functionality.
network
Package network provides network diagnostics and health checking capabilities.
Package network provides network diagnostics and health checking capabilities.
recovery
Package recovery provides intelligent failure analysis and recovery mechanisms.
Package recovery provides intelligent failure analysis and recovery mechanisms.
resume
Package resume provides functionality for resuming interrupted downloads.
Package resume provides functionality for resuming interrupted downloads.
retry
Package retry provides retry strategies and mechanisms for handling transient failures.
Package retry provides retry strategies and mechanisms for handling transient failures.
testing
Package testing provides error simulation tools for testing download scenarios.
Package testing provides error simulation tools for testing download scenarios.
pkg
cli
config
Package config provides configuration management for the gdl download tool.
Package config provides configuration management for the gdl download tool.
errors
Package errors defines custom error types and sentinel errors for the gdl download library.
Package errors defines custom error types and sentinel errors for the gdl download library.
help
Package help provides context-sensitive help and guidance for the gdl CLI tool.
Package help provides context-sensitive help and guidance for the gdl CLI tool.
monitoring
Package monitoring provides performance monitoring and metrics collection for gdl downloads.
Package monitoring provides performance monitoring and metrics collection for gdl downloads.
progress
Package progress provides progress tracking functionality for downloads.
Package progress provides progress tracking functionality for downloads.
ratelimit
Package ratelimit provides bandwidth throttling functionality for downloads.
Package ratelimit provides bandwidth throttling functionality for downloads.
types
Package types defines the core types and interfaces for the gdl download library.
Package types defines the core types and interfaces for the gdl download library.
ui
Package ui provides user interface formatting and interaction utilities.
Package ui provides user interface formatting and interaction utilities.
validation
Package validation provides input validation functions for public APIs.
Package validation provides input validation functions for public APIs.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL