thinkingnet

module
v0.0.0-...-06c276a Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 5, 2025 License: MIT

README ยถ

ThinkingNet - Production AI Library for Go

A comprehensive, production-ready machine learning library for Go, featuring neural networks, traditional ML algorithms, and data processing utilities.

Features

  • ๐Ÿš€ High-Performance Computing: Optimized operations achieving 300M+ ops/sec, inspired by py.fast.calc.py
  • โšก Parallel Processing: Multi-core activation functions, batch processing, and matrix operations
  • ๐Ÿ’พ Memory Optimization: Advanced memory pooling with 3.5x speedup and detailed statistics
  • ๐Ÿ”ง Vectorized Operations: SIMD-like operations with loop unrolling for better performance
  • ๐Ÿ“Š Comprehensive Benchmarking: Built-in performance testing and comparison tools
  • ๐Ÿง  Neural Networks: Dense layers, activation functions, optimizers, and loss functions
  • ๐Ÿค– Traditional ML: Clustering, dimensionality reduction, classification, and regression
  • ๐Ÿ“ˆ Data Processing: Preprocessing, encoding, scaling, and dataset utilities
  • ๐ŸŽฎ Reinforcement Learning: Q-learning and Deep Q-Networks (DQN)
  • ๐Ÿญ Production Ready: Error handling, validation, testing, and documentation

Project Structure

thinkingnet/
โ”œโ”€โ”€ pkg/
โ”‚   โ”œโ”€โ”€ core/           # Core interfaces and types
โ”‚   โ”œโ”€โ”€ nn/             # Neural network components
โ”‚   โ”œโ”€โ”€ optimizers/     # Optimization algorithms
โ”‚   โ”œโ”€โ”€ losses/         # Loss functions
โ”‚   โ”œโ”€โ”€ activations/    # Activation functions
โ”‚   โ”œโ”€โ”€ layers/         # Layer implementations
โ”‚   โ”œโ”€โ”€ models/         # Model implementations
โ”‚   โ”œโ”€โ”€ preprocessing/  # Data preprocessing utilities
โ”‚   โ”œโ”€โ”€ algorithms/     # Traditional ML algorithms
โ”‚   โ”œโ”€โ”€ metrics/        # Evaluation metrics
โ”‚   โ”œโ”€โ”€ datasets/       # Dataset generators and utilities
โ”‚   โ””โ”€โ”€ utils/          # Common utilities
โ”œโ”€โ”€ examples/           # Example applications
โ”œโ”€โ”€ docs/              # Documentation
โ”œโ”€โ”€ tests/             # Integration tests
โ””โ”€โ”€ benchmarks/        # Performance benchmarks

Installation

go get github.com/blackmoon87/thinkingnet

Performance Highlights

ThinkingNet-Go achieves MAXIMUM SPEED with ultra-fast optimizations:

  • ๐Ÿ”ฅ 1.43 BILLION operations/second with ultra-fast processor
  • โšก 1.01B ReLU ops/sec with UltraFast activation processor
  • ๐Ÿš€ 897M Sigmoid ops/sec using optimized lookup tables
  • ๐Ÿ’พ 9x speedup with memory pooling for matrix operations
  • ๐Ÿ”ง Automatic ultra-fast optimization for large tensors and batch processing
  • ๐Ÿ“Š Built-in benchmarking for performance monitoring and optimization
// Quick performance demo
import "github.com/blackmoon87/thinkingnet/pkg/core"

// Run comprehensive benchmarks
core.RunQuickBenchmark()

// High-performance operations
processor := core.GetHighPerformanceProcessor()
opsPerSecond := processor.PerformOperations(100_000_000) // 100M ops
fmt.Printf("Achieved %.0f operations per second\n", opsPerSecond)

๐Ÿ“Š Detailed Benchmark Results

The following benchmarks were run on Windows/amd64 with 8 CPU cores using Go 1.25.5.

โšก Activation Functions Performance
Function Implementation Speed (ops/sec)
ReLU UltraFast 1.01 Billion
Sigmoid UltraFast 897 Million
ReLU Parallel 682 Million
Sigmoid Parallel 356 Million
Tanh Parallel 228 Million
ReLU Direct 230 Million
LeakyReLU Direct 160 Million
ELU Direct 97 Million
Sigmoid Direct 86 Million
Swish Direct 81 Million
Tanh Direct 76 Million
GELU Direct 24 Million
๐Ÿš€ High-Performance Operations
Operation Speed
UltraFast Processor 1.43 Billion ops/sec
HighPerformance Processor 438 Million ops/sec
Batch Processing 14.7 Million samples/sec
๐Ÿ’พ Memory Management
Mode Speed Improvement
With Memory Pooling 112,661 ops/sec 9x faster
Without Memory Pooling 12,537 ops/sec baseline
๐Ÿ“ Tensor Operations (100 iterations)
Operation 64x64 128x128 256x256 512x512
Transpose 183K ops/s - 5.8K ops/s 1.1K ops/s
Scale 95K ops/s - 8.7K ops/s 1.9K ops/s
Addition 25K ops/s - 3.5K ops/s 993 ops/s
Subtraction 36K ops/s 5.3K ops/s 3.6K ops/s 954 ops/s
Element Mul 22K ops/s 12.5K ops/s 4K ops/s 1K ops/s
Matrix Mul 14K ops/s 295 ops/s 33 ops/s 3 ops/s
๐Ÿ”ฌ ML Algorithms (5,000 samples, 50 features)
Algorithm Execution Time
PCA 13.5 ms โšก
K-Means (10 clusters) 259 ms
Linear Regression 2.65 s
Logistic Regression 2.83 s
DBSCAN 11.76 s
๐Ÿง  Neural Network Training
Metric Value
Architecture 128โ†’256โ†’128โ†’64โ†’10
Training Samples 10,000
Epochs 50
Forward Pass ~5,897 samples/sec
Full Training ~3,336 samples/sec
Total Training Time ~2.5 minutes
๐Ÿ“ˆ Preprocessing Operations
Operation Speed
Train-Test Split 484 ops/sec
MinMax Scaling 220 ops/sec
Standard Scaling 163 ops/sec
๐Ÿ† Summary
  • Total Operations Tested: 771.8 Million
  • Average Performance: 1.63 Million ops/sec
  • Test Success Rate: 100% (67/67 tests passed)
  • Total Benchmark Duration: ~8 minutes
Running Benchmarks
// Run the comprehensive stress test
cd examples/stress_test_demo
go run main.go

// Or run quick benchmarks
import "github.com/blackmoon87/thinkingnet/pkg/core"
core.RunQuickBenchmark()
core.RunUltraFastBenchmark()

Quick Start

Get started with ThinkingNet in just a few lines of code using our simplified helper functions:

package main

import (
    "fmt"
    "github.com/blackmoon87/thinkingnet/pkg/core"
    "github.com/blackmoon87/thinkingnet/pkg/models"
    "github.com/blackmoon87/thinkingnet/pkg/layers"
    "github.com/blackmoon87/thinkingnet/pkg/activations"
    "github.com/blackmoon87/thinkingnet/pkg/optimizers"
    "github.com/blackmoon87/thinkingnet/pkg/losses"
)

func main() {
    // Create sample data using helper function
    X := core.EasyTensor([][]float64{
        {0, 0}, {0, 1}, {1, 0}, {1, 1},
    })
    y := core.EasyTensor([][]float64{
        {0}, {1}, {1}, {0},
    })

    // Create a simple neural network
    model := models.NewSequential()
    model.AddLayer(layers.NewDense(4, activations.NewReLU()))
    model.AddLayer(layers.NewDense(1, activations.NewSigmoid()))
    
    // Compile the model
    model.Compile(optimizers.NewAdam(0.01), losses.NewBinaryCrossEntropy())
    
    // Train with sensible defaults using EasyTrain
    history, err := model.EasyTrain(X, y)
    if err != nil {
        fmt.Printf("Training error: %v\n", err)
        return
    }
    
    // Make predictions using EasyPredict
    predictions, err := model.EasyPredict(X)
    if err != nil {
        fmt.Printf("Prediction error: %v\n", err)
        return
    }
    
    fmt.Println("Training completed!")
    fmt.Printf("Final loss: %.4f\n", history.Loss[len(history.Loss)-1])
    fmt.Println("Predictions:", predictions)
}
Before vs After: Simplified API

Before (Traditional approach):

// Complex configuration required
config := core.TrainingConfig{
    Epochs:          50,
    BatchSize:       32,
    ValidationSplit: 0.2,
    Shuffle:         true,
    Verbose:         1,
}
history, err := model.Fit(X, y, config)

// Manual data preprocessing
scaler := preprocessing.NewStandardScaler()
scaler.Fit(X)
X_scaled := scaler.Transform(X)

// Complex algorithm setup
lr := algorithms.NewLinearRegression(
    algorithms.WithLinearLearningRate(0.01),
    algorithms.WithLinearMaxIterations(1000),
    algorithms.WithLinearTolerance(1e-6),
)

After (Simplified with helper functions):

// One-liner training with sensible defaults
history, err := model.EasyTrain(X, y)

// One-liner data preprocessing
X_scaled := preprocessing.EasyStandardScale(X)

// One-liner algorithm creation
lr := algorithms.EasyLinearRegression()
Common Use Cases
1. Linear Regression
import "github.com/blackmoon87/thinkingnet/pkg/algorithms"

// Create and train a linear regression model
lr := algorithms.EasyLinearRegression()
err := lr.Fit(X, y)
predictions := lr.Predict(X_test)
2. Classification
import "github.com/blackmoon87/thinkingnet/pkg/algorithms"

// Create and train a logistic regression model
clf := algorithms.EasyLogisticRegression()
err := clf.Fit(X, y)
predictions := clf.Predict(X_test)
3. Clustering
import "github.com/blackmoon87/thinkingnet/pkg/algorithms"

// Create and fit K-means clustering
kmeans := algorithms.EasyKMeans(3) // 3 clusters
labels := kmeans.Fit(X)
4. Data Preprocessing
import "github.com/blackmoon87/thinkingnet/pkg/preprocessing"

// Scale your data
X_scaled := preprocessing.EasyStandardScale(X)
X_minmax := preprocessing.EasyMinMaxScale(X)

// Split your data
XTrain, XTest, yTrain, yTest := preprocessing.EasySplit(X, y, 0.2)

Troubleshooting

Common Issues and Solutions
1. Model Not Compiled Error

Error: ุงู„ู†ู…ูˆุฐุฌ ุบูŠุฑ ู…ูุฌู…ุน / Model not compiled

Solution: Always compile your model before training:

model.Compile(optimizers.NewAdam(0.01), losses.NewBinaryCrossEntropy())
2. Invalid Input Data

Error: ุจูŠุงู†ุงุช ุฅุฏุฎุงู„ ุบูŠุฑ ุตุญูŠุญุฉ / Invalid input data

Solution: Check your data format and dimensions:

// Ensure data is in correct format
X := core.EasyTensor([][]float64{
    {1.0, 2.0}, // Each row is a sample
    {3.0, 4.0}, // Each column is a feature
})
3. Dimension Mismatch

Error: Tensor dimension mismatch

Solution: Verify input and output dimensions match your model:

// For binary classification, ensure y has shape [samples, 1]
y := core.EasyTensor([][]float64{
    {0}, {1}, {1}, {0}, // Single column for binary labels
})
4. Training Not Converging

Problem: Loss not decreasing during training

Solutions:

  • Try different learning rates: optimizers.NewAdam(0.001) or optimizers.NewAdam(0.1)
  • Scale your input data: X_scaled := preprocessing.EasyStandardScale(X)
  • Check for NaN values in your data
  • Increase the number of epochs in training config
5. Memory Issues with Large Datasets

Problem: Out of memory errors

Solutions:

  • Use smaller batch sizes in training config
  • Process data in chunks
  • Use the memory pooling features for better efficiency
6. Import Path Issues

Problem: Cannot import packages

Solution: Use the correct import paths:

import (
    "github.com/blackmoon87/thinkingnet/pkg/core"
    "github.com/blackmoon87/thinkingnet/pkg/models"
    "github.com/blackmoon87/thinkingnet/pkg/algorithms"
    "github.com/blackmoon87/thinkingnet/pkg/preprocessing"
)
Getting Help
  1. Check the examples: Look at files in the examples/ directory for working code
  2. Read error messages: Our bilingual error messages provide specific guidance
  3. Use helper functions: Start with Easy* functions for common tasks
  4. Check data shapes: Use tensor.Shape() to verify dimensions
  5. Enable verbose logging: Set Verbose: 1 in training config for detailed output
Performance Tips
  1. Use helper functions: They include optimized defaults
  2. Scale your data: Always preprocess with EasyStandardScale() or EasyMinMaxScale()
  3. Batch processing: Use appropriate batch sizes (32-128 typically work well)
  4. Memory pooling: The library automatically uses memory pooling for better performance

Development Status

This library is currently under active development. The core interfaces and basic functionality have been implemented.

Completed Components
  • Core interfaces and types
  • Error handling framework
  • Tensor abstraction
  • Configuration system
  • Basic utilities
In Progress
  • Neural network layers
  • Optimizers
  • Loss functions
  • Model implementations
  • Data preprocessing
  • Traditional ML algorithms

Contributing

This is a production refactoring of an existing AI library. Please see the implementation tasks in .kiro/specs/production-ai-library/tasks.md for current development priorities.

License

MIT License [blackmoon87]

MIT License

Copyright (c) 2025 [blackmoon@mail.com]

Permission is hereby granted, free of charge, to any person obtaining a copy...

Architecture

The library follows a modular, interface-driven design with the following principles:

  • Interface-based: All major components implement well-defined interfaces
  • Error handling: Comprehensive error types with context
  • Memory efficient: Matrix pooling and reuse where possible
  • Extensible: Plugin architecture for custom components
  • Production ready: Validation, testing, and documentation

ThinkingNet Go Library - Import Guide & Case Study

Overview

This guide demonstrates how to import and use the ThinkingNet Go library from GitHub in your projects. Based on real testing scenarios and common use cases.

Quick Start - Importing from GitHub

Step 1: Initialize Your Project
# Create a new directory for your project
mkdir my-thinkingnet-project
cd my-thinkingnet-project

# Initialize Go module
go mod init my-thinkingnet-project
Step 2: Import ThinkingNet Library
# Import the latest version from GitHub
go get github.com/blackmoon87/thinkingnet@latest
Step 3: Handle Dependencies

If you encounter missing dependencies (like gonum), run:

# Clean up and download all dependencies
go mod tidy

Basic Usage Example

Create a main.go file with the following content:

package main

import (
    "fmt"
    "log"
    
    "github.com/blackmoon87/thinkingnet/pkg/core"
    "github.com/blackmoon87/thinkingnet/pkg/algorithms"
    "github.com/blackmoon87/thinkingnet/pkg/preprocessing"
)

func main() {
    fmt.Println("Testing ThinkingNet library...")
    
    // Initialize the neural network
    network := core.NewNeuralNetwork([]int{2, 4, 1})
    if network == nil {
        log.Fatal("Failed to create neural network")
    }
    
    // Create sample data
    processor := preprocessing.NewDataProcessor()
    
    // Example: Simple XOR problem
    inputs := [][]float64{
        {0, 0},
        {0, 1},
        {1, 0},
        {1, 1},
    }
    
    targets := [][]float64{
        {0},
        {1},
        {1},
        {0},
    }
    
    fmt.Println("Library loaded successfully!")
    fmt.Printf("Network created with %d layers\n", len(network.Layers))
    fmt.Printf("Training data: %d samples\n", len(inputs))
}

Case Study: Real-World Implementation

Problem: Binary Classification

Let's implement a simple binary classifier using ThinkingNet:

package main

import (
    "fmt"
    "math/rand"
    "time"
    
    "github.com/blackmoon87/thinkingnet/pkg/core"
    "github.com/blackmoon87/thinkingnet/pkg/algorithms"
    "github.com/blackmoon87/thinkingnet/pkg/metrics"
)

func main() {
    // Seed random number generator
    rand.Seed(time.Now().UnixNano())
    
    // Create network: 2 inputs, 1 hidden layer (4 neurons), 1 output
    network := core.NewNeuralNetwork([]int{2, 4, 1})
    
    // Training configuration
    config := &algorithms.TrainingConfig{
        LearningRate: 0.1,
        Epochs:      1000,
        BatchSize:   4,
    }
    
    // Sample dataset (XOR problem)
    trainData := [][]float64{
        {0, 0, 0}, // input1, input2, expected_output
        {0, 1, 1},
        {1, 0, 1},
        {1, 1, 0},
    }
    
    // Prepare training data
    inputs := make([][]float64, len(trainData))
    targets := make([][]float64, len(trainData))
    
    for i, data := range trainData {
        inputs[i] = data[:2]
        targets[i] = []float64{data[2]}
    }
    
    // Train the network
    trainer := algorithms.NewTrainer(network, config)
    history := trainer.Train(inputs, targets)
    
    // Evaluate results
    fmt.Println("Training completed!")
    fmt.Printf("Final loss: %.6f\n", history.FinalLoss)
    
    // Test predictions
    fmt.Println("\nPredictions:")
    for i, input := range inputs {
        prediction := network.Predict(input)
        expected := targets[i][0]
        fmt.Printf("Input: %v, Expected: %.0f, Predicted: %.3f\n", 
                   input, expected, prediction[0])
    }
}

Common Issues & Solutions

Issue 1: Missing Dependencies

Error: missing go.sum entry for module providing package gonum.org/v1/gonum/mat

Solution:

go mod tidy
Issue 2: Package Not Main

Error: package command-line-arguments is not a main package

Solution: Ensure your main.go file has:

package main

func main() {
    // Your code here
}
Issue 3: Unused Variables

Error: declared and not used: variable_name

Solution: Either use the variables or remove them:

// Remove unused variables or use them
_ = processor // Use blank identifier if needed temporarily

Advanced Usage Patterns

1. Custom Network Architecture
// Create a deeper network for complex problems
network := core.NewNeuralNetwork([]int{10, 64, 32, 16, 1})
2. Data Preprocessing
processor := preprocessing.NewDataProcessor()
normalizedData := processor.Normalize(rawData)
3. Performance Monitoring
evaluator := metrics.NewEvaluator()
accuracy := evaluator.Accuracy(predictions, targets)
fmt.Printf("Model accuracy: %.2f%%\n", accuracy*100)

Best Practices

  1. Always run go mod tidy after importing new packages
  2. Use appropriate network sizes - start small and scale up
  3. Normalize your data before training
  4. Monitor training progress with metrics
  5. Test with validation data to avoid overfitting

Version Information

  • Library Version: v0.0.0-20250914203955-eec5893249ba
  • Go Version: Compatible with Go 1.19+
  • Dependencies: gonum.org/v1/gonum/mat

Next Steps

  1. Check out ADVANCED_USAGE.md for more complex examples
  2. Review examples/ directory for specific use cases
  3. Read CONTRIBUTING.md to contribute to the project

Support

For issues and questions:

  • Check existing examples in the examples/ directory
  • Review the documentation files
  • Create an issue on the GitHub repository

This guide is based on real testing and usage scenarios. Last updated: September 2025

Directories ยถ

Path Synopsis
examples
activation_demo command
callbacks_demo command
clustering_demo command
complex_pattern_demo command
Package main demonstrates complex pattern learning with detailed generalization report.
Package main demonstrates complex pattern learning with detailed generalization report.
easy_usage_demo command
metrics_demo command
pca_demo command
split_demo command
stress_test_demo command
Package main provides a comprehensive heavy stress test for ThinkingNet-Go library.
Package main provides a comprehensive heavy stress test for ThinkingNet-Go library.
tensor_demo command
pkg
activations
Package activations provides activation functions for neural networks.
Package activations provides activation functions for neural networks.
callbacks
Package callbacks provides training callback implementations for monitoring and controlling the training process.
Package callbacks provides training callback implementations for monitoring and controlling the training process.
core
Package core defines the fundamental interfaces and types for the ThinkingNet AI library.
Package core defines the fundamental interfaces and types for the ThinkingNet AI library.
layers
Package layers provides neural network layer implementations.
Package layers provides neural network layer implementations.
losses
Package losses provides loss function implementations for the ThinkingNet AI library.
Package losses provides loss function implementations for the ThinkingNet AI library.
models
Package models provides neural network model implementations.
Package models provides neural network model implementations.
optimizers
Package optimizers provides optimization algorithms for neural network training.
Package optimizers provides optimization algorithms for neural network training.
preprocessing
Package preprocessing provides data preprocessing utilities for machine learning.
Package preprocessing provides data preprocessing utilities for machine learning.
utils
Package utils provides common utility functions for the ThinkingNet library.
Package utils provides common utility functions for the ThinkingNet library.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL