Documentation
¶
Overview ¶
Package nn provides neural network layers and building blocks.
Overview ¶
This package contains:
- Layers: Linear, Conv2D, MaxPool2D
- Activations: ReLU, Sigmoid, Tanh
- Loss functions: CrossEntropyLoss, MSELoss
- Utilities: Sequential, Module interface, Parameter
- Initialization: Xavier, Zeros, Ones, Randn
Basic Usage ¶
import (
"github.com/born-ml/born/nn"
"github.com/born-ml/born/backend/cpu"
)
func main() {
backend := cpu.New()
// Build a simple MLP
model := nn.NewSequential(
nn.NewLinear(784, 128, backend),
nn.NewReLU(),
nn.NewLinear(128, 10, backend),
)
// Forward pass
output := model.Forward(input)
}
Layers ¶
Linear: Fully connected layer with Xavier initialization
layer := nn.NewLinear(inFeatures, outFeatures, backend)
Conv2D: 2D convolutional layer with im2col algorithm
conv := nn.NewConv2D(inChannels, outChannels, kernelSize, stride, padding, backend)
MaxPool2D: 2D max pooling layer
pool := nn.NewMaxPool2D(kernelSize, stride, backend)
Activations ¶
Common activation functions:
relu := nn.NewReLU() sigmoid := nn.NewSigmoid() tanh := nn.NewTanh()
Loss Functions ¶
CrossEntropyLoss: For classification tasks (numerically stable)
criterion := nn.NewCrossEntropyLoss(backend) loss := criterion.Forward(logits, labels)
MSELoss: For regression tasks
criterion := nn.NewMSELoss(backend) loss := criterion.Forward(predictions, targets)
Sequential Models ¶
Build models by composing layers:
model := nn.NewSequential(
nn.NewLinear(784, 256, backend),
nn.NewReLU(),
nn.NewLinear(256, 128, backend),
nn.NewReLU(),
nn.NewLinear(128, 10, backend),
)
Parameter Management ¶
Access model parameters for optimization:
params := model.Parameters()
for _, param := range params {
fmt.Println(param.Name(), param.Tensor().Shape())
}
Index ¶
- func Accuracy[B tensor.Backend](logits *tensor.Tensor[float32, B], targets *tensor.Tensor[int32, B]) float32
- func CrossEntropyBackward[B tensor.Backend](logits *tensor.Tensor[float32, B], targets *tensor.Tensor[int32, B], backend B) *tensor.Tensor[float32, B]
- func Ones[B tensor.Backend](shape tensor.Shape, backend B) *tensor.Tensor[float32, B]
- func Randn[B tensor.Backend](shape tensor.Shape, backend B) *tensor.Tensor[float32, B]
- func Xavier[B tensor.Backend](fanIn, fanOut int, shape tensor.Shape, backend B) *tensor.Tensor[float32, B]
- func Zeros[B tensor.Backend](shape tensor.Shape, backend B) *tensor.Tensor[float32, B]
- type Conv2D
- type CrossEntropyLoss
- type Embedding
- type Linear
- type MSELoss
- type MaxPool2D
- type Module
- type Parameter
- type RMSNorm
- type ReLU
- type Sequential
- type SiLU
- type Sigmoid
- type Tanh
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func Accuracy ¶
func Accuracy[B tensor.Backend]( logits *tensor.Tensor[float32, B], targets *tensor.Tensor[int32, B], ) float32
Accuracy computes the classification accuracy.
Example:
acc := nn.Accuracy(predictions, labels)
fmt.Printf("Accuracy: %.2f%%\n", acc*100)
func CrossEntropyBackward ¶
func CrossEntropyBackward[B tensor.Backend]( logits *tensor.Tensor[float32, B], targets *tensor.Tensor[int32, B], backend B, ) *tensor.Tensor[float32, B]
CrossEntropyBackward computes the backward pass for cross-entropy loss.
func Ones ¶
Ones initializes a tensor with ones.
Example:
backend := cpu.New()
weights := nn.Ones(tensor.Shape{128, 784}, backend)
func Randn ¶
Randn initializes a tensor with random values from N(0, 1).
Example:
backend := cpu.New()
weights := nn.Randn(tensor.Shape{128, 784}, backend)
Types ¶
type Conv2D ¶
Conv2D represents a 2D convolutional layer.
func NewConv2D ¶
func NewConv2D[B tensor.Backend]( inChannels, outChannels int, kernelH, kernelW int, stride, padding int, useBias bool, backend B, ) *Conv2D[B]
NewConv2D creates a new 2D convolutional layer.
Example:
backend := cpu.New() conv := nn.NewConv2D(1, 32, 3, 3, 1, 1, true, backend) // in_channels=1, out_channels=32, kernel=3x3, stride=1, padding=1, useBias=true
type CrossEntropyLoss ¶
type CrossEntropyLoss[B tensor.Backend] = nn.CrossEntropyLoss[B]
CrossEntropyLoss represents the cross-entropy loss for classification.
func NewCrossEntropyLoss ¶
func NewCrossEntropyLoss[B tensor.Backend](backend B) *CrossEntropyLoss[B]
NewCrossEntropyLoss creates a new cross-entropy loss function.
Example:
backend := cpu.New() criterion := nn.NewCrossEntropyLoss(backend) loss := criterion.Forward(logits, labels)
type Embedding ¶ added in v0.3.0
Embedding represents a lookup table for embeddings.
func NewEmbedding ¶ added in v0.3.0
NewEmbedding creates a new embedding layer.
Example:
backend := cpu.New()
embed := nn.NewEmbedding[B](50000, 768, backend) // vocab=50000, dim=768
tokenIds := tensor.FromSlice([]int32{1, 5, 10}, tensor.Shape{1, 3}, backend)
embeddings := embed.Forward(tokenIds) // [1, 3, 768]
type MSELoss ¶
MSELoss represents the mean squared error loss for regression.
func NewMSELoss ¶
NewMSELoss creates a new MSE loss function.
Example:
backend := cpu.New() criterion := nn.NewMSELoss(backend) loss := criterion.Forward(predictions, targets)
type Sequential ¶
type Sequential[B tensor.Backend] = nn.Sequential[B]
Sequential represents a sequential container of modules.
func NewSequential ¶
func NewSequential[B tensor.Backend](modules ...Module[B]) *Sequential[B]
NewSequential creates a new sequential model.
Example:
backend := cpu.New()
model := nn.NewSequential(
nn.NewLinear(784, 128, backend),
nn.NewReLU(),
nn.NewLinear(128, 10, backend),
)
type SiLU ¶ added in v0.3.0
SiLU represents the Sigmoid Linear Unit (SiLU/Swish) activation function. SiLU(x) = x * sigmoid(x).
type Sigmoid ¶
Sigmoid represents the Sigmoid activation function.
func NewSigmoid ¶
NewSigmoid creates a new Sigmoid activation layer.
Example:
sigmoid := nn.NewSigmoid()