Documentation
¶
Overview ¶
Package autodiff provides automatic differentiation capabilities.
This package implements reverse-mode automatic differentiation (backpropagation) using a gradient tape. It wraps any backend to add autodiff capabilities.
Example:
import (
"github.com/born-ml/born/autodiff"
"github.com/born-ml/born/backend/cpu"
"github.com/born-ml/born/tensor"
)
func main() {
// Wrap CPU backend with autodiff
base := cpu.New()
backend := autodiff.New(base)
// Use for training
x := tensor.Zeros[float32](tensor.Shape{2, 3}, backend)
y := x.Add(x) // Operations recorded on tape
// Compute gradients
grads := backend.Backward(y.Raw())
}
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func Backward ¶
func Backward[T tensor.DType, B BackwardCapable](t *tensor.Tensor[T, B], backend B) map[*tensor.RawTensor]*tensor.RawTensor
Backward computes gradients via backpropagation.
func NoGrad ¶ added in v0.3.0
NoGrad disables gradient recording for operations within the function. Use this for inference or operations that shouldn't be differentiated.
Example:
backend := autodiff.New(cpu.New())
autodiff.NoGrad(backend, func() {
// Operations here won't be recorded for gradients
output := model.Forward(input)
})
Types ¶
type Backend ¶
type Backend[B tensor.Backend] = autodiff.AutodiffBackend[B]
Backend is the autodiff-enabled backend.
type BackwardCapable ¶
type BackwardCapable = autodiff.BackwardCapable
BackwardCapable interface for backends that support backpropagation.
type GradientTape ¶
type GradientTape = autodiff.GradientTape
GradientTape records operations for automatic differentiation.
func NewGradientTape ¶
func NewGradientTape() *GradientTape
NewGradientTape creates a new gradient tape.
Click to show internal directories.
Click to hide internal directories.