optimizer

package
v0.3.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 25, 2025 License: Apache-2.0 Imports: 8 Imported by: 0

Documentation

Overview

Package optimizer provides various optimization algorithms for neural networks.

Package optimizer provides various optimization algorithms for neural networks.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type AdamW

type AdamW[T tensor.Numeric] struct {
	// contains filtered or unexported fields
}

AdamW implements the AdamW optimizer.

func NewAdamW

func NewAdamW[T tensor.Numeric](engine compute.Engine[T], learningRate, beta1, beta2, epsilon, weightDecay T) *AdamW[T]

NewAdamW creates a new AdamW optimizer.

func (*AdamW[T]) Step

func (a *AdamW[T]) Step(ctx context.Context, params []*graph.Parameter[T]) error

Step updates the parameters based on their gradients.

type Optimizer

type Optimizer[T tensor.Numeric] interface {
	Step(ctx context.Context, params []*graph.Parameter[T]) error
}

Optimizer defines the interface for optimization algorithms.

type SGD

type SGD[T tensor.Numeric] struct {
	// contains filtered or unexported fields
}

SGD implements the stochastic gradient descent optimizer.

func NewSGD

func NewSGD[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], learningRate float32) *SGD[T]

NewSGD creates a new SGD optimizer.

func (*SGD[T]) Clip

func (s *SGD[T]) Clip(ctx context.Context, params []*graph.Parameter[T], threshold float32)

Clip clips the gradients of the parameters by a threshold.

func (*SGD[T]) Step

func (s *SGD[T]) Step(ctx context.Context, params []*graph.Parameter[T]) error

Step updates the parameters based on their gradients.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL