activations

package
v0.3.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 25, 2025 License: Apache-2.0 Imports: 10 Imported by: 1

Documentation

Overview

Package activations provides activation function layers.

Package activations provides activation function layers.

Package activations provides neural network activation functions.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func BuildFastGelu added in v0.3.0

func BuildFastGelu[T tensor.Float](
	engine compute.Engine[T],
	_ numeric.Arithmetic[T],
	_ string,
	_ map[string]*graph.Parameter[T],
	_ map[string]interface{},
) (graph.Node[T], error)

BuildFastGelu constructs a FastGelu layer for the registry.

Types

type ActivationLayer

type ActivationLayer[T tensor.Numeric] interface {
	Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
	Backward(ctx context.Context, mode types.BackwardMode, outputGradient *tensor.TensorNumeric[T], inputs ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)
}

ActivationLayer defines the interface for activation layers used in tests.

type BaseActivation

type BaseActivation[T tensor.Numeric] struct {
	graph.NoParameters[T]
	// contains filtered or unexported fields
}

BaseActivation provides common functionality for unary activation functions.

func NewBaseActivation

func NewBaseActivation[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], opType string, opts ...BaseActivationOption[T]) *BaseActivation[T]

NewBaseActivation creates a new base activation with the given forward and backward operations.

func NewReLU

func NewReLU[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]

NewReLU creates a new ReLU activation function.

func NewSigmoid

func NewSigmoid[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]

NewSigmoid creates a new Sigmoid activation function.

func NewTanh

func NewTanh[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]

NewTanh creates a new Tanh activation function.

func (*BaseActivation[T]) Attributes added in v0.3.0

func (b *BaseActivation[T]) Attributes() map[string]interface{}

Attributes returns the attributes of the activation.

func (*BaseActivation[T]) Backward

func (b *BaseActivation[T]) Backward(ctx context.Context, mode types.BackwardMode, outputGradient *tensor.TensorNumeric[T], _ ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)

Backward performs the backward pass of the activation function.

func (*BaseActivation[T]) Forward

func (b *BaseActivation[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)

Forward performs the forward pass of the activation function.

func (*BaseActivation[T]) OpType added in v0.3.0

func (b *BaseActivation[T]) OpType() string

OpType returns the operation type of the activation.

func (*BaseActivation[T]) OutputShape

func (b *BaseActivation[T]) OutputShape() []int

OutputShape returns the output shape of the activation.

type BaseActivationOption added in v0.3.0

type BaseActivationOption[T tensor.Numeric] func(*BaseActivationOptions[T])

BaseActivationOption is a function that applies an option to BaseActivationOptions.

func WithBackwardOp added in v0.3.0

func WithBackwardOp[T tensor.Numeric](op func(T) T) BaseActivationOption[T]

WithBackwardOp sets the backward operation for the BaseActivation.

func WithForwardOp added in v0.3.0

func WithForwardOp[T tensor.Numeric](op func(T) T) BaseActivationOption[T]

WithForwardOp sets the forward operation for the BaseActivation.

type BaseActivationOptions added in v0.3.0

type BaseActivationOptions[T tensor.Numeric] struct {
	ForwardOp  func(T) T
	BackwardOp func(T) T
}

BaseActivationOptions holds configuration options for BaseActivation. BaseActivationOptions holds configuration options for BaseActivation.

type FastGelu added in v0.3.0

type FastGelu[T tensor.Float] struct {
	// contains filtered or unexported fields
}

FastGelu is an approximation of the GELU activation function.

func NewFastGelu added in v0.3.0

func NewFastGelu[T tensor.Float](engine compute.Engine[T]) *FastGelu[T]

NewFastGelu creates a new FastGelu layer.

func (*FastGelu[T]) Attributes added in v0.3.0

func (g *FastGelu[T]) Attributes() map[string]interface{}

Attributes returns nil for the FastGelu layer.

func (*FastGelu[T]) Backward added in v0.3.0

func (g *FastGelu[T]) Backward(_ context.Context, mode types.BackwardMode, _ *tensor.TensorNumeric[T], _ ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)

Backward applies the backward pass of the FastGelu layer.

func (*FastGelu[T]) Forward added in v0.3.0

func (g *FastGelu[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)

Forward applies the forward pass of the FastGelu layer.

func (*FastGelu[T]) OpType added in v0.3.0

func (g *FastGelu[T]) OpType() string

OpType returns the operation type of the FastGelu layer.

func (*FastGelu[T]) OutputShape added in v0.3.0

func (g *FastGelu[T]) OutputShape() []int

OutputShape returns the output shape of the layer.

func (*FastGelu[T]) Parameters added in v0.3.0

func (g *FastGelu[T]) Parameters() []*graph.Parameter[T]

Parameters returns the learnable parameters of the layer.

type LeakyReLU

type LeakyReLU[T tensor.Numeric] struct {
	graph.NoParameters[T]
	// contains filtered or unexported fields
}

LeakyReLU implements the Leaky Rectified Linear Unit activation function.

func NewLeakyReLU

func NewLeakyReLU[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], opts ...LeakyReLUOption[T]) *LeakyReLU[T]

NewLeakyReLU creates a new LeakyReLU activation function.

func (*LeakyReLU[T]) Attributes added in v0.3.0

func (l *LeakyReLU[T]) Attributes() map[string]interface{}

Attributes returns the attributes of the LeakyReLU layer.

func (*LeakyReLU[T]) Backward

func (l *LeakyReLU[T]) Backward(ctx context.Context, mode types.BackwardMode, outputGradient *tensor.TensorNumeric[T], _ ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)

Backward computes the gradients for the LeakyReLU activation.

func (*LeakyReLU[T]) Forward

func (l *LeakyReLU[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)

Forward computes the LeakyReLU activation for the given input.

func (*LeakyReLU[T]) OpType added in v0.3.0

func (l *LeakyReLU[T]) OpType() string

OpType returns the operation type of the LeakyReLU layer.

func (*LeakyReLU[T]) OutputShape

func (l *LeakyReLU[T]) OutputShape() []int

OutputShape returns the output shape of the LeakyReLU layer.

type LeakyReLUOption added in v0.3.0

type LeakyReLUOption[T tensor.Numeric] func(*LeakyReLUOptions[T])

LeakyReLUOption is a function that applies an option to LeakyReLUOptions.

func WithAlpha added in v0.3.0

func WithAlpha[T tensor.Numeric](alpha float64) LeakyReLUOption[T]

WithAlpha sets the alpha parameter for LeakyReLU.

type LeakyReLUOptions added in v0.3.0

type LeakyReLUOptions[T tensor.Numeric] struct {
	Alpha float64
}

LeakyReLUOptions holds configuration options for LeakyReLU. LeakyReLUOptions holds configuration options for LeakyReLU.

type ReLU

type ReLU[T tensor.Numeric] struct {
	*BaseActivation[T]
}

ReLU implements the Rectified Linear Unit activation function.

type Sigmoid

type Sigmoid[T tensor.Numeric] struct {
	*BaseActivation[T]
}

Sigmoid implements the sigmoid activation function.

type SwiGLU

type SwiGLU[T tensor.Numeric] struct {
	// contains filtered or unexported fields
}

SwiGLU implements the SwiGLU activation function.

func NewSwiGLU

func NewSwiGLU[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], opts ...SwiGLUOption[T]) *SwiGLU[T]

NewSwiGLU creates a new SwiGLU activation layer.

func (*SwiGLU[T]) Attributes added in v0.3.0

func (s *SwiGLU[T]) Attributes() map[string]interface{}

Attributes returns the attributes.

func (*SwiGLU[T]) Backward

func (s *SwiGLU[T]) Backward(ctx context.Context, mode types.BackwardMode, dOut *tensor.TensorNumeric[T], inputs ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)

Backward computes the gradients for SwiGLU.

func (*SwiGLU[T]) Forward

func (s *SwiGLU[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)

Forward computes the SwiGLU activation. Input: A tensor with its last dimension being 2 * feature_dim.

func (*SwiGLU[T]) OpType added in v0.3.0

func (s *SwiGLU[T]) OpType() string

OpType returns the operation type.

func (*SwiGLU[T]) OutputShape

func (s *SwiGLU[T]) OutputShape() []int

OutputShape returns the output shape of SwiGLU.

func (*SwiGLU[T]) Parameters

func (s *SwiGLU[T]) Parameters() []*graph.Parameter[T]

Parameters returns an empty slice as SwiGLU has no trainable parameters.

type SwiGLUOption added in v0.3.0

type SwiGLUOption[T tensor.Numeric] func(*SwiGLUOptions[T])

SwiGLUOption is a function that applies an option to SwiGLUOptions.

type SwiGLUOptions added in v0.3.0

type SwiGLUOptions[T tensor.Numeric] struct {
}

SwiGLUOptions holds configuration options for SwiGLU.

type Tanh

type Tanh[T tensor.Numeric] struct {
	*BaseActivation[T]
}

Tanh implements the hyperbolic tangent activation function.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL