Documentation
¶
Overview ¶
Package activations provides neural network activation functions.
Index ¶
- type ActivationLayer
- type BaseActivation
- type LeakyReLU
- type ReLU
- type Sigmoid
- type SwiGLU
- func (s *SwiGLU[T]) Backward(ctx context.Context, dOut *tensor.Tensor[T], inputs ...*tensor.Tensor[T]) ([]*tensor.Tensor[T], error)
- func (s *SwiGLU[T]) Forward(ctx context.Context, inputs ...*tensor.Tensor[T]) (*tensor.Tensor[T], error)
- func (s *SwiGLU[T]) OutputShape(inputShapes ...[]int) ([]int, error)
- func (s *SwiGLU[T]) Parameters() []graph.Parameter[T]
- type Tanh
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type ActivationLayer ¶
type ActivationLayer[T tensor.Numeric] interface { Forward(ctx context.Context, inputs ...*tensor.Tensor[T]) (*tensor.Tensor[T], error) Backward(ctx context.Context, outputGradient *tensor.Tensor[T]) ([]*tensor.Tensor[T], error) }
ActivationLayer defines the interface for activation layers used in tests.
type BaseActivation ¶
type BaseActivation[T tensor.Numeric] struct { graph.NoParameters[T] // contains filtered or unexported fields }
BaseActivation provides common functionality for unary activation functions.
func NewBaseActivation ¶
func NewBaseActivation[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], forwardOp, backwardOp func(T) T) *BaseActivation[T]
NewBaseActivation creates a new base activation with the given forward and backward operations.
func (*BaseActivation[T]) Backward ¶
func (b *BaseActivation[T]) Backward(ctx context.Context, outputGradient *tensor.Tensor[T]) ([]*tensor.Tensor[T], error)
Backward performs the backward pass of the activation function.
func (*BaseActivation[T]) Forward ¶
func (b *BaseActivation[T]) Forward(ctx context.Context, inputs ...*tensor.Tensor[T]) (*tensor.Tensor[T], error)
Forward performs the forward pass of the activation function.
func (*BaseActivation[T]) OutputShape ¶
func (b *BaseActivation[T]) OutputShape() []int
OutputShape returns the output shape of the activation.
type LeakyReLU ¶
type LeakyReLU[T tensor.Numeric] struct { graph.NoParameters[T] // contains filtered or unexported fields }
LeakyReLU implements the Leaky Rectified Linear Unit activation function.
func NewLeakyReLU ¶
func NewLeakyReLU[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], alpha float64) *LeakyReLU[T]
NewLeakyReLU creates a new LeakyReLU activation function.
func (*LeakyReLU[T]) Backward ¶
func (l *LeakyReLU[T]) Backward(ctx context.Context, outputGradient *tensor.Tensor[T]) ([]*tensor.Tensor[T], error)
Backward computes the gradients for the LeakyReLU activation.
func (*LeakyReLU[T]) Forward ¶
func (l *LeakyReLU[T]) Forward(ctx context.Context, inputs ...*tensor.Tensor[T]) (*tensor.Tensor[T], error)
Forward computes the LeakyReLU activation for the given input.
func (*LeakyReLU[T]) OutputShape ¶
OutputShape returns the output shape of the LeakyReLU layer.
type ReLU ¶
type ReLU[T tensor.Numeric] struct { *BaseActivation[T] }
ReLU implements the Rectified Linear Unit activation function.
type Sigmoid ¶
type Sigmoid[T tensor.Numeric] struct { *BaseActivation[T] }
Sigmoid implements the sigmoid activation function.
func NewSigmoid ¶
NewSigmoid creates a new Sigmoid activation function.
type SwiGLU ¶
SwiGLU implements the SwiGLU activation function.
func (*SwiGLU[T]) Backward ¶
func (s *SwiGLU[T]) Backward(ctx context.Context, dOut *tensor.Tensor[T], inputs ...*tensor.Tensor[T]) ([]*tensor.Tensor[T], error)
Backward computes the gradients for SwiGLU.
func (*SwiGLU[T]) Forward ¶
func (s *SwiGLU[T]) Forward(ctx context.Context, inputs ...*tensor.Tensor[T]) (*tensor.Tensor[T], error)
Forward computes the SwiGLU activation. Input: A tensor with its last dimension being 2 * feature_dim.
func (*SwiGLU[T]) OutputShape ¶
OutputShape returns the output shape of SwiGLU. Input shape is (..., 2 * feature_dim). Output shape is (..., feature_dim).
func (*SwiGLU[T]) Parameters ¶
Parameters returns an empty slice as SwiGLU has no trainable parameters.