Documentation
¶
Overview ¶
Package activations provides activation function layers.
Package activations provides activation function layers.
Package activations provides neural network activation functions.
Index ¶
- func BuildFastGelu[T tensor.Float](engine compute.Engine[T], _ numeric.Arithmetic[T], _ string, ...) (graph.Node[T], error)
- type ActivationLayer
- type BaseActivation
- func NewBaseActivation[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], opType string, ...) *BaseActivation[T]
- func NewReLU[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]
- func NewSigmoid[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]
- func NewTanh[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]
- func (b *BaseActivation[T]) Attributes() map[string]interface{}
- func (b *BaseActivation[T]) Backward(ctx context.Context, mode types.BackwardMode, ...) ([]*tensor.TensorNumeric[T], error)
- func (b *BaseActivation[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
- func (b *BaseActivation[T]) OpType() string
- func (b *BaseActivation[T]) OutputShape() []int
- type BaseActivationOption
- type BaseActivationOptions
- type FastGelu
- func (g *FastGelu[T]) Attributes() map[string]interface{}
- func (g *FastGelu[T]) Backward(_ context.Context, mode types.BackwardMode, _ *tensor.TensorNumeric[T], ...) ([]*tensor.TensorNumeric[T], error)
- func (g *FastGelu[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
- func (g *FastGelu[T]) OpType() string
- func (g *FastGelu[T]) OutputShape() []int
- func (g *FastGelu[T]) Parameters() []*graph.Parameter[T]
- type LeakyReLU
- func (l *LeakyReLU[T]) Attributes() map[string]interface{}
- func (l *LeakyReLU[T]) Backward(ctx context.Context, mode types.BackwardMode, ...) ([]*tensor.TensorNumeric[T], error)
- func (l *LeakyReLU[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
- func (l *LeakyReLU[T]) OpType() string
- func (l *LeakyReLU[T]) OutputShape() []int
- type LeakyReLUOption
- type LeakyReLUOptions
- type ReLU
- type Sigmoid
- type SwiGLU
- func (s *SwiGLU[T]) Attributes() map[string]interface{}
- func (s *SwiGLU[T]) Backward(ctx context.Context, mode types.BackwardMode, dOut *tensor.TensorNumeric[T], ...) ([]*tensor.TensorNumeric[T], error)
- func (s *SwiGLU[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
- func (s *SwiGLU[T]) OpType() string
- func (s *SwiGLU[T]) OutputShape() []int
- func (s *SwiGLU[T]) Parameters() []*graph.Parameter[T]
- type SwiGLUOption
- type SwiGLUOptions
- type Tanh
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type ActivationLayer ¶
type ActivationLayer[T tensor.Numeric] interface { Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error) Backward(ctx context.Context, mode types.BackwardMode, outputGradient *tensor.TensorNumeric[T], inputs ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error) }
ActivationLayer defines the interface for activation layers used in tests.
type BaseActivation ¶
type BaseActivation[T tensor.Numeric] struct { graph.NoParameters[T] // contains filtered or unexported fields }
BaseActivation provides common functionality for unary activation functions.
func NewBaseActivation ¶
func NewBaseActivation[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], opType string, opts ...BaseActivationOption[T]) *BaseActivation[T]
NewBaseActivation creates a new base activation with the given forward and backward operations.
func NewReLU ¶
func NewReLU[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]
NewReLU creates a new ReLU activation function.
func NewSigmoid ¶
func NewSigmoid[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]
NewSigmoid creates a new Sigmoid activation function.
func NewTanh ¶
func NewTanh[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T]) *BaseActivation[T]
NewTanh creates a new Tanh activation function.
func (*BaseActivation[T]) Attributes ¶ added in v0.3.0
func (b *BaseActivation[T]) Attributes() map[string]interface{}
Attributes returns the attributes of the activation.
func (*BaseActivation[T]) Backward ¶
func (b *BaseActivation[T]) Backward(ctx context.Context, mode types.BackwardMode, outputGradient *tensor.TensorNumeric[T], _ ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)
Backward performs the backward pass of the activation function.
func (*BaseActivation[T]) Forward ¶
func (b *BaseActivation[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
Forward performs the forward pass of the activation function.
func (*BaseActivation[T]) OpType ¶ added in v0.3.0
func (b *BaseActivation[T]) OpType() string
OpType returns the operation type of the activation.
func (*BaseActivation[T]) OutputShape ¶
func (b *BaseActivation[T]) OutputShape() []int
OutputShape returns the output shape of the activation.
type BaseActivationOption ¶ added in v0.3.0
type BaseActivationOption[T tensor.Numeric] func(*BaseActivationOptions[T])
BaseActivationOption is a function that applies an option to BaseActivationOptions.
func WithBackwardOp ¶ added in v0.3.0
func WithBackwardOp[T tensor.Numeric](op func(T) T) BaseActivationOption[T]
WithBackwardOp sets the backward operation for the BaseActivation.
func WithForwardOp ¶ added in v0.3.0
func WithForwardOp[T tensor.Numeric](op func(T) T) BaseActivationOption[T]
WithForwardOp sets the forward operation for the BaseActivation.
type BaseActivationOptions ¶ added in v0.3.0
BaseActivationOptions holds configuration options for BaseActivation. BaseActivationOptions holds configuration options for BaseActivation.
type FastGelu ¶ added in v0.3.0
FastGelu is an approximation of the GELU activation function.
func NewFastGelu ¶ added in v0.3.0
NewFastGelu creates a new FastGelu layer.
func (*FastGelu[T]) Attributes ¶ added in v0.3.0
Attributes returns nil for the FastGelu layer.
func (*FastGelu[T]) Backward ¶ added in v0.3.0
func (g *FastGelu[T]) Backward(_ context.Context, mode types.BackwardMode, _ *tensor.TensorNumeric[T], _ ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)
Backward applies the backward pass of the FastGelu layer.
func (*FastGelu[T]) Forward ¶ added in v0.3.0
func (g *FastGelu[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
Forward applies the forward pass of the FastGelu layer.
func (*FastGelu[T]) OpType ¶ added in v0.3.0
OpType returns the operation type of the FastGelu layer.
func (*FastGelu[T]) OutputShape ¶ added in v0.3.0
OutputShape returns the output shape of the layer.
func (*FastGelu[T]) Parameters ¶ added in v0.3.0
Parameters returns the learnable parameters of the layer.
type LeakyReLU ¶
type LeakyReLU[T tensor.Numeric] struct { graph.NoParameters[T] // contains filtered or unexported fields }
LeakyReLU implements the Leaky Rectified Linear Unit activation function.
func NewLeakyReLU ¶
func NewLeakyReLU[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], opts ...LeakyReLUOption[T]) *LeakyReLU[T]
NewLeakyReLU creates a new LeakyReLU activation function.
func (*LeakyReLU[T]) Attributes ¶ added in v0.3.0
Attributes returns the attributes of the LeakyReLU layer.
func (*LeakyReLU[T]) Backward ¶
func (l *LeakyReLU[T]) Backward(ctx context.Context, mode types.BackwardMode, outputGradient *tensor.TensorNumeric[T], _ ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)
Backward computes the gradients for the LeakyReLU activation.
func (*LeakyReLU[T]) Forward ¶
func (l *LeakyReLU[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
Forward computes the LeakyReLU activation for the given input.
func (*LeakyReLU[T]) OpType ¶ added in v0.3.0
OpType returns the operation type of the LeakyReLU layer.
func (*LeakyReLU[T]) OutputShape ¶
OutputShape returns the output shape of the LeakyReLU layer.
type LeakyReLUOption ¶ added in v0.3.0
type LeakyReLUOption[T tensor.Numeric] func(*LeakyReLUOptions[T])
LeakyReLUOption is a function that applies an option to LeakyReLUOptions.
type LeakyReLUOptions ¶ added in v0.3.0
LeakyReLUOptions holds configuration options for LeakyReLU. LeakyReLUOptions holds configuration options for LeakyReLU.
type ReLU ¶
type ReLU[T tensor.Numeric] struct { *BaseActivation[T] }
ReLU implements the Rectified Linear Unit activation function.
type Sigmoid ¶
type Sigmoid[T tensor.Numeric] struct { *BaseActivation[T] }
Sigmoid implements the sigmoid activation function.
type SwiGLU ¶
SwiGLU implements the SwiGLU activation function.
func NewSwiGLU ¶
func NewSwiGLU[T tensor.Numeric](engine compute.Engine[T], ops numeric.Arithmetic[T], opts ...SwiGLUOption[T]) *SwiGLU[T]
NewSwiGLU creates a new SwiGLU activation layer.
func (*SwiGLU[T]) Attributes ¶ added in v0.3.0
Attributes returns the attributes.
func (*SwiGLU[T]) Backward ¶
func (s *SwiGLU[T]) Backward(ctx context.Context, mode types.BackwardMode, dOut *tensor.TensorNumeric[T], inputs ...*tensor.TensorNumeric[T]) ([]*tensor.TensorNumeric[T], error)
Backward computes the gradients for SwiGLU.
func (*SwiGLU[T]) Forward ¶
func (s *SwiGLU[T]) Forward(ctx context.Context, inputs ...*tensor.TensorNumeric[T]) (*tensor.TensorNumeric[T], error)
Forward computes the SwiGLU activation. Input: A tensor with its last dimension being 2 * feature_dim.
func (*SwiGLU[T]) OutputShape ¶
OutputShape returns the output shape of SwiGLU.
func (*SwiGLU[T]) Parameters ¶
Parameters returns an empty slice as SwiGLU has no trainable parameters.
type SwiGLUOption ¶ added in v0.3.0
type SwiGLUOption[T tensor.Numeric] func(*SwiGLUOptions[T])
SwiGLUOption is a function that applies an option to SwiGLUOptions.
type SwiGLUOptions ¶ added in v0.3.0
SwiGLUOptions holds configuration options for SwiGLU.
type Tanh ¶
type Tanh[T tensor.Numeric] struct { *BaseActivation[T] }
Tanh implements the hyperbolic tangent activation function.