roberta

package
v0.1.14 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 9, 2020 License: Apache-2.0 Imports: 12 Imported by: 0

Documentation

Overview

roberta package implements Roberta transformer model.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type RobertaClassificationHead

type RobertaClassificationHead struct {
	// contains filtered or unexported fields
}

RoberatClassificationHead holds data for Roberta classification head.

func NewRobertaClassificationHead

func NewRobertaClassificationHead(p nn.Path, config *bert.BertConfig) *RobertaClassificationHead

NewRobertaClassificationHead create a new RobertaClassificationHead.

func (*RobertaClassificationHead) ForwardT

func (ch *RobertaClassificationHead) ForwardT(hiddenStates ts.Tensor, train bool) ts.Tensor

ForwardT forwards pass through model.

type RobertaEmbeddings

type RobertaEmbeddings struct {
	// contains filtered or unexported fields
}

RobertaEmbeddings holds embedding struct for Roberta model. It also implements `BertEmbedding` interface for Roberta models.

func NewRobertaEmbeddings

func NewRobertaEmbeddings(p nn.Path, config *bert.BertConfig) *RobertaEmbeddings

NewRobertaEmbeddings creates a new RobertaEmbeddings.

Params:

  • `p` - Variable store path for the root of the BertEmbeddings model
  • `config` - `BertConfig` object defining the model architecture and vocab/hidden size.

func (*RobertaEmbeddings) ForwardT

func (re *RobertaEmbeddings) ForwardT(inputIds, tokenTypeIds, positionIds, inputEmbeds ts.Tensor, train bool) (ts.Tensor, error)

ForwardT forwards pass through the embedding layer. This differs from the original BERT embeddings in how the position ids are calculated when not provided.

Params:

  • `inputIds`: Optional input tensor of shape (batch size, sequence length). If None, pre-computed embeddings must be provided (see `inputEmbeds`)
  • `tokenTypeIds`: Optional segment id of shape (batch size, sequence length). Convention is value of 0 for the first sentence (incl. [SEP]) and 1 for the second sentence. If None set to 0.
  • `positionIds`: Optional position ids of shape (batch size, sequence length). If None, will be incremented from 0.
  • `inputEmbeds`: Optional pre-computed input embeddings of shape (batch size, sequence length, hidden size). If None, input ids must be provided (see `inputIds`)
  • `train`: boolean flag to turn on/off the dropout layers in the model. Should be set to false for inference.

Return:

  • `embeddedOutput`: tensor of shape (batch size, sequence length, hidden size)

type RobertaForMaskedLM

type RobertaForMaskedLM struct {
	// contains filtered or unexported fields
}

RobertaForMaskedLM holds data for Roberta masked language model.

Base RoBERTa model with a RoBERTa masked language model head to predict missing tokens.

func NewRobertaForMaskedLM

func NewRobertaForMaskedLM(p nn.Path, config *bert.BertConfig) *RobertaForMaskedLM

NewRobertaForMaskedLM builds a new RobertaForMaskedLM.

func (*RobertaForMaskedLM) Forward

func (mlm *RobertaForMaskedLM) Forward(inputIds, mask, tokenTypeIds, positionIds, inputEmbeds, encoderHiddenStates, encoderMask ts.Tensor, train bool) (output ts.Tensor, hiddenStates, attentions []ts.Tensor, err error)

Forwad forwads pass through the model.

Params:

  • `inputIds`: Optional input tensor of shape (batch size, sequence length). If None, pre-computed embeddings must be provided (see inputEmbeds).
  • `mask`: Optional mask of shape (batch size, sequence length). Masked position have value 0, non-masked value 1. If None set to 1.
  • `tokenTypeIds`: Optional segment id of shape (batch size, sequence length). Convention is value of 0 for the first sentence (incl. </s>) and 1 for the second sentence. If None set to 0.
  • `positionIds`: Optional position ids of shape (batch size, sequence length). If None, will be incremented from 0.
  • `inputEmbeds`: Optional pre-computed input embeddings of shape (batch size, sequence length, hidden size). If None, input ids must be provided (see inputIds).
  • `encoderHiddenStates`: Optional encoder hidden state of shape (batch size, encoder sequence length, hidden size). If the model is defined as a decoder and the encoder hidden states is not None, used in the cross-attention layer as keys and values (query from the decoder).
  • `encoderMask`: Optional encoder attention mask of shape (batch size, encoder sequence length). If the model is defined as a decoder and the *encoder_hidden_states* is not None, used to mask encoder values. Positions with value 0 will be masked.
  • `train`: boolean flag to turn on/off the dropout layers in the model. Should be set to false for inference.

Returns:

  • `output`: tensor of shape (batch size, numLabels, vocab size)
  • `hiddenStates`: optional slice of tensors of length numHiddenLayers with shape (batch size, sequence length, hidden size).
  • `attentions`: optional slice of tensors of length num hidden layers with shape (batch size, sequence length, hidden size).
  • `err`: error

func (*RobertaForMaskedLM) Load

func (mlm *RobertaForMaskedLM) Load(modelNameOrPath string, config interface{ pretrained.Config }, params map[string]interface{}, vs nn.VarStore) error

Load loads model from file or model name. It also updates default configuration parameters if provided. This method implements `PretrainedModel` interface.

type RobertaForMultipleChoice

type RobertaForMultipleChoice struct {
	// contains filtered or unexported fields
}

RobertaForMultipleChoice holds data for Roberta multiple choice model.

Input should be in form of `<s> Context </s> Possible choice </s>`. The choice is made along the batch axis, assuming all elements of the batch are alternatives to be chosen from for a given context.

func NewRobertaForMultipleChoice

func NewRobertaForMultipleChoice(p nn.Path, config *bert.BertConfig) *RobertaForMultipleChoice

NewRobertaForMultipleChoice creates a new RobertaForMultipleChoice model.

func (*RobertaForMultipleChoice) ForwardT

func (mc *RobertaForMultipleChoice) ForwardT(inputIds, mask, tokenTypeIds, positionIds ts.Tensor, train bool) (output ts.Tensor, hiddenStates, attentions []ts.Tensor, err error)

ForwardT forwards pass through the model.

func (*RobertaForMultipleChoice) Load

func (mc *RobertaForMultipleChoice) Load(modelNameOrPath string, config interface{ pretrained.Config }, params map[string]interface{}, vs nn.VarStore) error

Load loads model from file or model name. It also updates default configuration parameters if provided.

This method implements `PretrainedModel` interface.

type RobertaForQuestionAnswering

type RobertaForQuestionAnswering struct {
	// contains filtered or unexported fields
}

RobertaForQuestionAnswering constructs layers for Roberta question answering model.

func NewRobertaForQuestionAnswering

func NewRobertaForQuestionAnswering(p nn.Path, config *bert.BertConfig) *RobertaForQuestionAnswering

NewRobertaQuestionAnswering creates a new RobertaForQuestionAnswering model.

func (*RobertaForQuestionAnswering) ForwardT

func (qa *RobertaForQuestionAnswering) ForwardT(inputIds, mask, tokenTypeIds, positionIds, inputEmbeds ts.Tensor, train bool) (startScores, endScores ts.Tensor, hiddenStates, attentions []ts.Tensor, err error)

ForwadT forwards pass through the model.

func (*RobertaForQuestionAnswering) Load

func (qa *RobertaForQuestionAnswering) Load(modelNameOrPath string, config interface{ pretrained.Config }, params map[string]interface{}, vs nn.VarStore) error

Load loads model from file or model name. It also updates default configuration parameters if provided.

This method implements `PretrainedModel` interface.

type RobertaForSequenceClassification

type RobertaForSequenceClassification struct {
	// contains filtered or unexported fields
}

RobertaForSequenceClassification holds data for Roberta sequence classification model. It's used for performing sentence or document-level classification.

func NewRobertaForSequenceClassification

func NewRobertaForSequenceClassification(p nn.Path, config *bert.BertConfig) *RobertaForSequenceClassification

NewRobertaForSequenceClassification creates a new RobertaForSequenceClassification model.

func (*RobertaForSequenceClassification) ForwardT

func (sc *RobertaForSequenceClassification) ForwardT(inputIds, mask, tokenTypeIds, positionIds, inputEmbeds ts.Tensor, train bool) (labels ts.Tensor, hiddenStates, attentions []ts.Tensor, err error)

Forward forwards pass through the model.

func (*RobertaForSequenceClassification) Load

func (sc *RobertaForSequenceClassification) Load(modelNameOrPath string, config interface{ pretrained.Config }, params map[string]interface{}, vs nn.VarStore) error

Load loads model from file or model name. It also updates default configuration parameters if provided.

This method implements `PretrainedModel` interface.

type RobertaForTokenClassification

type RobertaForTokenClassification struct {
	// contains filtered or unexported fields
}

RobertaForTokenClassification holds data for Roberta token classification model.

func NewRobertaForTokenClassification

func NewRobertaForTokenClassification(p nn.Path, config *bert.BertConfig) *RobertaForTokenClassification

NewRobertaForTokenClassification creates a new RobertaForTokenClassification model.

func (*RobertaForTokenClassification) ForwardT

func (tc *RobertaForTokenClassification) ForwardT(inputIds, mask, tokenTypeIds, positionIds, inputEmbeds ts.Tensor, train bool) (output ts.Tensor, hiddenStates, attentions []ts.Tensor, err error)

ForwardT forwards pass through the model.

func (*RobertaForTokenClassification) Load

func (tc *RobertaForTokenClassification) Load(modelNameOrPath string, config interface{ pretrained.Config }, params map[string]interface{}, vs nn.VarStore) error

Load loads model from file or model name. It also updates default configuration parameters if provided.

This method implements `PretrainedModel` interface.

type RobertaLMHead

type RobertaLMHead struct {
	// contains filtered or unexported fields
}

RobertaLMHead holds data of Roberta LM head.

func NewRobertaLMHead

func NewRobertaLMHead(p nn.Path, config *bert.BertConfig) *RobertaLMHead

NewRobertaLMHead creates new RobertaLMHead.

func (*RobertaLMHead) Forward

func (rh *RobertaLMHead) Forward(hiddenStates ts.Tensor) ts.Tensor

Foward forwards pass through RobertaLMHead model.

type Tokenizer

type Tokenizer struct {
	*tokenizer.Tokenizer
}

Tokenizer holds data for Roberta tokenizer.

func NewTokenizer

func NewTokenizer() *Tokenizer

NewTokenizer creates a new Roberta tokenizer.

func (*Tokenizer) Load

func (t *Tokenizer) Load(vocabNameOrPath, mergesNameOrPath string, params map[string]interface{}) error

Load loads Roberta tokenizer from pretrain vocab and merges files.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL