Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type ModelForTokenClassification ¶
type ModelForTokenClassification struct {
*bert.ModelForTokenClassification
}
func (*ModelForTokenClassification) Classify ¶
func (m *ModelForTokenClassification) Classify(tokens []string) []mat.Tensor
Classify returns the logits for each token.
func (*ModelForTokenClassification) EncodeAndReduce ¶
func (m *ModelForTokenClassification) EncodeAndReduce(tokens []string) []mat.Tensor
type TokenClassification ¶
type TokenClassification struct {
// Model is the model used to answer questions.
Model *ModelForTokenClassification
// Tokenizer is the tokenizer used to tokenize questions and passages.
Tokenizer *wordpiecetokenizer.WordPieceTokenizer
// Labels is the list of labels used for classification.
Labels []string
// contains filtered or unexported fields
}
TokenClassification is a token classification model.
func LoadTokenClassification ¶
func LoadTokenClassification(modelPath string) (*TokenClassification, error)
LoadTokenClassification returns a TokenClassification loading the model, the embeddings and the tokenizer from a directory.
func (*TokenClassification) Classify ¶
func (m *TokenClassification) Classify(_ context.Context, text string, parameters tokenclassification.Parameters) (tokenclassification.Response, error)
Classify returns the classification of the given text.
Click to show internal directories.
Click to hide internal directories.