neural

package module
v0.1.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 9, 2021 License: GPL-3.0 Imports: 3 Imported by: 0

README

gopher-learn

gopher-learn-logo

Quickstart

What is gopher-learn?

  • Artificial neural network written in Golang with training / testing framework
  • Rich measurement mechanisms to control the training
  • Examples for fast understanding
  • Can also be used for iterative online learning (using online module) for autonomous agents

Install

  go get github.com/breskos/gopher-learn
  go get github.com/breskos/gopher-learn/persist
  go get github.com/breskos/gopher-learn/learn
  go get github.com/breskos/gopher-learn/engine
  go get github.com/breskos/gopher-learn/evaluation
  go get github.com/breskos/gopher-learn/online

The gopher-learn engine

The engine helps you with optimizing the learning process. Basically it starts with a high learning rate to make fast progress in the beginning. After some rounds of learning (epochs) the learning rate declines (decay). During the process the best network is saved. After engine finished training you receive the ready to go network.

Modes

Gopher-neural can be used to perform classification and regression. This sections helps to set up both modes. In general, you have to take care about the differences between both modes during these parts: read training data from file, start engine, use evaluation modes and perform in production.

Classification
Read training data from file
data := learn.NewSet(neural.Classification)
ok, err := data.LoadFromCSV(dataFile)
Start engine
e := engine.NewEngine(neural.Classification, []int{hiddenNeurons}, data)
e.SetVerbose(true)
e.Start(neural.CriterionDistance, tries, epochs, trainingSplit, learningRate, decay)
Use evalation mode
evaluation.PrintSummary("name of class1")
evaluation.PrintSummary("name of class2")
evaluation.PrintConfusionMatrix()
Perform in production
x := net.CalculateWinnerLabel(vector)
Regression

Important note: Use regression just with a target value between 0 and 1.

Read training data from file
data := learn.NewSet(neural.Regression)
ok, err := data.LoadFromCSV(dataFile)
Start engine
e := engine.NewEngine(neural.Regression, []int{hiddenNeurons}, data)
e.SetVerbose(true)
e.Start(neural.CriterionDistance, tries, epochs, trainingSplit, learningRate, decay)
Use evalation mode
evaluation.GetRegressionSummary()
Perform in production
x := net.Calculate(vector)

Criterions

To let the engine decide for the best model, a few criterias were implemented. They are listed below together with a short regarding their application:

  • CriterionAccuracy - uses simple accuracy calculation to decide the best model. Not suitable with unbalanced data sets.
  • CriterionBalancedAccuracy - uses balanced accuracy. Suitable for unbalanced data sets.
  • CriterionFMeasure - uses F1 score. Suitable for unbalanced data sets.
  • CriterionSimple - uses simple correct classified divided by all classified samples. Suitable for regression with thresholding.
  • CriterionDistance - uses distance between ideal output and current output. Suitable for regression.
...
e := engine.NewEngine(neural.Classification, []int{100}, data)
e.Start(neural.CriterionDistance, tries, epochs, trainingSplit, learningRate, decay)
...

Some more basics

Train a network using engine
import (
	"fmt"

	"github.com/breskos/gopher-learn"
	"github.com/breskos/gopher-learn/engine"
	"github.com/breskos/gopher-learn/learn"
	"github.com/breskos/gopher-learn/persist"
)

const (
	dataFile      = "data.csv"
	networkFile   = "network.json"
	tries         = 1
	epochs        = 100 //100
	trainingSplit = 0.7
	learningRate  = 0.6
	decay         = 0.005
  hiddenNeurons = 20
)

func main() {
	data := learn.NewSet(neural.Classification)
	ok, err := data.LoadFromCSV(dataFile)
	if !ok || nil != err {
		fmt.Printf("something went wrong -> %v", err)
	}
	e := engine.NewEngine(neural.Classification, []int{hiddenNeurons}, data)
	e.SetVerbose(true)
	e.Start(neural.CriterionDistance, tries, epochs, trainingSplit, learningRate, decay)
	network, evaluation := e.GetWinner()

	evaluation.PrintSummary("name of class1")
	evaluation.PrintSummary("name of class2")

	err = persist.ToFile(networkFile, network)
	if err != nil {
		fmt.Printf("error while saving network: %v\n", err)
	}
	network2, err := persist.FromFile(networkFile)
	if err != nil {
		fmt.Printf("error while loading network: %v\n", err)
	}
  // check the network with the first sample
	w := network2.CalculateWinnerLabel(data.Samples[0].Vector)
	fmt.Printf("%v -> %v\n", data.Samples[0].Label, w)

  fmt.Println(" * Confusion Matrix *")
	evaluation.PrintConfusionMatrix()
}

Create simple network for classification

  import "github.com/breskos/gopher-learn"
  // Network has 9 enters and 3 layers
  // ( 9 neurons, 9 neurons and 2 neurons).
  // Last layer is network output (2 neurons).
  // For these last neurons we need labels (like: spam, nospam, positive, negative)
  labels := make(map[int]string)
  labels[0] = "positive"
  labels[1] = "negative"
  n := neural.NewNetwork(9, []int{9,9,2}, map[int])
  // Randomize sypaseses weights
  n.RandomizeSynapses()

  // now you can calculate on this network (of course it is not trained yet)
  // (for the training you can use then engine)
  result := n.Calculate([]float64{0,1,0,1,1,1,0,1,0})

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func LogisticFunc

func LogisticFunc(x, a float64) float64

LogisticFunc returns the value of the logistic function for x and a

Types

type ActivationFunction

type ActivationFunction func(float64) float64

ActivationFunction function definition of activation functions

func NewLogisticFunc

func NewLogisticFunc(a float64) ActivationFunction

NewLogisticFunc applies and returns the Logistic function

type Criterion

type Criterion int
const (
	// Accuracy decides evaluation by accuracy
	Accuracy Criterion = iota
	// BalancedAccuracy decides evaluation by balanced accuracy
	BalancedAccuracy
	// FMeasure decides evaluation by f-measure
	FMeasure
	// Simple decides on simple wrong/correct ratio
	Simple
	// Distance decides evaluation by distance to ideal output
	Distance
)

type Enter

type Enter struct {
	OutSynapses []*Synapse
	Input       float64 `json:"-"`
}

func NewEnter

func NewEnter() *Enter

func (*Enter) ConnectTo

func (e *Enter) ConnectTo(layer *Layer)

func (*Enter) SetInput

func (e *Enter) SetInput(val float64)

func (*Enter) Signal

func (e *Enter) Signal()

func (*Enter) SynapseTo

func (e *Enter) SynapseTo(nTo *Neuron, weight float64)

type Layer

type Layer struct {
	Neurons []*Neuron
}

func NewLayer

func NewLayer(neurons int) *Layer

func (*Layer) Calculate

func (l *Layer) Calculate()

func (*Layer) ConnectTo

func (l *Layer) ConnectTo(layer *Layer)

type Network

type Network struct {
	Enters    []*Enter
	Layers    []*Layer
	Out       []float64 `json:"-"`
	OutLabels map[int]string
}

Network contains all the necessary information to use the neural network

func BuildNetwork

func BuildNetwork(usage NetworkType, input int, hidden []int, labels map[int]string) *Network

BuildNetwork builds a neural network from parameters given

func NewNetwork

func NewNetwork(in int, layers []int, labels map[int]string) *Network

NewNetwork creates a new neural network

func (*Network) Calculate

func (n *Network) Calculate(enters []float64) []float64

Calculate calculates the result of a input vector

func (*Network) CalculateLabels

func (n *Network) CalculateLabels(enters []float64) map[string]float64

CalculateLabels output with all labels of output neurons

func (*Network) CalculateWinnerLabel

func (n *Network) CalculateWinnerLabel(enters []float64) string

CalculateWinnerLabel calculates the output and just returns the label of the winning euron

func (*Network) ConnectEnters

func (n *Network) ConnectEnters()

ConnectEnters connects the input neurons with the first hidden layer

func (*Network) ConnectLayers

func (n *Network) ConnectLayers()

ConnectLayers connects all layers with corresponding neurons

func (*Network) RandomizeSynapses

func (n *Network) RandomizeSynapses()

RandomizeSynapses applies a random value to all synapses

func (*Network) SetActivationFunction

func (n *Network) SetActivationFunction(aFunc ActivationFunction)

SetActivationFunction sets the activation function for the network

type NetworkType

type NetworkType int
const (
	// Classification describes the mode of operation: classification
	Classification NetworkType = iota
	// Regression describes the mode of operation: regression
	Regression
)

type Neuron

type Neuron struct {
	OutSynapses        []*Synapse
	InSynapses         []*Synapse         `json:"-"`
	ActivationFunction ActivationFunction `json:"-"`
	Out                float64            `json:"-"`
}

Neuron holds the data a neuron needs

func NewNeuron

func NewNeuron() *Neuron

NewNeuron creates a new neuron

func (*Neuron) Calculate

func (n *Neuron) Calculate()

Calculate calculates the actual neuron activity

func (*Neuron) SetActivationFunction

func (n *Neuron) SetActivationFunction(aFunc ActivationFunction)

SetActivationFunction sets the activation function for the neuron

func (*Neuron) SynapseTo

func (n *Neuron) SynapseTo(nTo *Neuron, weight float64)

SynapseTo creates a new synapse to a neuron

type Synapse

type Synapse struct {
	Weight float64
	In     float64 `json:"-"`
	Out    float64 `json:"-"`
}

Synapse holds the synapse structure

func NewSynapse

func NewSynapse(weight float64) *Synapse

NewSynapse creates a new synapse

func NewSynapseFromTo

func NewSynapseFromTo(from, to *Neuron, weight float64) *Synapse

NewSynapseFromTo creates a new synapse from neuron to neuron

func (*Synapse) Signal

func (s *Synapse) Signal(value float64)

Signal activates the Synapse with an input value

Directories

Path Synopsis
examples
online-sonar command
sonar command
wine-quality command

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL