bgate

package
v1.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 12, 2020 License: BSD-3-Clause Imports: 15 Imported by: 0

README

BGate: Basal Ganglia Action & Thought Engagement

Documentation

Index

Constants

This section is empty.

Variables

View Source
var (
	// NeuronVars are extra neuron variables for bgate
	NeuronVars = []string{"DA", "DALrn", "ACh", "Ca", "KCa"}

	// NeuronVarsAll is the bgate collection of all neuron-level vars
	NeuronVarsAll []string

	// SynVarsAll is the bgate collection of all synapse-level vars (includes TraceSynVars)
	SynVarsAll []string
)
View Source
var (
	STNNeuronVars    = []string{"Ca", "KCa"}
	STNNeuronVarsMap map[string]int
)
View Source
var KiT_DaReceptors = kit.Enums.AddEnum(DaReceptorsN, kit.NotBitFlag, nil)
View Source
var KiT_GPLayer = kit.Types.AddType(&GPLayer{}, leabra.LayerProps)
View Source
var KiT_GPeInPrjn = kit.Types.AddType(&GPeInPrjn{}, leabra.PrjnProps)
View Source
var KiT_GPiLayer = kit.Types.AddType(&GPiLayer{}, leabra.LayerProps)
View Source
var KiT_GPiPrjn = kit.Types.AddType(&GPiPrjn{}, leabra.PrjnProps)
View Source
var KiT_Layer = kit.Types.AddType(&Layer{}, leabra.LayerProps)
View Source
var KiT_MatrixLayer = kit.Types.AddType(&MatrixLayer{}, leabra.LayerProps)
View Source
var KiT_Network = kit.Types.AddType(&Network{}, NetworkProps)
View Source
var KiT_STNLayer = kit.Types.AddType(&STNLayer{}, leabra.LayerProps)
View Source
var KiT_TANLayer = kit.Types.AddType(&TANLayer{}, leabra.LayerProps)
View Source
var KiT_VThalLayer = kit.Types.AddType(&VThalLayer{}, leabra.LayerProps)
View Source
var NetworkProps = deep.NetworkProps
View Source
var TraceSynVars = []string{"NTr", "Tr"}

Functions

func STNNeuronVarByName

func STNNeuronVarByName(varNm string) (int, error)

STNNeuronVarByName returns the index of the variable in the STNNeuron, or error

Types

type CaParams

type CaParams struct {
	BurstThr float32 `` /* 244-byte string literal not displayed */
	ActThr   float32 `def:"0.7" desc:"activation threshold for increment in activation above baseline that drives lower influx of Ca"`
	BurstCa  float32 `desc:"Ca level for burst level activation"`
	ActCa    float32 `` /* 187-byte string literal not displayed */
	GbarKCa  float32 `def:"20" desc:"maximal KCa conductance (actual conductance is applied to KNa channels)"`
	KCaTau   float32 `def:"40" desc:"KCa conductance time constant -- 40 from Gillies & Willshaw, 2006"`
	CaTau    float32 `def:"185.7" desc:"Ca time constant of decay to baseline -- 185.7 from Gillies & Willshaw, 2006"`
}

CaParams control the calcium dynamics in STN neurons. Gillies & Willshaw, 2006 provide a biophysically detailed simulation, and we use their logistic function for computing KCa conductance based on Ca, but we use a simpler approximation with burst and act threshold. KCa are Calcium-gated potassium channels that drive the long afterhyperpolarization of STN neurons. Auto reset at each AlphaCycle. The conductance is applied to KNa channels to take advantage of the existing infrastructure.

func (*CaParams) Defaults

func (kc *CaParams) Defaults()

func (*CaParams) KCaGFmCa

func (kc *CaParams) KCaGFmCa(ca float32) float32

KCaGFmCa returns the driving conductance for KCa channels based on given Ca level. This equation comes from Gillies & Willshaw, 2006.

type DaModParams

type DaModParams struct {
	On      bool    `desc:"whether to use dopamine modulation"`
	ModGain bool    `viewif:"On" desc:"modulate gain instead of Ge excitatory synaptic input"`
	Minus   float32 `` /* 145-byte string literal not displayed */
	Plus    float32 `` /* 144-byte string literal not displayed */
	NegGain float32 `` /* 208-byte string literal not displayed */
	PosGain float32 `` /* 208-byte string literal not displayed */
}

Params for effects of dopamine (Da) based modulation, typically adding a Da-based term to the Ge excitatory synaptic input. Plus-phase = learning effects relative to minus-phase "performance" dopamine effects

func (*DaModParams) Defaults

func (dm *DaModParams) Defaults()

func (*DaModParams) Gain

func (dm *DaModParams) Gain(da, gain float32, plusPhase bool) float32

Gain returns da-modulated gain value

func (*DaModParams) GainModOn

func (dm *DaModParams) GainModOn() bool

GainModOn returns true if modulating Gain

func (*DaModParams) Ge

func (dm *DaModParams) Ge(da, ge float32, plusPhase bool) float32

Ge returns da-modulated ge value

func (*DaModParams) GeModOn

func (dm *DaModParams) GeModOn() bool

GeModOn returns true if modulating Ge

type DaReceptors

type DaReceptors int

DaReceptors for D1R and D2R dopamine receptors

const (
	// D1R primarily expresses Dopamine D1 Receptors -- dopamine is excitatory and bursts of dopamine lead to increases in synaptic weight, while dips lead to decreases -- direct pathway in dorsal striatum
	D1R DaReceptors = iota

	// D2R primarily expresses Dopamine D2 Receptors -- dopamine is inhibitory and bursts of dopamine lead to decreases in synaptic weight, while dips lead to increases -- indirect pathway in dorsal striatum
	D2R

	DaReceptorsN
)

func (*DaReceptors) FromString

func (i *DaReceptors) FromString(s string) error

func (DaReceptors) MarshalJSON

func (ev DaReceptors) MarshalJSON() ([]byte, error)

func (DaReceptors) String

func (i DaReceptors) String() string

func (*DaReceptors) UnmarshalJSON

func (ev *DaReceptors) UnmarshalJSON(b []byte) error

type GPLayer

type GPLayer struct {
	Layer
}

GPLayer represents the dorsal matrisome MSN's that are the main Go / NoGo gating units in BG. D1R = Go, D2R = NoGo.

func (*GPLayer) Defaults

func (ly *GPLayer) Defaults()

type GPeInPrjn

type GPeInPrjn struct {
	leabra.Prjn
}

GPeInPrjn must be used with GPLayer. Learns from DA and gating status.

func (*GPeInPrjn) DWt

func (pj *GPeInPrjn) DWt()

DWt computes the weight change (learning) -- on sending projections.

func (*GPeInPrjn) Defaults

func (pj *GPeInPrjn) Defaults()

type GPiLayer

type GPiLayer struct {
	GPLayer
	GateThr float32   `def:"0.2" desc:"threshold on activation to count for gating"`
	MinAct  []float32 `desc:"per-pool minimum activation value during alpha cycle"`
}

GPiLayer represents the GPi / SNr output nucleus of the BG. It gets inhibited by the MtxGo and GPeIn layers, and its minimum activation during this inhibition is measured for gating. Typically just a single unit per Pool representing a given stripe.

func (*GPiLayer) ActFmG

func (ly *GPiLayer) ActFmG(ltime *leabra.Time)

ActFmG computes rate-code activation from Ge, Gi, Gl conductances and updates learning running-average activations from that Act. GP extends to compute MinAct

func (*GPiLayer) AlphaCycInit

func (ly *GPiLayer) AlphaCycInit()

AlphaCycInit handles all initialization at start of new input pattern, including computing input scaling from running average activation etc. should already have presented the external input to the network at this point.

func (*GPiLayer) Build

func (ly *GPiLayer) Build() error

Build constructs the layer state, including calling Build on the projections you MUST have properly configured the Inhib.Pool.On setting by this point to properly allocate Pools for the unit groups if necessary.

func (*GPiLayer) Defaults

func (ly *GPiLayer) Defaults()

func (*GPiLayer) InitActs

func (ly *GPiLayer) InitActs()

type GPiPrjn

type GPiPrjn struct {
	leabra.Prjn
	GateLrate float32 `desc:"extra learning rate multiplier for gated pools"`
}

GPiPrjn must be used with GPi recv layer, from MtxGo, GPeIn senders. Learns from DA and gating status.

func (*GPiPrjn) DWt

func (pj *GPiPrjn) DWt()

DWt computes the weight change (learning) -- on sending projections.

func (*GPiPrjn) Defaults

func (pj *GPiPrjn) Defaults()

type Layer

type Layer struct {
	leabra.Layer
	DA float32 `inactive:"+" desc:"dopamine value for this layer"`
}

Layer is the base layer type for BGate framework. Adds a dopamine variable to base Leabra layer type.

func (*Layer) GetDA

func (ly *Layer) GetDA() float32

func (*Layer) InitActs

func (ly *Layer) InitActs()

func (*Layer) SetDA

func (ly *Layer) SetDA(da float32)

func (*Layer) UnitVal1D

func (ly *Layer) UnitVal1D(varIdx int, idx int) float32

UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.

func (*Layer) UnitVarIdx

func (ly *Layer) UnitVarIdx(varNm string) (int, error)

UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.

type MatrixLayer

type MatrixLayer struct {
	Layer
	DaR    DaReceptors  `desc:"dominant type of dopamine receptor -- D1R for Go pathway, D2R for NoGo"`
	Matrix MatrixParams `view:"inline" desc:"matrix parameters"`
	DALrn  float32      `inactive:"+" desc:"effective learning dopamine value for this layer: reflects DaR and Gains"`
	ACh    float32      `` /* 190-byte string literal not displayed */
}

MatrixLayer represents the dorsal matrisome MSN's that are the main Go / NoGo gating units in BG. D1R = Go, D2R = NoGo.

func (*MatrixLayer) ActFmG

func (ly *MatrixLayer) ActFmG(ltime *leabra.Time)

ActFmG computes rate-code activation from Ge, Gi, Gl conductances and updates learning running-average activations from that Act. Matrix extends to call DALrnFmDA

func (*MatrixLayer) DALrnFmDA

func (ly *MatrixLayer) DALrnFmDA(da float32) float32

DALrnFmDA returns effective learning dopamine value from given raw DA value applying Burst and Dip Gain factors, and then reversing sign for D2R.

func (*MatrixLayer) Defaults

func (ly *MatrixLayer) Defaults()

func (*MatrixLayer) GetACh

func (ly *MatrixLayer) GetACh() float32

func (*MatrixLayer) InitActs

func (ly *MatrixLayer) InitActs()

func (*MatrixLayer) SetACh

func (ly *MatrixLayer) SetACh(ach float32)

func (*MatrixLayer) UnitVal1D

func (ly *MatrixLayer) UnitVal1D(varIdx int, idx int) float32

UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.

func (*MatrixLayer) UnitVarIdx

func (ly *MatrixLayer) UnitVarIdx(varNm string) (int, error)

UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.

type MatrixParams

type MatrixParams struct {
	BurstGain float32 `` /* 237-byte string literal not displayed */
	DipGain   float32 `` /* 237-byte string literal not displayed */
}

MatrixParams has parameters for Dorsal Striatum Matrix computation These are the main Go / NoGo gating units in BG driving updating of PFC WM in PBWM

func (*MatrixParams) Defaults

func (mp *MatrixParams) Defaults()

type MatrixTracePrjn

type MatrixTracePrjn struct {
	leabra.Prjn
	Trace  TraceParams `view:"inline" desc:"special parameters for matrix trace learning"`
	TrSyns []TraceSyn  `desc:"trace synaptic state values, ordered by the sending layer units which owns them -- one-to-one with SConIdx array"`
}

MatrixTracePrjn does dopamine-modulated, gated trace learning, for Matrix learning in PBWM context

func (*MatrixTracePrjn) Build

func (pj *MatrixTracePrjn) Build() error

func (*MatrixTracePrjn) ClearTrace

func (pj *MatrixTracePrjn) ClearTrace()

func (*MatrixTracePrjn) DWt

func (pj *MatrixTracePrjn) DWt()

DWt computes the weight change (learning) -- on sending projections.

func (*MatrixTracePrjn) Defaults

func (pj *MatrixTracePrjn) Defaults()

func (*MatrixTracePrjn) InitWts

func (pj *MatrixTracePrjn) InitWts()

func (*MatrixTracePrjn) SynVal1D

func (pj *MatrixTracePrjn) SynVal1D(varIdx int, synIdx int) float32

SynVal1D returns value of given variable index (from SynVarIdx) on given SynIdx. Returns NaN on invalid index. This is the core synapse var access method used by other methods, so it is the only one that needs to be updated for derived layer types.

func (*MatrixTracePrjn) SynVarIdx

func (pj *MatrixTracePrjn) SynVarIdx(varNm string) (int, error)

SynVarIdx returns the index of given variable within the synapse, according to *this prjn's* SynVarNames() list (using a map to lookup index), or -1 and error message if not found.

type Network

type Network struct {
	deep.Network
}

bgate.Network has methods for configuring specialized BGATE network components

func (*Network) AddBG

func (nt *Network) AddBG(prefix string, nPoolsY, nPoolsX, nNeurY, nNeurX int) (mtxGo, mtxNo, tan, gpeOut, gpeIn, gpeTA, stnp, stns, gpi, vthal leabra.LeabraLayer)

AddBG adds MtxGo, No, GPeOut, GPeIn, GPeTA, STNp, STNs, GPi, and VThal layers, with given optional prefix. Assumes that a 4D structure will be used, with Pools representing separable gating domains. Only Matrix has more than 1 unit per Pool by default. Appropriate PoolOneToOne connections are made between layers, using standard styles

func (*Network) AddGPeLayer

func (nt *Network) AddGPeLayer(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *GPLayer

AddGPLayer adds a GPLayer of given size, with given name. Assumes that a 4D structure will be used, with Pools representing separable gating domains. Typically nNeurY, nNeurX will both be 1, but could have more for noise etc.

func (*Network) AddGPiLayer

func (nt *Network) AddGPiLayer(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *GPiLayer

AddGPiLayer adds a GPiLayer of given size, with given name. Assumes that a 4D structure will be used, with Pools representing separable gating domains. Typically nNeurY, nNeurX will both be 1, but could have more for noise etc.

func (*Network) AddMatrixLayer

func (nt *Network) AddMatrixLayer(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int, da DaReceptors) *MatrixLayer

AddMatrixLayer adds a MatrixLayer of given size, with given name. Assumes that a 4D structure will be used, with Pools representing separable gating domains. da gives the DaReceptor type (D1R = Go, D2R = NoGo)

func (*Network) AddSTNLayer

func (nt *Network) AddSTNLayer(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *STNLayer

AddSTNLayer adds a subthalamic nucleus Layer of given size, with given name. Assumes that a 4D structure will be used, with Pools representing separable gating domains. Typically nNeurY, nNeurX will both be 1, but could have more for noise etc.

func (*Network) AddTANLayer

func (nt *Network) AddTANLayer(name string) *TANLayer

AddTANLayer adds a TANLayer, with a single neuron.

func (*Network) AddVThalLayer

func (nt *Network) AddVThalLayer(name string, nPoolsY, nPoolsX, nNeurY, nNeurX int) *VThalLayer

AddVThalLayer adds a ventral thalamus (VA/VL/VM) Layer of given size, with given name. Assumes that a 4D structure will be used, with Pools representing separable gating domains. Typically nNeurY, nNeurX will both be 1, but could have more for noise etc.

func (*Network) ConnectToMatrix

func (nt *Network) ConnectToMatrix(send, recv emer.Layer, pat prjn.Pattern) emer.Prjn

ConnectToMatrix adds a MatrixTracePrjn from given sending layer to a matrix layer

func (*Network) Defaults

func (nt *Network) Defaults()

Defaults sets all the default parameters for all layers and projections

func (*Network) NewLayer

func (nt *Network) NewLayer() emer.Layer

NewLayer returns new layer of default leabra.Layer type

func (*Network) NewPrjn

func (nt *Network) NewPrjn() emer.Prjn

NewPrjn returns new prjn of default leabra.Prjn type

func (*Network) SynVarNames

func (nt *Network) SynVarNames() []string

SynVarNames returns the names of all the variables on the synapses in this network.

func (*Network) UnitVarNames

func (nt *Network) UnitVarNames() []string

UnitVarNames returns a list of variable names available on the units in this layer

func (*Network) UpdateParams

func (nt *Network) UpdateParams()

UpdateParams updates all the derived parameters if any have changed, for all layers and projections

type STNLayer

type STNLayer struct {
	Layer
	Ca       CaParams    `` /* 186-byte string literal not displayed */
	STNNeurs []STNNeuron `` /* 149-byte string literal not displayed */
}

STNLayer represents the pausing subtype of STN neurons. These open the gating window.

func (*STNLayer) ActFmG

func (ly *STNLayer) ActFmG(ltime *leabra.Time)

func (*STNLayer) AlphaCycInit

func (ly *STNLayer) AlphaCycInit()

AlphaCycInit handles all initialization at start of new input pattern, including computing input scaling from running average activation etc. should already have presented the external input to the network at this point.

func (*STNLayer) Build

func (ly *STNLayer) Build() error

Build constructs the layer state, including calling Build on the projections.

func (*STNLayer) Defaults

func (ly *STNLayer) Defaults()

func (*STNLayer) GetDA

func (ly *STNLayer) GetDA() float32

func (*STNLayer) InitActs

func (ly *STNLayer) InitActs()

func (*STNLayer) SetDA

func (ly *STNLayer) SetDA(da float32)

func (*STNLayer) UnitVal1D

func (ly *STNLayer) UnitVal1D(varIdx int, idx int) float32

UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.

func (*STNLayer) UnitVarIdx

func (ly *STNLayer) UnitVarIdx(varNm string) (int, error)

UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.

type STNNeuron

type STNNeuron struct {
	Ca  float32 `` /* 169-byte string literal not displayed */
	KCa float32 `` /* 129-byte string literal not displayed */
}

STNNeuron holds the extra neuron (unit) level variables for STN computation.

func (*STNNeuron) VarByIndex

func (nrn *STNNeuron) VarByIndex(idx int) float32

VarByIndex returns variable using index (0 = first variable in STNNeuronVars list)

func (*STNNeuron) VarByName

func (nrn *STNNeuron) VarByName(varNm string) (float32, error)

VarByName returns variable by name, or error

func (*STNNeuron) VarNames

func (nrn *STNNeuron) VarNames() []string

type TANLayer

type TANLayer struct {
	leabra.Layer
	RewLay  string     `desc:"name of Reward-representing layer from which this computes ACh as absolute value"`
	SendACh rl.SendACh `desc:"list of layers to send acetylcholine to"`
	ACh     float32    `desc:"acetylcholine value for this layer"`
}

TANLayer (tonically active neuron) reads reward signals from a named source layer and sends the absolute value of that activity as the positively-rectified non-prediction-discounted reward signal computed TANs, and sent as an acetylcholine signal.

func (*TANLayer) ActFmG

func (ly *TANLayer) ActFmG(ltime *leabra.Time)

func (*TANLayer) Build

func (ly *TANLayer) Build() error

Build constructs the layer state, including calling Build on the projections.

func (*TANLayer) CyclePost

func (ly *TANLayer) CyclePost(ltime *leabra.Time)

CyclePost is called at end of Cycle We use it to send ACh, which will then be active for the next cycle of processing.

func (*TANLayer) Defaults

func (ly *TANLayer) Defaults()

func (*TANLayer) GetACh

func (ly *TANLayer) GetACh() float32

func (*TANLayer) RewLayer

func (ly *TANLayer) RewLayer() (*leabra.Layer, error)

RewLayer returns the reward layer based on name

func (*TANLayer) SetACh

func (ly *TANLayer) SetACh(ach float32)

func (*TANLayer) UnitVal1D

func (ly *TANLayer) UnitVal1D(varIdx int, idx int) float32

UnitVal1D returns value of given variable index on given unit, using 1-dimensional index. returns NaN on invalid index. This is the core unit var access method used by other methods, so it is the only one that needs to be updated for derived layer types.

func (*TANLayer) UnitVarIdx

func (ly *TANLayer) UnitVarIdx(varNm string) (int, error)

UnitVarIdx returns the index of given variable within the Neuron, according to UnitVarNames() list (using a map to lookup index), or -1 and error message if not found.

type TraceParams

type TraceParams struct {
	CurTrlDA bool    `` /* 278-byte string literal not displayed */
	Deriv    bool    `` /* 328-byte string literal not displayed */
	Decay    float32 `` /* 168-byte string literal not displayed */
}

Params for for trace-based learning in the MatrixTracePrjn. A trace of synaptic co-activity is formed, and then modulated by dopamine whenever it occurs. This bridges the temporal gap between gating activity and subsequent activity, and is based biologically on synaptic tags. Trace is reset at time of reward based on ACh level from TANs.

func (*TraceParams) Defaults

func (tp *TraceParams) Defaults()

func (*TraceParams) LrnFactor

func (tp *TraceParams) LrnFactor(act float32) float32

LrnFactor resturns multiplicative factor for level of msn activation. If Deriv is 2 * act * (1-act) -- the factor of 2 compensates for otherwise reduction in learning from these factors. Otherwise is just act.

type TraceSyn

type TraceSyn struct {
	NTr float32 `` /* 136-byte string literal not displayed */
	Tr  float32 `` /* 183-byte string literal not displayed */
}

TraceSyn holds extra synaptic state for trace projections

func (*TraceSyn) VarByIndex

func (sy *TraceSyn) VarByIndex(varIdx int) float32

VarByIndex returns synapse variable by index

func (*TraceSyn) VarByName

func (sy *TraceSyn) VarByName(varNm string) float32

VarByName returns synapse variable by name

type VThalLayer

type VThalLayer struct {
	leabra.Layer
	DA float32 `inactive:"+" desc:"dopamine value for this layer"`
}

VThalLayer represents the Ventral thalamus: VA / VM / VL which receives BG gating.

func (*VThalLayer) Defaults

func (ly *VThalLayer) Defaults()

func (*VThalLayer) GetDA

func (ly *VThalLayer) GetDA() float32

func (*VThalLayer) InitActs

func (ly *VThalLayer) InitActs()

func (*VThalLayer) SetDA

func (ly *VThalLayer) SetDA(da float32)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL