generated

package
v0.0.5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 3, 2021 License: MIT Imports: 4 Imported by: 0

Documentation

Overview

Package generated contains generated dataloader configurations

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type AddressLoader

type AddressLoader struct {
	// contains filtered or unexported fields
}

AddressLoader batches and caches requests

func NewAddressLoader

func NewAddressLoader(config AddressLoaderConfig) *AddressLoader

NewAddressLoader creates a new AddressLoader given a fetch, wait, and maxBatch

func (*AddressLoader) Clear

func (l *AddressLoader) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*AddressLoader) Load

func (l *AddressLoader) Load(key int64) (*model.Address, error)

Load a Address by key, batching and caching will be applied automatically

func (*AddressLoader) LoadAll

func (l *AddressLoader) LoadAll(keys []int64) ([]*model.Address, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*AddressLoader) LoadAllThunk

func (l *AddressLoader) LoadAllThunk(keys []int64) func() ([]*model.Address, []error)

LoadAllThunk returns a function that when called will block waiting for a Addresss. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*AddressLoader) LoadThunk

func (l *AddressLoader) LoadThunk(key int64) func() (*model.Address, error)

LoadThunk returns a function that when called will block waiting for a Address. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*AddressLoader) Prime

func (l *AddressLoader) Prime(key int64, value *model.Address) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type AddressLoaderConfig

type AddressLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([]*model.Address, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

AddressLoaderConfig captures the config to create a new AddressLoader

type CompanyLoader

type CompanyLoader struct {
	// contains filtered or unexported fields
}

CompanyLoader batches and caches requests

func NewCompanyLoader

func NewCompanyLoader(config CompanyLoaderConfig) *CompanyLoader

NewCompanyLoader creates a new CompanyLoader given a fetch, wait, and maxBatch

func (*CompanyLoader) Clear

func (l *CompanyLoader) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*CompanyLoader) Load

func (l *CompanyLoader) Load(key int64) (*model.Company, error)

Load a Company by key, batching and caching will be applied automatically

func (*CompanyLoader) LoadAll

func (l *CompanyLoader) LoadAll(keys []int64) ([]*model.Company, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*CompanyLoader) LoadAllThunk

func (l *CompanyLoader) LoadAllThunk(keys []int64) func() ([]*model.Company, []error)

LoadAllThunk returns a function that when called will block waiting for a Companys. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*CompanyLoader) LoadThunk

func (l *CompanyLoader) LoadThunk(key int64) func() (*model.Company, error)

LoadThunk returns a function that when called will block waiting for a Company. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*CompanyLoader) Prime

func (l *CompanyLoader) Prime(key int64, value *model.Company) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type CompanyLoaderConfig

type CompanyLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([]*model.Company, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

CompanyLoaderConfig captures the config to create a new CompanyLoader

type CompanyStringLoader

type CompanyStringLoader struct {
	// contains filtered or unexported fields
}

CompanyStringLoader batches and caches requests

func NewCompanyStringLoader

func NewCompanyStringLoader(config CompanyStringLoaderConfig) *CompanyStringLoader

NewCompanyStringLoader creates a new CompanyStringLoader given a fetch, wait, and maxBatch

func (*CompanyStringLoader) Clear

func (l *CompanyStringLoader) Clear(key string)

Clear the value at key from the cache, if it exists

func (*CompanyStringLoader) Load

func (l *CompanyStringLoader) Load(key string) (*model.Company, error)

Load a Company by key, batching and caching will be applied automatically

func (*CompanyStringLoader) LoadAll

func (l *CompanyStringLoader) LoadAll(keys []string) ([]*model.Company, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*CompanyStringLoader) LoadAllThunk

func (l *CompanyStringLoader) LoadAllThunk(keys []string) func() ([]*model.Company, []error)

LoadAllThunk returns a function that when called will block waiting for a Companys. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*CompanyStringLoader) LoadThunk

func (l *CompanyStringLoader) LoadThunk(key string) func() (*model.Company, error)

LoadThunk returns a function that when called will block waiting for a Company. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*CompanyStringLoader) Prime

func (l *CompanyStringLoader) Prime(key string, value *model.Company) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type CompanyStringLoaderConfig

type CompanyStringLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []string) ([]*model.Company, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

CompanyStringLoaderConfig captures the config to create a new CompanyStringLoader

type DomainSliceLoader

type DomainSliceLoader struct {
	// contains filtered or unexported fields
}

DomainSliceLoader batches and caches requests

func NewDomainSliceLoader

func NewDomainSliceLoader(config DomainSliceLoaderConfig) *DomainSliceLoader

NewDomainSliceLoader creates a new DomainSliceLoader given a fetch, wait, and maxBatch

func (*DomainSliceLoader) Clear

func (l *DomainSliceLoader) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*DomainSliceLoader) Load

func (l *DomainSliceLoader) Load(key int64) ([]*model.Domain, error)

Load a Domain by key, batching and caching will be applied automatically

func (*DomainSliceLoader) LoadAll

func (l *DomainSliceLoader) LoadAll(keys []int64) ([][]*model.Domain, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*DomainSliceLoader) LoadAllThunk

func (l *DomainSliceLoader) LoadAllThunk(keys []int64) func() ([][]*model.Domain, []error)

LoadAllThunk returns a function that when called will block waiting for a Domains. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*DomainSliceLoader) LoadThunk

func (l *DomainSliceLoader) LoadThunk(key int64) func() ([]*model.Domain, error)

LoadThunk returns a function that when called will block waiting for a Domain. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*DomainSliceLoader) Prime

func (l *DomainSliceLoader) Prime(key int64, value []*model.Domain) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type DomainSliceLoaderConfig

type DomainSliceLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([][]*model.Domain, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

DomainSliceLoaderConfig captures the config to create a new DomainSliceLoader

type PermissionChecker

type PermissionChecker struct {
	// contains filtered or unexported fields
}

PermissionChecker batches and caches requests

func NewPermissionChecker

func NewPermissionChecker(config PermissionCheckerConfig) *PermissionChecker

NewPermissionChecker creates a new PermissionChecker given a fetch, wait, and maxBatch

func (*PermissionChecker) Clear

func (l *PermissionChecker) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*PermissionChecker) Load

func (l *PermissionChecker) Load(key int64) (*bool, error)

Load a bool by key, batching and caching will be applied automatically

func (*PermissionChecker) LoadAll

func (l *PermissionChecker) LoadAll(keys []int64) ([]*bool, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*PermissionChecker) LoadAllThunk

func (l *PermissionChecker) LoadAllThunk(keys []int64) func() ([]*bool, []error)

LoadAllThunk returns a function that when called will block waiting for a bools. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*PermissionChecker) LoadThunk

func (l *PermissionChecker) LoadThunk(key int64) func() (*bool, error)

LoadThunk returns a function that when called will block waiting for a bool. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*PermissionChecker) Prime

func (l *PermissionChecker) Prime(key int64, value *bool) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type PermissionCheckerConfig

type PermissionCheckerConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([]*bool, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

PermissionCheckerConfig captures the config to create a new PermissionChecker

type PermissionLoader

type PermissionLoader struct {
	// contains filtered or unexported fields
}

PermissionLoader batches and caches requests

func NewPermissionLoader

func NewPermissionLoader(config PermissionLoaderConfig) *PermissionLoader

NewPermissionLoader creates a new PermissionLoader given a fetch, wait, and maxBatch

func (*PermissionLoader) Clear

func (l *PermissionLoader) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*PermissionLoader) Load

func (l *PermissionLoader) Load(key int64) (*bool, error)

Load a bool by key, batching and caching will be applied automatically

func (*PermissionLoader) LoadAll

func (l *PermissionLoader) LoadAll(keys []int64) ([]*bool, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*PermissionLoader) LoadAllThunk

func (l *PermissionLoader) LoadAllThunk(keys []int64) func() ([]*bool, []error)

LoadAllThunk returns a function that when called will block waiting for a bools. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*PermissionLoader) LoadThunk

func (l *PermissionLoader) LoadThunk(key int64) func() (*bool, error)

LoadThunk returns a function that when called will block waiting for a bool. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*PermissionLoader) Prime

func (l *PermissionLoader) Prime(key int64, value *bool) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type PermissionLoaderConfig

type PermissionLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([]*bool, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

PermissionLoaderConfig captures the config to create a new PermissionLoader

type PermissionsLoader

type PermissionsLoader struct {
	// contains filtered or unexported fields
}

PermissionsLoader batches and caches requests

func NewPermissionsLoader

func NewPermissionsLoader(config PermissionsLoaderConfig) *PermissionsLoader

NewPermissionsLoader creates a new PermissionsLoader given a fetch, wait, and maxBatch

func (*PermissionsLoader) Clear

func (l *PermissionsLoader) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*PermissionsLoader) Load

func (l *PermissionsLoader) Load(key int64) ([]*permission.Permission, error)

Load a Permission by key, batching and caching will be applied automatically

func (*PermissionsLoader) LoadAll

func (l *PermissionsLoader) LoadAll(keys []int64) ([][]*permission.Permission, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*PermissionsLoader) LoadAllThunk

func (l *PermissionsLoader) LoadAllThunk(keys []int64) func() ([][]*permission.Permission, []error)

LoadAllThunk returns a function that when called will block waiting for a Permissions. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*PermissionsLoader) LoadThunk

func (l *PermissionsLoader) LoadThunk(key int64) func() ([]*permission.Permission, error)

LoadThunk returns a function that when called will block waiting for a Permission. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*PermissionsLoader) Prime

func (l *PermissionsLoader) Prime(key int64, value []*permission.Permission) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type PermissionsLoaderConfig

type PermissionsLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([][]*permission.Permission, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

PermissionsLoaderConfig captures the config to create a new PermissionsLoader

type RoleLoader

type RoleLoader struct {
	// contains filtered or unexported fields
}

RoleLoader batches and caches requests

func NewRoleLoader

func NewRoleLoader(config RoleLoaderConfig) *RoleLoader

NewRoleLoader creates a new RoleLoader given a fetch, wait, and maxBatch

func (*RoleLoader) Clear

func (l *RoleLoader) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*RoleLoader) Load

func (l *RoleLoader) Load(key int64) ([]permission.Role, error)

Load a Role by key, batching and caching will be applied automatically

func (*RoleLoader) LoadAll

func (l *RoleLoader) LoadAll(keys []int64) ([][]permission.Role, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*RoleLoader) LoadAllThunk

func (l *RoleLoader) LoadAllThunk(keys []int64) func() ([][]permission.Role, []error)

LoadAllThunk returns a function that when called will block waiting for a Roles. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*RoleLoader) LoadThunk

func (l *RoleLoader) LoadThunk(key int64) func() ([]permission.Role, error)

LoadThunk returns a function that when called will block waiting for a Role. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*RoleLoader) Prime

func (l *RoleLoader) Prime(key int64, value []permission.Role) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type RoleLoaderConfig

type RoleLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([][]permission.Role, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

RoleLoaderConfig captures the config to create a new RoleLoader

type UserLoader

type UserLoader struct {
	// contains filtered or unexported fields
}

UserLoader batches and caches requests

func NewUserLoader

func NewUserLoader(config UserLoaderConfig) *UserLoader

NewUserLoader creates a new UserLoader given a fetch, wait, and maxBatch

func (*UserLoader) Clear

func (l *UserLoader) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*UserLoader) Load

func (l *UserLoader) Load(key int64) (*model.User, error)

Load a User by key, batching and caching will be applied automatically

func (*UserLoader) LoadAll

func (l *UserLoader) LoadAll(keys []int64) ([]*model.User, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*UserLoader) LoadAllThunk

func (l *UserLoader) LoadAllThunk(keys []int64) func() ([]*model.User, []error)

LoadAllThunk returns a function that when called will block waiting for a Users. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserLoader) LoadThunk

func (l *UserLoader) LoadThunk(key int64) func() (*model.User, error)

LoadThunk returns a function that when called will block waiting for a User. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserLoader) Prime

func (l *UserLoader) Prime(key int64, value *model.User) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type UserLoaderConfig

type UserLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([]*model.User, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

UserLoaderConfig captures the config to create a new UserLoader

type UserSliceLoader

type UserSliceLoader struct {
	// contains filtered or unexported fields
}

UserSliceLoader batches and caches requests

func NewUserSliceLoader

func NewUserSliceLoader(config UserSliceLoaderConfig) *UserSliceLoader

NewUserSliceLoader creates a new UserSliceLoader given a fetch, wait, and maxBatch

func (*UserSliceLoader) Clear

func (l *UserSliceLoader) Clear(key int64)

Clear the value at key from the cache, if it exists

func (*UserSliceLoader) Load

func (l *UserSliceLoader) Load(key int64) ([]*model.User, error)

Load a User by key, batching and caching will be applied automatically

func (*UserSliceLoader) LoadAll

func (l *UserSliceLoader) LoadAll(keys []int64) ([][]*model.User, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*UserSliceLoader) LoadAllThunk

func (l *UserSliceLoader) LoadAllThunk(keys []int64) func() ([][]*model.User, []error)

LoadAllThunk returns a function that when called will block waiting for a Users. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserSliceLoader) LoadThunk

func (l *UserSliceLoader) LoadThunk(key int64) func() ([]*model.User, error)

LoadThunk returns a function that when called will block waiting for a User. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserSliceLoader) Prime

func (l *UserSliceLoader) Prime(key int64, value []*model.User) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type UserSliceLoaderConfig

type UserSliceLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []int64) ([][]*model.User, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

UserSliceLoaderConfig captures the config to create a new UserSliceLoader

type UserStringLoader

type UserStringLoader struct {
	// contains filtered or unexported fields
}

UserStringLoader batches and caches requests

func NewUserStringLoader

func NewUserStringLoader(config UserStringLoaderConfig) *UserStringLoader

NewUserStringLoader creates a new UserStringLoader given a fetch, wait, and maxBatch

func (*UserStringLoader) Clear

func (l *UserStringLoader) Clear(key string)

Clear the value at key from the cache, if it exists

func (*UserStringLoader) Load

func (l *UserStringLoader) Load(key string) (*model.User, error)

Load a User by key, batching and caching will be applied automatically

func (*UserStringLoader) LoadAll

func (l *UserStringLoader) LoadAll(keys []string) ([]*model.User, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*UserStringLoader) LoadAllThunk

func (l *UserStringLoader) LoadAllThunk(keys []string) func() ([]*model.User, []error)

LoadAllThunk returns a function that when called will block waiting for a Users. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserStringLoader) LoadThunk

func (l *UserStringLoader) LoadThunk(key string) func() (*model.User, error)

LoadThunk returns a function that when called will block waiting for a User. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserStringLoader) Prime

func (l *UserStringLoader) Prime(key string, value *model.User) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type UserStringLoaderConfig

type UserStringLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []string) ([]*model.User, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

UserStringLoaderConfig captures the config to create a new UserStringLoader

type UserStringSliceLoader

type UserStringSliceLoader struct {
	// contains filtered or unexported fields
}

UserStringSliceLoader batches and caches requests

func NewUserStringSliceLoader

func NewUserStringSliceLoader(config UserStringSliceLoaderConfig) *UserStringSliceLoader

NewUserStringSliceLoader creates a new UserStringSliceLoader given a fetch, wait, and maxBatch

func (*UserStringSliceLoader) Clear

func (l *UserStringSliceLoader) Clear(key string)

Clear the value at key from the cache, if it exists

func (*UserStringSliceLoader) Load

func (l *UserStringSliceLoader) Load(key string) ([]*model.User, error)

Load a User by key, batching and caching will be applied automatically

func (*UserStringSliceLoader) LoadAll

func (l *UserStringSliceLoader) LoadAll(keys []string) ([][]*model.User, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*UserStringSliceLoader) LoadAllThunk

func (l *UserStringSliceLoader) LoadAllThunk(keys []string) func() ([][]*model.User, []error)

LoadAllThunk returns a function that when called will block waiting for a Users. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserStringSliceLoader) LoadThunk

func (l *UserStringSliceLoader) LoadThunk(key string) func() ([]*model.User, error)

LoadThunk returns a function that when called will block waiting for a User. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserStringSliceLoader) Prime

func (l *UserStringSliceLoader) Prime(key string, value []*model.User) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type UserStringSliceLoaderConfig

type UserStringSliceLoaderConfig struct {
	// Fetch is a method that provides the data for the loader
	Fetch func(keys []string) ([][]*model.User, []error)

	// Wait is how long wait before sending a batch
	Wait time.Duration

	// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
	MaxBatch int
}

UserStringSliceLoaderConfig captures the config to create a new UserStringSliceLoader

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL