memory

package
v0.0.3-beta.8 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 16, 2025 License: MIT Imports: 4 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type ByteCache

type ByteCache = Cache[[]byte]

func NewByteCache

func NewByteCache() *ByteCache

type Cache

type Cache[T any] struct {
	// contains filtered or unexported fields
}

Cache is an in-memory cache implementation backed by ristretto that provides high-performance caching with configurable cost calculation and asynchronous write options.

func NewMemoryCache

func NewMemoryCache[T any]() *Cache[T]

NewMemoryCache creates a new in-memory cache with default settings. The cache uses a fixed cost of 1 per item and does not calculate actual memory usage.

func NewMemoryCacheWithCost

func NewMemoryCacheWithCost[T any](cost func(T) int64) *Cache[T]

NewMemoryCacheWithCost creates a new in-memory cache with a custom cost function. The cost function determines the memory cost of each cached item, enabling memory-based eviction policies.

func NewMemoryCacheWithRistretto

func NewMemoryCacheWithRistretto[T any](cache *ristretto.Cache[string, T], calculateCost, allowAsyncWrites bool) *Cache[T]

NewMemoryCacheWithRistretto creates a new cache wrapper around an existing ristretto cache instance. This allows for advanced configuration and sharing of cache instances across multiple Cache wrappers.

func (*Cache[T]) Close

func (m *Cache[T]) Close() error

func (*Cache[T]) Del

func (m *Cache[T]) Del(ctx context.Context, key string) error

func (*Cache[T]) DelAll

func (m *Cache[T]) DelAll(ctx context.Context) error

func (*Cache[T]) Exists

func (m *Cache[T]) Exists(ctx context.Context, key string) (bool, error)

func (*Cache[T]) Get

func (m *Cache[T]) Get(ctx context.Context, key string) (T, bool, error)

func (*Cache[T]) MultiDel

func (m *Cache[T]) MultiDel(ctx context.Context, keys []string) error

func (*Cache[T]) MultiGet

func (m *Cache[T]) MultiGet(ctx context.Context, keys []string) (map[string]T, error)

func (*Cache[T]) MultiSet

func (m *Cache[T]) MultiSet(ctx context.Context, valMap map[string]T) error

func (*Cache[T]) MultiSetWithTTL

func (m *Cache[T]) MultiSetWithTTL(ctx context.Context, valMap map[string]T, expiration time.Duration) error

func (*Cache[T]) Set

func (m *Cache[T]) Set(ctx context.Context, key string, val T) error

func (*Cache[T]) SetAllowAsyncWrites

func (m *Cache[T]) SetAllowAsyncWrites(allow bool)

SetAllowAsyncWrites configures whether the cache should use asynchronous writes. In memory.Cache asynchronous writes are disabled by default. If asynchronous writes are enabled, the cache will not block the Set method but it will not guarantee that the value is written to the cache immediately.

func (*Cache[T]) SetWithTTL

func (m *Cache[T]) SetWithTTL(ctx context.Context, key string, val T, expiration time.Duration) error

func (*Cache[T]) UpdateMaxCost

func (m *Cache[T]) UpdateMaxCost(maxItem int64)

UpdateMaxCost updates the maximum cost allowed for the cache. In memory.Cache, by default, `calculateCost` is False, so `cost` will be 1. It doesn't care about the size of the item. Calculating cost is too complex and not necessary for most use cases. If you want to limit the number of items in the cache, you use this method to set the maximum number of items. If you want to limit the size of the items in the cache, you can use NewMemoryCacheWithCost

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL