Documentation
¶
Index ¶
- type ByteCache
- type Cache
- func (m *Cache[T]) Close() error
- func (m *Cache[T]) Del(ctx context.Context, key string) error
- func (m *Cache[T]) DelAll(ctx context.Context) error
- func (m *Cache[T]) Exists(ctx context.Context, key string) (bool, error)
- func (m *Cache[T]) Get(ctx context.Context, key string) (T, bool, error)
- func (m *Cache[T]) MultiDel(ctx context.Context, keys []string) error
- func (m *Cache[T]) MultiGet(ctx context.Context, keys []string) (map[string]T, error)
- func (m *Cache[T]) MultiSet(ctx context.Context, valMap map[string]T) error
- func (m *Cache[T]) MultiSetWithTTL(ctx context.Context, valMap map[string]T, expiration time.Duration) error
- func (m *Cache[T]) Set(ctx context.Context, key string, val T) error
- func (m *Cache[T]) SetAllowAsyncWrites(allow bool)
- func (m *Cache[T]) SetWithTTL(ctx context.Context, key string, val T, expiration time.Duration) error
- func (m *Cache[T]) UpdateMaxCost(maxItem int64)
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Cache ¶
type Cache[T any] struct { // contains filtered or unexported fields }
Cache is an in-memory cache implementation backed by ristretto that provides high-performance caching with configurable cost calculation and asynchronous write options.
func NewMemoryCache ¶
NewMemoryCache creates a new in-memory cache with default settings. The cache uses a fixed cost of 1 per item and does not calculate actual memory usage.
func NewMemoryCacheWithCost ¶
NewMemoryCacheWithCost creates a new in-memory cache with a custom cost function. The cost function determines the memory cost of each cached item, enabling memory-based eviction policies.
func NewMemoryCacheWithRistretto ¶
func NewMemoryCacheWithRistretto[T any](cache *ristretto.Cache[string, T], calculateCost, allowAsyncWrites bool) *Cache[T]
NewMemoryCacheWithRistretto creates a new cache wrapper around an existing ristretto cache instance. This allows for advanced configuration and sharing of cache instances across multiple Cache wrappers.
func (*Cache[T]) MultiSetWithTTL ¶
func (*Cache[T]) SetAllowAsyncWrites ¶
SetAllowAsyncWrites configures whether the cache should use asynchronous writes. In memory.Cache asynchronous writes are disabled by default. If asynchronous writes are enabled, the cache will not block the Set method but it will not guarantee that the value is written to the cache immediately.
func (*Cache[T]) SetWithTTL ¶
func (*Cache[T]) UpdateMaxCost ¶
UpdateMaxCost updates the maximum cost allowed for the cache. In memory.Cache, by default, `calculateCost` is False, so `cost` will be 1. It doesn't care about the size of the item. Calculating cost is too complex and not necessary for most use cases. If you want to limit the number of items in the cache, you use this method to set the maximum number of items. If you want to limit the size of the items in the cache, you can use NewMemoryCacheWithCost