hot

package module
v0.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 2, 2024 License: MIT Imports: 12 Imported by: 1

README

HOT - In-memory caching

tag Go Version GoDoc Build Status Go report Coverage Contributors License

HOT Object Tracker.

A feature-complete and blazing-fast caching library for Go.

💡 Features

  • 🚀 Fast, concurrent
  • 💫 Generics
  • 🗑️ Eviction policies: LRU, LFU, 2Q
  • ⏰ TTL with jitter
  • 🔄 Stale while revalidation
  • ❌ Missing key caching
  • 🍕 Sharded cache
  • 🔒 Optional locking
  • 🔗 Chain of data loaders with in-flight deduplication
  • 🌶️ Cache warmup
  • 📦 Batching all the way
  • 🧩 Composable caching strategy
  • 📝 Optional copy on read and/or write
  • 📊 Stat collection

🚀 Install

go get github.com/samber/hot

This library is v0 and follows SemVer strictly.

No breaking changes will be made to exported APIs before v1.0.0.

🤠 Getting started

GoDoc: https://godoc.org/github.com/samber/hot

// TODO

🏛️ Architecture

This project has been split into multiple layers to respect the separation of concern.

Each cache layer implements the pkg/base.InMemoryCache[K, V] interface. Chaining multiple encapsulation has a minimal cost (~1ns per call), but is highly customizable.

This is highly recommended to use hot.HotCache[K, V] instead of lower layers.

Eviction policies

This project provides multiple eviction policies. Each implements the pkg/base.InMemoryCache[K, V] interface.

They are not protected against concurrent access. If safety is required, encapsulate it into pkg/safe.SafeInMemoryCache[K comparable, V any].

Packages:

  • pkg/lru
  • pkg/lfu
  • pkg/twoqueue
  • pkg/arc

Example:

cache := lru.NewLRUCache[string, *User](100_000)
Concurrent access

The hot.HotCache[K, V] offers protection against concurrent access by default. But in some cases, unnecessary locking might just slow down a program.

Low-level cache layers are not protected by default. Use the following encapsulation to bring safety:

import (
	"github.com/samber/hot/pkg/lfu"
	"github.com/samber/hot/pkg/safe"
)

cache := safe.NewSafeInMemoryCache(
    lru.NewLRUCache[string, *User](100_000),
)
Sharded cache

A sharded cache might be useful in two scenarios:

  • highly concurrent application slowed down by cache locking -> 1 lock per shard instead of 1 global lock
  • highly parallel application with no concurrency -> no lock

The sharding key must not be too costly to compute and must offer a nice balance between shards. The hashing function must have func(k K) uint64 signature.

A sharded cache can be created via hot.HotCache[K, V] or using a low-level layer:

import (
    "hash/fnv"
    "github.com/samber/hot/pkg/lfu"
    "github.com/samber/hot/pkg/safe"
    "github.com/samber/hot/pkg/sharded"
)

cache := sharded.NewShardedInMemoryCache(
    1_000, // number of shards
    func() base.InMemoryCache[K, *item[V]] {
        return safe.NewSafeInMemoryCache(
            lru.NewLRUCache[string, *User](100_000),
        )
    },
    func(key string) uint64 {
        h := fnv.New64a()
        h.Write([]byte(key))
        return h.Sum64()
    },
)
Missing key caching

Instead of calling the loader chain every time an invalid key is requested, a "missing cache" can be enabled. Note that it won't protect your app against a DDoS attack with high cardinality keys.

If the missing keys are infrequent, sharing the missing cache with the main cache might be reasonable:

import "github.com/samber/hot"

cache := hot.NewHotCache[string, int](hot.LRU, 100_000).
    WithMissingSharedCache().
    Build()

If the missing keys are frequent, use a dedicated cache to prevent pollution of the main cache:

import "github.com/samber/hot"

cache := hot.NewHotCache[string, int](hot.LRU, 100_000).
    WithMissingCache(hot.LFU, 50_000).
    Build()

🏎️ Benchmark

// TODO: copy here the benchmarks of bench/ directory // - compare libraries // - measure encapsulation cost // - measure lock cost // - measure ttl cost // - measure size.Of cost // - measure stats collection cost

🤝 Contributing

Don't hesitate ;)

# Install some dev dependencies
make tools

# Run tests
make test
# or
make watch-test

👤 Contributors

Contributors

💫 Show your support

Give a ⭐️ if this project helped you!

GitHub Sponsors

📝 License

Copyright © 2023 Samuel Berthe.

This project is MIT licensed.

Documentation

Index

Constants

This section is empty.

Variables

View Source
var DebounceRevalidationFactor = 0.2

Revalidation is done in batch,.

Functions

This section is empty.

Types

type CacheAlgorithm

type CacheAlgorithm int
const (
	LRU CacheAlgorithm = iota
	LFU
	TwoQueue
	ARC
)

type HotCache

type HotCache[K comparable, V any] struct {
	// contains filtered or unexported fields
}

func (*HotCache[K, V]) Algorithm

func (c *HotCache[K, V]) Algorithm() (string, string)

Algorithm returns the cache algo.

func (*HotCache[K, V]) Capacity

func (c *HotCache[K, V]) Capacity() (int, int)

Capacity returns the cache capacity.

func (*HotCache[K, V]) Delete

func (c *HotCache[K, V]) Delete(key K) bool

Delete removes a key from the cache.

func (*HotCache[K, V]) DeleteMany

func (c *HotCache[K, V]) DeleteMany(keys []K) map[K]bool

DeleteMany removes many keys from the cache.

func (*HotCache[K, V]) Get

func (c *HotCache[K, V]) Get(key K) (value V, ok bool, err error)

Get returns a value from the cache, a boolean indicating whether the key was found and an error when loaders fail.

func (*HotCache[K, V]) GetMany

func (c *HotCache[K, V]) GetMany(keys []K) (map[K]V, []K, error)

GetMany returns many values from the cache, a slice of missing keys and an error when loaders fail.

func (*HotCache[K, V]) GetManyWithCustomLoaders

func (c *HotCache[K, V]) GetManyWithCustomLoaders(keys []K, customLoaders LoaderChain[K, V]) (map[K]V, []K, error)

GetManyWithCustomLoaders returns many values from the cache, a slice of missing keys and an error when loaders fail.

func (*HotCache[K, V]) GetWithCustomLoaders

func (c *HotCache[K, V]) GetWithCustomLoaders(key K, customLoaders LoaderChain[K, V]) (value V, ok bool, err error)

GetWithCustomLoaders returns a value from the cache, a boolean indicating whether the key was found and an error when loaders fail.

func (*HotCache[K, V]) Has

func (c *HotCache[K, V]) Has(key K) bool

Has checks if a key exists in the cache. Missing values are not valid, even if cached.

func (*HotCache[K, V]) HasMany

func (c *HotCache[K, V]) HasMany(keys []K) map[K]bool

HasMany checks if keys exist in the cache. Missing values are not valid, even if cached.

func (*HotCache[K, V]) Janitor

func (c *HotCache[K, V]) Janitor()

Janitor runs a background goroutine to clean up the cache.

func (*HotCache[K, V]) Keys

func (c *HotCache[K, V]) Keys() []K

Keys returns all keys in the cache. Missing keys are not included.

func (*HotCache[K, V]) Len

func (c *HotCache[K, V]) Len() int

Len returns the number of items in the cache. Missing items are included.

func (*HotCache[K, V]) Peek

func (c *HotCache[K, V]) Peek(key K) (value V, ok bool)

Peek is similar to Get, but do not check expiration and do not call loaders/revalidation. Missing values are not returned, even if cached.

func (*HotCache[K, V]) PeekMany

func (c *HotCache[K, V]) PeekMany(keys []K) (map[K]V, []K)

PeekMany is similar to GetMany, but do not check expiration and do not call loaders/revalidation. Missing values are not returned, even if cached.

func (*HotCache[K, V]) Purge

func (c *HotCache[K, V]) Purge()

Purge removes all items from the cache.

func (*HotCache[K, V]) Range

func (c *HotCache[K, V]) Range(f func(K, V) bool)

Range calls a function for each key/value pair in the cache. The callback should be kept short because it is called while holding a read lock. @TODO: loop over missingCache? Use a different callback? Missing values are not included.

func (*HotCache[K, V]) Set

func (c *HotCache[K, V]) Set(key K, v V)

Set adds a value to the cache. If the key already exists, its value is updated. It uses the default ttl or none.

func (*HotCache[K, V]) SetMany

func (c *HotCache[K, V]) SetMany(items map[K]V)

SetMany adds many values to the cache. If the keys already exist, values are updated. It uses the default ttl or none.

func (*HotCache[K, V]) SetManyWithTTL

func (c *HotCache[K, V]) SetManyWithTTL(items map[K]V, ttl time.Duration)

SetManyWithTTL adds many values to the cache. If the keys already exist, values are updated. It uses the given ttl.

func (*HotCache[K, V]) SetMissing

func (c *HotCache[K, V]) SetMissing(key K)

SetMissing adds a key to the `missing` cache. If the key already exists, its value is dropped. It uses the default ttl or none.

func (*HotCache[K, V]) SetMissingMany

func (c *HotCache[K, V]) SetMissingMany(missingKeys []K)

SetMissingMany adds many keys to the cache. If the keys already exist, values are dropped. It uses the default ttl or none.

func (*HotCache[K, V]) SetMissingManyWithTTL

func (c *HotCache[K, V]) SetMissingManyWithTTL(missingKeys []K, ttl time.Duration)

SetManyWithTTL adds many keys to the cache. If the keys already exist, values are dropped. It uses the given ttl.

func (*HotCache[K, V]) SetMissingWithTTL

func (c *HotCache[K, V]) SetMissingWithTTL(key K, ttl time.Duration)

SetMissingWithTTL adds a key to the `missing` cache. If the key already exists, its value is dropped. It uses the given ttl.

func (*HotCache[K, V]) SetWithTTL

func (c *HotCache[K, V]) SetWithTTL(key K, v V, ttl time.Duration)

SetWithTTL adds a value to the cache. If the key already exists, its value is updated. It uses the given ttl.

func (*HotCache[K, V]) StopJanitor

func (c *HotCache[K, V]) StopJanitor()

func (*HotCache[K, V]) Values

func (c *HotCache[K, V]) Values() []V

Values returns all values in the cache. Missing values are not included.

func (*HotCache[K, V]) WarmUp

func (c *HotCache[K, V]) WarmUp(loader func() (map[K]V, []K, error)) error

WarmUp loads all keys from the loader and sets them in the cache.

type HotCacheConfig

type HotCacheConfig[K comparable, V any] struct {
	// contains filtered or unexported fields
}

func NewHotCache

func NewHotCache[K comparable, V any](algorithm CacheAlgorithm, capacity int) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) Build

func (cfg HotCacheConfig[K, V]) Build() *HotCache[K, V]

func (HotCacheConfig[K, V]) WithCopyOnRead

func (cfg HotCacheConfig[K, V]) WithCopyOnRead(copyOnRead func(V) V) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithCopyOnWrite

func (cfg HotCacheConfig[K, V]) WithCopyOnWrite(copyOnWrite func(V) V) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithJanitor

func (cfg HotCacheConfig[K, V]) WithJanitor() HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithJitter

func (cfg HotCacheConfig[K, V]) WithJitter(jitter float64) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithLoaders

func (cfg HotCacheConfig[K, V]) WithLoaders(loaders ...Loader[K, V]) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithMissingCache

func (cfg HotCacheConfig[K, V]) WithMissingCache(algorithm CacheAlgorithm, capacity int) HotCacheConfig[K, V]

WithMissingCache enables cache of missing keys. The missing keys are stored in a separate cache.

func (HotCacheConfig[K, V]) WithMissingSharedCache

func (cfg HotCacheConfig[K, V]) WithMissingSharedCache() HotCacheConfig[K, V]

WithMissingSharedCache enables cache of missing keys. The missing cache is shared with the main cache.

func (HotCacheConfig[K, V]) WithRevalidation

func (cfg HotCacheConfig[K, V]) WithRevalidation(stale time.Duration, loaders ...Loader[K, V]) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithSharding

func (cfg HotCacheConfig[K, V]) WithSharding(nbr uint64, fn sharded.Hasher[K]) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithTTL

func (cfg HotCacheConfig[K, V]) WithTTL(ttl time.Duration) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithWarmUp

func (cfg HotCacheConfig[K, V]) WithWarmUp(fn func() (map[K]V, []K, error)) HotCacheConfig[K, V]

func (HotCacheConfig[K, V]) WithoutLocking

func (cfg HotCacheConfig[K, V]) WithoutLocking() HotCacheConfig[K, V]

type Loader

type Loader[K comparable, V any] func(keys []K) (found map[K]V, err error)

type LoaderChain

type LoaderChain[K comparable, V any] []Loader[K, V]

type Metrics

type Metrics struct {
	Insertion prometheus.Counter
	Eviction  prometheus.Counter

	Hit  prometheus.Counter
	Miss prometheus.Counter

	Length prometheus.Gauge
	Weight prometheus.Gauge

	// settings
	// SettingsCapacity        prometheus.Gauge
	SettingsTTL    prometheus.Gauge
	SettingsJitter prometheus.Gauge
	SettingsStale  prometheus.Gauge
	// SettingsMissingCapacity prometheus.Gauge
	SettingsMissingTTL   prometheus.Gauge
	SettingsMissingStale prometheus.Gauge
}

@TODO: Should be simple atomic counters and gauges. @TODO: If metrics are disabled, no need to collect them (use a no-op implementation). @TODO: If prometheus metrics are enabled, collect them in a separate goroutine. @TODO: collect revalidation count+delay @TODO: add a label for the cache name @TODO: add comment to metric declaration @TODO: weight should be diplicated in order to include *item[V] weight

func NewMetrics

func NewMetrics(ttl time.Duration, jitter float64, stale time.Duration) *Metrics

Directories

Path Synopsis
pkg
arc
lfu
lru

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL