fastcache

package module
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 22, 2018 License: MIT Imports: 4 Imported by: 1,625

README

Build Status GoDoc Go Report codecov

fastcache - fast off-heap thread-safe inmemory cache for Go

Features
  • Fast. Performance scales on multi-core CPUs. See benchmark results below.
  • Thread-safe. Concurrent goroutines may read and write into a single cache instance.
  • The fastcache is designed for storing big number of items without GC overhead.
  • Fastcache automatically evicts old entries when reaching the maximum size set during its creation.
  • Simple API.
Benchmarks

Fastcache performance is compared to BigCache performance and to standard Go map performance.

GOMAXPROCS=4 go test ./lib/fastcache/ -run=111 -bench=. -benchtime=10s
goos: linux
goarch: amd64
pkg: github.com/VictoriaMetrics/VictoriaMetrics/lib/fastcache
BenchmarkBigCacheSet-4      	    2000	  11534981 ns/op	   5.68 MB/s	 4660371 B/op	       6 allocs/op
BenchmarkBigCacheGet-4      	    2000	   6950758 ns/op	   9.43 MB/s	  684169 B/op	  131076 allocs/op
BenchmarkBigCacheSetGet-4   	    2000	  11381738 ns/op	   6.33 MB/s	 4712794 B/op	   13112 allocs/op
BenchmarkCacheSet-4         	    3000	   3965871 ns/op	  16.52 MB/s	    7459 B/op	       2 allocs/op
BenchmarkCacheGet-4         	    5000	   2822254 ns/op	  23.22 MB/s	    4475 B/op	       1 allocs/op
BenchmarkCacheSetGet-4      	    3000	   5219393 ns/op	  13.81 MB/s	    7461 B/op	       2 allocs/op
BenchmarkStdMapSet-4        	    2000	  12009655 ns/op	   5.46 MB/s	  268430 B/op	   65537 allocs/op
BenchmarkStdMapGet-4        	    5000	   2747309 ns/op	  23.85 MB/s	    2563 B/op	      13 allocs/op
BenchmarkStdMapSetGet-4     	     300	  44032198 ns/op	   1.64 MB/s	  303889 B/op	   65543 allocs/op
PASS

As you can see, fastcache is faster than the BigCache in all cases and faster than the standard Go map on workloads with inserts.

Limitations
  • Keys and values must be byte slices.
  • Summary size of a (key, value) entry cannot exceed 64KB.
  • There is no cache expiration. Entries are evicted from the cache only on overflow.
Architecture details

The cache uses ideas from BigCache:

  • The cache consists of many buckets, each with its own lock. This helps scaling the performance on multi-core CPUs, since multiple CPUs may concurrently access distinct buckets.
  • Each bucket consists of a hash(key) -> (key, value) position map and 64KB-sized byte slices (chunks) holding encoded (key, value) entries. Each bucket contains only O(chunksCount) pointers. For instance, 16GB cache would contain ~1M pointers, while similarly-sized map[string][]byte for short keys and values would contain ~1B more pointers, leading to huge GC overhead.

64KB-sized chunks reduce memory fragmentation and the total memory usage comparing to a single big chunk per bucket.

Users

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Cache

type Cache struct {
	// contains filtered or unexported fields
}

Cache is a fast off-heap thread-safe inmemory cache.

GC performance doesn't depend on the cache size.

Use New for creating new cache instance. Concurrent goroutines may call any Cache methods on the same cache instance.

func New

func New(maxBytes int) *Cache

New returns new cache with the given maxBytes capacity in bytes.

maxBytes must be smaller than the available RAM size for the app, since the cache holds data in memory.

If maxBytes is less than 64Mb, then the minimum cache capacity is 64Mb.

func (*Cache) Del

func (c *Cache) Del(k []byte)

Del deletes value for the given k from the cache.

k contents may be modified after returning from Del.

func (*Cache) Get

func (c *Cache) Get(dst, k []byte) []byte

Get appends value by the key k to dst and returns the result.

k contents may be modified after returning from Get.

func (*Cache) Reset

func (c *Cache) Reset()

Reset removes all the items from the cache.

func (*Cache) Set

func (c *Cache) Set(k, v []byte)

Set stores (k, v) in the cache.

The stored entry may be evicted at any time either due to cache overflow or due to unlikely hash collision. Pass higher maxBytes value to New if the added items disappear frequently.

k and v contents may be modified after returning from Set.

func (*Cache) UpdateStats

func (c *Cache) UpdateStats(s *Stats)

UpdateStats adds cache stats to s.

type Stats

type Stats struct {
	// GetCalls is the number of Get calls.
	GetCalls uint64

	// SetCalls is the number of Set calls.
	SetCalls uint64

	// Misses is the number of cache misses.
	Misses uint64

	// Collisions is the number of cache collisions.
	Collisions uint64

	// EntriesCount is the current number of entries in the cache.
	EntriesCount uint64

	// BytesSize is the current size of the cache in bytes.
	BytesSize uint64
}

Stats represents cache stats.

Use Cache.UpdateStats for obtaining fresh stats from the cache.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL