A multi-level cache library with stampede prevention for Go

Overview

HybridCache

A multi-level cache library with cache stampede prevention for Go

import "github.com/cshum/hybridcache"

// Redis cache adapter based on Redigo
redisCache := cache.NewRedis(&redis.Pool{...})

// Memory cache adapter based on Ristretto
memoryCache := cache.NewMemory(1e5, 1<<29, time.Hour)
// bounded maximum ~100,000 items, ~500mb memory size, 1 hour ttl

// Hybrid cache adapter with Redis upstream + Memory downstream
hybridCache := cache.NewHybrid(redisCache, memoryCache)

The hybrid combination allows Redis upstream coordinate across multiple servers, while Memory downstream ensures minimal network I/O which brings the fastest response time. Shall the Redis upstream failed, memory downstream will still operate independently without service disruption.

// wrap function call with hybrid cache
cacheFunc := cache.NewFunc(hybridCache, time.Seconds*20, time.Minute, time.Hour)
// 20 seconds execution timeout, 1 minute fresh-for timeout, 1 hour ttl
var items []*Items
someKey := fmt.Sprintf("key-%d", id)
if err := cacheFunc.Do(ctx, someKey, func(ctx context.Context) (interface{}, error) {
	return someHeavyOperations(ctx, id)
}, &items); err != nil {
	return err
}
for _, item := range items {
	...
}


// use as an HTTP middleware
cacheHandler := cache.NewHTTP(hybridCache, time.Seconds*30, time.Minute, time.Hour*12).Handler
// 30 seconds request timeout, 1 minute fresh-for timeout, 12 hour ttl
// setting a high ttl will enable "always online" in case of service disruption.
// content will lazy refresh in background (goroutine) after fresh-for timeout

r := mux.NewRouter()
r.Use(cacheHandler)
r.Mount("/", myWebServices)
http.ListenAndServe(":3001", r)

A simple function wrapper or HTTP middleware gives you under the hood:

  • Lazy background refresh with timeout - after fresh-for timeout exceeded, the next cache hit will trigger a refresh in goroutine, where context deadline is detached from parent and based on wait-for timeout.
  • Cache stampede prevention - uses singleflight for memory call suppression and SEX NX for redis.
  • Marshal and unmarshal options for function calls - default to msgpack, with options to configure your own.

Conditional caching with cache.ErrNoCache:

var items []*Items
someKey := fmt.Sprintf("key-%d", id)
if err := cacheFunc.Do(ctx, someKey, func(ctx context.Context) (interface{}, error) {
	start := time.Now()
	items, err := someHeavyOperations(ctx, id)
	if err != nil {
		return nil, err
	}
	if time.Since(start) < time.Millisecond*300 {
		// no cache if response within 300 milliseconds
		return items, cache.ErrNoCache
	}
	return items, nil
}, &items); err != nil {
	// ErrNoCache will NOT appear outside
	return err
}

More options:

cacheFunc := cache.NewFunc(hybridCache, time.Seconds*20, time.Minute, time.Hour)
cacheFunc.Marshal = json.Marshal
cacheFunc.Unmarshal = json.Unmarshal
// custom Marshal Unmarshal function, default msgpack


h := cache.NewHTTP(hybridCache, time.Seconds*30, time.Minute, time.Hour*12)
h.RequestKey = func(r *http.Request) string {
	// cache key by url excluding query params
	return strings.Split(r.URL.String(), "?")[0]
}
h.AcceptRequest = func(r *http.Request) bool {
	if strings.Contains(r.URL.RawQuery, "nocache") {
		// no cache if nocache appears in query
		return false
	}
	return true
}
h.AcceptResponse = func(res *http.Response) bool {
	if res.StatusCode != http.StatusOK {
		// no cache if status code not 200
		return false
	}
	if res.ContentLength >= 1<<20 {
		// no cache if over 1 MB
		return false
	}
	return true
}
cacheHandler := h.Handler
You might also like...
An in-memory key:value store/cache (similar to Memcached) library for Go, suitable for single-machine applications.

go-cache go-cache is an in-memory key:value store/cache similar to memcached that is suitable for applications running on a single machine. Its major

gdcache is a pure non-intrusive distributed cache library implemented by golang
gdcache is a pure non-intrusive distributed cache library implemented by golang

gdcache is a pure non-intrusive distributed cache library implemented by golang, you can use it to implement your own distributed cache

A REST-API service that works as an in memory key-value store with go-minimal-cache library.

A REST-API service that works as an in memory key-value store with go-minimal-cache library.

An in-memory key:value store/cache library written in Go 1.18 generics

go-generics-cache go-generics-cache is an in-memory key:value store/cache that is suitable for applications running on a single machine. This in-memor

Ristretto - A fast, concurrent cache library built with a focus on performance and correctness

Ristretto Ristretto is a fast, concurrent cache library built with a focus on pe

Gocodecache - An in-memory cache library for code value master in Golang

gocodecache An in-memory cache library for code master in Golang. Installation g

A zero-dependency cache library for storing data in memory with generics.

Memory Cache A zero-dependency cache library for storing data in memory with generics. Requirements Golang 1.18+ Installation go get -u github.com/rod

LevelDB style LRU cache for Go, support non GC object.

Go语言QQ群: 102319854, 1055927514 凹语言(凹读音“Wa”)(The Wa Programming Language): https://github.com/wa-lang/wa LRU Cache Install go get github.com/chai2010/c

fastcache - fast thread-safe inmemory cache for big number of entries in Go

Fast thread-safe inmemory cache for big number of entries in Go. Minimizes GC overhead

Releases(v0.2.5)
Multi-level caching service in Go

IgoVIUM Multi-level caching service in Go. Specifically: Distributed in-memory cache (L1) DB-based cache (L2) Long term historization on persistent vo

Andrea 6 Nov 9, 2022
cyhone 144 Nov 25, 2022
Package cache is a middleware that provides the cache management for Flamego.

cache Package cache is a middleware that provides the cache management for Flamego. Installation The minimum requirement of Go is 1.16. go get github.

Flamego 11 Nov 9, 2022
A mem cache base on other populator cache, add following feacture

memcache a mem cache base on other populator cache, add following feacture add lazy load(using expired data, and load it asynchronous) add singlefligh

zhq 1 Oct 28, 2021
Cache - A simple cache implementation

Cache A simple cache implementation LRU Cache An in memory cache implementation

Stanislav Petrashov 1 Jan 25, 2022
Gin-cache - Gin cache middleware with golang

Gin-cache - Gin cache middleware with golang

Anson 38 Mar 16, 2022
groupcache is a caching and cache-filling library, intended as a replacement for memcached in many cases.

groupcache Summary groupcache is a distributed caching and cache-filling library, intended as a replacement for a pool of memcached nodes in many case

Go 11.8k Nov 28, 2022
☔️ A complete Go cache library that brings you multiple ways of managing your caches

Gocache Guess what is Gocache? a Go cache library. This is an extendable cache library that brings you a lot of features for caching data. Overview He

Vincent Composieux 1.6k Nov 28, 2022
An in-memory cache library for golang. It supports multiple eviction policies: LRU, LFU, ARC

GCache Cache library for golang. It supports expirable Cache, LFU, LRU and ARC. Features Supports expirable Cache, LFU, LRU and ARC. Goroutine safe. S

Jun Kimura 318 May 31, 2021