Module ncache
Normalization cache (many-to-1 cache)
Use case: storing values by a specific key, except that the same key can have many different representations.
For example IPv6 addresses, where [::1]
, [::0:1]
, and
[::0001]
all refer to the same address. To cache values you need normalization of the
key to a single, unique value for all the variants the key can have.
Since you need the normalization on every cache lookup, this can become too expensive. Hence this library not only caches the values, but also the normalization results. Which means that every key-variant only needs to be normalized once.
When creating a new cache you provide a normalization function, and the optionally the cache instances for the two internal caches:
- key_cache : the cache that links the key variant to the normalized key.
- value_cache : the cache that holds the values, indexed by the normalized key.
You can either provide an OpenResty LRU cache, or not provide one, in which case it will get a simple Lua table based cache. In the latter case you have to watch memory usage, as it might grow uncontrollable.
When to use what caches for the new call:
key_cache = nil, value_cache = nil
In this case any data in there will only be removed when explicitly calling delete. So only use this when the number of normalized-keys and their variants are limited. Otherwise both caches can grow uncontrolled.
key_cache = nil, value_cache = resty-lru
This will protect against too many values. But not against too many variants of a single key. Since the key_cache can still grow uncontrolled. In case a value gets evicted from the value-cache, then all its key-variants will also be removed from the key-cache based on weak-table references.
key_cache = lru, value_cache = nil
Use this if the number of normalized-keys is limited, but the variants are not. Whenever a value gets deleted, its key-variants are 'abandoned', meaning they will not be immediately removed from memory, but since they are in an lru cache, they will slowly be evicted there.
key_cache = lru, value_cache = lru
This protects against both types of memory usage. Here also, if a value get deleted, the key-variants will be abandoned, waiting for the lru-mechanism to evict them.
Example 1:
A cache of versioned items based on Semantic Versioning. Many input versions, in different formats
will lead to a limited number of compatible versions of objects to return.
Since the versions will (most likely) be defined in code, they will be limited. Both
the requested versions, as well as the returned versions.
In this case use the Lua-table based caches by providing nil
for both of them, since there
is no risk of memory exhaustion in this case.
Example 2:
Matching incoming requested IPv6 addresses to a limited number of upstream servers. Since we know the upstream servers before hand, or through some configuration directive they will be limited.
But the incoming IPv6 addresses are user provided, and hence one can expect every possible representation of that address to appear sometime (which are a lot!). Hence for the value_cache (with a limited number of normalized addresses for the upstream services) we can use the Lua-table based cache. But for the key-cache storing the combination of every raw-key to normalized key, we must protect for overruns, and hence we use lru-cache.
NOTE: besides the above on cache types, it is important to realize that even if there is no value in the cache, looking it up, will still normalize the key. It will store the raw key, the normalized key, and the fact that there is no value for that key. So repeatedly looking for a non-existing key will only normalize the key once. The cost of this optimization is that it will still use memory to store the non-existing entry, and hence grow memory usage. Keep this in mind when picking the proper cache types.
Info:
- Copyright: 2018 Thijs Schreijer
- License: MIT
- Author: Thijs Schreijer
Functions
delete (key) | Deletes a key/value from the cache. |
flush_all () | Clears the cache. |
get (key) | Gets a value from the cache. |
new (normalizer, key_cache, value_cache, value_cache_non_evicting) | Creates a new instance of the normalization cache. |
raw_set (raw_key, value) | Sets a value in the cache, under its raw key. |
set (key, value) | Sets a value in the cache. |
Functions
- delete (key)
-
Deletes a key/value from the cache.
The accompanying value will also be deleted, and all other variants of
key
will be evicted. To keep the normalization cache of all the key-variants use set to set the value tonil
.Parameters:
- key the raw key in a normalizable format
Returns:
true
, ornil + error
Usage:
local cache = ncache.new(tonumber) cache:set(5, "value 5") print(cache:get(5)) -- "value 5" cache:set(5, nil) print(cache:get(5)) -- nil cache:delete(5) print(cache:get(5)) -- nil, "key not found"
- flush_all ()
-
Clears the cache.
Removes all values as well as all variants of normalized keys.
Returns:
true
- get (key)
-
Gets a value from the cache.
Note: if there is no value, then still the normalization results will be stored
so even if nothing is in the cache, memory usage may increase when only getting.
To undo this, explicitly delete a key.
Parameters:
- key the raw key in a normalizable format
Returns:
-
the value, or
nil + error
. Note thatnil
is a valid value, and that the error will be "key not found" if the (normalized) key wasn't found.Usage:
local cache = ncache.new(tonumber) cache:set(5, "value 5") print(cache:get(5)) -- "value 5" print(cache:get("5")) -- "value 5" print(cache:get(6)) -- nil, "key not found" cache:set(6, nil) print(cache:get(6)) -- nil
- new (normalizer, key_cache, value_cache, value_cache_non_evicting)
-
Creates a new instance of the normalization cache.
The cache objects are optional, and are API compatible with the OpenResty lru-cache. If not
provided then simple table based caches will be created, without an lru safety mechanism.
The
value_cache_non_evicting
parameter provides a small performance gain if the providedvalue_cache
does never evict any valuesParameters:
- normalizer (function) a function that normalizes a key value to a common, unique, non-nil representation
- key_cache (optional) cache object (get, set, delete, flush_all) where the relation between the raw keys and values will be stored.
- value_cache (optional) cache object (get, set, delete, flush_all) where the relation between the normalized key and values is stored.
- value_cache_non_evicting
(boolean, optional) set to
true
if thevalue_cache
provided will never evict data by itself.
Returns:
-
normalization cache
Usage:
-- sample
normalizer
function local normalizer = function(key) -- normalize everything to a proper number local key = tonumber(key) if key then return key end return nil, "key was not coercable to a number" end local cache = ncache.new(normalizer) - raw_set (raw_key, value)
-
Sets a value in the cache, under its raw key.
When storing the value, the
normalizer
function will not be invoked.Parameters:
- raw_key the normalized/raw key
- value
the value to store (can be
nil
)
Returns:
true
Usage:
local cache = ncache.new(tonumber) cache:raw_set(5, "value 5") cache:raw_set("5", "why 5?") print(cache:get(5)) -- "value 5" print(cache:get("5")) -- "why 5?"
- set (key, value)
-
Sets a value in the cache.
Note:
nil
is a valid value to set, use delete to remove an entry.Parameters:
- key the raw key in a normalizable format
- value
the value to store (can be
nil
)
Returns:
true
on success,nil + error
on errorUsage:
local cache = ncache.new(tonumber) cache:set(5, "value 5") cache:set("5", "why 5?") print(cache:get(5)) -- "why 5?" print(cache:get("5")) -- "why 5?"