YunSuk-Yeo / clru-rs

An LRU cache implementation with constant time operations and weighted semantic.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CLru

Actions Crate Docs License

Another LRU cache implementation in rust. It has two main characteristics that differentiates it from other implementation:

  1. It is backed by a HashMap: it offers a O(1) time complexity (amortized average) for common operations like:

    • get / get_mut
    • put / pop
    • peek / peek_mut
  2. It is a weighted cache: each key-value pair has a weight and the capacity serves as both as:

    • a limit to the number of elements
    • and as a limit to the total weight of its elements

    using the following formula:

    CLruCache::len + CLruCache::weight <= CLruCache::capacity

Even though most operations don't depend on the number of elements in the cache, CLruCache::put_with_weight has a special behavior: because it needs to make room for the new element, it will remove enough least recently used elements. In the worst case, that will require to fully empty the cache. Additionally, if the weight of the new element is too big, the insertion can fail.

For the common case of an LRU cache whose elements don't have a weight, a default ZeroWeightScale is provided and unlocks some useful APIs like:

Disclaimer

Most of the API, documentation, examples and tests have been heavily inspired by the lru crate. I want to thank jeromefroe for his work without which this crate would have probably never has been released.

Differences with lru

The main differences are:

  • Smaller amount of unsafe code. Unsafe code is not bad in itself as long as it is thoroughly reviewed and understood but can be surprisingly hard to get right. Reducing the amount of unsafe code should hopefully reduce bugs or undefined behaviors.
  • API closer to the standard HashMap collection which allows to lookup with Borrow-ed version of the key.

Example

Below are simple examples of how to instantiate and use this LRU cache.

Using the default ZeroWeightScale:

use std::num::NonZeroUsize;
use clru::CLruCache;

let mut cache = CLruCache::new(NonZeroUsize::new(2).unwrap());
cache.put("apple".to_string(), 3);
cache.put("banana".to_string(), 2);

assert_eq!(cache.get("apple"), Some(&3));
assert_eq!(cache.get("banana"), Some(&2));
assert!(cache.get("pear").is_none());

assert_eq!(cache.put("banana".to_string(), 4), Some(2));
assert_eq!(cache.put("pear".to_string(), 5), None);

assert_eq!(cache.get("pear"), Some(&5));
assert_eq!(cache.get("banana"), Some(&4));
assert!(cache.get("apple").is_none());

{
    let v = cache.get_mut("banana").unwrap();
    *v = 6;
}

assert_eq!(cache.get("banana"), Some(&6));

Using a custom WeightScale implementation:

use std::num::NonZeroUsize;
use clru::{CLruCache, CLruCacheConfig, WeightScale};

struct CustomScale;

impl WeightScale<String, &str> for CustomScale {
    fn weight(&self, _key: &String, value: &&str) -> usize {
        value.len()
    }
}

let mut cache = CLruCache::with_config(
    CLruCacheConfig::new(NonZeroUsize::new(6).unwrap()).with_scale(CustomScale),
);

assert_eq!(cache.put_with_weight("apple".to_string(), "red").unwrap(), None);
assert_eq!(
    cache.put_with_weight("apple".to_string(), "green").unwrap(),
    Some("red")
);

assert_eq!(cache.len(), 1);
assert_eq!(cache.get("apple"), Some(&"green"));

Tests

Each contribution is tested with regular compiler, miri, and 4 flavors of sanitizer (address, memory, thread and leak). This should help catch bugs sooner than later.

TODO

  • improve documentation and add examples

About

An LRU cache implementation with constant time operations and weighted semantic.

License:MIT License


Languages

Language:Rust 100.0%