bitfaster / BitFaster.Caching

High performance, thread-safe in-memory caching primitives for .NET

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Doesn't look like capacity expiration/evicting is happening properly (might be specific to the initial population of the warm queue)

Kritner opened this issue · comments

Given this unit test:

[Fact]
public void WhenItemEvictedDueToCapacityShouldNoLongerBeRetrievable()
{
    for (var i = 1; i < 100; i++)
    {
        lru.AddOrUpdate(i, i.ToString());
    }

    bool found;
    string value;

    // Last 3 items should be retrievable
    found = lru.TryGet(99, out value);
    found.Should().BeTrue();
    value.Should().Be(99.ToString());
    
    found = lru.TryGet(98, out value);
    found.Should().BeTrue();
    value.Should().Be(98.ToString());
    
    found = lru.TryGet(97, out value);
    found.Should().BeTrue();
    value.Should().Be(97.ToString());

    // First 3 items should not be retrievable
    found = lru.TryGet(1, out value);
    found.Should().BeFalse();
    
    found = lru.TryGet(2, out value);
    found.Should().BeFalse(); // fails at this point
    
    found = lru.TryGet(3, out value);
    found.Should().BeFalse();
}

Are my assumptions incorrect here?

Getting this as the keys present in the LRU when debugging:

image

Looks like the first entry of 1 is being successfully evicted, but entries 2, 3, and 4 are for some reason staying in the queue.

FWIW it looks like the items 2, 3, and 4 are for some reason "stuck" in the warm queue:

image

If i change the test to be this instead, i get a passing test:

[Fact]
public void WhenItemEvictedDueToCapacityShouldNoLongerBeRetrievable()
{
    for (var i = 1; i < 100; i++)
    {
        lru.AddOrUpdate(i, i.ToString());

        // sanity check that the newly added item can be grabbed from the cache - note that this will set its accessed property
        lru.TryGet(i, out _).Should().BeTrue();
    }
    
    bool found;
    string value;

    // Last 3 items should be retrievable
    found = lru.TryGet(99, out value);
    found.Should().BeTrue();
    value.Should().Be(99.ToString());
    
    found = lru.TryGet(98, out value);
    found.Should().BeTrue();
    value.Should().Be(98.ToString());
    
    found = lru.TryGet(97, out value);
    found.Should().BeTrue();
    value.Should().Be(97.ToString());

    // First 3 items should not be retrievable
    found = lru.TryGet(1, out value);
    found.Should().BeFalse();
    
    found = lru.TryGet(2, out value);
    found.Should().BeFalse();
    
    found = lru.TryGet(3, out value);
    found.Should().BeFalse();
}

i'm not sure if this is expected behavior or not - not having read up on the hot/warm/cold policy i would imagine it's unintended, but wanted to call it out just in case.

That's expected. ConcurrentLru is a pseudo LRU, item order doesn't follow a strict LRU sequence.

Once the cache is warmed up, the warm queue requires a second access before admitting new items. This is a simple heuristic to favor frequent items and filter out scanning. Despite it's simplicity, this eviction policy yields high hit rate - hit rate is the key characteristic, regardless of item order.

In your second test all items are touched twice so you effectively get FIFO order.

cool, thanks for the explanation!