epicweb-dev / cachified

πŸ€‘ wrap virtually everything that can store by key to act as cache with ttl/max-age, stale-while-validate, parallel fetch protection and type-safety support

Repository from Github https://github.comepicweb-dev/cachifiedRepository from Github https://github.comepicweb-dev/cachified

bug in Batch requesting

uwemaurer opened this issue Β· comments

I am using cachified in a webserver and I had the issue that some requests were taking very very long and looking at the logs they completed sometimes after several minutes and even hours, for simple lookups which should take just milliseconds.

I tried to reproduce the problem starting from the batch example in the README.

import cachified, { createBatch } from "@epic-web/cachified";

const cache = new Map();

function delay(ms: number) {
  return new Promise((resolve) => setTimeout(resolve, ms));
}

async function getFreshValues(idsThatAreNotInCache: number[]) {
  await delay(500);
  console.log(`fetching ${idsThatAreNotInCache}`);
  return idsThatAreNotInCache.map((a) => `hello ${a}`);
}

function getUsersWithId(ids: number[]) {
  const batch = createBatch(getFreshValues);

  return Promise.all(
    ids.map((id) =>
      cachified({
        cache,
        getFreshValue: batch.add(id),
        key: `entry-${id}`,
        ttl: 60_000,
      }),
    ),
  );
}

async function main() {
  const a = getUsersWithId([1, 2, 3]).then((results) => {
    console.log(results);
  });
  
  // const input = [1, 2, 3]; // fully cached
  // const input = [5, 6, 7];  // all new
  const input = [1,2, 5];   // partially cached
  const b = getUsersWithId(input).then((results) => {
    console.log(results);
  });

  await a;
  await b;

  console.log("waiting 10s");
  await delay(10000);
}

await main();

when I run it with the "fully cached" input, it works as expected and shows this

fetching 1,2,3
[ 'hello 1', 'hello 2', 'hello 3' ]
[ 'hello 1', 'hello 2', 'hello 3' ]
waiting 10s

with the "all new" it also works and shows

fetching 1,2,3
[ 'hello 1', 'hello 2', 'hello 3' ]
fetching 5,6,7
[ 'hello 5', 'hello 6', 'hello 7' ]
waiting 10s

but now the problem, when it is "partially cached" then it shows

fetching 1,2,3
[ 'hello 1', 'hello 2', 'hello 3' ]
Warning: Detected unsettled top-level await at /home/uwe/cache-test.ts:49
await main();
^

I think this is what happened in our webserver that a partially cached batch request promise somehow not completes or takes very long.

I also tried without the auto submitting like this:

function getUsersWithId(ids: number[]) {
  const batch = createBatch(getFreshValues, false);

  const result = Promise.all(
    ids.map((id) =>
      cachified({
        cache,
        getFreshValue: batch.add(id),
        key: `entry-${id}`,
        ttl: 60_000,
      }),
    ),
  );
  return batch.submit().then(() => result);
}

Then the problem also happens for the "fully cached" input

and one more bug, if there is a duplicated key in the batch:

async function main() {
  await getUsersWithId([1, 2, 3, 1]).then((results) => {
    console.log(results);
  });
}

then it also doesn't return any results and instead gives the warning Warning: Detected unsettled top-level

Hi @uwemaurer thank you for reporting this. I can reproduce the issue and am looking into it.

Definitely a good catch. Should be resolved in next version

πŸŽ‰ This issue has been resolved in version 5.2.1 πŸŽ‰

The release is available on:

Your semantic-release bot πŸ“¦πŸš€

Thank you!

Oh, I haven't actually looked into the duplicated key thing... gonna do that soon

async function main() {
  await getUsersWithId([1, 2, 3, 1]).then((results) => {
    console.log(results);
  });
}

this was the same bug, and was fixed within 5.2.1