bvaughn / suspense

Utilities for working with React Suspense

Home Page:https://suspense.vercel.app/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Feedback

cevr opened this issue · comments

Sorry if this isn't the best place to put this, feel free to move it to discussion!

Here's a little app i made for demonstrative purposes: https://stackblitz.com/edit/vitejs-vite-v6umf3?file=src/App.tsx

After using this library for a couple days, I've gathered some feedback:

  • Possible to add an invalidate function to the cache?
    The implementation I use is:
function invalidate<TParams extends any[], TValue>(
 cache: Cache<TParams, TValue>,
 ...args: TParams
) {
 cache.evict(...args);
 cache.prefetch(...args);
}
  • possible to notify subscribers when an eviction happens? Currently no notification is given, unless its an evictAll i believe.
  • similarly, evicting and forcing an update afterwards does not cause the suspense fallback to show. Intended? You can see it in the app above by going to the posts, and clicking evict current page
  • the current eviction policy seems unpredictable and aggressive, sometimes causing surprising evictions. In the app above, if you go to the posts, you'll see that sometimes when pressing the next button, a complete cache blowout has occurred forcing a suspense fallback. Will also notice it in the users demo (loading a new user will sometimes show all other previously loaded users are now evicted).

I was thinking a possible strategy is to allow the user to provide their own implementation of the internal cacher so to leave the eviction policy to them. I would personally want to use a LRU cache, but some might consider another type of cache. Providing the inversion of control could satisfy all use cases. What do you think?

  • I commonly want to know if the current cache item is revalidating (no suspense fallback). What does an api like this sound like for read:
const [value, revalidating] = Cache.read()

my current implementation for this is using this helper function:

function useRead<TParams extends any[], TValue>(
  cache: Cache<TParams, TValue>,
  ...args: TParams
) {
  const value = cache.read(...args);
  const status = useCacheStatus(cache, ...args);
  return [value, status === 'pending'] as const;
}
  • provide built in hooks? this one is not really a big deal and philosophically i can understand it not being part of the cache. But something like useCacheStatus(cache, ...) to Cache.useStatus(...) etc
  • the current getValueIfCached throws an error if rejected, which makes it cumbersome in some cases. Thoughts on providing a cache.peek function that returns the raw record or undefined?
  • In my implementation i paired it with a function called cache.evictIf(predicate, ...args) that was useful in ErrorBoundaries to evict if the cache was rejected before reseting the error boundary

some other points

  • the subscribeToStatus calls the subscription callback immediately causing an unnecessary render on mount when using uSES
  • the type inference feels lacking, I think the load function should provide all the signatures it needs instead of having to manually supply the generics.
  • not-started seems like an odd status. I believe it means something like cache-miss unless you can set a cache record with not-started? If not, I think undefined is a better descriptor of cache-miss

Mind moving this to one or more discussion threads? I think it's better suited for that.