janl / mustache.js

Minimal templating with {{mustaches}} in JavaScript

Home Page:https://mustache.github.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Allow internal cache to be disabled

AndrewLeedham opened this issue · comments

I am running in to memory limits when compiling lots of mustache.js templates in one go (for testing purposes). I am wanting to disable mustache's internal cache and implement my own that would remove stale templates. This is currently not possible, the only API seems to be cache clearing. Is it something that has been considered for this project? If not does it seem like it fits with the library? Happy to put in a PR.

Possible workarounds:

  • Clear the cache after every component render, this seems inefficient.
  • Monkey patch the Writer.prototype.parse function to skip the caching. This is the best current solution but having it as a first class API would be better.

That's an interesting question!

I remember there has been questions about making the cache smarter and offer different cache strategies, but we've been reluctant to implement & maintain that as part of this package.

Opening up for allowing others to override the caching strategy is a different story, and probably something we should have asked ourselfs before? 🤷‍♂ Totally agree to your suggestion of exposing this first class, rather than forcing others to monkey patch the inner workings.

I'd happily review a PR that enables this 👍 In general, as few & focused changes as possible would be better, compared to turning the source upside down to make this happen. Let me know if you have more questions or want input.

Thanks for getting back to me @phillipj. I have put a PR in with a basic disabling mechanism, but it does not particularly address the monkey patching concern, just makes it easier. See 88457ab#diff-66c3acc3931379ca3437082f30ec7aecR175-R185 for an example of how a user would override the cache.

Perhaps a better approach would be to split the cache retrieval into a separate function, then expose a function that accepts a predicate which handles caching?

Your PR surely looks like it deliveries on the need for disabling the cache completely when needed.

Perhaps a better approach would be to split the cache retrieval into a separate function, then expose a function that accepts a predicate which handles caching?

That's also an interesting idea! Thinking out loud without considering how that would be implemented practise, extracting the cache retrieval & storing into separate methods could potentially allow for other caching strategies to be provided, without those strategies having to be implemented in the mustache.js itself.

Building upon that idea and wanting to disable the cache, that could be accomplished by providing either a predicate as you mention or a no-op cache storing method 🤔

Whether we actually need and should think about solving these use cases is up for discussion though.

Perhaps just allow disabling for now, then a user can monkey patch parse if necessary. Provide and example somewhere? Then open an issue to discuss an API for providing a custom caching strategy?

FYI I haven't forgotten about this issue. Still revisiting the pros & cons here from time to time.

With the amount of downloads this package, I deliberately a bit conservative when it comes to adding new external API methods as they'll have to be supported for a long time going forward.

No worries completely understand. I currently have a workaround (clearing the cache after each parse) so this is not blocking. Happy to update the PR with whatever the outcome :)

Alrighty, so here's some early ramblings that starts to grow on me. Tried not to overengineer this, but rather focus on keep-it-simple-stupid and still:

  1. allow the internal template cache to be configurable by the using project
  2. disable the internal template cache completely

Inspired by your changes in #731, rather than only enabling/disabling the current template cache logic, what about allowing a template cache object to be provided or set to undefined to disable it?

Such a template cache object needs three methods (.set(key, value) | .get(key) | .clear()) and provides a good amount of headroom for ourselfs internally going forward and using projects to be creative when they need to.

The changes to Writer.parse() could look a lot like your proposed solution:

diff --git a/mustache.mjs b/mustache.mjs
index abbbd6d..74efe27 100644
--- a/mustache.mjs
+++ b/mustache.mjs
@@ -497,10 +497,13 @@ Writer.prototype.clearCache = function clearCache () {
 Writer.prototype.parse = function parse (template, tags) {
   var cache = this.cache;
   var cacheKey = template + ':' + (tags || mustache.tags).join(':');
-  var tokens = cache[cacheKey];
+  var isCacheEnabled = typeof cache !== 'undefined';
+  var tokens = isCacheEnabled ? cache.get(cacheKey) : undefined;

-  if (tokens == null)
-    tokens = cache[cacheKey] = parseTemplate(template, tags);
+  if (tokens == undefined) {
+    tokens = parseTemplate(template, tags);
+    isCacheEnabled && cache.set(cacheKey, tokens);
+  }

   return tokens;
 };

A few psudo examples of how different implementation of template caches could look like:

const mustache = require('mustache');

// 1: Use a Map (default in the future?)
mustache.templateCache = new Map();

// 2: Default built-in cache
mustache.templateCache = {
  _cache: {},
  set: function (key, value) {
    this._cache[key] = value
  },
  get: function (key) {
    return this._cache[key]
  },
  clear: function () {
    this._cache = {}
  }
};

// 3: Disable cache
mustache.templateCache = undefined;

// 4: Use lru-cache
const LRU = require('lru-cache')
const lruCache = new LRU(50);
lruCache.clear = lruCache.reset.bind(lruCache);

mustache.templateCache = lruCache

Any thoughts? Don't hesitate to shout if it doesn't make sense to you.

Forgot to add that disabling the cache by doing the somewhat low-level mustache.templateCache = undefined operation feels acceptable, because adjusting this internal cache feels like "advanced features" to me, knowing we haven't been flooded with requests to make this happen throughout the years this package has existed.

I really like this. A very elegant solution.

Are you happy for me to update the PR with this?

Cool, glad to hear it makes sense to someone else as well 😄

Are you happy for me to update the PR with this?

That would be perfect!

@AndrewLeedham and @phillipj would you mind if i request an example when Mustache lead the memory leak and how your custom cach look like?

@timratha I am currently using the 4th example @phillipj provided here: #730 (comment)

The issue itself is harder to convey, it was happening in CI where the container would occasionally run out of memory and fail. We initially fixed it by increasing the container size and increasing nodes memory limit, but after doing that a second time I thought I would try and debug why the usage was so high. Said research led me to this issue :)