ssbc / muxrpc

lightweight multiplexed rpc

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

*Memo() functions

pfrazee opened this issue · comments

While working through some code, I noticed that it would be convenient to have memoization on RPC methods so that code paths can "fetch once" and then reuse the cached response in subsequent calls.

Consider this snippet:

function render() {
  whoami(function(err, prof) {
    if (err) return console.error(err)
    document.body.querySelector('#user-pubkey').textContent = prof.public
  })
}
var _whoami = null
function whoami(cb) {
  if (_whoami) return cb.apply(null, _whoami)
  rpc.ssb.whoami(function(err, keys) {
    _whoami = [err, keys]
    cb(err, keys)
  })
}

It contains a fair amount of boilerplate for response caching, which we could remove by generating Memo funcs automatically:

function render() {
  rpc.ssb.whoamiMemo(function(err, prof) {
    if (err) return console.error(err)
    document.body.querySelector('#user-pubkey').textContent = prof.public
  })
}

Or, possibly, by using a monkeypatch:

var memo = require('rpc-memoize')

function render() {
  memo(rpc.ssb.whoami, function(err, prof) {
    if (err) return console.error(err)
    document.body.querySelector('#user-pubkey').textContent = prof.public
  }))
}

I'm leaning toward the monkeypatch

I don't think this should be a feature of muxrpc... there are only some calls that should work like this, and that depends on the application... it would be better to have a memoizer module that was decoupled from muxrpc and then the application can use that where it wants to.

something more like this:

function memoize (fun) {
  var value
  return function (cb) {
   if(value) return cb(null, value)
   fun(function (err, _value) {
     if(err) cb(err)
     else cb(null, value = _value)
   })
  }
}

//then apply this function to anything you want to memoize

var whoami = memoize(rpc.whoami)

This removes eleminates the boiler plate and allows you to create cached functions for just the calls that you call lots...

but also, what benefit will this have? latency isn't a problem since the calls are only on the local machine.
it's doesn't really seem like it calls for http style caching to me

Your version looks good. I'll put something like that together.

latency isn't a problem since the calls are only on the local machine.

Generally that's true, but I was putting this call in a render function of a small script. I didnt want the render latency, but I also didnt want the boiler plate which I described.

closing