baconjs / bacon.js

Functional reactive programming library for TypeScript and JavaScript

Home Page:https://baconjs.github.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Wrap ES 2018 Asynchronous Iterators

semmel opened this issue · comments

I'd like to consume ES 2018 asynchronous iterators(see also the 2alities intro on the topic) as Bacon.EventStreams by wrapping the iteration by Bacon.fromBinder. This general interface would Bacon.js allow to work with new data sources. E.g. Node.js Readable Streams are Async Iterators now (Although here one could use Bacon.fromEvent('data') too.)

However, I cannot decide how to reflect the loss of subscribers in the Bacon.EventStream in the iteration of the AsyncIterable.

1.) Should the re-attachment of EventStream subscribers commence iteration at the point where the last previous subscriber was detached? I.e. should the iteration be paused? Or,
2.) should the iterator be returned, i.e. allowed to clean up, and the stream ended, when the subscriber count reaches 0?

Quick example:

var ten = countDown(10); // countDown is an async generator function 
var tenStream = pausableFromAsyncIterator(ten); // implemented using Bacon.fromBinder
tenStream.take(2).log()
// --> 9, 8, <end>
tenStream.take(2).doEnd(() => ten.return()).log()
// --> 7, 6, <end>

// Or
var tenStream = fromAsyncIterator(countDown(10)); // implemented using Bacon.fromBinder
tenStream.take(2).log()
// --> 9, 8, <end>
tenStream.take(2).log()
// --> <end>

What is the Bacon.js way to do it?

Let me formulate the question differently:
Iterators are stateful (i.e. they keep the current "postition" of the iteration). Should we expose this statefulness in making the Bacon.EventStream stateful as well?

I know that for instance Bacon.fromArray and Bacon.sequentially commence publishing array items form the position where the loop position has been when the last subscriber was detached. Also reading a file every time from the beginning might not what one wants. BUT 1.) would mean that we need to share the AsyncIterator with other code parts (e.g. .doEnd(() => asyncIterator.return()) in order manage its lifetime or leaking memory. This would add complexity which I don't like.

On the other hand 2.) might not be what the programmer familiar with Iterators would expect, BUT: Is unsubscribing and later re-subscribing to the same Bacon.EventStream really a programming pattern Bacon.js should support?

The problem is of course that iterators don't really match the Bacon definition of an eventstream, when each event occurs exactly once and at a certain time (neither do the fromArray streams). To consume async iterators with Bacon.js we will have to do some impedance matching and to me, the latter suggestion seems better, because it won't cause leaking.

The problem is of course that iterators don't really match the Bacon definition of an eventstream, when each event occurs exactly once and at a certain time (neither do the fromArray streams).

As I understand it, an iterator enables the consumer of a data source to pull items from the data source. This is indeed different from Bacon.js, which pushes events to the consumer (subscriber).

Asynchronous Iterators on the other hand, mix in Promises, thereby instructing the data source when to commence work, but the source controls when the particular data item is delivered.

Is this not simply a lazy data source? And since Bacon EventStreams are lazy it could work just so without "impedance matching"?

the latter suggestion seems better, because it won't cause leaking.

I favour it too.

we will have to do some impedance matching

Now I know what you mean. You want that .delay, .throttle, .debounce etc. do not buffer or discard events, but actually pull data from the iterator delayed, throttled etc.! Hmm that’s tougher to do. Also conceptional: Each it.next() pull returns a Promise of the event value but with varying settle time.

My idea was just to poll the iterator as quickly as possible (until Bacon.noMore) and then let the usual Bacon machinery do the event processing as always.

For now I use a rather naive recursive (see implementation below) adapter which lets me digest asynchronous iterables with Bacon.js like this:

Bacon.once(path.join(os.homedir(), 'tmp/hello.txt')).
flatMap(_ba.fromAsyncGeneratorFunction(
   path => fs.createReadStream(path, {encoding: 'utf8'})
))
.log() 
// --> Hello Word! <end> (or whatever the file contains)

Implemented so

const
isAsyncIterable = obj => Symbol.asyncIterator in Object(obj),

/**
 * @typedef BaconAdapters.fromAsyncGeneratorFunction
 * @description As fast as possible pulls values from the iterator returned by the generator function.
 * The returned EventStream is the sequence of the pulled values. When the EventStream has no more subscribers
 * the pulling is finished on the next value yielded by the iterator.
 * @function
 * @template T
 * @param {function(...*): (AsyncIterator<T> | AsyncIterable<T>)} generatorFn
 * @return {function(...*): Bacon.EventStream<T>}
 * @example
 * async function* countDown(n) {
 *    for (let i = n; i--; i > 0) {
 *       yield await new Promise(resolve => setTimeout(resolve, 1000, i));
 *    }
 * }
 * fromAsyncGeneratorFunction(countDown)(3)
 * // -> 2, 1, 0, <end>
 */
fromAsyncGeneratorFunction = generator => (...args) => Bacon.fromBinder(sink => {
   const
      iteratorOrIterable = generator(...args),
      
      /** @type {AsyncIterator} */
      iterator = isAsyncIterable(iteratorOrIterable)
         ? iteratorOrIterable[Symbol.asyncIterator]()
         : iteratorOrIterable,
      
      pullNext = () => {
      iterator.next().then(({value, done}) => {
         if (done) {
            sink(new Bacon.End());
            return;
         }
         if (sink(value) !== Bacon.noMore) {
           pullNext();
         }
      },
      error => {
         if (sink(new Bacon.Error(error)) !== Bacon.noMore) {
           pullNext();
         }
      });
   };
   
   pullNext();
   return () => { iterator.return(); };
});

Pardon me for not being very active wrt Bacon.js issues and pull request. I don't have much time and energy to put into this. Btw I'd be very happy to get some fellow maintainers merging and releasing improvements. Right now I can take care of outright bugs and fixes but anything that requires more than an hour will most likely drop.

@raimohanska No need to apologise!
I just needed some feedback that my idea is not crazy or if I had missed something important.

Right now I can take care of outright bugs and fixes but anything that requires more than an hour will most likely drop.

👍 That's excellent nevertheless!

Bacon.js is a well working reactive library without great omissions. Thanks to .fromBinder, .withStateMachine and .transform users can extend the functionality. Bug fixes is basically all what's needed to keep my daily work going 😄