nodeca / pako

high speed zlib port to javascript, works in browser & node.js

Home Page:http://nodeca.github.io/pako/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

No way to read Deflate data in chunks.

jlivak opened this issue · comments

Unless I'm missing something, because Deflate.chunks is marked internal, there is no way to read all the compressed data that has been written/flushed so far and subsequently clear the chunks array (which is needed when handling large objects that can't be all in memory at once). You currently have to keep everything in memory and read from Deflate.result at the end.

My organization forked pako and add a flushResultBuffer() method to Deflate which returns all the current chunks and then clears the chunks array. Curious if that's something it'd make sense to PR back into pako proper or if there's a way to read partial data that I've missed.

https://github.com/nodeca/pako/blob/master/lib/deflate.js#L277 - you can override .onData() in the instance, or even create your own wrapper around zlib functions. I don't see problem with mentioned topic.

I guess it's just a matter of there not being an obvious way of how to achieve this without overriding default behavior, and whether that's something people think is worth adding to the base project.

I think it's better to keep things as is, to keep API as simple as possible.

Sounds good, I'll just leave a link to our fork repo here then in case anyone finds this issue in the future and wants an example of how to achieve this.

https://github.com/inkarnaterpg/pako