stream reading big files
oleks12345 opened this issue · comments
Hi,
I have a problem with ZipEntry, It does not have any option to be read asynchronously.
Could you add reading it part by part, lets say call a function with XMB of data, wait for function to finish, continue reading, that would allow to read big files in zip, without filling all the ram.
In project I'm working on we handle few GB files (read them line by line, calculate something, and got to next line), as files get size * 2 in js, whey are getting really big, to big to fit in most people RAM
Sorry I never replied to this. Honestly this request came across is "please do a bunch of free work"
In any case I agree it would be a nice feature. Unfortunately it's not a small amount of work. Refactoring the decompression code so it can save its state, piping that through workers, etc, would be a huge refactoring.