maxhumber / gif

The matplotlib Animation Extension

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Could this work on generators instead of lists?

antonimmo opened this issue · comments

Since the code uses indexes [0] and [1:] to take the first and then the rest of the elements of the list, I understand generators won't work here.

You could say the solution would be just converting the generator into a list, but for my use case (2500k frames @ 150 dpi) it takes a lot of memory to allocate the list. Actually, it might work the other way around.

I'd suggest something like:

def save(frames, path, duration=100):
    # iter() works for either lists and other iterators (such as generators)
    frames = iter(frames) 
    firstframe = next(frames)

    firstframe.save(
        path,
        save_all=True,
        append_images=frames,	# The first one was already read by next()
        optimize=True,
        duration=duration,
        loop=0,
    )

Any thoughts?

Hey. I tested this with a generator and seems to work fine. The caveat Is that for large frames, or a big number of them, it seems that memory keeps growing. It may be due to how the garbage collector works, but that's outside the scope of the issue.

In terms of usability, it works pretty well. I have to test it with arrays as well, but I'm positive it will work.

Hi antonimmo,

Thanks for your interest in gif and for experimenting on an improvement! Generators are a nice idea, but I'm not sure they're well suited for this specific function...

gif frames are created by the @gif.frame decorator which saves everything into an io.BytesIO() buffer. If memory is a concern, then you might perhaps write each frame to disk (in a temporary folder), loop through and read each using a generator, save, and then delete the temporary folder.

So if the io.BytesIO() keeps using memory until completion, what's the difference with loading temporary frames (as you suggest) it the buffer will grow anyway?

Options:

  1. Keep in memory - faster, but could bump up against memory constraints (unlikely? most computers are 8GB RAM?)
  2. Write to disk - slower, but would eliminate potential memory issues

I thinkgif should remain in Option 1. However, if you (or someone else) would like to submit a PR so that users could choose either option implementation (maybe as a global setting?), I would be receptive!

But closing for now...

Well, I'm working with such big images in a 64 GB RAM computer, but ran out of memory. This is why I raised this issue in the first place.

I'll do my best to follow your advice and attempt to include the option to save to file. Though, I believe this cannot be done, as it would face similar issues that generators do.