Kaixhin / Autoencoders

Torch implementations of various types of autoencoders

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Loading Large files

amirid opened this issue · comments

Hi,
I am trying to train various autoencoders using my own data; I could train several models using small numbers of training data samples (let's say 100 samples). However when I manage to increase the number of training samples and train the model it crashes because of lack of memory, with the following error message:

PANIC: unprotected error in call to Lua API (not enough memory)

Is there any solution for loading the data in a lazy way like python and train the model incrementally?

p.s Even after decreasing the batch size to 1, the code still crashes. The problem is the data file size.

This repo is for reference code, so I will not be including efficient data loaders inside. You can find Torch packages to help you with this: torch-dataset and torchnet.