adamreeve / npTDMS

NumPy based Python module for reading TDMS files produced by LabView

Home Page:http://nptdms.readthedocs.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Trouble converting large .tdms files

adhigoel opened this issue · comments

Hey Adam,

I'm trying to convert a 4 large TDMS files (100 - 500 MB) at once and I've run in to a few issues I'm hoping you can help. Firstly, when I try to convert all the files in sequence and store them to a list, the 4th file runs in to a memory allocation error:

numpy.core._exceptions._ArrayMemoryError: Unable to allocate 5.28 MiB for an array with shape (692313,) and data type int64

It is usually relatively small memory amount (<10 MB). I've tried playing around with the order of the files and increased my virtual memory and IDE settings but to no luck. Through some experimenting, it seems like there is a memory limit on TdmsFile objects at ~1024MiB somewhere.

I tried an alternate approach where I process each file sequentially (then delete the object) but I can't get the as_dataframe() method to work. I run into a similar error:

numpy.core._exceptions._ArrayMemoryError: Unable to allocate 302. MiB for an array with shape (47, 841472) and data type float64

The allocation here is usually larger (>300 MiB) but I don't see any spikes in RAM usage or anything. Any idea what the issue is? These files are defragmented and I've tried on a few IDEs to no luck.

Thanks!

Discovered my interpreter had been inadvertently switched to 32-bit Python. Upgraded to 64-bit and all my importing worked.

Apologies for not replying sooner, but I'm glad you found what the problem was, and thanks for updating with the solution.