datafile.py killed on huge snmpwalk files
landy2005 opened this issue · comments
I have big snmpwalk dump:
$ ls -lh myagent.snmpwalk
-rw-rw-r-- 1 mstupalov mstupalov 221M Jun 9 12:41 myagent.snmpwalk
when try convert to snmprec, datafile script killed by unknown reason (probably memory lost):
$ datafile.py --ignore-broken-records --escaped-strings --source-record-type=snmpwalk --input-file=myagent.snmpwalk --output-file=myagent.snmprec
# Input file #0, processing records from the beginning till the end
Killed
searched in the repository and got no code related to 'Killed'. I would suggest you check your OS.
Funny..
Of course "Killed" is a OS message, when memory ran out.
I was able to convert the file, breaking it into 4 parts..
But I not think this is correct way, need to check the work with memory and cleaning at runtime.
Just for sure, I tried to run without additional parameters (--ignore-broken-records --escaped-strings), with the same result.
@landy2005 could you share a sample file for the issue?