notable / dumper

Library for extracting attachments, notes and metadata out of formats used by popular note-taking apps.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

JavaScript heap out of memory

msbentley opened this issue · comments

I have just installed enex-dump (ubuntu 18.04) and tried to convert an ENEX file containing a few hundred notes (~373 MB, since it has quite a few attachments). The following is the output:

mbentley@bpcl03:~/Desktop$ enex-dump --src notes.enex --dst notes

<--- Last few GCs --->

[17938:0x55d3d37c3220]     1977 ms: Mark-sweep 762.3 (800.8) -> 760.6 (800.8) MB, 9.5 / 0.0 ms  allocation failure GC in old space requested
[17938:0x55d3d37c3220]     1991 ms: Mark-sweep 760.6 (800.8) -> 760.5 (765.8) MB, 14.3 / 0.0 ms  last resort GC in old space requested
[17938:0x55d3d37c3220]     2007 ms: Mark-sweep 760.5 (765.8) -> 760.5 (765.8) MB, 15.3 / 0.0 ms  last resort GC in old space requested


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x22aaad598fe1 <JSObject>
    2: replace(this=0x79116f82201 <Very long string[390176708]>,0x1f5d1adc22d9 <JSRegExp <String[15]: <!--[\s\S]*?-->>>,0x22aaad582801 <String[0]: >)
    3: getTraversalObj [/usr/local/lib/node_modules/enex-dump/node_modules/fast-xml-parser/src/xmlstr2xmlnode.js:72] [bytecode=0x395c79c21b11 offset=45](this=0x3330a949851 <Object map = 0x32869ea758f9>,xmlData=0x79116f82201 <Very long string[390176...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [node]
 2: 0x55d3d16f8011 [node]
 3: v8::Utils::ReportOOMFailure(char const*, bool) [node]
 4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [node]
 5: v8::internal::Factory::NewRawTwoByteString(int, v8::internal::PretenureFlag) [node]
 6: 0x55d3d15adb54 [node]
 7: v8::internal::Runtime_StringReplaceGlobalRegExpWithString(int, v8::internal::Object**, v8::internal::Isolate*) [node]
 8: 0x5bab79040bd

Aborted (core dumped)

Any clues? Thanks!!

@msbentley It looks like you ran out of RAM or something? Do you have any huge attachment or just many smaller ones?

I don't know off-hand - probably the largest are some tens of MB. I can check! Is there a way to increase the memory limit when using enex-dump?

This will have to be fixed by reworking how the internals of the app work, currently we are reading and writing everything at once, while instead we should do that incrementally.

Maybe you can increase the available ram (hardware permitting) with one of the following commands:

node --max-old-space-size=4096 $(which enex-dump)/dist/bin/index.js
node --max-old-space-size=4096 $(which enex-dump)

Ahh yes, that makes sense - I'll try the changes above tomorrow and report back.

Sorry for the delay - with your hint for increasing memory, enex-dump runs fine - thanks!

I'm reopening as this should be fixed on the library-side, it should just work.

I think this issue got fixed for the most part in the rewrite.