cyanfish / naps2

Scan documents to PDF and more, as simply as possible.

Home Page:https://www.naps2.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

crash during PDF output w/OCR (out of memory) on Debian

aurbus opened this issue · comments

Describe the bug
When I attempt to output a PDF with OCR, NAPS fills up all my RAM (16gb) and swap space (4 or 5 gb I think), and then crashes.

I am running Debian testing and NAPS 7.3.1, and have been able to export PDF's this way in the past without issues.

running dmesg -k lists the following outputs which might help:

[ 1091.226375] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/user.slice/user-1000.slice/user@1000.service/app.slice/app-naps2-1ed33cac23da4357822cf6923a59fc8d.scope,task=naps2,pid=4537,uid=1000

[ 1091.226550] Out of memory: Killed process 4537 (naps2) total-vm:290692092kB, anon-rss:11396184kB, file-rss:5552kB, shmem-rss:40152kB, UID:1000 pgtables:23344kB oom_score_adj:200

To Reproduce
Export a PDF with OCR.

Expected behavior
PDF would export properly

Screenshots
n/a

Desktop (please complete the following information):
Debian testing "trixie", NAPS 7.3.1

Some questions:

  • How many PDF pages?
  • What resolution/dpi did you scan at?
  • How many CPU cores/threads does your system have?

Also, can you try running taskset –c 0-1 naps2 and see if that helps?

The PDF is 671 pages, but I have had the same problem with PDF's that are about half as many pages.

The images of each pages are fairly large, about 2000x4000 pixels.

I have an i7-1165G7, so 4 cores, 8 threads, at 2.8Ghz base, and 28w.

I should note that I have been able to export PDFs more than twice as long, with pages 2-3x the resolution of these, with no issues. I also tried with NAPS 7.3.0, and 7.2, and had the same issue. I suspect it may be something with Debian, and not NAPS, but I am not 100% sure, and thought this was the best place to start.

I'm not sure, the log seems to claim that NAPS2 is using 11GB of memory, but I can't get it past 500MB-1GB (top -o %MEM) myself running a similar test.

If this is native you could try using the flatpak (or vice-versa) and see if there is any difference. You could also try playing around with the images you're saving - maybe there's something about a particular image that's causing the issue.

I tried with version 7.3.1 on Windows, and I was able to process all the images in question with no issue, RAM usage never surpassing about 1.2GB. I think it might be something with Debian, so I will look into it and see what it could be...thank you for the help, and the fantastic program :)