perwendel / spark

A simple expressive web framework for java. Spark has a kotlin DSL https://github.com/perwendel/spark-kotlin

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Uploading large-size file = Out of Memory

lepe opened this issue · comments

I found this question in SOF from 2016 about a similar issue.

I'm uploading a 2GB file using a relatively simple code to upload files to my server.
I noticed that the memory limit is maxed before reaching my code. I was wondering what does Spark does during upload? Does the stream is stored in memory before exporting it into file? It sounds unfeasible to me, but I haven't located the part of the code which does that.

Is this a known limitation, a bug or something that it should never happen (and thus is some mistake on my part)?

Can someone tell me where that code is implemented (I would like to give it a look)? Thanks.

Following the code, it seems the issue is not related to Spark implementation. I believe org.eclipse.jetty.util.MultiPartInputStreamParser will read the stream and store it into a MultiMap<Part> object which is stored in memory.
The memory increases when I call spark.Request.getParts() which only calls javax.servlet.http.HttpServletRequest.getParts() which calls org.eclipse.jetty.server.Request.getParts(), finally calling MultiPartInputStreamParser somewhere during the upload.

In short, I'm closing this issue as it seems it has nothing to do with Spark.