Sending very large files, tasks in parallel are using a lot of memory
antonioc57 opened this issue · comments
Antonio Carlos Lima commented
I have a problem with this solution to send many files in parallel, the logic is to follow these steps below:
-Search batch of files for sending
-Create history, update status and read files
-Zip files
-Send files
the rule is to send a maximum of 5 requests simultaneously, at each iteration check if there are files to send otherwise cancel the execution.
It's working as expected (apparently), but my biggest problem is the number of files to send, reaching 30.000, this is generating excessive memory usage, I'm thinking that my tasks are not being canceled as they should, or I'm doing something wrong.
function* pendingForShipping() {
let tasks = [];
let batch = 0;
while (true) {
if (batch < 5) {
const filesForSending = yield call(generateBatchFiles);
if (!filesForSending || filesForSending.length === 0) {
if (yield join([...tasks])) {
yield all(tasks.map((task) => cancel(task)));
tasks.length = 0;
yield cancel();
}
}
const files = yield call(
createHistoryUpdateStatusReadFile,
filesForSending
);
const zipFile = yield call(generateZipFile, files);
const saga = yield fork(upload, zipFile);
tasks = [...tasks, saga];
batch += 1;
} else {
if (yield join([...tasks])) {
yield all(tasks.map((task) => cancel(task)));
tasks.length = 0;
batch = 0;
}
}
}
}