Ubuntu 18 not generating files when maxEntriesPerFile is set or sitemap_parts with more than 50000 urls
windson opened this issue · comments
Do you want to request a feature or report a bug?
bug
What is the current behavior?
On windows when maxEntriesPerFile is set the behavior works as expected by generating sitemap_parts{}.xml and a consolidated sitemap.xml. But the same configuration doesn't works on Ubuntu 18. But I could see the part files in /tmp directory with sitemap_somehash of ubuntu os. Its just that the process is not able to generate the consolidated xml and part xmls.
If you provide a small website with 50 urls without maxEntriesPerFile, it works as expected on both Ubuntu 18 and Windows 10.
If the current behavior is a bug, please provide the steps to reproduce.
const SitemapGenerator = require('sitemap-generator');
const site = 'ifsc.icalci'
// create generator
const generator = SitemapGenerator('https://www.example.com', {
stripQuerystring: false,
maxDepth: 0,
filepath: './sitemap.xml',
priorityMap: [1.0, 0.9],
maxEntriesPerFile: 12, // if not provided for large files with more than 50000 urls, then also it behaves as reported.
timeout: 5000,
maxConcurrency: 7,
userAgent: 'Node/SitemapGenerator',
lastMod: true,
changeFreq: 'monthly'
});
// https://github.com/lgraubner/sitemap-generator/pull/76
let cnt = 0;
console.time(site)
generator.on('add', (url) => {
cnt++;
//console.log(url)
if (cnt % 500 == 0)
{
console.log("added " + cnt.toString() + " urls")
}
});
generator.on('error', (error) => {
console.log(error);
console.log("Failed at " + cnt.toString() + " processing URL :'( ")
});
// register event listeners
generator.on('done', () => {
// sitemaps created
console.log("Done adding " + cnt.toString() + " Urls successfully ��")
console.timeEnd(site)
});
// start the crawler
generator.start();
What is the expected behavior?
Should be able to generate parts sitemap along with mapping sitemap.xml on Ubuntu 18