lgraubner / sitemap-generator

Easily create XML sitemaps for your website.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Crashed on 404

tinyjin opened this issue · comments

Do you want to request a feature or report a bug?
feature

What is the current behavior?
when try to connect 404 page, process crashed. because of throw error.

If the current behavior is a bug, please provide the steps to reproduce.
It is not bug. but I hope do not throw error.
even if connect 404 page, my crawler must not crash.

or... did you provided progressible function without crash even if connect 404 page??

What is the expected behavior?
progress crawling without crash even if connect 404 page.

When does this crash happen? Right on start? If the initial page is an 404 the crawler crashes, because it can not proceed without any new links. If it's crashing during the process this might be a bug, because it should only emit an error event.

oh I'm sorry for late.
This crash occur at the only begining time.
Namely, it is not bug!

My question's key point is that as u can see below.

I'm developing Crawler can crawling from several site.
so I need collect sitemaps from several site, therefore I decided to use your super cool module sitemap-generator.
My system was collect sitemaps in succession. so crawler should not crash in midstream.

In conclusion, can you support able to handling error continuously instead of throw error?

Ah now I understand.
Why don't you just wrap it in a try { .. } catch { .. }? I think that error still makes sense.

Closed due to inactivity.