Control maximum number of concurrent pages to be crawled
mrmashal opened this issue · comments
MohamadReza Mash'al commented
Is there an option to control the number of concurrent pages taken out of the scheduler? I tried the MaxConcurrentThreads
but seems to have no effect.
I need to set it to 1 while debugging. Do you have any suggestions?
Steven commented
Hi MaxConcurrentThreads does exactly that. Is it possible the site you are crawling cannot handle more than 1 or 2 concurrent requests? Ie... Abot cannot crawl any faster than the site is able to respond.