sjdirect / abot

Cross Platform C# web crawler framework built for speed and flexibility. Please star this project! +1.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Control maximum number of concurrent pages to be crawled

mrmashal opened this issue · comments

Is there an option to control the number of concurrent pages taken out of the scheduler? I tried the MaxConcurrentThreads but seems to have no effect.

I need to set it to 1 while debugging. Do you have any suggestions?

Hi MaxConcurrentThreads does exactly that. Is it possible the site you are crawling cannot handle more than 1 or 2 concurrent requests? Ie... Abot cannot crawl any faster than the site is able to respond.