bbc / wraith

Wraith — A responsive screenshot comparison tool

Home Page:http://bbc-news.github.io/wraith/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Feature request] Ability to limit depth of spidering

geordiemhall opened this issue · comments

Hi there – just thought I'd drop a feature request that'd simplify our workflow a bit and allow us to use the spidering built into wraith.

Would be great if you could limit the depth of the spider search – so basically limit how many sub-pages it checks when building the list of paths.

Eg. specifying a spider_depth of 0 would mean it'd only include links on the homepage, and never go any deeper. A depth of 1 would mean it'd include links on the homepage, and links on any pages linked from the homepage etc. etc.

Our use case for this is that we're using history mode to compare sites before/after a deployment. We want to capture more pages than just the homepage, but we also don't want to capture all pages on a site, since this can be 1000+ and would take far too long for the sort of differences we're expecting. What we really want is a representative sample of the "main" pages on the site, which is usually the homepage + any pages linked to from the homepage (so a depth of zero).

Thanks!