clemfromspace / scrapy-selenium

Scrapy middleware to handle javascript pages using selenium

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Keep browser window open after scraping?

wondering639 opened this issue · comments

How can one keep the browser window open after scraping has finished (or aborted)? Thanks!

@wondering639 There's only a single browser instance created for the lifecycle of the downloader middleware. I suppose all you'd need to do is avoid running this statement when your crawl is finished: https://github.com/clemfromspace/scrapy-selenium/blob/develop/scrapy_selenium/middlewares.py#L139

You can probably accomplish that through subclassing SeleniumMiddleware and overriding spider_closed() w/o having to actually modify any code in the package itself.