scrapy / scrapyd

A service daemon to run Scrapy spiders

Home Page:https://scrapyd.readthedocs.io/en/stable/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

scrapyd schedule API with environment variables

ahaffar opened this issue · comments

  • Scrapyd : version": "==1.2.1"
  • scrapy: "version": "==2.5.0"
  • OS: docker slim-buster
  • Python: 3.8

While running the scrapy everything works great. e.g. scrapy crawl myspider -a; but while scheduling the spider via Scrapyd Schedule API e.g. curl http://localhost:6800/schedule.json -d project=prj_spiders -d spider=zz_spider, the scrapyd is not able to find any custom environment variables.

I tried to copy the env. vars to /etc/environment and /etc/profile.d/secrets.sh
but nothing is working. 😢

please support

This issue was caused of sudo not able to pass environment variables.

Will close the issue