We have a quite normal Scrapy project, something like that:
project/ setup.py scrapy.cfg SOME_DIR_WITH_PYTHON_MODULE/ __init__.py project/ settings.py pipelines.py __init__.py spiders/ __init__.py somespider.py
Everything works great if we run it from command line
scrapy crawl somespider...
But when we deploy it and run using Scrapyd, it just fails to import the code from SOME_DIR_WITH_PYTHON_MODULE. Looks like it doesn't see the code there for some unknown reasons.
We tried to import it in the pipelines.py file. Tried like that:
from project.SOME_DIR_WITH_PYTHON_MODULE import *
and like that:
from SOME_DIR_WITH_PYTHON_MODULE import *
...and nothing worked. Though it worked if ran from command-line 'direct' execution using scrapy crawl.
What should we do to make it work?