running 'scrapy crawl' with more than one spider is not supported
Package:
scrapy
41445

Exception Class:
UsageError
Raise code
def short_desc(self):
return "Run a spider"
def run(self, args, opts):
if len(args) < 1:
raise UsageError()
elif len(args) > 1:
raise UsageError("running 'scrapy crawl' with more than one spider is not supported")
spname = args[0]
crawl_defer = self.crawler_process.crawl(spname, **opts.spargs)
if getattr(crawl_defer, 'result', None) is not None and issubclass(crawl_defer.result.type, Exception):
self.exitcode = 1
else:
Links to the raise (1)
https://github.com/scrapy/scrapy/blob/ee682af3b06d48815dbdaa27c1177b94aaf679e1/scrapy/commands/crawl.py#L19NO FIXES YET
Just press the button and we will add solution
to this exception as soon as possible
* As many users press the button, the faster we create a fix
Add a possible fix
Please authorize to post fix