Skip to content Skip to sidebar Skip to footer

Terminate Scrapy Crawl Manually

When trying to run a spider in Scrapy, after having run it before with other parameters, I get this error message: crawl: error: running 'scrapy crawl' with more than one spider is

Solution 1:

I use an incremented number to break the loop when I'm testing

defparse(self, response):
     i = 0for sel in response.xpath('something'):
         if i > 2:
             break#something
         i += 1#something

Solution 2:

I hope you are using multiple command line parameters in wrong way. Simply scrapy crawl <spidername> will work fine. You may missed any specifiers if you are trying to use multiple command line arguments.

For terminating all running Scrapy processes, in Linux OS you can simply find out and kill all Scrapy processes by using the following command in Linux terminal

pkill scrapy 

Please use Windows PsKill for Windows OS.

Post a Comment for "Terminate Scrapy Crawl Manually"