I"m using scrapy with python 2.7.9 , my spider can crawl the data normally.
But I found the data is too big,so I want to crawl the data in one spider in several times and write to different CSV files.
For example: Totally ,I have "One hundred and ten thousand " webpages,I want the spider to crawl "30 thousand","30 thousand","30 thousand","20 thousand" respectively.
How can I do? Can I done it only in "spider"? Or maybe in "pipelines" ?
Please somebody help me.