scrapy順序執行多個爬蟲
阿新 • • 發佈:2018-02-04
clas aio 爬蟲 sleep class abs pan path execute
1 # -*- coding:utf-8 -*- 2 3 from scrapy import cmdline 4 from scrapy.cmdline import execute 5 import sys,time,os 6 7 #會全部執行爬蟲程序 8 os.system(‘scrapy crawl ccdi‘) 9 os.system(‘scrapy crawl ccxi‘) 10 #----------------------------------------------------- 11 12 #只會執行第一個 13 cmdline.execute(‘scrapy crawl ccdi‘.split()) 14 cmdline.execute(‘scrapy crawl ccxi‘.split()) 15 #----------------------------------------------------- 16 17 #只會執行第一個 18 sys.path.append(os.path.dirname(os.path.abspath(__file__))) 19 execute(["scrapy", "crawl", "shanghaione"]) 20 time.sleep(30) 21 22 sys.path.append(os.path.dirname(os.path.abspath(__file__))) 23 execute(["scrapy", "crawl", "shanghaitwo"])
scrapy順序執行多個爬蟲