SpiderKeeper使用文档

pip install scrapy
pip install scrapyd
pip install scrapyd-client
pip install spiderkeeper
复制代码
scrapy.cfg文件配置
[settings]
default = crawler.settings

[deploy:worker]
url = http://0.0.0.0:6800/
project = crawler
复制代码
1.在crawler项目的第一级目录下执行
scrapyd

2.在spiderkeeper的manager目录(需新建)下执行
spiderkeeper

3.在crawler项目的第一级目录下执行
scrapyd-deploy --build-egg output.egg

4.浏览器打开
http://127.0.0.1:5000

5.创建项目,名字随意起,如:work

6.浏览器打开
http://127.0.0.1:5000/project/1/spider/deploy

7.选择output.egg文件并提交

8.添加任务,设置定时,然后跑起来
复制代码

转载于:https://juejin.im/post/5d023d7df265da1bd4247ab7

猜你喜欢

转载自blog.csdn.net/weixin_34256074/article/details/93179217