下载器中间件
大致功能:
- 更换代理IP
- 更换Cookies
- 更换User-Agent
- 自动重试
代理中间件
Scrapy工程创建后会自动生成middlewares.py这个文件,s表示这个文件里可以放很多中间件。通俗点说,一个类对应一个中间件。
import randomfrom scrapy import settingsclass ProxyMiddleware:def process_request(self, request, spider):proxy = random.choice(settings['PROXIES'])request.meta['proxy'] = proxy
当然,settings.py中设置:
# ↓↓ 新增代理选项PROXIES = ["http://112.87.69.135:9999","https://125.123.153.131:3000",]# ↓↓ 激活中间件(取消注释)DOWNLOADER_MIDDLEWARES = {'projectName.middlewares.ProjectnameDownloaderMiddleware': 543, # 这个是默认自带的,543序号越大,优先级越低'projectName.middlewares.ProxyMiddleware': 544,}
在这里可以试试你的ip地址。这个网站还可以在后面加/10表示翻到了第十页。
另外,可以自己爬免费代理网站构建个小池子,然后随机从中获取代理。(高富帅可以直接花钱买,不容易挂)
- 写个小爬虫去各大免费代理网站上爬并验证,将可用的代理IP保存到数据库或Redis
- 在process_request方法中随机从数据库中获取一条代理
- 周期性验证数据库中的代理池,清除掉挂了的代理
UA中间件
和代理中间件几乎一样的套路:
class UAMiddleware:def process_request(self, request, spider):ua = random.choice(settings['USER_AGENT_LIST'])request.headers['User-Agent'] = ua
然后再去settings.py中激活。访问这里可以查看自己的UA,以及第十页的URL。
附常用UA池:
USER_AGENT_LIST = ['Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36 OPR/26.0.1656.60','Opera/8.0 (Windows NT 5.1; U; en)','Mozilla/5.0 (Windows NT 5.1; U; en; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.50','Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; en) Opera 9.50','Mozilla/5.0 (Windows NT 6.1; WOW64; rv:34.0) Gecko/20100101 Firefox/34.0','Mozilla/5.0 (X11; U; Linux x86_64; zh-CN; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10','Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.57.2 (KHTML, like Gecko) Version/5.1.7 Safari/534.57.2 ','Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36','Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11','Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16','Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/30.0.1599.101 Safari/537.36','Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko','Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.11 TaoBrowser/2.0 Safari/536.11','Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.71 Safari/537.1 LBBROWSER','Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E)','Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.84 Safari/535.11 SE 2.X MetaSr 1.0','Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SV1; QQDownload 732; .NET4.0C; .NET4.0E; SE 2.X MetaSr 1.0) ',]
Cookies中间件
Cookies一般用来保持登录状态,可以将多个Cookies存入Redis,然后在中间件的process_request方法中设置:
request.cookies = cookies
中间件集成Selenium
代码示例:
import timefrom selenium import webdriverfrom scrapy.http import HtmlResponseclass SeleniumMiddleware:def __init__(self):self.driver = webdriver.Chrome('./chromedriver')def process_request(self, request, spider):if spider.name == 'seleniumSpider':self.driver.get(request.url)time.sleep(2)body = self.driver.page_sourcereturn HtmlResponse(self.driver.current_url, body=body, encoding='utf-8', request=request)
