注意:请不要爬取过多信息,仅供学习。
分析:
- 业务需求分析......(此例为住房信息...)
- 查找相关网页信息(以链家为例)
- 分析URL,查找我们需要的内容,建立连接
- 定位数据
- 存储数据
首先进入链家网首页,点击租房,F12检查网页,查找我们需要的信息。如图:
第一页url:https://bj.lianjia.com/zufang/
第二页url:https://bj.lianjia.com/zufang/pg2/
然后再定位我们需要的信息:如下图
下面就开始代码实现,我们的分析过程,获取数据,对数据进行定位。
主要代码:
-
# url 页码拼接
-
url = 'https://bj.lianjia.com/zufang/pg{}'.format(page)
-
# 利用Xpath 对数据进行定位
-
...
-
html_pipei = html_ele.xpath('//ul[@id="house-lst"]/li')
-
for pipei_one in html_pipei:
-
title = pipei_one.xpath('./div[2]/h2/a')[0].text
-
region = pipei_one.xpath('./div[2]/div[1]/div[1]/a/span')[0].text
-
...
完整代码如下:
-
import requests
-
from lxml import etree
-
import pymysql
-
class Mysql(object):
-
'''执行数据操作封装类'''
-
def __init__(self):
-
'''连接数据库、创建游标'''
-
self.db = pymysql.connect(host="localhost", user="root", password="8888", database="test")
-
self.cursor = self.db.cursor()
-
def mysql_op(self, sql, data):
-
'''MySQL语句'''
-
self.cursor.execute(sql, data)
-
self.db.commit()
-
def __del__(self):
-
'''关闭游标、关闭数据库'''
-
self.cursor.close()
-
self.db.close()
-
# MySQL语句
-
Insert = Mysql()
-
# 要执行的sql 语句
-
sql = '''INSERT INTO lianjia (title, region, zone, meters, location, price) VALUES(%s, %s, %s, %s, %s, %s)'''
-
# 头部报文
-
headers = {
-
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36'
-
}
-
def download_msg():
-
for page in range(1, 2):
-
url = 'https://bj.lianjia.com/zufang/pg{}'.format(page)
-
responses = requests.get(url, headers=headers)
-
html = responses.text
-
# 利用Xpath
-
html_ele = etree.HTML(html)
-
html_pipei = html_ele.xpath('//ul[@id="house-lst"]/li')
-
# print(html_pipei)
-
for pipei_one in html_pipei:
-
# ./li/div[2]/a
-
title = pipei_one.xpath('./div[2]/h2/a')[0].text
-
# print(title)
-
region = pipei_one.xpath('./div[2]/div[1]/div[1]/a/span')[0].text
-
# print(region)
-
zone = pipei_one.xpath('./div[2]/div[1]/div[1]/span[1]/span')[0].text
-
# print(zone)
-
meters = pipei_one.xpath('./div[2]/div[1]/div[1]/span[2]')[0].text
-
# print(meters)
-
location = pipei_one.xpath('./div[2]/div[1]/div[1]/span[3]')[0].text
-
# print(location)
-
price = pipei_one.xpath('.//div[@class="price"]/span')[0].text
-
# print(price)
-
data = (title, region, zone, meters, location, price)
-
Insert.mysql_op(sql, data)
-
if __name__ == '__main__':
-
download_msg()