美文网首页
python爬取豆瓣两千万图书简介信息:(七)代理IP

python爬取豆瓣两千万图书简介信息:(七)代理IP

作者: 曹波波 | 来源:发表于2017-11-08 16:13 被阅读91次

    这是全部的调试过程,我已经整理成为笔记,这里分享给大家:
    python爬取豆瓣两千万图书简介信息:(一)目标API分析
    python爬取豆瓣两千万图书简介信息:(二)简单python请求urllib2
    python爬取豆瓣两千万图书简介信息:(三)异常处理
    python爬取豆瓣两千万图书简介信息:(四)多进程并发
    python爬取豆瓣两千万图书简介信息:(五)数据库设计
    python爬取豆瓣两千万图书简介信息:(六)数据库操作类
    python爬取豆瓣两千万图书简介信息:(七)代理IP
    python爬取豆瓣两千万图书简介信息:(八)总结

    代理IP

    如果使用同一个ip对豆瓣图书的API发起访问,而且要采用之前说的多进程访问,那跟DOS攻击无异。豆瓣封杀起来也会异常简单。

    于是需要从专门的代理ip网站上,撸一堆能用的代理ip,然后在发起网络请求时,使用代理ip,这样被封杀的概率就会小很多。

    下面是我专门获取代理ip的类

    # -*- coding:utf-8 -*-
    import urllib2
    import gc
    import socket
    import functools
    import ssl
    import sys
    from bs4 import BeautifulSoup
    import Crawler
    
    default_encoding = 'utf-8'
    if sys.getdefaultencoding() != default_encoding:
        reload(sys)
    
    sys.setdefaultencoding(default_encoding)
    
    reload(sys)
    sys.path.append("..")
    socket.setdefaulttimeout(20.0)
    urllib2.socket.setdefaulttimeout(20)
    urllib2.disable_warnings = True
    
    
    def cb_print(str):
        # print str
        pass
    
    # 强制ssl使用TLSv1
    def sslwrap(func):
        @functools.wraps(func)
        def bar(*args, **kw):
            kw['ssl_version'] = ssl.PROTOCOL_TLSv1
            return func(*args, **kw)
        return bar
    
    ssl.wrap_socket = sslwrap(ssl.wrap_socket)
    
    ip_arr = []
    
    def get_ip_arr():
        gc.enable()
        try:
            url = 'http://vtp.daxiangdaili.com/ip/?tid=559609709731038&num=2000&delay=1&protocol=https'
            headers = {"User-Agent": "Mozilla/5.0"}
            req = urllib2.Request(url, headers=headers)
            res = urllib2.urlopen(req, timeout=20)
            res = res.read()
            ips_arr = res.split('\r\n')
            return ips_arr
        except Exception as e:
            cb_print('ip_arr_error:{}'.format(e))
        gc.collect()
    
    def get_66_ip(index):
        gc.enable()
        try:
            url = 'http://www.66ip.cn/'+str(index)
            headers = {"User-Agent": "Mozilla/5.0"}
            req = urllib2.Request(url, headers=headers)
            res = urllib2.urlopen(req, timeout=20)
            res = res.read()
            # print res
            soup = BeautifulSoup(res, "html.parser")
            table_arr = soup('table')
            ip_soup_arr = table_arr[len(table_arr)-1]('tr')
            ips_arr = []
            for it in ip_soup_arr:
                if it != ip_soup_arr[0]:
                    ip = it('td')[0].string
                    port = it('td')[1].string
                    ip_port = ip + ':' + port
                    ips_arr.append(ip_port)
            return ips_arr
        except Exception as e:
            cb_print('ip_arr_error:{}'.format(e))
        gc.collect()
    
    
    def get_xici_ip():
        gc.enable()
        try:
            url = 'http://www.xicidaili.com/wn/'
            headers = {"User-Agent": "Mozilla/5.0"}
            req = urllib2.Request(url, headers=headers)
            res = urllib2.urlopen(req, timeout=20)
            res = res.read()
            soup = BeautifulSoup(res, "html.parser")
            table_arr = soup('table')
            ip_soup_arr = table_arr[len(table_arr) - 1]('tr')
            ips_arr = []
            for it in ip_soup_arr:
                if it != ip_soup_arr[0]:
                    ip = it('td')[1].string
                    port = it('td')[2].string
                    ip_port = ip + ':' + port
                    ips_arr.append(ip_port)
            return ips_arr
        except Exception as e:
            cb_print('ip_arr_error:{}'.format(e))
        gc.collect()
        pass
    
    # 测试方法
    # ip_arr = get_xici_ip()
    # print ip_arr
    
    

    运行测试方法:

    # 测试方法
    ip_arr = get_xici_ip()
    print ip_arr
    

    运行结果如下:


    屏幕快照 2017-11-08 下午3.46.09.png

    这就获得了相应的代理ip组,在正常爬去的时候,发现一个ip不能使用了,就换下一个代理ip 来使用。

    写一个测试方法

    用到的 agent数组,装作是不同的用户环境访问。

    agent_arr = [
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.11 TaoBrowser/2.0 Safari/536.11",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.71 Safari/537.1 LBBROWSER",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; LBBROWSER)",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E; LBBROWSER)",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.84 Safari/535.11 LBBROWSER",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E) ",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; QQBrowser/7.0.3698.400) ",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E) ",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SV1; QQDownload 732; .NET4.0C; .NET4.0E; 360SE) ",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E) ",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E) ",
        "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.89 Safari/537.1",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.89 Safari/537.1",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E) ",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E) ",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E) ",
        "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.84 Safari/535.11 SE 2.X MetaSr 1.0",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SV1; QQDownload 732; .NET4.0C; .NET4.0E; SE 2.X MetaSr 1.0) ",
        "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:16.0) Gecko/20121026 Firefox/16.0",
        "Mozilla/5.0 (iPad; U; CPU OS 4_2_1 like Mac OS X; zh-cn) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5",
        "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:2.0b13pre) Gecko/20110307 Firefox/4.0b13pre",
        "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:16.0) Gecko/20100101 Firefox/16.0",
        "Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-CN; rv:1.9.2.15) Gecko/20110303 Firefox/3.6.15",
        "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11",
        "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0)",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)",
        "Mozilla/5.0 (X11; U; Linux x86_64; zh-CN; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10",
        "Mozilla/5.0 (Linux; U; Android 2.2.1; zh-cn; HTC_Wildfire_A3333 Build/FRG83D) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1"
    ]
    

    刚才从代理网站上撸取的代理ip数组,这里就硬编码了,实际上可以做成先获取代理ip,然后验证即可。

    ips = ['42.202.130.246:3128',
    '119.90.63.3:3128',
    '61.158.111.142:53281',
    '14.211.34.194:9999',
    '61.160.208.222:8080',
    '112.228.215.122:8118',
    '118.178.239.41:3128',
    '58.62.86.216:9999']
    

    写的一个测试代理ip可用性的函数,依次测试代理ip

    def test_ip(num):
        gc.enable()
        for ip in ips:
            cb_print(ip)
            y = random.randint(0, len(agent_arr) - 1)
            agent = agent_arr[y]
            try:
                proxy = urllib2.ProxyHandler({'https': ip})
                opener = urllib2.build_opener(proxy)
                urllib2.install_opener(opener)
                url = 'https://api.douban.com/v2/book/' + str(num)
                headers = {"User-Agent": agent}
                req = urllib2.Request(url, headers=headers)
                res = urllib2.urlopen(req, timeout=5)
                res = res.read().encode("utf-8")
                print ('结果是'+res)
            except Exception as e:
                if not e:
                    cb_print('e = can not get e!')
                elif isinstance(e, urllib2.URLError):
                    if format(e) == 'HTTP Error 404: Not Found':
                        print ('insert_none_book_id')
                        # SqlOperation.insert_none_book_id(num)
                        continue
                    else:
                        cb_print('urllib2.URLError = ' + format(e))
                else:
                    print ('insert_error_book_id')
                    # SqlOperation.insert_error_book_id(num)
                    continue
            finally:
                gc.collect()
        cb_print('end!')
    
    # 测试id为10554308 的可用性
    test_ip(10554308)
    

    运行结果如下:


    屏幕快照 2017-11-08 下午4.09.39.png

    前面爬取的代理ip都是可用的。

    相关文章

      网友评论

          本文标题:python爬取豆瓣两千万图书简介信息:(七)代理IP

          本文链接:https://www.haomeiwen.com/subject/cphymxtx.html