美文网首页Python之佳
aiohttp——异步爬虫

aiohttp——异步爬虫

作者: 派派森森 | 来源:发表于2019-01-20 20:12 被阅读0次

    发起请求

    asyncdeffetch():asyncwithaiohttp.ClientSession()assession:asyncwithsession.get('https://www.baidu.com')asresposne:            print(awaitresposne.text())loop = asyncio.get_event_loop()tasks = [fetch(),]loop.run_until_complete(asyncio.wait(tasks))

    添加请求参数

    params = {'key':'value','page':10}asyncdeffetch():asyncwithaiohttp.ClientSession()assession:asyncwithsession.get('https://www.baidu.com/s',params=params)asresposne:            print(awaitresposne.url)loop = asyncio.get_event_loop()tasks = [fetch(),]loop.run_until_complete(asyncio.wait(tasks))

    自定义User-Agent

    url ='http://httpbin.org/user-agent'headers = {'User-Agent':'test_user_agent'}asyncdeffetch():asyncwithaiohttp.ClientSession()assession:asyncwithsession.get(url,headers=headers)asresposne:            print(awaitresposne.text())loop = asyncio.get_event_loop()tasks = [fetch(),]loop.run_until_complete(asyncio.wait(tasks))

    自定义cookies

    推荐下我自己创建的Python学习交流群923414804,这是Python学习交流的地方,不管你是小白还是大牛,小编都欢迎,不定期分享干货,包括我整理的一份适合零基础学习Python的资料和入门教程。

    url ='http://httpbin.org/cookies'cookies = {'cookies_name':'test_cookies'}asyncdeffetch():asyncwithaiohttp.ClientSession()assession:asyncwithsession.get(url,cookies=cookies)asresposne:            print(awaitresposne.text())          loop = asyncio.get_event_loop()tasks = [fetch(),]loop.run_until_complete(asyncio.wait(tasks))

    post字符串

    url ='http://httpbin.org'payload = {'username':'zhang','password':'123456'}asyncdeffetch():asyncwithaiohttp.ClientSession()assession:asyncwithsession.post(url, data=payload)asresposne:            print(awaitresposne.text())loop = asyncio.get_event_loop()tasks = [fetch(), ]loop.run_until_complete(asyncio.wait(tasks))

    post文件

    url ='http://httpbin.org'files = {'file': open('test.txt','rb')}asyncdeffetch():asyncwithaiohttp.ClientSession()assession:asyncwithsession.post(url, data=files)asresposne:            print(awaitresposne.text())loop = asyncio.get_event_loop()tasks = [fetch(), ]loop.run_until_complete(asyncio.wait(tasks))

    设置代理

    url ="http://python.org"asyncdeffetch():asyncwithaiohttp.ClientSession()assession:asyncwithsession.get(url, proxy="http://some.proxy.com")asresposne:        print(resposne.status)loop = asyncio.get_event_loop()tasks = [fetch(), ]loop.run_until_complete(asyncio.wait(tasks))

    设置认证代理

    url ="http://python.org"asyncdeffetch():asyncwithaiohttp.ClientSession()assession:        proxy_auth = aiohttp.BasicAuth('user','pass')asyncwithsession.get(url, proxy="http://some.proxy.com", proxy_auth=proxy_auth)asresposne:            print(response.status)loop = asyncio.get_event_loop()tasks = [fetch(), ]loop.run_until_complete(asyncio.wait(tasks))# 下面的方法也可以url ="http://python.org"asyncdeffetch():asyncwithaiohttp.ClientSession()assession:asyncwithsession.get(url, proxy="http://user:pass@some.proxy.com")asresponse:            print(response.status)loop = asyncio.get_event_loop()tasks = [fetch(), ]loop.run_until_complete(asyncio.wait(tasks))

    相关文章

      网友评论

        本文标题:aiohttp——异步爬虫

        本文链接:https://www.haomeiwen.com/subject/prrfjqtx.html