sanic性能对比

作者: 霡霂976447044 | 来源:发表于2019-07-03 14:16 被阅读0次

    Sanic是基于uvloop的异步web框架,其性能个人觉得比Py三大框架强。

    安装wrk压测工具

    git clone https://github.com/wg/wrk.git
    make
    

    计算机环境

    python3.6.8
    tornado==6.0.2
    sanic==19.3.1
    cpu Intel(R) Core(TM) i5-8250U CPU @ 1.60GHz
    测试语句:

    wrk -t10 -d1m -c200 http://127.0.0.1:8080
    

    1. 测试python标准库自带的HttpServer

    from http.server import BaseHTTPRequestHandler
    from urllib import parse
    
    
    class GetHandler(BaseHTTPRequestHandler):
        def do_GET(self):
            message = "Hello World"
            self.send_response(200)
            self.end_headers()
            self.wfile.write(message.encode('utf-8'))
    
    if __name__ == '__main__':
        from http.server import HTTPServer
        server = HTTPServer(('localhost', 8080), GetHandler)
        server.serve_forever()
    

    wrk -t10 -d1m -c200 http://127.0.0.1:8080
    结果

    Running 1m test @ http://127.0.0.1:8080
      10 threads and 200 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     6.25ms   61.93ms   1.67s    99.04%
        Req/Sec   556.71    549.62     1.76k    67.14%
      4261 requests in 1.01m, 424.44KB read
      Socket errors: connect 0, read 4261, write 0, timeout 0
    Requests/sec:     70.28
    Transfer/sec:      7.00KB
    

    每秒请求70

    2. 测试sanic

    from sanic import Sanic
    from sanic.response import json, text
    import asyncio
    
    app = Sanic()
    
    @app.route("/")
    async def test(request):
        asyncio.sleep(1)
        return json({"message": "hello world"})
    
    if __name__ == "__main__":
        app.run(host="0.0.0.0", debug=False, port=8080, access_log=False, workers=1)
    

    这里测试一个进程也就是一个cpu的,测试结果

    Running 1m test @ http://127.0.0.1:8080
      10 threads and 200 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency    10.58ms    2.46ms  48.66ms   86.05%
        Req/Sec     1.90k   343.11     4.43k    74.63%
      1136502 requests in 1.00m, 146.32MB read
    Requests/sec:  18915.42
    Transfer/sec:      2.44MB
    

    sanic每秒请求大概1.9w

    3. 测试tornado

    import tornado.ioloop
    import tornado.web
    
    class MainHandler(tornado.web.RequestHandler):
        def get(self):
            self.write("Hello, world")
    
    def make_app():
        return tornado.web.Application([
            (r"/", MainHandler),
        ])
    
    if __name__ == "__main__":
        app = make_app()
        app.listen(8080)
        tornado.ioloop.IOLoop.current().start()
    

    tornado并没有返回json了,直接测试性能,测试结果:

    Running 1m test @ http://127.0.0.1:8080
      10 threads and 200 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency    66.14ms    9.37ms 159.04ms   91.26%
        Req/Sec   304.24     88.16   600.00     30.50%
      181609 requests in 1.00m, 35.85MB read
    Requests/sec:   3021.95
    Transfer/sec:    610.88KB
    

    每秒请求3k

    4 测试go标准库net/http

    package main
    
    import (
        "encoding/json"
        "fmt"
        "log"
        "net/http"
        "runtime"
        "time"
    )
    
    type Hello struct {
    }
    
    type Msg struct {
        Message string
    }
    
    func (h Hello) ServeHTTP(w http.ResponseWriter, r *http.Request) {
        time.Sleep(1)
        msg := Msg{Message: "Hello world"}
        s, _ := json.Marshal(msg)
        fmt.Fprint(w, string(s))
    }
    
    func main() {
        h := Hello{}
        runtime.GOMAXPROCS(1)
        err := http.ListenAndServe("localhost:8080", h)
        if err != nil {
            log.Fatal(err)
        }
    }
    

    这里限制核心数为1核,和sanic平级。结果如下

    Running 1m test @ http://127.0.0.1:8080
      10 threads and 200 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     6.14ms   17.53ms 343.37ms   98.64%
        Req/Sec     4.54k   821.84    18.65k    80.78%
      2713334 requests in 1.00m, 367.44MB read
    Requests/sec:  45151.63
    Transfer/sec:      6.11MB
    

    每秒4.5w,其实如果sanic改为多进程,go也改为没有cpu限制,他们之间还是会相差2倍。

    5 测试node.js

    var http = require('http');
    
    http.createServer(function (request, response) {
    
        // 发送 HTTP 头部 
        // HTTP 状态值: 200 : OK
        // 内容类型: text/plain
        response.writeHead(200, {'Content-Type': 'text/plain'});
    
        // 发送响应数据 "Hello World"
        response.end('Hello World\n');
    }).listen(8080);
    
    Running 1m test @ http://127.0.0.1:8080
      10 threads and 200 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     5.86ms    1.70ms  85.42ms   95.21%
        Req/Sec     3.46k   494.21     5.75k    89.76%
      2065598 requests in 1.00m, 307.31MB read
    Requests/sec:  34378.55
    Transfer/sec:      5.11MB
    

    每秒3.4w左右,由于node.js使用的也是libuv,但是python本身语言还是慢了。或者说,Sanic是一个框架,逻辑也比go的http和node的http模块逻辑要复杂的多,执行时间自然足药更多的时间。

    uvloop

    sanic是使用的uvloop作为asyncio的loop实现,uvloop本身非常快,他是Cython写的。如果使用c语言写的http解析器,那么它实现的http的性能和node.js持平, 和go也有的拼。
    https://magic.io/blog/uvloop-blazing-fast-python-networking/

    截图_2019-07-03_14-11-17.png

    思考

    是否可以将业务逻辑之外的,用C实现,再Python语言绑定,如上图,http解析器使用httptools实现,达到最大化利用计算机性能?

    虽然python的执行速度没那么快,但是均衡开发速度,选择Python,你会有更多的时间做其它事情。

    -----------------<如文章有不正确的地方或者你有什么建议,欢迎在下方评论>-----------------

    相关文章

      网友评论

        本文标题:sanic性能对比

        本文链接:https://www.haomeiwen.com/subject/ckuqhctx.html