美文网首页nginx
Ngingx 压力测试

Ngingx 压力测试

作者: 小六的昵称已被使用 | 来源:发表于2019-08-13 10:15 被阅读0次

    一.安装测试工具

    1.安装 ab 测试工具

    1.安装

    yum -y install httpd-tools
    

    2.简单使用

    ab -c 12 -n 500000 http://192.168.30.171/
        -n requests     Number of requests to perform
                        执行的总请求数
        -c concurrency  Number of multiple requests to make at a time
                        并发数
    

    2.安装 awk

    https://github.com/wg/wrk

    1.安装

    git clone https://github.com/wg/wrk.git
    cd wrk
    make
    

    2.简单使用

    ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/index.html
    
        [root@temp-02 wrk]# ./wrk 
        Usage: wrk <options> <url>                            
          Options:                                            
            -c, --connections <N>  Connections to keep open
                                    连接数
            -d, --duration    <T>  Duration of test
                                    测试持续时间
            -t, --threads     <N>  Number of threads to use
                                    指定线程数
            -s, --script      <S>  Load Lua script file
                                    指定 Lua 脚本文件
            -H, --header      <H>  Add header to request
                                    添加请求头
                --latency          Print latency statistics
                                    打印延迟统计
                --timeout     <T>  Socket/request timeout
                                    超时时间
            -v, --version          Print version details
                                    打印版本详情
    
          Numeric arguments may include a SI unit (1k, 1M, 1G)
            数字参数可以使用以下格式
          Time arguments may include a time unit (2s, 2m, 2h)
            时间参数可以使用以下格式
    

    二.测试结果展示

    1.测试服务器配置

    CPU     8核
    内存    8G
    
    [root@temp-01 logs]# cat /etc/centos-release
    CentOS Linux release 7.6.1810 (Core)
    
    [root@temp-01 logs]# /opt/tengine/sbin/nginx -v
    Tengine version: Tengine/2.2.2 (nginx/1.8.1)
    

    2.打开和关闭 Nginx 日志测试对比

    1.关闭 Nginx access 日志

    ## worker_connections 65535;
    ## worker_processes auto;
    ## 第一次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     1.94ms   10.17ms 403.93ms   98.95%
        Req/Sec    27.81k     2.44k   43.53k    87.21%
      Latency Distribution
         50%    1.06ms
         75%    1.23ms
         90%    1.48ms
         99%   13.10ms
      9977664 requests in 30.09s, 7.38GB read
    Requests/sec: 331545.69
    Transfer/sec:    251.04MB
    
    ## 第二次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     2.87ms   16.54ms 802.98ms   98.32%
        Req/Sec    27.47k     3.03k   37.21k    91.08%
      Latency Distribution
         50%    1.06ms
         75%    1.24ms
         90%    1.53ms
         99%   68.43ms
      9866325 requests in 30.10s, 7.30GB read
    Requests/sec: 327792.62
    Transfer/sec:    248.20MB
    
    ## 第三次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     2.08ms   11.78ms 805.37ms   98.85%
        Req/Sec    27.59k     2.61k   39.17k    88.74%
      Latency Distribution
         50%    1.07ms
         75%    1.25ms
         90%    1.52ms
         99%   19.67ms
      9903039 requests in 30.10s, 7.32GB read
    Requests/sec: 329010.43
    Transfer/sec:    249.12MB
    

    2.打开 Nginx access 日志

    ## worker_connections 65535;
    ## worker_processes auto;
    ## 第一次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     2.75ms    8.54ms 258.51ms   97.01%
        Req/Sec    20.78k     2.13k   54.21k    87.58%
      Latency Distribution
         50%    1.16ms
         75%    2.27ms
         90%    5.12ms
         99%   20.35ms
      7431933 requests in 30.10s, 5.50GB read
    Requests/sec: 246936.87
    Transfer/sec:    186.97MB
    
    ## 第二次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     3.38ms   12.65ms 403.54ms   97.85%
        Req/Sec    21.99k     2.10k   35.44k    91.63%
      Latency Distribution
         50%    1.02ms
         75%    2.25ms
         90%    5.43ms
         99%   48.83ms
      7895936 requests in 30.10s, 5.84GB read
    Requests/sec: 262328.65
    Transfer/sec:    198.63MB
    
    ## 第三次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     3.05ms   16.21ms 885.71ms   98.74%
        Req/Sec    21.78k     2.86k   29.75k    89.57%
      Latency Distribution
         50%    1.09ms
         75%    2.11ms
         90%    4.38ms
         99%   27.41ms
      7816029 requests in 30.10s, 5.78GB read
    Requests/sec: 259669.41
    Transfer/sec:    196.61MB
    

    3.总结

    关闭日志基本维持在32.7万/秒以上
    打开日志基本维持在25万/秒左右,低了7万多
    

    3.worker_processes 对性能的影响

    跟【1.关闭 Nginx access 日志】测试配置除了将 worker_processes=auto 改为 worker_processes=1 外,其他全部一样

    1.worker_processes=1

    ## 第一次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency    36.07ms   97.95ms   1.83s    89.11%
        Req/Sec     5.12k     2.52k   25.83k    73.33%
      Latency Distribution
         50%    1.12ms
         75%    7.30ms
         90%  142.08ms
         99%  353.08ms
      1764512 requests in 30.10s, 1.30GB read
      Socket errors: connect 0, read 0, write 0, timeout 69
    Requests/sec:  58622.25
    Transfer/sec:     44.39MB
    
    ## 第二次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency    34.94ms   92.78ms   1.84s    88.59%
        Req/Sec     5.08k     2.71k   24.16k    67.80%
      Latency Distribution
         50%    1.11ms
         75%    5.95ms
         90%  140.54ms
         99%  326.47ms
      1755504 requests in 30.10s, 1.30GB read
      Socket errors: connect 0, read 0, write 0, timeout 47
    Requests/sec:  58331.95
    Transfer/sec:     44.17M
    
    ## 第三次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency    40.46ms  116.94ms   1.86s    91.15%
        Req/Sec     5.04k     2.16k   25.99k    74.91%
      Latency Distribution
         50%    1.13ms
         75%   11.83ms
         90%  146.93ms
         99%  456.04ms
      1774127 requests in 30.10s, 1.31GB read
      Socket errors: connect 0, read 0, write 0, timeout 84
    Requests/sec:  58946.37
    Transfer/sec:     44.63MB
    

    2.worker_processes=16

    ## 第一次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     2.09ms    8.61ms 321.66ms   98.20%
        Req/Sec    26.25k     3.01k   42.40k    91.68%
      Latency Distribution
         50%    1.05ms
         75%    1.41ms
         90%    2.23ms
         99%   19.83ms
      9420675 requests in 30.10s, 6.97GB read
    Requests/sec: 312980.77
    Transfer/sec:    236.98MB
    
    ## 第二次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     2.38ms   11.58ms 212.98ms   98.62%
        Req/Sec    26.85k     2.28k   37.55k    88.25%
      Latency Distribution
         50%    1.04ms
         75%    1.37ms
         90%    2.05ms
         99%   24.42ms
      9641055 requests in 30.10s, 7.13GB read
    Requests/sec: 320308.21
    Transfer/sec:    242.53MB
    
    ## 第三次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     2.35ms   10.73ms 209.90ms   98.44%
        Req/Sec    26.68k     2.63k   43.19k    92.27%
      Latency Distribution
         50%    1.04ms
         75%    1.41ms
         90%    2.38ms
         99%   21.99ms
      9577321 requests in 30.10s, 7.08GB read
    Requests/sec: 318220.12
    Transfer/sec:    240.95MB
    

    2.worker_processes=4

    ## 第一次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     9.73ms   25.08ms 874.22ms   89.77%
        Req/Sec    15.11k     2.62k   26.25k    78.02%
      Latency Distribution
         50%    0.98ms
         75%    4.01ms
         90%   35.52ms
         99%   93.07ms
      5417171 requests in 30.10s, 4.01GB read
    Requests/sec: 179959.48
    Transfer/sec:    136.26MB
    
    ## 第二次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     9.84ms   27.29ms 853.33ms   90.77%
        Req/Sec    14.22k     2.50k   25.67k    72.18%
      Latency Distribution
         50%  714.00us
         75%    3.32ms
         90%   34.50ms
         99%  116.57ms
      5091237 requests in 30.10s, 3.76GB read
    Requests/sec: 169145.06
    Transfer/sec:    128.07MB
    
    ## 第三次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency    10.23ms   28.65ms 859.30ms   90.72%
        Req/Sec    14.30k     2.81k   29.18k    74.66%
      Latency Distribution
         50%  721.00us
         75%    3.62ms
         90%   36.44ms
         99%  114.73ms
      5102272 requests in 30.10s, 3.77GB read
    Requests/sec: 169509.31
    Transfer/sec:    128.35MB
    

    3.升级 CPU 为16核,其他配置保持不变,worker_processes=16

    ## 第一次
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     1.70ms    6.68ms 209.37ms   98.42%
        Req/Sec    26.68k     2.73k   34.98k    75.71%
      Latency Distribution
         50%    1.02ms
         75%    1.40ms
         90%    2.04ms
         99%   11.04ms
      9574645 requests in 30.06s, 7.08GB read
    Requests/sec: 318514.85
    Transfer/sec:    241.17MB
    
    ## 第二次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     2.07ms   10.01ms 213.09ms   98.90%
        Req/Sec    27.42k     3.24k   40.85k    83.20%
      Latency Distribution
         50%    0.99ms
         75%    1.35ms
         90%    2.02ms
         99%   13.23ms
      9842622 requests in 30.10s, 7.28GB read
    Requests/sec: 327037.08
    Transfer/sec:    247.62MB
    
    ## 第三次
    [root@temp-02 wrk]# ./wrk -t12 -c400 -d30s --latency http://192.168.30.171/
    Running 30s test @ http://192.168.30.171/
      12 threads and 400 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency     1.41ms    5.08ms 204.71ms   98.58%
        Req/Sec    28.88k     2.97k   41.41k    74.83%
      Latency Distribution
         50%    0.95ms
         75%    1.27ms
         90%    1.77ms
         99%    8.16ms
      10368344 requests in 30.10s, 7.67GB read
    Requests/sec: 344482.51
    Transfer/sec:    260.83MB
    

    4.总结

    worker_processes=1,基本维持在不到6万
    worker_processes=4,基本维持在17万
    worker_processes=16,虽然线程调成了16,但是由于 CPU 还是8核心,所以性能并没有提升,还是维持在32万左右
    worker_processes=16,并调整 CPU 为16核心,还是维持在32万左右
                        因为在每秒32万左右的情况下,请求 Nginx 默认主页的时候网络带宽占用已经达到1000M,所以此时的瓶颈已经不在 CPU
    

    相关文章

      网友评论

        本文标题:Ngingx 压力测试

        本文链接:https://www.haomeiwen.com/subject/iqiujctx.html