美文网首页
django笔记:celery之异步任务

django笔记:celery之异步任务

作者: 倔犟的贝壳 | 来源:发表于2021-11-10 12:26 被阅读0次

    在一个后台的应用中,我们经常需要用到异步任务,如:

    接下来,跟着官方文档,熟悉celery的简单使用

    • Step1 安装celery
    pip install celery
    
    • Step2 选择一个brocker(消息队列)

    Celery requires a solution to send and receive messages; usually this comes in the form of a separate service called a message broker.
    celery是用来解决异步任务的,那么它需要有一个东西来存储异步的任务以及任务返回的结果,即broker。

    broker我们可以选择如下:(这里我使用redis)

    • RabbitMQ

    • Redis

    • Other brokers

    • Step3 创建一个任务
      我们创建一个tasks.py的文件,在里面创建一个task,代码如下:

    from celery import Celery
    import time
    #backend:指定每一个异步任务的结果存储在什么地方
    #brocker:指定存储任务的系统
    app = Celery('tasks', backend ='redis://127.0.0.1',broker='redis://127.0.0.1')
    
    @app.task
    def add(x, y):
        print("=====start add======")
        time.sleep(1) #设置延时1秒
        return x + y
    
    • Step4 启动Celery的Worker Server
      我的理解是:注册我们的任务,启动一个工作进程。启动之后控制台输出如下:
     -------------- celery@huanghuandeMacBook-Pro.local v5.1.2 (sun-harmonics)
    --- ***** ----- 
    -- ******* ---- macOS-10.16-x86_64-i386-64bit 2021-11-10 11:48:12
    - *** --- * --- 
    - ** ---------- [config]
    - ** ---------- .> app:         tasks:0x7fac8f6f5eb0
    - ** ---------- .> transport:   redis://127.0.0.1:6379//
    - ** ---------- .> results:     redis://127.0.0.1/
    - *** --- * --- .> concurrency: 4 (prefork)
    -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
    --- ***** ----- 
     -------------- [queues]
                    .> celery           exchange=celery(direct) key=celery
                    
    
    [tasks]
      . tasks.add
    
    [2021-11-10 11:48:12,906: INFO/MainProcess] Connected to redis://127.0.0.1:6379//
    [2021-11-10 11:48:12,913: INFO/MainProcess] mingle: searching for neighbors
    [2021-11-10 11:48:13,935: INFO/MainProcess] mingle: all alone
    [2021-11-10 11:48:13,946: INFO/MainProcess] celery@huanghuandeMacBook-Pro.local ready.
    

    我们可以看到,上面提示我们已经成功连接到了我们本地的redis,并且也看到有一个tasks的列表[tasks],里面有tasks.add

    • Step5 异步执行任务
    import tasks
    import time
    start = time.perf_counter()
    
    #添加2个任务
    result = tasks.add.delay(4,4) #注意是要使用add的delay方法,不然就还是同步调用了
    result2 = tasks.add.delay(3,3)
    
    print('is task ready:%s' % result.ready())
    print('is task2 ready:%s' % result2.ready())
    
    run_result = result.get()
    run_result2 = result2.get()
    
    print("task result :%s" % run_result)
    print("task result :%s" % run_result)
    
    end = time.perf_counter()
    print("spend time:{}".format(end - start))
    

    运行上面的代码,我们可以看到输出:

    is task ready:False
    is task2 ready:False
    task result :8
    task result2 :6
    spend time:1.296107417
    

    总共花费约1.2s,我们在add方法里面是sleep了1s,如果同步执行的话,至少需要花费2s。说明异步起作用了。我们从worker server那边的输出中也可以更清晰地看出来

    [2021-11-10 11:57:18,244: INFO/MainProcess] Task tasks.add[2de99f66-69e4-45e3-8a36-26aa92dd21e0] received
    [2021-11-10 11:57:18,245: INFO/MainProcess] Task tasks.add[562c9a4a-c66c-42e6-85cd-7deac122edc8] received
    [2021-11-10 11:57:18,246: WARNING/ForkPoolWorker-3] =====start add======
    [2021-11-10 11:57:18,246: WARNING/ForkPoolWorker-2] =====start add======
    [2021-11-10 11:57:18,246: WARNING/ForkPoolWorker-3] 
    
    [2021-11-10 11:57:18,247: WARNING/ForkPoolWorker-2] 
    
    [2021-11-10 11:57:19,250: INFO/ForkPoolWorker-2] Task tasks.add[2de99f66-69e4-45e3-8a36-26aa92dd21e0] succeeded in 1.0037125489999426s: 8
    [2021-11-10 11:57:19,250: INFO/ForkPoolWorker-3] Task tasks.add[562c9a4a-c66c-42e6-85cd-7deac122edc8] succeeded in 1.0037003060000416s: 6
    

    worker很快就几乎同时接收到了两个任务,然后开始去执行。注意上面的接收时间和开始执行的时间。我们在开始执行的时候,是有打印的。

    至此,我们已经实现了在django中使用异步任务。那么我们能否去监控任务的执行情况呢?比如有多少任务正在执行,有哪些任务已经执行完了,查看任务的执行结果。有很多方法可以去查看,这里介绍很方便的一种Flower.具有可视化的图形界面。

    Flower is a real-time web based monitor and administration tool for Celery.

    首先安装flower

    pip install flower
    

    然后启动flower

    celery -A tasks flower --broker=redis://127.0.0.1:6379/0
    

    启动成功后会显示:

     [I 211110 12:18:47 command:152] Visit me at http://localhost:5555
    [I 211110 12:18:47 command:159] Broker: redis://127.0.0.1:6379//
    [I 211110 12:18:47 command:160] Registered tasks: 
        ['celery.accumulate',
         'celery.backend_cleanup',
         'celery.chain',
         'celery.chord',
         'celery.chord_unlock',
         'celery.chunks',
         'celery.group',
         'celery.map',
         'celery.starmap',
         'tasks.add']
    [I 211110 12:18:47 mixins:226] Connected to redis://127.0.0.1:6379//
    

    我们可以通过http://localhost:5555访问,界面如下:

    flower.png

    the End

    相关文章

      网友评论

          本文标题:django笔记:celery之异步任务

          本文链接:https://www.haomeiwen.com/subject/jqmcmltx.html