美文网首页
Django 借助 Celery 实现计划任务排期及调度系统(d

Django 借助 Celery 实现计划任务排期及调度系统(d

作者: rollingstarky | 来源:发表于2020-05-08 21:26 被阅读0次

    一、环境搭建

    配置运行环境:

    $ python -m venv env
    $ source ./env/bin/activate
    $ pip install django-celery-beat django-celery-results redis
    

    项目初始化:

    $ django-admin startproject schedule_task
    $ cd schedule_task
    $ django-admin startapp schedules
    

    修改 schedule_task/settings.py 配置文件,将 ALLOWED_HOSTS = [] 改为 ALLOWED_HOSTS = ['*']
    运行 web 服务: $ python manage.py runserver 0.0.0.0:8000

    web

    二、启用 schedule-celery-beat 和 schedule-celery-results

    schedule_task/settings.py 文件中的 INSTALLED_APPS 配置项下,添加如下三个应用:

    INSTALLED_APPS = [
        ...
        'schedules',
        'django_celery_results',
        'django_celery_beat'
    ]
    

    其中 django_celery_results 用于在数据库中存储 Celery 任务执行的结果。
    django_celery_beat 则用于在数据库中记录预先定义好的任务执行规则(比如每隔一分钟执行一次),以及与这些规则关联的待执行的具体任务。

    数据库迁移,创建超级用户

    $ python manage.py migrate
    $ python manage.py createsuperuser
    

    三、系统后台

    启动 web 服务,用上一步中创建的超级用户登录后台管理系统:http://127.0.0.1:8000/admin 。界面如下:

    Django Admin

    界面中 CELERY RESULTSdjango_celery_results 创建的用于保存任务结果的数据库表。

    PERIODIC TASKS 下面则是由 django_celery_beat 创建的用于保存 Celery 任务及其执行规则的几张数据库表,具体含义如下:

    • Clocked:定义在具体某个时间点触发的执行规则
    • Crontabs:类似于 Linux 系统下 crontab 的语法
    • Intervals:定义任务重复执行的时间间隔
    • Periodic tasks:具体某个待执行的任务,需要与其他表(Clocked、Crontabs、Intervals、Solar events)中定义的执行规则相关联
    • Solar events:根据日升和日落等太阳运行轨迹确定执行规则

    如定义一个每隔 10 秒执行一次的规则,步骤如下:


    Intervals - Add Interval

    四、创建 Celery 任务

    Celery 任务需要在源代码中手动创建,具体可参考官方文档 Using Celery With Django,简要步骤如下:

    schedule_task/schedule_task/celery.py

    # schedule_task/schedule_task/celery.py
    from __future__ import absolute_import, unicode_literals
    import os
    from celery import Celery
    
    # set the default Django settings module for the 'celery' program.
    os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'schedule_task.settings')
    app = Celery('schedule_task')
    
    # - namespace='CELERY' means all celery-related configuration keys
    #   should have a `CELERY_` prefix.
    app.config_from_object('django.conf:settings', namespace='CELERY')
    
    # Load task modules from all registered Django app configs.
    app.autodiscover_tasks()
    

    schedule_task/schedule_task/__init__.py

    # schedule_task/schedule_task/__init__.py
    from __future__ import absolute_import, unicode_literals
    
    # This will make sure the app is always imported when
    # Django starts so that shared_task will use this app.
    from .celery import app as celery_app
    
    __all__ = ('celery_app',)
    

    schedule_tasks/schedules/tasks.py

    # schedule_tasks/schedules/tasks.py
    from __future__ import absolute_import, unicode_literals
    from celery import shared_task
    
    @shared_task(bind=True)
    def debug_task(self):
        return f'Hello Celery, the task id is: {self.request.id}'
    

    使用 Redis 作为 Message Broker,Django 默认配置的数据库作为 Result Backend,DatabaseScheduler 作为 Celery 的任务调度器:

    schedule_task/schedule_task/settings.py

    # schedule_task/schedule_task/settings.py
    # ...
    CELERY_RESULT_BACKEND = 'django-db'
    CELERY_BROKER_URL = 'redis://127.0.0.1:6379/0'
    CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
    

    此时可进入系统管理后台,将任务 debug_task 关联给每隔 10s 执行的规则:

    Periodic tasks - Add periodic task

    只需要填写基本信息,选择相关联的任务和 Schedule 即可。此外,还可以根据需求自行定义计划任务的其他参数,如:

    • 生效时间
    • 是否只执行一次
    • 传递给任务的参数
    • 失效时间

    五、运行测试

    为了使系统正常运行,需要同时开启三个服务

    • web 服务:python manage.py runserver 0.0.0.0:8000
    • Celery Worker:celery -A schedule_task worker -l info
    • Celery Beat:celery -A schedule_task beat -l info

    服务成功运行后,输出信息如下

    1. Celery Beat 持续监测数据库中存储的计划任务信息,将满足触发条件的任务传递给 Celery Worker 执行:
    $ celery -A schedule_task beat -l info
    celery beat v4.4.2 (cliffs) is starting.
    __    -    ... __   -        _
    LocalTime -> 2020-05-08 03:44:41
    Configuration ->
        . broker -> redis://127.0.0.1:6379/0
        . loader -> celery.loaders.app.AppLoader
        . scheduler -> django_celery_beat.schedulers.DatabaseScheduler
    
        . logfile -> [stderr]@%INFO
        . maxinterval -> 5.00 seconds (5s)
    [2020-05-08 03:44:41,578: INFO/MainProcess] beat: Starting...
    [2020-05-08 03:44:41,578: INFO/MainProcess] Writing entries...
    [2020-05-08 03:44:46,745: INFO/MainProcess] Writing entries...
    [2020-05-08 03:44:51,594: INFO/MainProcess] Scheduler: Sending due task debug_task (schedules.tasks.debug_task)
    [2020-05-08 03:45:01,585: INFO/MainProcess] Scheduler: Sending due task debug_task (schedules.tasks.debug_task)
    [2020-05-08 03:45:11,587: INFO/MainProcess] Scheduler: Sending due task debug_task (schedules.tasks.debug_task)
    [2020-05-08 03:45:21,588: INFO/MainProcess] Scheduler: Sending due task debug_task (schedules.tasks.debug_task)
    [2020-05-08 03:45:31,591: INFO/MainProcess] Scheduler: Sending due task debug_task (schedules.tasks.debug_task)
    
    1. Celery Worker 负责执行由 Beat 传过来的任务,输出执行结果并将结果保存至 result backend(即数据库):
    $ celery -A schedule_task worker -l info
    
    [tasks]
      . schedules.tasks.debug_task
    
    [2020-05-08 03:44:05,521: INFO/MainProcess] Connected to redis://127.0.0.1:6379/0
    [2020-05-08 03:44:05,529: INFO/MainProcess] mingle: searching for neighbors
    [2020-05-08 03:44:06,546: INFO/MainProcess] mingle: all alone
    [2020-05-08 03:44:06,558: INFO/MainProcess] celery@mirrors ready.
    [2020-05-08 03:44:51,607: INFO/MainProcess] Received task: schedules.tasks.debug_task[3d6b77bb-d4b7-4a5d-b05f-3b85e5dafce7]
    [2020-05-08 03:44:51,687: INFO/ForkPoolWorker-1] Task schedules.tasks.debug_task[3d6b77bb-d4b7-4a5d-b05f-3b85e5dafce7] succeeded in 0.07936301361769438s: 'Hello Celery, the task id is: 3d6b77bb-d4b7-4a5d-b05f-3b85e5dafce7'
    [2020-05-08 03:45:01,588: INFO/MainProcess] Received task: schedules.tasks.debug_task[a097dc02-71c9-4cab-9871-92ed1a7f2f45]
    [2020-05-08 03:45:01,660: INFO/ForkPoolWorker-1] Task schedules.tasks.debug_task[a097dc02-71c9-4cab-9871-92ed1a7f2f45] succeeded in 0.07120843604207039s: 'Hello Celery, the task id is: a097dc02-71c9-4cab-9871-92ed1a7f2f45'
    [2020-05-08 03:45:11,590: INFO/MainProcess] Received task: schedules.tasks.debug_task[1b0dfc23-d3cc-495a-b306-9d1defe4b119]
    [2020-05-08 03:45:11,659: INFO/ForkPoolWorker-1] Task schedules.tasks.debug_task[1b0dfc23-d3cc-495a-b306-9d1defe4b119] succeeded in 0.0677587790414691s: 'Hello Celery, the task id is: 1b0dfc23-d3cc-495a-b306-9d1defe4b119'
    
    后台管理系统 task results 界面: Celery Results › Task results

    task results 里默认显示的是 UTC 时间,可以修改 schedule_task/schedule_task/settings.py 配置文件更改时区设置:

    TIME_ZONE = 'Asia/Shanghai'
    

    PS:实际测试以后,此处的时区设置只会对网页端 task results 表格中显示的时间起作用,实际保存到 task results 数据库表中的时间依旧是 UTC 时间。如需要二次开发,可以调用返回的 datetime 对象的 astimezone 方法进行时区格式转换。

    参考资料

    Celery 4.4.2 documentation: First steps with Django
    django-celery-beat - Database-backed Periodic Tasks
    django-celery-results - Celery Result Backends for Django

    相关文章

      网友评论

          本文标题:Django 借助 Celery 实现计划任务排期及调度系统(d

          本文链接:https://www.haomeiwen.com/subject/epvxnhtx.html