Celery with Django: Every Basic of celery in one post

pysyntaxguru
3 min readMay 27, 2021

--

Running a periodic task is important job in web development but unfortunately django don’t have such built-in solution for periodic task.

In this situation, to perform scheduled task we have to rely on celery but celery use case are huge and it’s not limited to periodic jobs.

The celery developer team describe it as follows

Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operations but supports scheduling as well. The execution units called tasks are executed concurrently on one or more worker servers using multiprocessing, Eventlet, or gevent. Tasks can execute asynchronously (in the background) or synchronously (wait until ready).

What is celery?

to know more about celery, learn more here

Agenda of this post

We will demonstrate all the basic thing you can do with celery

  • Asynchronous task
  • Celery Worker
  • Celery Beat
  • Celery Schedule task
  • and some other important thing

Let’s get started

Before we go, we have to know some basic thing and difference.

  • What is different between celery beats and celery worker?

Celery Beats is a scheduler that sends predefined tasks to a celery worker at a given time. You needs celery beats only when you want to run your job on schedule. and celery worker do all the task that sent by beats.

Also celery worker do the job that you run manually.

# Installation

pip3 install celery

pip3 install redis

# Setup Celery in your django project

myproject/celery.py

and import your celery app to django initializer file __init__.py

myproject/__init__.py

To use the celery beat, you need to use caching server as broker, to do it, update the settings

myproject/settings.py

CELERY_BROKER_URL = 'redis://localhost:6379'  
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = "UTC"

If everything ok, you can can run worker commands

celery -A myproject worker -l info

If everything is ok, you should some information of the server that are aready

To terminate it,

pkill -f "celery worker"

It will stop all the process that are initialized by celery

# Write a task

We can write task anywhere that are discover-able but now we will write task inside django app.

to install a django app

python3 manage.py startapp scheduler

and add it in your settings INSTALLED_APP = [] array

then write a task there.

scheduler/tasks.py

Now you re-run the worker and in detach mode. also --detach more is being used in production

celery -A myproject worker -l info --logfile=celery.log --detach

We are saving the log in celery.log file as output

and it’s time test if it’s working.

Run in your terminal following this.

python manage.py shell  

from scheduler.tasks import see_iam_working

see_iam_working.delay("Yes I am working")

and if you print your logo file with cat celery.log you should see the output there

Let’s do the the job on schedule

in above part, we learned how we can use celery for asynchronize

and now we will do it periodically

You may notice app.conf.beat_schedule objects, in the schedule key, we can use contab as value, celery has its own contab module.

Now stop all the celery process pkill -f "celery worker"

And run chelery beat and work in detach modes

beat will send message to the work to run the task at given time

celery -A myproject beat -l info --logfile=celery.beat.log --detach  
celery -A myproject worker -l info --logfile=celery.log --detach

If you pront the out log with cat celery.beat.log you will see it’s updating every 10 second with the message

--

--

No responses yet