» Celery multi with queues set up not receiving tasks from django
Celery multi with queues set up not receiving tasks from django
|July 25, 2014
Posted by forumadmin
I am running my workers with the following command:
celery -A myapp multi start 4 -l debug -Q1:3 queue1,queue2 -Q:4 queue3
The workers start out very well so when i run
celery inspect active_queues
the queues appear assigned.
Then i start tasks from my django app with the following code:
result = chain(task1.s(**kwargs).set(queue='queue1'),task2.s(**kwargs).set(queue='queue2'))()
i parse the result variable with result.parent to get all tasks IDs and record them to database for further inspection. When i issue
task = AsyncResult(task.id)
for every task i start with my chain. The celery logs doesn’t seem to be receiving any tasks. However when i issue a
command with a following
i get message that my tasks has been actually removed from 1 queue
the AsyncResult.status on the deleted tasks from here on continue to show up as ‘PENDING’ and the tasks never start.
I use rabbitmq-server as a broker with all default configuration. My celery config is default. It is really strange but in another environment the same code and commands produce other results: The workers also start but they do receive the very same tasks and execute them without any issues. Please consider what might be an issue here.
p.s. when i start a worker the other way:
celery -A myapp worker -Q queue1,queue2,queue3 -l debug
i still cant get my tasks executing.
The problem started to show up when i modified my chain to launch tasks and added the
or queue2 or queue3
all my tasks are written with a
Is there at least a way to see which tasks (which i can remove by celery purge) are waiting on a queue and what is the queue name they are waiting for?
More Related Questions
- celery save task to database so in my previous project i used django-celery but in my current project i am using celery because djcelery THIS PROJECT IS NO LONGER REQUIRED
Im using Redis as my backend, and i would […]
- Whats the advantage of using celery with rabbitmq over Redis, MongoDB or Django ORM) Is there a speed advantage, functionality advantage?
Also using Django ORM (mysql), is it a lot slower than using rabbitmq, mongodb or redis?
Im currently hosting on webfaction, and […]
- Understanding difference between celery and celery milti? When I run Celery with one worker with celery worker -A myapp -l info, everything is fine, tasks are performed.
But when I run celery multi start 2 -Q:1 message_send -Q:2 message_manager […]
- Shared XMPP connection between Celery workers My web app needs to be able to send XMPP messages (Facebook Chat), and I thought Celery might be a good solution for this. A task would consist of querying the database and sending the […]
- Cancel an already executing task with Celery? I am new to Django and Celery. I have been reading the doc and searching but cannot seem to find a straight answer:
Can you cancel an already executing task? (as in the task has started, […]
- How to apply Celery signals to a group of tasks I want to do some processing before and after a group of tasks is run. I know I can connect a single task to task_prerun and task_postrun signals, but I'm using task.chunks and need to […]
- Appropriate approach for MQ tasks in Django I'm wondering what criteria would need to be considered when we need to use some kind of task queue in a django project, I'm thinking in performace, development speed, flexibility, […]
- Best Practices for running celery on heroku Lets say I have the following processes declared in my Procfile
web: newrelic-admin run-program python manage.py run_gunicorn -b 0.0.0.0:$PORT -w 9 -k gevent --max-requests 250 --preload […]
- django celery redis error ValueError: invalid literal for int() with base 10: OK I keep getting the error "ValueError: invalid literal for int() with base 10: 'OK'" when I am trying to run a delayed task with celery with redis in django. Could some show me what is […]
- CPU Concurrency behavior with celery task that calls a subprocess I have a celery task which uses subprocess.Popen() to call out to an executable which does some CPU-intensive data crunching. It works well but does not take full advantage of the celery […]