Giter Club home page Giter Club logo

Comments (24)

ask avatar ask commented on May 25, 2024

On Thursday, April 14, 2011 at 5:31 PM, PonasNiekas wrote:

I'm having a weird issue with PostgreSQL (8.4.7) and celery (2.2.5) and Django (1.2.3).

Suddenly all my celery tasks, that are touching the database, are raising:

DatabaseError: server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.

Check the postgresql logs maybe? Often the reason is found in the logs.

Reply to this email directly or view it on GitHub:
https://github.com/ask/django-celery/issues/46

from django-celery.

 avatar commented on May 25, 2024

Hi ask, I updated my post with database log - the connections are often closed immediately after opening them. So the question is why, as I'm not doing anything with connections/transactions or alike, just creating/updating objects.
That leaves Django ORM and celery, I guess :/.

from django-celery.

ask avatar ask commented on May 25, 2024

Does it work better with Celery 2.2.4?

If not, then I'm not sure we can rule out some other error just yet. What does unexpectedly close the connection mean? Celery doesn't close the connection until after the task returns, so maybe the query timed out or similar? (Is the query very expensive?)

from django-celery.

 avatar commented on May 25, 2024

I started having the problem while using Celery 2.2.3, then upgraded to 2.2.5 with no change.

Just to clarify:

  • reading from database works fine everywhere
  • reading/writing/modifying data in Django admin works fine
  • reading data in celery tasks works fine
  • writing/modifying data in celery tasks fails randomly with: "server closed the connection unexpectedly"

Queries modifying data are simple, amount of data is small.

After some debugging, I believe that request_finished signal (which closes the db connection) is being send before the task is actually finished. (Everything works fine If I comment closing connection in django/db/init.py)

Is there any way to find out which task issues "request_finished" signal and when it does so ?

from django-celery.

 avatar commented on May 25, 2024

It looks like the source of the error I'm getting is in the psycopg2 module itself. For some reason it's raising DatabaseError: server closed the connection unexpectedly, even thought database logs show no disconnection entry.

I moved data to MySQL and everything works as expected.

tasks.py

class TestTask(Task):
    ignore_result = True

    def run(self):
        IP.objects.filter(ip="0.0.1.1").update(status=time.strftime("%Y.%m.%d %H:%M:%S"))
        return "ok"

Execution (first time the task runs fine):

In [4]: from WebTools.controlpanel.tasks import TestTask
In [5]: TestTask.apply_async()
Out[5]: <AsyncResult: f2f82ed2-f79e-4fdd-a40b-629f6ac33517>
In [6]: TestTask.apply_async()
Out[5]: <AsyncResult: 7a65f3c2-7e50-4dd2-b2ca-5dabe8b2f508>

celeryd.log

[2011-04-19 11:38:20,870: INFO/MainProcess] Got task from broker: WebTools.controlpanel.tasks.TestTask[f2f82ed2-f79e-4fdd-a40b-629f6ac33517]
[2011-04-19 11:38:21,095: INFO/MainProcess] Task WebTools.controlpanel.tasks.TestTask[f2f82ed2-f79e-4fdd-a40b-629f6ac33517] succeeded in 0.181342840195s: 'ok'
[2011-04-19 11:38:29,358: INFO/MainProcess] Got task from broker: WebTools.controlpanel.tasks.TestTask[7a65f3c2-7e50-4dd2-b2ca-5dabe8b2f508]
[2011-04-19 11:38:29,438: ERROR/MainProcess] Task WebTools.controlpanel.tasks.TestTask[7a65f3c2-7e50-4dd2-b2ca-5dabe8b2f508] raised exception: DatabaseError('server closed the connection unexpectedly\n\tThis probably means the server terminated abnormally\n\tbefore or while processing the request.\n',)
Traceback (most recent call last):
  File "/usr/local/lib/python2.6/dist-packages/celery/execute/trace.py", line 34, in trace
    return cls(states.SUCCESS, retval=fun(*args, **kwargs))
  File "/usr/local/lib/python2.6/dist-packages/celery/task/base.py", line 241, in __call__
    return self.run(*args, **kwargs)
  File "/usr/local/ast-tools/WebTools/../WebTools/controlpanel/tasks.py", line 352, in run
    IP.objects.filter(ip="0.0.1.1").update(status=time.strftime("%Y.%m.%d %H:%M:%S"))
  File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 467, in update
    rows = query.get_compiler(self.db).execute_sql(None)
  File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/compiler.py", line 861, in execute_sql
    cursor = super(SQLUpdateCompiler, self).execute_sql(result_type)
  File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/compiler.py", line 727, in execute_sql
    cursor.execute(sql, params)
  File "/usr/local/lib/python2.6/dist-packages/django/db/backends/util.py", line 15, in execute
    return self.cursor.execute(sql, params)
  File "/usr/local/lib/python2.6/dist-packages/django/db/backends/postgresql_psycopg2/base.py", line 44, in execute
    return self.cursor.execute(query, args)
DatabaseError: server closed the connection unexpectedly
This probably means the server terminated abnormally
    before or while processing the request.

None

PostgreSQL logs (for the time task run fine):

2011-04-19 11:38:20 CEST 4dad5505.678 0 LOG:  statement: UPDATE "controlpanel_ip" SET "status" = E'2011.04.19 11:38:20' WHERE HOST("controlpanel_ip"."ip") = E'0.0.1.1' 
2011-04-19 11:38:21 CEST 4dad5507.69d 0 LOG:  statement: BEGIN; SET TRANSACTION ISOLATION LEVEL READ COMMITTED
2011-04-19 11:38:21 CEST 4dad5507.69d 0 LOG:  statement: SELECT "djcelery_taskstate"."id",     "djcelery_taskstate"."state", "djcelery_taskstate"."task_id", "djcelery_taskstate"."name", "djcelery_taskstate"."tstamp", "djcelery_taskstate"."args", "djcelery_taskstate"."kwargs", "djcelery_taskstate"."eta", "djcelery_taskstate"."expires", "djcelery_taskstate"."result", "djcelery_taskstate"."traceback", "djcelery_taskstate"."runtime", "djcelery_taskstate"."worker_id", "djcelery_taskstate"."hidden" FROM "djcelery_taskstate" WHERE "djcelery_taskstate"."task_id" = E'f2f82ed2-f79e-4fdd-a40b-629f6ac33517' 
2011-04-19 11:38:21 CEST 4dad5505.678 1117441 LOG:  statement: COMMIT
2011-04-19 11:38:21 CEST 4dad5505.678 0 LOG:  disconnection: session time: 0:12:55.559 user=ast database=astdb host=10.47.247.155 port=36178
2011-04-19 11:38:21 CEST 4dad5507.69d 0 LOG:  statement: SELECT "djcelery_taskstate"."id", "djcelery_taskstate"."state", "djcelery_taskstate"."task_id", "djcelery_taskstate"."name", "djcelery_taskstate"."tstamp", "djcelery_taskstate"."args", "djcelery_taskstate"."kwargs", "djcelery_taskstate"."eta", "djcelery_taskstate"."expires", "djcelery_taskstate"."result", "djcelery_taskstate"."traceback", "djcelery_taskstate"."runtime", "djcelery_taskstate"."worker_id", "djcelery_taskstate"."hidden" FROM "djcelery_taskstate" WHERE "djcelery_taskstate"."task_id" = E'f2f82ed2-f79e-4fdd-a40b-629f6ac33517' 
2011-04-19 11:38:21 CEST 4dad5507.69d 0 LOG:  statement: SAVEPOINT s1225172112_x1
2011-04-19 11:38:21 CEST 4dad5507.69d 0 LOG:  statement: INSERT INTO "djcelery_taskstate" ("state", "task_id", "name", "tstamp", "args", "kwargs", "eta", "expires", "result", "traceback", "runtime", "worker_id", "hidden") VALUES (E'STARTED', E'f2f82ed2-f79e-4fdd-a40b-629f6ac33517', E'WebTools.controlpanel.tasks.TestTask', E'2011-04-19 11:38:20.914830', E'[]', E'{}', NULL, NULL, NULL, NULL, NULL, 1, false)
2011-04-19 11:38:21 CEST 4dad5507.69d 1117442 LOG:  statement: SELECT CURRVAL('"djcelery_taskstate_id_seq"')
2011-04-19 11:38:21 CEST 4dad5507.69d 1117442 LOG:  statement: RELEASE SAVEPOINT s1225172112_x1
2011-04-19 11:38:21 CEST 4dad5507.69d 1117442 LOG:  statement: COMMIT
2011-04-19 11:38:23 CEST 4dad5507.69d 0 LOG:  statement: BEGIN; SET TRANSACTION ISOLATION LEVEL READ COMMITTED
2011-04-19 11:38:23 CEST 4dad5507.69d 0 LOG:  statement: SELECT "djcelery_taskstate"."id", "djcelery_taskstate"."state", "djcelery_taskstate"."task_id", "djcelery_taskstate"."name", "djcelery_taskstate"."tstamp", "djcelery_taskstate"."args", "djcelery_taskstate"."kwargs", "djcelery_taskstate"."eta", "djcelery_taskstate"."expires", "djcelery_taskstate"."result", "djcelery_taskstate"."traceback", "djcelery_taskstate"."runtime", "djcelery_taskstate"."worker_id", "djcelery_taskstate"."hidden" FROM "djcelery_taskstate" WHERE "djcelery_taskstate"."task_id" = E'f2f82ed2-f79e-4fdd-a40b-629f6ac33517' 
2011-04-19 11:38:23 CEST 4dad5507.69d 0 LOG:  statement: SELECT (1) AS "a" FROM "djcelery_taskstate" WHERE "djcelery_taskstate"."id" = 98785  LIMIT 1
2011-04-19 11:38:23 CEST 4dad5507.69d 0 LOG:  statement: UPDATE "djcelery_taskstate" SET "state" = E'SUCCESS', "task_id" = E'f2f82ed2-f79e-4fdd-a40b-629f6ac33517', "name" = E'WebTools.controlpanel.tasks.TestTask', "tstamp" = E'2011-04-19 11:38:21.095244', "args" = E'[]', "kwargs" = E'{}', "eta" = NULL, "expires" = NULL, "result" = E'''ok''', "traceback" = NULL, "runtime" = 0.18134284019470215, "worker_id" = 1, "hidden" = false WHERE "djcelery_taskstate"."id" = 98785 
2011-04-19 11:38:23 CEST 4dad5507.69d 1117444 LOG:  statement: COMMIT

Second run when it fails:

2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: BEGIN; SET TRANSACTION ISOLATION LEVEL READ COMMITTED
2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: SELECT "djcelery_workerstate"."id", "djcelery_workerstate"."hostname", "djcelery_workerstate"."last_heartbeat" FROM "djcelery_workerstate" WHERE "djcelery_workerstate"."hostname" = E'ast-tool02' 
2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: SELECT (1) AS "a" FROM "djcelery_workerstate" WHERE "djcelery_workerstate"."id" = 1  LIMIT 1
2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: UPDATE "djcelery_workerstate" SET "hostname" = E'ast-tool02', "last_heartbeat" = E'2011-04-19 11:38:29.438096' WHERE "djcelery_workerstate"."id" = 1 
2011-04-19 11:38:31 CEST 4dad5507.69d 1117445 LOG:  statement: COMMIT
2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: BEGIN; SET TRANSACTION ISOLATION LEVEL READ COMMITTED
2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: SELECT "djcelery_taskstate"."id", "djcelery_taskstate"."state", "djcelery_taskstate"."task_id", "djcelery_taskstate"."name", "djcelery_taskstate"."tstamp", "djcelery_taskstate"."args", "djcelery_taskstate"."kwargs", "djcelery_taskstate"."eta", "djcelery_taskstate"."expires", "djcelery_taskstate"."result", "djcelery_taskstate"."traceback", "djcelery_taskstate"."runtime", "djcelery_taskstate"."worker_id", "djcelery_taskstate"."hidden" FROM "djcelery_taskstate" WHERE "djcelery_taskstate"."task_id" = E'7a65f3c2-7e50-4dd2-b2ca-5dabe8b2f508' 
2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: SELECT "djcelery_taskstate"."id", "djcelery_taskstate"."state", "djcelery_taskstate"."task_id", "djcelery_taskstate"."name", "djcelery_taskstate"."tstamp", "djcelery_taskstate"."args", "djcelery_taskstate"."kwargs", "djcelery_taskstate"."eta", "djcelery_taskstate"."expires", "djcelery_taskstate"."result", "djcelery_taskstate"."traceback", "djcelery_taskstate"."runtime", "djcelery_taskstate"."worker_id", "djcelery_taskstate"."hidden" FROM "djcelery_taskstate" WHERE "djcelery_taskstate"."task_id" = E'7a65f3c2-7e50-4dd2-b2ca-5dabe8b2f508' 
2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: SAVEPOINT s1225172112_x1
2011-04-19 11:38:31 CEST 4dad5507.69d 0 LOG:  statement: INSERT INTO "djcelery_taskstate" ("state", "task_id", "name", "tstamp", "args", "kwargs", "eta", "expires", "result", "traceback", "runtime", "worker_id", "hidden") VALUES (E'FAILURE', E'7a65f3c2-7e50-4dd2-b2ca-5dabe8b2f508', E'WebTools.controlpanel.tasks.TestTask', E'2011-04-19 11:38:29.438096', E'[]', E'{}', NULL, NULL, E'DatabaseError(''server closed the connection unexpectedly\\n\\tThis probably means the server terminated abnormally\\n\\tbefore or while processing the request.\\n'',)', E'Traceback (most recent call last):
#011  File "/usr/local/lib/python2.6/dist-packages/celery/execute/trace.py", line 34, in trace
#011    return cls(states.SUCCESS, retval=fun(*args, **kwargs))
#011  File "/usr/local/lib/python2.6/dist-packages/celery/task/base.py", line 241, in __call__
#011    return self.run(*args, **kwargs)
#011  File "/usr/local/ast-tools/WebTools/../WebTools/controlpanel/tasks.py", line 352, in run
#011    IP.objects.filter(ip="0.0.1.1").update(status=time.strftime("%Y.%m.%d %H:%M:%S"))
#011  File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 467, in update
#011    rows = query.get_compiler(self.db).execute_sql(None)
#011  File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/compiler.py", line 861, in execute_sql
#011    cursor = super(SQLUpdateCompiler, self).execute_sql(result_type)
#011  File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/compiler.py", line 727, in execute_sql
#011    cursor.execute(sql, params)
#011  File "/usr/local/lib/python2.6/dist-packages/django/db/backends/util.py", line 15, in execute
#011    return self.cursor.execute(sql, params)
#011  File "/usr/local/lib/python2.6/dist-packages/django/db/backends/postgresql_psycopg2/base.py", line 44, in execute
#011    return self.cursor.execute(query, args)
#011DatabaseError: server closed the connection unexpectedly
#011#011This probably means the server terminated abnormally
#011#011before or while processing the request.
#011
#011', NULL, 1, false)
2011-04-19 11:38:31 CEST 4dad5507.69d 1117446 LOG:  statement: SELECT CURRVAL('"djcelery_taskstate_id_seq"')
2011-04-19 11:38:31 CEST 4dad5507.69d 1117446 LOG:  statement: RELEASE SAVEPOINT s1225172112_x1
2011-04-19 11:38:31 CEST 4dad5507.69d 1117446 LOG:  statement: COMMIT

All this with latest psycopg2 (2.4).

from django-celery.

chase-seibert avatar chase-seibert commented on May 25, 2024

Having the same issue. Also PostgreSQL 8.4.7, but with Django 1.3 and the latest celery. So far it's only happening on celery startup with scheduled tasks.

Edit: I resolved it on my local machine with a monkey patch in loaders.py:

# inside DjangoLoader
def on_task_init(self, task_id, task):
    self.close_database()

from django-celery.

aldarund avatar aldarund commented on May 25, 2024

Faced this issue too.
Fix with patching loaders.py works fine for me.

from django-celery.

bradleyayers avatar bradleyayers commented on May 25, 2024

This issue is affecting me too.

from django-celery.

ask avatar ask commented on May 25, 2024

Ok, I added the patch given by @chase-seibert. Really would like to know why it's needed though.

from django-celery.

sbywater avatar sbywater commented on May 25, 2024

We began getting this problem recently, after a significant code change. This happens with only some tasks, only with some arguments. For a task that fails, that task with the same arguments will always fail. These tasks have all been working well until now. This is our stack:

celery 2.3.2
djcelery 2.3.3
PostgreSQL 8.4.7 on i686-pc-linux-gnu, compiled by GCC gcc-4.4.real (Ubuntu/Linaro 4.4.4-14ubuntu5) 4.4.5, 32-bit
psycopg2 2.4.2
Python 2.6.6 (r266:84292, Sep 15 2010, 15:52:39)
django 1.3.0

celeryd.log, slightly sanitized:

[2011-09-08 11:33:10,715: CRITICAL/MainProcess] Internal error <class 'django.db.utils.DatabaseError'>: server closed the connection unexpectedly
    This probably means the server terminated abnormally
    before or while processing the request.

Traceback (most recent call last):
  File "/home/django/myvirtualenv/lib/python2.6/site-packages/celery/worker/__init__.py", line 264, in process_task
    self.loglevel, self.logfile)
  File "/home/django/myvirtualenv/lib/python2.6/site-packages/celery/app/task/__init__.py", line 688, in execute
    request.execute_using_pool(pool, loglevel, logfile)
  File "/home/django/myvirtualenv/lib/python2.6/site-packages/celery/worker/job.py", line 366, in execute_using_pool
    timeout=self.task.time_limit)
  File "/home/django/myvirtualenv/lib/python2.6/site-packages/celery/concurrency/base.py", line 84, in apply_async
    target, args, kwargs))
  File "/home/django/myvirtualenv/src/django/django/db/models/base.py", line 370, in __repr__
    u = unicode(self)
  File "/home/django/myvirtualenv/myproject/myapp/models/thing_models.py", line 183, in __unicode__
    self.thing_type.thing_type_name, self.id,
  File "/home/django/myvirtualenv/src/django/django/db/models/fields/related.py", line 315, in __get__
    rel_obj = QuerySet(self.field.rel.to).using(db).get(**params)
  File "/home/django/myvirtualenv/src/django/django/db/models/query.py", line 344, in get
    num = len(clone)
  File "/home/django/myvirtualenv/src/django/django/db/models/query.py", line 82, in __len__
    self._result_cache = list(self.iterator())
  File "/home/django/myvirtualenv/src/django/django/db/models/query.py", line 273, in iterator
    for row in compiler.results_iter():
  File "/home/django/myvirtualenv/src/django/django/db/models/sql/compiler.py", line 680, in results_iter
    for rows in self.execute_sql(MULTI):
  File "/home/django/myvirtualenv/src/django/django/db/models/sql/compiler.py", line 735, in execute_sql
    cursor.execute(sql, params)
  File "/home/django/myvirtualenv/src/django/django/db/backends/postgresql_psycopg2/base.py", line 44, in execute
    return self.cursor.execute(query, args)
DatabaseError: server closed the connection unexpectedly
    This probably means the server terminated abnormally
    before or while processing the request.

How this error looks in PostgreSQL logs:

2011-09-08 13:37:52 UTC LOG:  SSL error: decryption failed or bad record mac
2011-09-08 13:37:52 UTC LOG:  could not receive data from client: Connection reset by peer
2011-09-08 13:37:52 UTC LOG:  unexpected EOF on client connection

from django-celery.

sbywater avatar sbywater commented on May 25, 2024

Update: we were able to fix this by reinstalling the latest version of celery.

from django-celery.

vibrant avatar vibrant commented on May 25, 2024

I had the same issue and upgrading to newest celery, django-celery and psycopg didn't help. What helped was making one of my tasks stop calling sys.exit(-1). Comes out celery is not happy about such things ;)

from django-celery.

mixmastamyk avatar mixmastamyk commented on May 25, 2024

Having the same issue with postgres 9.1 and pgbouncer. Upgraded celery, djcelery and psycopg2 to newest versions. The addition to loaders,py didn't help.

from django-celery.

danawoodman avatar danawoodman commented on May 25, 2024

For those that are Googling for this issue: Make sure you're running Django Celery 2.5.5+, that fixes the issue.

from django-celery.

dpedowitz avatar dpedowitz commented on May 25, 2024

I'm running celery 3.0.11, no django and this is happening for me. I've attempted to add a custom loader but the loader does not appear to be registered.

from django-celery.

mattmcgrew avatar mattmcgrew commented on May 25, 2024

We recently ran into this issue (we're stuck on an older version of Celery) and it turned out to be a (admittedly rather dumb) non-lazy query in a Django form class definition's form field. The form was imported before the process forked, creating a connection in the parent that was then trashed in the child process. We changed the query to be lazy and it fixed the issue. We never noticed it before because the form was never imported early enough to cause the issue elsewhere. We started seeing it when we added a new task that happened to import something that imported something that imported the form in question.

from django-celery.

DylanYoung avatar DylanYoung commented on May 25, 2024

This is still an issue in 3.0.17. Just had Celery go down on 6 instances running Django 1.3. Could be that a newer version of Django fixes it (none of our 1.6 instances went down).

from django-celery.

auvipy avatar auvipy commented on May 25, 2024

try to upgrade to 1.8

from django-celery.

DylanYoung avatar DylanYoung commented on May 25, 2024

If only that were as easy as you make it sound. Working on it though.

from django-celery.

auvipy avatar auvipy commented on May 25, 2024

it is not a good idea to run un-supported software frameworks.

from django-celery.

mrgaolei avatar mrgaolei commented on May 25, 2024

@ask This bug to reproduce on my server.

Django: 1.9.7
Celery: 4.0.2
PostgreSQL: 9.4

from django-celery.

Revocsid avatar Revocsid commented on May 25, 2024

Same problem here with periodic tasks that need to fetch / update models stored in my PostgreSQL db.

Django : 1.11.1
Psycopg2 : 2.7.1
Celery : 4.0.2
PostgreSQL : 9.6.2

from django-celery.

goinnn avatar goinnn commented on May 25, 2024

Me too, is it related?

psycopg/psycopg2#443

This will be fixed in the next psycopg2 release (2.7.2)

from django-celery.

dojeda avatar dojeda commented on May 25, 2024

I've stumbled on this exact problem. Difficult to pin down since I could not reproduce it at first.

To reproduce it, I created a dummy task that would query the database with an artificially long query. Something like

@shared_task
def dummy():
    result = list(SomeModel.objects.raw('SELECT id, pg_sleep(1) FROM some_table LIMIT 10'))

This task will connect with the database for 10 seconds.
On the other hand, I connected to my postgres database and severed the connection while the task is running with:

SELECT pid, pg_terminate_backend(pid), datid, client_addr, client_hostname, state FROM pg_stat_get_activity(NULL::integer) 
WHERE datid=(SELECT oid from pg_database where datname = 'django_db');

Where django_db is the database that I have for my Django application.

This procedure reproduced my problem and gave me something to work with. My best workaround is to add django.db.utils.OperationalError to the autoretry_for of my task. Since I had many tasks, and was already using a different base Task object, and did not want to change them all, I wrapped the celery.shared_task decorator:

import celery
def shared_task(*args, **kwargs):
    """Task decorator for robust tasks

    Robust tasks are tasks that will be retried 5 times if one of the
    exceptions in ...
    """
    kwargs['base'] = kwargs.get('base', None) or BaseTask  # BaseTask is my abstract base class that has some custom logging logic
    # Add the django.db.utils.OperationalError in the auto retry list of
    # exceptions. It seems that postgres is closing some connections and this
    # makes the task fail
    kwargs['autoretry_for'] = kwargs.get('autoretry_for', ()) + (django.db.utils.OperationalError,)
    if 'max_retries' not in kwargs:
        kwargs['max_retries'] = 5
    if 'countdown' not in kwargs and 'eta' not in kwargs:
        # TODO: the countdown might not be working on celery 4.1.0 ?
        # I haven't had any problems yet, but keep an eye on this issue:
        # https://github.com/celery/celery/issues/4221
        kwargs['countdown'] = 10
    return celery.shared_task(*args, **kwargs)

from django-celery.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.