Giter Club home page Giter Club logo

django-celery's Introduction

image

Build status coverage BSD License Celery can be installed via wheel Semgrep security Supported Python versions. Supported Python implementations. Backers on Open Collective Sponsors on Open Collective

Version

5.4.0 (opalescent)

Web

https://docs.celeryq.dev/en/stable/index.html

Download

https://pypi.org/project/celery/

Source

https://github.com/celery/celery/

Keywords

task, queue, job, async, rabbitmq, amqp, redis, python, distributed, actors

Donations

This project relies on your generous donations.

If you are using Celery to create a commercial product, please consider becoming our backer or our sponsor to ensure Celery's future.

For enterprise

Available as part of the Tidelift Subscription.

The maintainers of celery and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. Learn more.

What's a Task Queue?

Task queues are used as a mechanism to distribute work across threads or machines.

A task queue's input is a unit of work, called a task, dedicated worker processes then constantly monitor the queue for new work to perform.

Celery communicates via messages, usually using a broker to mediate between clients and workers. To initiate a task a client puts a message on the queue, the broker then delivers the message to a worker.

A Celery system can consist of multiple workers and brokers, giving way to high availability and horizontal scaling.

Celery is written in Python, but the protocol can be implemented in any language. In addition to Python there's node-celery for Node.js, a PHP client, gocelery, gopher-celery for Go, and rusty-celery for Rust.

Language interoperability can also be achieved by using webhooks in such a way that the client enqueues an URL to be requested by a worker.

What do I need?

Celery version 5.3.5 runs on:

  • Python (3.8, 3.9, 3.10, 3.11, 3.12)
  • PyPy3.9+ (v7.3.12+)

This is the version of celery which will support Python 3.8 or newer.

If you're running an older version of Python, you need to be running an older version of Celery:

  • Python 3.7: Celery 5.2 or earlier.
  • Python 3.6: Celery 5.1 or earlier.
  • Python 2.7: Celery 4.x series.
  • Python 2.6: Celery series 3.1 or earlier.
  • Python 2.5: Celery series 3.0 or earlier.
  • Python 2.4: Celery series 2.2 or earlier.

Celery is a project with minimal funding, so we don't support Microsoft Windows but it should be working. Please don't open any issues related to that platform.

Celery is usually used with a message broker to send and receive messages. The RabbitMQ, Redis transports are feature complete, but there's also experimental support for a myriad of other solutions, including using SQLite for local development.

Celery can run on a single machine, on multiple machines, or even across datacenters.

Get Started

If this is the first time you're trying to use Celery, or you're new to Celery v5.4.x coming from previous versions then you should read our getting started tutorials:

You can also get started with Celery by using a hosted broker transport CloudAMQP. The largest hosting provider of RabbitMQ is a proud sponsor of Celery.

Celery is...

  • Simple

    Celery is easy to use and maintain, and does not need configuration files.

    It has an active, friendly community you can talk to for support, like at our mailing-list, or the IRC channel.

    Here's one of the simplest applications you can make:

    from celery import Celery
    
    app = Celery('hello', broker='amqp://guest@localhost//')
    
    @app.task
    def hello():
        return 'hello world'
  • Highly Available

    Workers and clients will automatically retry in the event of connection loss or failure, and some brokers support HA in way of Primary/Primary or Primary/Replica replication.

  • Fast

    A single Celery process can process millions of tasks a minute, with sub-millisecond round-trip latency (using RabbitMQ, py-librabbitmq, and optimized settings).

  • Flexible

    Almost every part of Celery can be extended or used on its own, Custom pool implementations, serializers, compression schemes, logging, schedulers, consumers, producers, broker transports, and much more.

It supports...

  • Message Transports

  • Concurrency

  • Result Stores

    • AMQP, Redis
    • memcached
    • SQLAlchemy, Django ORM
    • Apache Cassandra, IronCache, Elasticsearch
  • Serialization

    • pickle, json, yaml, msgpack.
    • zlib, bzip2 compression.
    • Cryptographic message signing.

Framework Integration

Celery is easy to integrate with web frameworks, some of which even have integration packages:

Django not needed
Pyramid pyramid_celery
Pylons celery-pylons
Flask not needed
web2py web2py-celery
Tornado tornado-celery
FastAPI not needed

The integration packages aren't strictly necessary, but they can make development easier, and sometimes they add important hooks like closing database connections at fork.

Documentation

The latest documentation is hosted at Read The Docs, containing user guides, tutorials, and an API reference.

最新的中文文档托管在 https://www.celerycn.io/ 中,包含用户指南、教程、API接口等。

Installation

You can install Celery either via the Python Package Index (PyPI) or from source.

To install using pip:

:

$ pip install -U Celery

Bundles

Celery also defines a group of bundles that can be used to install Celery and the dependencies for a given feature.

You can specify these in your requirements or on the pip command-line by using brackets. Multiple bundles can be specified by separating them by commas.

:

$ pip install "celery[redis]"

$ pip install "celery[redis,auth,msgpack]"

The following bundles are available:

Serializers

celery[auth]

for using the auth security serializer.

celery[msgpack]

for using the msgpack serializer.

celery[yaml]

for using the yaml serializer.

Concurrency

celery[eventlet]

for using the eventlet pool.

celery[gevent]

for using the gevent pool.

Transports and Backends

celery[amqp]

for using the RabbitMQ amqp python library.

celery[redis]

for using Redis as a message transport or as a result backend.

celery[sqs]

for using Amazon SQS as a message transport.

celery[tblib]

for using the task_remote_tracebacks feature.

celery[memcache]

for using Memcached as a result backend (using pylibmc)

celery[pymemcache]

for using Memcached as a result backend (pure-Python implementation).

celery[cassandra]

for using Apache Cassandra/Astra DB as a result backend with the DataStax driver.

celery[azureblockblob]

for using Azure Storage as a result backend (using azure-storage)

celery[s3]

for using S3 Storage as a result backend.

celery[gcs]

for using Google Cloud Storage as a result backend.

celery[couchbase]

for using Couchbase as a result backend.

celery[arangodb]

for using ArangoDB as a result backend.

celery[elasticsearch]

for using Elasticsearch as a result backend.

celery[riak]

for using Riak as a result backend.

celery[cosmosdbsql]

for using Azure Cosmos DB as a result backend (using pydocumentdb)

celery[zookeeper]

for using Zookeeper as a message transport.

celery[sqlalchemy]

for using SQLAlchemy as a result backend (supported).

celery[pyro]

for using the Pyro4 message transport (experimental).

celery[slmq]

for using the SoftLayer Message Queue transport (experimental).

celery[consul]

for using the Consul.io Key/Value store as a message transport or result backend (experimental).

celery[django]

specifies the lowest version possible for Django support.

You should probably not use this in your requirements, it's here for informational purposes only.

Downloading and installing from source

Download the latest version of Celery from PyPI:

https://pypi.org/project/celery/

You can install it by doing the following:

:

$ tar xvfz celery-0.0.0.tar.gz
$ cd celery-0.0.0
$ python setup.py build
# python setup.py install

The last command must be executed as a privileged user if you aren't currently using a virtualenv.

Using the development version

With pip

The Celery development version also requires the development versions of kombu, amqp, billiard, and vine.

You can install the latest snapshot of these using the following pip commands:

:

$ pip install https://github.com/celery/celery/zipball/main#egg=celery
$ pip install https://github.com/celery/billiard/zipball/main#egg=billiard
$ pip install https://github.com/celery/py-amqp/zipball/main#egg=amqp
$ pip install https://github.com/celery/kombu/zipball/main#egg=kombu
$ pip install https://github.com/celery/vine/zipball/main#egg=vine

With git

Please see the Contributing section.

Getting Help

Mailing list

For discussions about the usage, development, and future of Celery, please join the celery-users mailing list.

IRC

Come chat with us on IRC. The #celery channel is located at the Libera Chat network.

Bug tracker

If you have any suggestions, bug reports, or annoyances please report them to our issue tracker at https://github.com/celery/celery/issues/

Wiki

https://github.com/celery/celery/wiki

Credits

Contributors

This project exists thanks to all the people who contribute. Development of celery happens at GitHub: https://github.com/celery/celery

You're highly encouraged to participate in the development of celery. If you don't like GitHub (for some reason) you're welcome to send regular patches.

Be sure to also read the Contributing to Celery section in the documentation.

oc-contributors

Backers

Thank you to all our backers! 🙏 [Become a backer]

oc-backers

Sponsors

Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [Become a sponsor]

oc-sponsor-1 Upstash

License

This software is licensed under the New BSD License. See the LICENSE file in the top distribution directory for the full license text.

django-celery's People

Contributors

artscoop avatar ask avatar auvipy avatar davidfischer-ch avatar diegueus9 avatar dlamotte avatar dongweiming avatar enagorny avatar iamjstates avatar ionelmc avatar jasonbaker avatar jezdez avatar jonashaag avatar justquick avatar kipanshi avatar kmike avatar nikolas avatar nuklea avatar nvie avatar pigjj avatar piotrsikora avatar public avatar realitycheck avatar rockallite avatar stranger6667 avatar thedrow avatar vytisb avatar weipin avatar zhiwehu avatar zpencerq avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

django-celery's Issues

celery.execute.send_task lacks input validation

I use the function

celery.execute.send_task(name,...) 

to submit taks to celery and if there is a typo in "name" the task is still accepted and celery issues a task_id. Any subsequent call to

default_app.backend.get_status(task_id)

returns "PENDING" and there is no obvious way to know the submission was invalid and the task is being ignored.

CELERY_CACHE_BACKEND setting docs improvements

The docs on the CELERY_CACHE_BACKEND could be improved.

For one thing, they claim that celery can be used with pylibmc, and that in fact python-memcached will only be used if pylibmc is not installed. However, this doesn't seem to be true. If I don't have python-memcached installed, the setup described in the docs will cause the celeryd management command to fail.

Additionally, the docs make no mention of the fact that (at least in django 1.3) one has the option of using the CACHES setting to define multiple cache backends. Because CELERY_CACHE_BACKED is passed directly into django.core.cache.get_cache, one can simply set CELERY_CACHE_BACKEND to the key for the appropriate cache, instead of trying to figure out celery-specific settings. This would be much simpler, especially since the arguments accepted by get_cache are only described in its docstring, not in the django cache documentation.

result.get hangs if result backend not set

Original problem:
I have two different django projects say projA and projB, each have its own celery daemon running on separate queues but same vhost, projA have a task taskA and projB have a task taskB, I try to run taskB from inside taskA e.g.

@task(routing_key='taskA')
def taskA(event_id):
    # do some work , then call taskB and wait for result
    result = send_task('taskB',routing_key='taskB')
    res = result.get(timeout=20)

I can see in logs of projB that taskB finished within a second, but taskA keeps on waiting for result and times out after 20 seconds

For backend I have rabbitmq.

Solution:
Setting the result back-end fixes the problem

CELERY_RESULT_BACKEND = "amqp"
CELERY_AMQP_TASK_RESULT_EXPIRES = 1000 

IMO if result back-end is not set result.get should throw error or at-least log a warning

ERROR: test_reserve (djcelery.tests.test_schedulers.test_DatabaseScheduler)

Running tests with python-2.6.6 fails:

ERROR: test_reserve (djcelery.tests.test_schedulers.test_DatabaseScheduler)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/tmp/buildd/django-celery-2.2.2/djcelery/tests/test_schedulers.py", line 159, in test_reserve
    self.s[self.m1.name] = self.s.reserve(e1)
TypeError: 'TrackingScheduler' object does not support item assignment

TaskState.args and TaskState.kwargs in djcelery model are currently limited to 200 chars.

I'm using some tasks with very long kwargs. When running the camera I received this stack trace:

Traceback (most recent call last):
File "manage.py", line 11, in
execute_manager(settings)
File "/opt/py26/lib/python2.6/site-packages/django/core/management/init.py", line 438, in execute_manager
utility.execute()
File "/opt/py26/lib/python2.6/site-packages/django/core/management/init.py", line 379, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/opt/py26/lib/python2.6/site-packages/django/core/management/base.py", line 191, in run_from_argv
self.execute(_args, *_options.dict)
File "/opt/py26/lib/python2.6/site-packages/django/core/management/base.py", line 220, in execute
output = self.handle(_args, *_options)
File "/opt/py26/lib/python2.6/site-packages/djcelery/management/commands/celerycam.py", line 19, in handle
run_celeryev(_args, *_options)
File "/opt/py26/lib/python2.6/site-packages/celery/bin/celeryev.py", line 49, in run_celeryev
loglevel=loglevel, logfile=logfile)
File "/opt/py26/lib/python2.6/site-packages/celery/events/snapshot.py", line 96, in evcam
cam.cancel()
File "/opt/py26/lib/python2.6/site-packages/celery/events/snapshot.py", line 61, in cancel
self._tref()
File "/opt/py26/lib/python2.6/site-packages/celery/utils/timer2.py", line 41, in call
return self.fun(_self.args, *_self.kwargs)
File "/opt/py26/lib/python2.6/site-packages/celery/utils/timer2.py", line 192, in _reschedules
return fun(_args, *_kwargs)
File "/opt/py26/lib/python2.6/site-packages/celery/events/snapshot.py", line 57, in capture
return self.state.freeze_while(self.shutter)
File "/opt/py26/lib/python2.6/site-packages/celery/events/state.py", line 198, in freeze_while
return fun(_args, *_kwargs)
File "/opt/py26/lib/python2.6/site-packages/celery/events/snapshot.py", line 53, in shutter
self.on_shutter(self.state)
File "/opt/py26/lib/python2.6/site-packages/djcelery/snapshot.py", line 82, in on_shutter
map(self.handle_task, state.tasks.items())
File "/opt/py26/lib/python2.6/site-packages/djcelery/snapshot.py", line 55, in handle_task
"worker": worker})
File "/opt/py26/lib/python2.6/site-packages/djcelery/snapshot.py", line 65, in update_task
return objects.create(**dict(kwargs, **defaults))
File "/opt/py26/lib/python2.6/site-packages/django/db/models/manager.py", line 138, in create
return self.get_query_set().create(**kwargs)
File "/opt/py26/lib/python2.6/site-packages/django/db/models/query.py", line 352, in create
obj.save(force_insert=True, using=self.db)
File "/opt/py26/lib/python2.6/site-packages/django/db/models/base.py", line 434, in save
self.save_base(using=using, force_insert=force_insert, force_update=force_update)
File "/opt/py26/lib/python2.6/site-packages/django/db/models/base.py", line 527, in save_base
result = manager._insert(values, return_id=update_pk, using=using)
File "/opt/py26/lib/python2.6/site-packages/django/db/models/manager.py", line 195, in _insert
return insert_query(self.model, values, **kwargs)
File "/opt/py26/lib/python2.6/site-packages/django/db/models/query.py", line 1479, in insert_query
return query.get_compiler(using=using).execute_sql(return_id)
File "/opt/py26/lib/python2.6/site-packages/django/db/models/sql/compiler.py", line 783, in execute_sql
cursor = super(SQLInsertCompiler, self).execute_sql(None)
File "/opt/py26/lib/python2.6/site-packages/django/db/models/sql/compiler.py", line 727, in execute_sql
cursor.execute(sql, params)
File "/opt/py26/lib/python2.6/site-packages/django/db/backends/postgresql_psycopg2/base.py", line 44, in execute
return self.cursor.execute(query, args)
django.db.utils.DatabaseError: value too long for type character varying(200)

I was able to resolve the issue for my case by increasing the size of the TaskState.args and TaskState.kwargs to 1900. However the correct solution is probably to use a TextField.

Using 'dummy' cache backend throws a ValueError

I have the following in my settings file.

CELERY_RESULT_BACKEND = "cache"
CELERY_CACHE_BACKEND = 'dummy'

Whenever I load my tasks with this configuration, Celery throws a ValueError and dies. Traceback follows:

/usr/local/lib/python2.7/site-packages/celery/local.pyc in __getattr__(self, name)
     54         if name == '__members__':
     55             return dir(self._get_current_object())
---> 56         return getattr(self._get_current_object(), name)
     57 
     58     def __setitem__(self, key, value):

/usr/local/lib/python2.7/site-packages/kombu/utils/__init__.pyc in __get__(self, obj, type)
    219             return obj.__dict__[self.__name__]
    220         except KeyError:
--> 221             value = obj.__dict__[self.__name__] = self.__get(obj)
    222             return value
    223 

/usr/local/lib/python2.7/site-packages/celery/app/__init__.pyc in Task(self)
    161     def Task(self):
    162         """Default Task base class for this application."""
--> 163         return self.create_task_cls()
    164 
    165     def __repr__(self):

/usr/local/lib/python2.7/site-packages/celery/app/__init__.pyc in create_task_cls(self)
     59         from celery.task.base import BaseTask
     60 
---> 61         class Task(BaseTask):
     62             abstract = True
     63             app = self

/usr/local/lib/python2.7/site-packages/celery/app/__init__.pyc in Task()
     62             abstract = True
     63             app = self
---> 64             backend = self.backend
     65             exchange_type = conf.CELERY_DEFAULT_EXCHANGE_TYPE
     66             delivery_mode = conf.CELERY_DEFAULT_DELIVERY_MODE

/usr/local/lib/python2.7/site-packages/kombu/utils/__init__.pyc in __get__(self, obj, type)
    219             return obj.__dict__[self.__name__]
    220         except KeyError:
--> 221             value = obj.__dict__[self.__name__] = self.__get(obj)
    222             return value
    223 

/usr/local/lib/python2.7/site-packages/celery/app/base.pyc in backend(self)
    310         """Storing/retreiving task state.  See
    311         :class:`~celery.backend.base.BaseBackend`."""
--> 312         return self._get_backend()
    313 
    314     @cached_property

/usr/local/lib/python2.7/site-packages/celery/app/base.pyc in _get_backend(self)
    294         from celery.backends import get_backend_cls
    295         backend_cls = self.backend_cls or self.conf.CELERY_RESULT_BACKEND
--> 296         backend_cls = get_backend_cls(backend_cls, loader=self.loader)
    297         return backend_cls(app=self)
    298 

/usr/local/lib/python2.7/site-packages/celery/backends/__init__.py in get_backend_cls(backend, loader)
     26             raise ValueError("Unknown result backend: %r.  "
     27                              "Did you spell it correctly?  (%s)" % (backend,
---> 28                                                                     exc))
     29     return _backend_cache[backend]
     30 

ValueError: Unknown result backend: 'cache'.  Did you spell it correctly?  (Couldn't import 'djcelery.backends.cache.CacheBackend': need more than 1 value to unpack)

"tuple index out of range" when using task_status view on task generating an exception

In the "task_status" view, whenever I try to call this view on a task that has generated an exception, I get a "tuple index out of range" error. Specifically, the errors occurs in the dictionary where it assigns "result": str(res.args[0])

I am not sure what "res.args" is exactly, but replacing that line with 'str(res.args[0] if res.args else "")' allows me to get the exception info as expected.

open db connections

connections are cept open, although no work is done (by the tasks, just waiting for next periodic execution). I am using celery 1.0.5.

I think we can close the connection after the task, not before the
task as is done now. (comment on mailing list from Ask Solem)

DatabaseScheduler does not take respekt of 'enabled' setting

If I start the celery daemon, disable a task in the django admin everything works fine and the task is not executed anymore.

But once I restart the celery runner the task I disabled is just running again.

I think there is a self.get_schedule() missing in DatabaseScheduler.init or DatabaseScheduler.setup_schedule

Regards, Christopher

AttributeError: 'Settings' object has no attribute 'get'

After including 'djcelery' (2.2.4) in django as described in the documentation the instance is no longer accessible due to an "AttributeError". It seems the the .get() method used in celery/app/base.py on the Django settings is in conflict with the django.conf.LazySettings which seems to have no .get().
Traceback:
File "/usr/lib/pymodules/python2.6/django/core/handlers/base.py" in get_response
91. request.path_info)
File "/usr/lib/pymodules/python2.6/django/core/urlresolvers.py" in resolve
215. for pattern in self.url_patterns:
File "/usr/lib/pymodules/python2.6/django/core/urlresolvers.py" in _get_url_patterns
244. patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
File "/usr/lib/pymodules/python2.6/django/core/urlresolvers.py" in _get_urlconf_module
239. self._urlconf_module = import_module(self.urlconf_name)
File "/usr/lib/pymodules/python2.6/django/utils/importlib.py" in import_module
35. import(name)
File "/var/www/wsgi/MyIdea/urls.py" in
4. admin.autodiscover()
File "/usr/lib/pymodules/python2.6/django/contrib/admin/init.py" in autodiscover
24. import_module('%s.admin' % app)
File "/usr/lib/pymodules/python2.6/django/utils/importlib.py" in import_module
35. import(name)
File "/usr/lib/pymodules/python2.6/django/contrib/auth/admin.py" in
151. admin.site.register(Group, GroupAdmin)
File "/usr/lib/pymodules/python2.6/django/contrib/admin/sites.py" in register
90. validate(admin_class, model)
File "/usr/lib/pymodules/python2.6/django/contrib/admin/validation.py" in validate
20. models.get_apps()
File "/usr/lib/pymodules/python2.6/django/db/models/loading.py" in get_apps
115. self._populate()
File "/usr/lib/pymodules/python2.6/django/db/models/loading.py" in _populate
61. self.load_app(app_name, True)
File "/usr/lib/pymodules/python2.6/django/db/models/loading.py" in load_app
78. models = import_module('.models', app_name)
File "/usr/lib/pymodules/python2.6/django/utils/importlib.py" in import_module
35. import(name)
File "/usr/lib/pymodules/python2.6/djcelery/models.py" in
12. from celery import conf
File "/usr/lib/pymodules/python2.6/celery/conf.py" in
13. conf = current_app.conf
File "/usr/lib/pymodules/python2.6/celery/local.py" in getattr
56. return getattr(self._get_current_object(), name)
File "/usr/lib/pymodules/python2.6/celery/utils/init.py" in get
423. value = obj.dict[self.name] = self.get(obj)
File "/usr/lib/pymodules/python2.6/celery/app/base.py" in conf
256. return self._get_config()
File "/usr/lib/pymodules/python2.6/celery/app/base.py" in _get_config
240. [self.prepare_config(self.loader.conf), deepcopy(DEFAULTS)])
File "/usr/lib/pymodules/python2.6/celery/app/base.py" in prepare_config
197. if not c.get("CELERY_RESULT_BACKEND"):
File "/usr/lib/pymodules/python2.6/django/utils/functional.py" in __getattr

277. return getattr(self._wrapped, name)

Exception Type: AttributeError at /
Exception Value: 'Settings' object has no attribute 'get'

This site can be used as an example for this error: http://myidea.fladi.at/

djcelery management commands missing

Hi,

On Ubuntu Lucid, I have installed django from the Ubuntu package manager. After that, I installed django-celery and celery with 'pip install.' When running ./manage.py help in my django project the management commands for celery are missing.

How do I get them to show up?
Thanks,
/mike

Tasks are skips using django 1.3 and database backend (postgres 9.4)

Setup is
postgres 9.4
celery 2.2.6
django-celery 2.2.4
django 1.3

If I call the same tasks twice during a transaction only one get executed and only one row is inserted in the database.

I switch the backend to "cache" (memcached) and the problem is solved but it is problematic because there is no trace of errors. I even get a task id.

Redis backend does not appear to work

I have the following in my settings.py file.

import djcelery
djcelery.setup_loader()

BROKER_BACKEND = "redis"
BROKER_HOST = "127.0.0.1"
BROKER_PORT = 6379
BROKER_VHOST = "0"

CELERY_IMPORTS = ('app.tasks', )
CELERY_RESULT_BACKEND = 'redis'
REDIS_HOST = 6379
REDIS_PORT = 6379
REDIS_DB = 0
CELERY_IGNORE_RESULT = False

Whenever I run a async task, I am unable to view the result. When I call a task with task=method.delay(arg) and call task.result (after the task is completed), I get a long traceback which ends with the following:

/usr/local/lib/python2.7/site-packages/celery/backends/pyredis.pyc in get(self, key)
     62 
     63     def get(self, key):
---> 64         return self.client.get(key)
     65 
     66     def set(self, key, value):

/usr/local/lib/python2.7/site-packages/redis/client.pyc in get(self, name)
    364         Return the value at key ``name``, or None if the key doesn't exist
    365         """
--> 366         return self.execute_command('GET', name)
    367 
    368     def __getitem__(self, name):

/usr/local/lib/python2.7/site-packages/redis/client.pyc in execute_command(self, *args, **options)
    232         connection = pool.get_connection(command_name, **options)
    233         try:
--> 234             connection.send_command(*args)
    235             return self.parse_response(connection, command_name, **options)
    236         except ConnectionError:

/usr/local/lib/python2.7/site-packages/redis/connection.pyc in send_command(self, *args)
    204     def send_command(self, *args):
    205         "Pack and send a command to the Redis server"
--> 206         self.send_packed_command(self.pack_command(*args))
    207 
    208     def read_response(self):

/usr/local/lib/python2.7/site-packages/redis/connection.pyc in send_packed_command(self, command)
    194         "Send an already packed command to the Redis server"
    195         try:
--> 196             self._send(command)
    197         except ConnectionError:
    198             # retry the command once in case the socket connection simply


/usr/local/lib/python2.7/site-packages/redis/connection.pyc in _send(self, command)
    178         "Send the command to the socket"
    179         if not self._sock:
--> 180             self.connect()
    181         try:
    182             self._sock.sendall(command)

/usr/local/lib/python2.7/site-packages/redis/connection.pyc in connect(self)
    123             return
    124         try:
--> 125             sock = self._connect()
    126         except socket.error, e:
    127             raise ConnectionError(self._error_message(e))

/usr/local/lib/python2.7/site-packages/redis/connection.pyc in _connect(self)
    134         sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    135         sock.settimeout(self.socket_timeout)
--> 136         sock.connect((self.host, self.port))
    137         return sock
    138 

/usr/local/lib/python2.7/socket.pyc in meth(name, self, *args)
    222 
    223 def meth(name,self,*args):
--> 224     return getattr(self._sock,name)(*args)
    225 
    226 for _m in _socketmethods:

TypeError: coercing to Unicode: need string or buffer, int found

Problem with syncdb

When i drop all tables in my database and i have installed djcelery and run 'manage.py syncdb' i get this error:

Traceback (most recent call last):
File "manage.py", line 15, in
execute_manager(settings)
File "/usr/lib/pymodules/python2.6/django/core/management/init.py", line 438, in execute_manager
utility.execute()
File "/usr/lib/pymodules/python2.6/django/core/management/init.py", line 379, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/usr/lib/pymodules/python2.6/django/core/management/base.py", line 191, in run_from_argv
self.execute(_args, *_options.dict)
File "/usr/lib/pymodules/python2.6/django/core/management/base.py", line 218, in execute
output = self.handle(_args, *_options)
File "/usr/lib/pymodules/python2.6/django/core/management/base.py", line 347, in handle
return self.handle_noargs(**options)
File "/usr/lib/pymodules/python2.6/django/core/management/commands/syncdb.py", line 55, in handle_noargs
tables = connection.introspection.table_names()
File "/usr/lib/pymodules/python2.6/django/db/backends/init.py", line 493, in table_names
return self.get_table_list(cursor)
File "/usr/lib/pymodules/python2.6/django/db/backends/postgresql/introspection.py", line 31, in get_table_list
AND pg_catalog.pg_table_is_visible(c.oid)""")
File "/usr/lib/pymodules/python2.6/django/db/backends/util.py", line 15, in execute
return self.cursor.execute(sql, params)
File "/usr/lib/pymodules/python2.6/django/db/backends/postgresql_psycopg2/base.py", line 44, in execute
return self.cursor.execute(query, args)
django.db.utils.DatabaseError: current transaction is aborted, commands ignored until end of transaction block

when i comment djcelery from INSTALLED_APPS, runs fine.

djcelery.admin.TaskMonitor changelist view memory usage

Hi Ask,

I'm using django-celery for task monitoring + redis as a broker & result backend. The code is deployed using apache2 + mod_wsgi in daemon mode.

There is an issue with djcelery.admin.TaskMonitor changelist view: after each page refresh memory usage of apache process goes up for several 10's megabytes (about 40M). After some random time this memory returns back but this does not happen fast enough and so several page refreshes can cause frontend to eat all the memory and server to swap. The tasks have large arguments (I think they can be about 100kb of text for some tasks).

The workaround that works fine for me:

import gc
from django.contrib import admin
from djcelery.admin import TaskMonitor, TaskState

class PatchedTaskMonitor(TaskMonitor):
    def changelist_view(self, request, extra_context=None):
        result = super(PatchedTaskMonitor, self).changelist_view(request, extra_context)
        gc.collect()
        return result

admin.site.unregister(TaskState)
admin.site.register(TaskState, PatchedTaskMonitor)

I'm not submitting a pull request with gc.collect() because maybe you have a better idea about this leak and weakref in certain place can fix the underlying issue.

install error

when trying to install with easy install I am getting the following error:

rocessing django_celery-1.1.1-py2.6.egg
django-celery 1.1.1 is already the active version in easy-install.pth

Using /usr/local/lib/python2.6/dist-packages/django_celery-1.1.1-py2.6.egg
Processing dependencies for django-celery
Searching for celery>=1.1.0
Reading http://pypi.python.org/simple/celery/
Reading http://github.com/ask/celery/
No local packages or download links found for celery>=1.1.0
error: Could not find suitable distribution for Requirement.parse('celery>=1.1.0')

DatabaseScheduler is not reusing crontab schedule

I'm using Django-Celery 2.2.1. I recently started using the Celery Beat for scheduling tasks. I've set up a task in the settings file, under the CELERYBEAT_SCHEDULE var. I'm using CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler", so the tasks are kept in the database. I noticed something rather strange: every time I start the celery-beat process, the database is updated. While the task itself remains the same, the interval record is recreated each time, with the exact same parameters. This seems rather redundant and just created unnecessary records in the database.

No handlers could be found for logger "multiprocessing"

On Windows 7 + Apache2 + mod_wsgi + Python 2.6

I am getting the above error when I try to run the worker server
python manage.py celeryd -l info
as described in the 'first steps with django' docs.

However I can do this:
$ python manage.py shell
Python 2.6.3 (r263:75183, Oct 5 2009, 14:41:55) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
(InteractiveConsole)
>>> import multiprocessing
>>> logger = multiprocessing.get_logger()
>>> logger
<logging.Logger instance at 0x02F8C4E0>
>>>

So it seems like the obvious parts are in place. Please help!

Error on djcelerymon with Django 1.2.

When running djcelerymon from command line and having Django 1.2 installed we'll get an error such as "Command' object has no attribute 'stdout'", self.stdout proxy is introduced in Django 1.2.
On 1.1.x series you can run it without problems.

wsgi: os.environ["CELERY_LOADER"] must come before DJANGO_SETTINGS_MODULE

In the docs it just says:

If you’re using mod_wsgi to deploy your Django application you need to include the following in your .wsgi module:

import os
os.environ["CELERY_LOADER"] = "django"

You get import errors unless they are in this order:

os.environ["CELERY_LOADER"] = "django"
os.environ['DJANGO_SETTINGS_MODULE'] = 'mysite.settings'

(I had it the other way around before!)

Comparing booleans with integers doesn't work in PostgreSQL

Traceback (most recent call last):
  File "/usr/local/lib/python2.6/dist-packages/celery/utils/timer2.py", line 155, in apply_entry
    entry()
  File "/usr/local/lib/python2.6/dist-packages/celery/utils/timer2.py", line 41, in __call__
    return self.fun(*self.args, **self.kwargs)
  File "/usr/local/lib/python2.6/dist-packages/celery/utils/timer2.py", line 229, in _reschedules
    return fun(*args, **kwargs)
  File "/usr/local/lib/python2.6/dist-packages/celery/events/snapshot.py", line 45, in cleanup
    self.on_cleanup()
  File "/usr/local/lib/python2.6/dist-packages/djcelery/snapshot.py", line 133, in on_cleanup
    self.TaskState.objects.purge()
  File "/usr/local/lib/python2.6/dist-packages/djcelery/managers.py", line 194, in purge
    self.model._meta.db_table, ))

DatabaseError: operator does not exist: boolean = integer
LINE 1: DELETE FROM djcelery_taskstate WHERE hidden=1

Changing the query to hidden = true would work without a problem in both PostgreSQL and MySQL. But I'm not completely sure about the rest of the databases.

Is there any specific reason for executing manual deletes instead of something like: self.model.objects.all().filter(hidden=True).delete()? (not tested)

Use unicode() instead of repr() to log scheduled tasks.

Hi,

Using DatabaseBackend, if one uses unicode characters in PeriodicTasks name, celerybeat will refuse to start with traceback:

[2011-05-08 07:35:37,596: CRITICAL/MainProcess] celerybeat raised exception <type 'exceptions.UnicodeEncodeError'>:   UnicodeEncodeError('ascii', u"<ModelEntry: Zadanie akceptuj\u0105ce project.app.tasks.task(*[], **{u'when_minutes_pass': 30L}) {<freq: 15 minutes>}", 29, 30, 'ordinal not in range(128)')
Traceback (most recent call last):
  File "/sites/development/lib/python2.7/site-packages/celery/apps/beat.py", line 88, in start_scheduler
    beat.start()
  File "/sites/development/lib/python2.7/site-packages/celery/beat.py", line 367, in start
    humanize_seconds(self.scheduler.max_interval)))
  File "/sites/development/lib/python2.7/site-packages/kombu/utils/__init__.py", line 220, in     __get__
    value = obj.__dict__[self.__name__] = self.__get(obj)
  File "/sites/development/lib/python2.7/site-packages/celery/beat.py", line 407, in scheduler
    return self.get_scheduler()
  File "/sites/development/lib/python2.7/site-packages/celery/beat.py", line 402, in     get_scheduler
    lazy=lazy)
  File "/sites/development/lib/python2.7/site-packages/celery/utils/__init__.py", line 325, in     instantiate
    return get_cls_by_name(name)(*args, **kwargs)
  File "/sites/development/lib/python2.7/site-packages/djcelery/schedulers.py", line 109, in     __init__
    Scheduler.__init__(self, *args, **kwargs)
  File "/sites/development/lib/python2.7/site-packages/celery/beat.py", line 148, in __init__
    self.setup_schedule()
  File "/sites/development/lib/python2.7/site-packages/djcelery/schedulers.py", line 113, in     setup_schedule
    self.install_default_entries(self.schedule)
  File "/sites/development/lib/python2.7/site-packages/celery/beat.py", line 279, in schedule
    return self.get_schedule()
  File "/sites/development/lib/python2.7/site-packages/djcelery/schedulers.py", line 197, in     get_schedule
    for entry in self._schedule.values()))
  File "/sites/development/lib/python2.7/site-packages/djcelery/schedulers.py", line 197, in <    genexpr>
    for entry in self._schedule.values()))
UnicodeEncodeError: 'ascii' codec can't encode character u'\u0105' in position 29: ordinal not     in range(128)

In aforementioned djcelery/schedulers.py, line 197, if you use unicode instead of repr, everything should be alright. Repr unfortunately insists on converting the unicode string to bytes (at least in Python 2.x).

Best regards,
Tomek Kopczuk.

django_celery 2.2.4 transaction error with Django 1.3

Error Message:

Django Version: 1.3
Exception Type: TransactionManagementError
Exception Value: Transaction managed block ended with pending COMMIT/ROLLBACK

How to reproduce:
Initialize a DatabaseScheduler object, which ultimately will fail because the flush() method of schedulers.py use transaction.commit_manually and didn't do a commit/rollback before returning in the if case "if not self._dirty".

Fix:
in schedulers.py

def flush(self):
self.logger.debug("Writing dirty entries...")
if not self._dirty:

transaction.commit() <<<<<
return

Server closed the connection unexpectedly

I'm having a weird issue with PostgreSQL (8.4.7) and celery (2.2.5) and Django (1.2.3).

Suddenly all my celery tasks, that are touching the database, are randomly raising:

DatabaseError: server closed the connection unexpectedly
    This probably means the server terminated abnormally
    before or while processing the request.

Sometimes a task fails 3 times, and then works just fine, other times it fails randomly.

if I run:

$python manage.py shell

and create an object and save it to the database, all works fine.
Here's the traceback:

[2011-04-14 16:59:43,071: INFO/MainProcess] Got task from broker:   WebTools.controlpanel.tasks.discoverDeviceTask[15dcea93-d3ff-416d-aedd-bc6fe80a6f6b]
[2011-04-14 16:59:43,122: WARNING/PoolWorker-25] server closed the connection unexpectedly
    This probably means the server terminated abnormally
    before or while processing the request.
[2011-04-14 16:59:43,126: ERROR/MainProcess] Task WebTools.controlpanel.tasks.discoverDeviceTask[15dcea93-d3ff-416d-aedd-bc6fe80a6f6b] raised exception: DatabaseError('server closed the connection unexpectedly\n\tThis probably means the server terminated abnormally\n\tbefore or while processing the request.\n',)
Traceback (most recent call last):
  File "/usr/local/lib/python2.6/dist-packages/celery/execute/trace.py", line 34, in trace
    return cls(states.SUCCESS, retval=fun(*args, **kwargs))
  File "/usr/local/lib/python2.6/dist-packages/celery/task/base.py", line 234, in __call__
    return self.run(*args, **kwargs)
  File "/usr/local/ast-tools/WebTools/../WebTools/controlpanel/tasks.py", line 54, in wrapper
    IP.objects.filter(ip=ip).update(busy="N")
  File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 467, in update
    rows = query.get_compiler(self.db).execute_sql(None)
  File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/compiler.py", line 861, in execute_sql
    cursor = super(SQLUpdateCompiler, self).execute_sql(result_type)
  File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/compiler.py", line 727, in execute_sql
    cursor.execute(sql, params)
  File "/usr/local/lib/python2.6/dist-packages/django/db/backends/util.py", line 15, in execute
    return self.cursor.execute(sql, params)
  File "/usr/local/lib/python2.6/dist-packages/django/db/backends/postgresql_psycopg2/base.py", line 50, in execute
    return self.cursor.execute(query, args)
DatabaseError: server closed the connection unexpectedly
    This probably means the server terminated abnormally
    before or while processing the request.

Here's database log:

2011-04-14 18:26:37 CEST LOG:  connection received: host=127.0.0.1 port=40295
2011-04-14 18:26:37 CEST LOG:  connection authorized: user=ast database=astdb
2011-04-14 18:26:37 CEST LOG:  disconnection: session time: 0:00:00.044 user=ast database=astdb host=127.0.0.1 port=40295
2011-04-14 18:26:54 CEST LOG:  statement: UPDATE "djcelery_workerstate" SET "hostname" = E'ast-tool01', "last_heartbeat" = E'2011-04-14 18:26:52.081696' WHERE "djcelery_workerstate"."id" = 2 
2011-04-14 18:27:02 CEST LOG:  statement: INSERT INTO "djcelery_taskstate" ("state", "task_id", "name", "tstamp", "args", "kwargs", "eta", "expires", "result", "traceback", "runtime", "worker_id", "hidden") VALUES (E'FAILURE', E'352493f7-1fe5-4216-92aa-da44478233d2', E'WebTools.TestingReports.tasks.parseTestResultTaskv2', E'2011-04-14 18:27:00.068098', E'[u''/media/sdb/AST_TestLogs/testresults/'']', E'{}', NULL, NULL, E'DatabaseError(''server closed the connection unexpectedly\\n\\tThis probably means the server terminated abnormally\\n\\tbefore or while processing the request.\\n'',)', E'Traceback (most recent call last):
      File "/usr/local/lib/python2.6/dist-packages/celery/execute/trace.py", line 34, in trace
        return cls(states.SUCCESS, retval=fun(*args, **kwargs))
      File "/usr/local/lib/python2.6/dist-packages/celery/task/base.py", line 234, in __call__
        return self.run(*args, **kwargs)
      File "/usr/local/ast-tools/WebTools/../WebTools/TestingReports/tasks.py", line 45, in wrapper
        return func(*args, **kwargs)
      File "/usr/local/ast-tools/WebTools/../WebTools/TestingReports/tasks.py", line 243, in run
        tr.save()
      File "/usr/local/ast-tools/WebTools/../WebTools/TestingReports/models.py", line 60, in save
        super(TestReport, self).save(*args, **kwargs)
      File "/usr/local/lib/python2.6/dist-packages/django/db/models/base.py", line 434, in save
        self.save_base(using=using, force_insert=force_insert, force_update=force_update)
      File "/usr/local/lib/python2.6/dist-packages/django/db/models/base.py", line 527, in save_base
        result = manager._insert(values, return_id=update_pk, using=using)
      File "/usr/local/lib/python2.6/dist-packages/django/db/models/manager.py", line 195, in _insert
        return insert_query(self.model, values, **kwargs)
      File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 1479, in insert_query
        return query.get_compiler(using=using).execute_sql(return_id)
      File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/compiler.py", line 783, in execute_sql
        cursor = super(SQLInsertCompiler, self).execute_sql(None)
      File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/compiler.py", line 727, in execute_sql
        cursor.execute(sql, params)
      File "/usr/local/lib/python2.6/dist-packages/django/db/backends/postgresql_psycopg2/base.py", line 50, in execute
        return self.cursor.execute(query, args)
    DatabaseError: server closed the connection unexpectedly
        This probably means the server terminated abnormally
        before or while processing the request.
    ', NULL, 2, false)

Anybody knows what's going on ?
Thanks

FAIL: test_memcache_wrapper (djcelery.tests.test_backends.test_cache.test_MemcacheWrapper)

Creating test database for alias 'default'...
.......F.........................................................
======================================================================
FAIL: test_memcache_wrapper (djcelery.tests.test_backends.test_cache.test_MemcacheWrapper)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/users/FladischerMichael/Development/Debian/django-celery/django-celery-2.2.4/djcelery/tests/test_backends/test_cache.py", line 130, in test_memcache_wrapper
    self.assertIsInstance(cache, DjangoMemcacheWrapper)
AssertionError: <django.core.cache.backends.locmem.LocMemCache object at 0x2f21490> is not an instance of <class 'djcelery.backends.cache.DjangoMemcacheWrapper'>

----------------------------------------------------------------------
Ran 65 tests in 1.849s

FAILED (failures=1)
Traceback (most recent call last):
  File "/usr/lib/python2.6/multiprocessing/util.py", line 235, in _run_finalizers
    finalizer()
  File "/usr/lib/python2.6/multiprocessing/util.py", line 174, in __call__
    res = self._callback(*self._args, **self._kwargs)
  File "/home/users/FladischerMichael/Development/Debian/django-celery/django-celery-2.2.4/djcelery/tests/test_schedulers.py", line 64, in flush
    schedulers.DatabaseScheduler.flush(self)
  File "/usr/lib/pymodules/python2.6/django/db/transaction.py", line 217, in inner
    res = func(*args, **kwargs)
  File "/home/users/FladischerMichael/Development/Debian/django-celery/django-celery-2.2.4/djcelery/schedulers.py", line 167, in flush
    self.schedule[name].save()
  File "/usr/lib/python2.6/dist-packages/celery/beat.py", line 285, in schedule
    return self.get_schedule()
  File "/home/users/FladischerMichael/Development/Debian/django-celery/django-celery-2.2.4/djcelery/schedulers.py", line 190, in get_schedule
    self.flush()
  File "/home/users/FladischerMichael/Development/Debian/django-celery/django-celery-2.2.4/djcelery/tests/test_schedulers.py", line 64, in flush
    schedulers.DatabaseScheduler.flush(self)
  File "/usr/lib/pymodules/python2.6/django/db/transaction.py", line 222, in inner
    self.__exit__(None, None, None)
  File "/usr/lib/pymodules/python2.6/django/db/transaction.py", line 207, in __exit__
    self.exiting(exc_value, self.using)
  File "/usr/lib/pymodules/python2.6/django/db/transaction.py", line 302, in exiting
    leave_transaction_management(using=using)
  File "/usr/lib/pymodules/python2.6/django/db/transaction.py", line 56, in leave_transaction_management
    connection.leave_transaction_management()
  File "/usr/lib/pymodules/python2.6/django/db/backends/__init__.py", line 115, in leave_transaction_management
    raise TransactionManagementError("Transaction managed block ended with "
TransactionManagementError: Transaction managed block ended with pending COMMIT/ROLLBACK

@periodic_task with literal as the hour does not work (task never gets called)

We are trying to dispatch a simple periodic task using django-celery with the decorator style cron definition. Only the top two (using "*" seem to work), with all of the others, the task is NEVER fired. very frustrating....

ok: @periodic_task(run_every=crontab(hour="", minute="", day_of_week=[1,2,3,4,5]))
ok: @periodic_task(run_every=crontab(hour="", minute=[0], day_of_week=[1,2,3,4,5]))
not ok: @periodic_task(run_every=crontab(hour=16, minute=[0], day_of_week=[1,2,3,4,5]))
not ok: @periodic_task(run_every=crontab(hour="16", minute=[0], day_of_week=[1,2,3,4,5]))
not ok: @periodic_task(run_every=crontab(hour=[16], minute=[0], day_of_week=[1,2,3,4,5]))
not ok: @periodic_task(run_every=crontab(hour="16", minute="
", day_of_week=[1,2,3,4,5]))

We are testing by modifying the linux system time, then starting celeryd, we pray a lot that the task will work, and it never does...

I am running django-celery 2.0.2

Console logging fails when loggin non-string objects

When using celery from a console with local output, in case the log message contains something which is not a string, the console output may fail with the following traceback.

Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/logging/init.py", line 838, in emit
msg = self.format(record)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/logging/init.py", line 715, in format
return fmt.format(record)
File "/Users/xxxxx/dev/python-envs/sfl1/lib/python2.7/site-packages/celery/log.py", line 42, in format
record.msg = unicode(color(record.msg))
File "/Users/xxxxx/dev/python-envs/sfl1/lib/python2.7/site-packages/celery/utils/term.py", line 67, in str
return prefix + reduce(self._add, self.s) + suffix
TypeError: cannot concatenate 'str' and 'instance' objects
Logged from file client.py, line 648

Can't use `--eta` argument with `celeryctl`

It doesn't seem to be possible to use the --eta argument with celeryctl because the string value is never converted to a datetime, but some parts of celery assume that it is a datetime:

$ py manage.py celeryctl apply --eta=$(date) asdf
...lib/python2.6/site-packages/piston/oauth.py:42: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
  self.message = message
...lib/python2.6/site-packages/piston/authentication.py:142: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
  response = HttpResponse(err.message.encode('utf-8'))
Traceback (most recent call last):
  File "manage.py", line 18, in <module>
    execute_manager(settings)
  File "...lib/python2.6/site-packages/django/core/management/__init__.py", line 438, in execute_manager
    utility.execute()
  File "...lib/python2.6/site-packages/django/core/management/__init__.py", line 379, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "...lib/python2.6/site-packages/djcelery/management/commands/celeryctl.py", line 26, in run_from_argv
    util.execute_from_commandline(self.handle_default_options(argv)[1:])
  File "...lib/python2.6/site-packages/celery/bin/base.py", line 72, in execute_from_commandline
    return self.handle_argv(prog_name, argv[1:])
  File "...lib/python2.6/site-packages/celery/bin/celeryctl.py", line 340, in handle_argv
    return self.execute(command, argv)
  File "...lib/python2.6/site-packages/celery/bin/celeryctl.py", line 330, in execute
    cls(app=self.app).run_from_argv(self.prog_name, argv)
  File "...lib/python2.6/site-packages/celery/bin/celeryctl.py", line 73, in run_from_argv
    self(*args, **options.__dict__)
  File "...lib/python2.6/site-packages/celery/bin/celeryctl.py", line 47, in __call__
    self.run(*args, **kwargs)
  File "...lib/python2.6/site-packages/celery/bin/celeryctl.py", line 170, in run
    expires=expires)
  File "...lib/python2.6/site-packages/celery/app/base.py", line 227, in send_task
    return result_cls(new_id)
  File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/contextlib.py", line 34, in __exit__
    self.gen.throw(type, value, traceback)
  File "...lib/python2.6/site-packages/celery/app/base.py", line 287, in default_connection
    yield connection
  File "...lib/python2.6/site-packages/celery/app/base.py", line 224, in send_task
    expires=expires, **options)
  File "...lib/python2.6/site-packages/celery/app/amqp.py", line 215, in delay_task
    eta = eta and eta.isoformat()
AttributeError: 'str' object has no attribute 'isoformat'

Calling result.ready() twice before task is completed makes result.ready() return false forever dedpite task is completed

One important note is that this doesn't happen under celery alone - occured after I tried task under django-celery.
Below is output from bpython:

In [50]: res = sleep_10_sec_and_add.delay(1,9)

In [51]: res.ready()
Out[51]: False

In [52]: # 10 sec passes

In [53]: res.ready()
Out[53]: True

In [54]: 

In [55]: res = sleep_10_sec_and_add.delay(1,9)

In [56]: res.ready()
Out[56]: False

In [57]: res.ready()
Out[57]: False

In [58]: ## 10 sec passes

In [59]: res.ready()
Out[59]: False    #despite celeryd logs show task was completed successfuly

I provided you very few information - if you cannot reproduce the bug - contact me.

Dumpdata complains celery_taskmeta doesn't exist

When attempting to dump the entire database using dumpdata, django complains that celery_taskmeta doesn't exist. Since django-celery doesn't require celery to be an installed-app, syncdb never creates these tables and thus dumpdata shouldn't have to ever dump them.

python manage.py dumpdata --settings=proj.settings > db.json
/home/ec2-user/django-trunk/django/db/models/fields/subclassing.py:80: DeprecationWarning: A Field class whose get_db_prep_save method hasn't been updated to take a connection argument.
new_class = super(SubfieldBase, cls).new(cls, name, bases, attrs)
Traceback (most recent call last):
File "manage.py", line 11, in
execute_manager(settings)
File "/home/ec2-user/django-trunk/django/core/management/init.py", line 438, in execute_manager
utility.execute()
File "/home/ec2-user/django-trunk/django/core/management/init.py", line 379, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/home/ec2-user/django-trunk/django/core/management/base.py", line 191, in run_from_argv
self.execute(_args, *_options.dict)
File "/home/ec2-user/django-trunk/django/core/management/base.py", line 220, in execute
output = self.handle(_args, *_options)
File "/home/ec2-user/django-trunk/django/core/management/commands/dumpdata.py", line 110, in handle
objects.extend(model._default_manager.using(using).all())
File "/home/ec2-user/django-trunk/django/db/models/query.py", line 84, in len
self._result_cache.extend(list(self._iter))
File "/home/ec2-user/django-trunk/django/db/models/query.py", line 273, in iterator
for row in compiler.results_iter():
File "/home/ec2-user/django-trunk/django/db/models/sql/compiler.py", line 680, in results_iter
for rows in self.execute_sql(MULTI):
File "/home/ec2-user/django-trunk/django/db/models/sql/compiler.py", line 735, in execute_sql
cursor.execute(sql, params)
File "/home/ec2-user/django-trunk/django/db/backends/mysql/base.py", line 86, in execute
return self.cursor.execute(query, args)
File "/usr/lib64/python2.6/site-packages/MySQLdb/cursors.py", line 173, in execute
self.errorhandler(self, exc, value)
File "/usr/lib64/python2.6/site-packages/MySQLdb/connections.py", line 36, in defaulterrorhandler
raise errorclass, errorvalue
django.db.utils.DatabaseError: (1146, "Table 'swipegood.celery_taskmeta' doesn't exist")

DeprecationWarning: integer argument expected, got float

django/core/cache/backends/memcached.py:54: DeprecationWarning: integer argument expected, got float
self._cache.set(smart_str(key), value, self._get_memcache_timeout(timeout))

backends / cache.py:
if isinstance(expires, timedelta):
expires = timedelta_seconds(conf.TASK_RESULT_EXPIRES)

timedelta_seconds() returns a float instead of int and that upsets django.

--settings not supported with celeryctl

While this works perfectly for me:

$ python manage.py celeryd --settings=settings_foo

this does not:

$ python manage.py celeryctl status --settings=settings_foo
Usage: celeryctl status [options]

celeryctl: error: no such option: --settings

I need settings_foo because the normal settings is not using django-celery at all. I'm using 2.2.4 and celery 2.2.6.

Decorator @task_with_respect_to_language for using tasks with respect of current site's languages

What do you think about add this to repo?
Created by recommendations from here http://celeryq.org/docs/django-celery/faq.html#generating-a-template-in-a-task-doesn-t-seem-to-respect-my-i18n-settings

from django.utils import translation
from celery.decorators import task
from django.utils.functional import wraps
def task_with_respect_to_language(func):
    '''
    Decorator for tasks with respect to site's current language.
    You can use in tasks.py this decorator @task_respect_to_language instead of default @task
    Be sure that task method have kwargs argument:

        @task_respect_to_language
        def my_task(...., **kwargs):
            pass

    You can call this task this way:

        from django.utils import translation
        tasks.my_task.delay(...., language=translation.get_language())
    '''
    def wrapper(*args, **kwargs):
        language = kwargs.pop('language', None)
        prev_language = translation.get_language()
        language and translation.activate(language)
        try:
            return func(*args, **kwargs)
        finally:
            translation.activate(prev_language)

    wrapper.__doc__ = func.__doc__
    wrapper.__name__ = func.__name__
    wrapper.__module__ = func.__module__
    return wraps(func)(task(wrapper))

Generic Celery Init Script Doesn't Work With Latest Release

Hi there,

I've been doing some tinkering with the latest release of django-celery and celery, and found an issue with the generic init script: http://www.google.com/url?sa=D&q=https://raw.github.com/ask/celery/master/contrib/generic-init.d/celeryd

Assuming that I've defined a celeryd configuration file:

# /etc/defauls/celeryd
CELERYD_CHDIR="/var/www/partyliner"
CELERYD="/var/www/partyliner/manage.py celeryd --settings=settings.dev"

When I launch celeryd, via /etc/init.d/celeryd start I get the following error:

celeryd-multi v2.3.1
> Starting nodes...
 > celery.q.rcitelco.com: Usage: /var/www/partyliner/manage.py celeryd [options] 

Runs a Celery worker node.

/var/www/partyliner/manage.py: error: no such option: --workdir
* Child terminated with failure code 2
FAILED

As the latest version of celeryd no longer accepts --workdir as an argument. The only way I'm able to get celeryd to run via the init script is to add the following to my configuration file, /etc/default/celeryd:

CELERYD_CHDIR=""

Which (since empty) forces the init script to run celeryd without the --workdir option. I can't help but feel this is a hack, however. I think that either:

  1. celery needs to be updated to allow the --workdir option again, as the docs all suggest using it and it is hard to debug for new users, or
  2. The docs / init script should be updated to not use CELERYD_CHDIR anywhere.

Stale tasks are not removed from the djcelery_periodictask table

When using djcelery.schedulers.DatabaseScheduler:

  1. Define a periodic task
  2. Run celerybeat
  3. Observe task added to djcelery_periodictask table
  4. Rename the periodic task
  5. Observe new task added to djcelery_periodictask table, but old task not removed

This also happens when removing periodic tasks—the database entries persist after the tasks no longer exist in code.

Pip install of django-celery 2.2.4 is broken as of celery 3.0 release

Hi Ask,

Ran into some trouble after trying to install django-celery 2.2.4. When I install this version, celery 3.0 is installed when pip looks for the requirements.

The issue is that there is no longer a file named functional.py within celery 3.0:

  File "/home/alfred/.hudson/jobs/beta/workspace/ve/lib/python2.6/site-packages/djcelery/models.py", line 18, in <module>
    from djcelery.managers import TaskManager, TaskSetManager, ExtendedManager
  File "/home/alfred/.hudson/jobs/beta/workspace/ve/lib/python2.6/site-packages/djcelery/managers.py", line 6, in <module>
    from celery.utils.functional import wraps
ImportError: No module named functional

This seems to be because the setup.py file for django-celery 2.2.4 has this:

install_requires=[
        "django-picklefield",
        "celery>=2.2.4",
    ],

So it seems the installer sees that dependency, but since it can be anything equal to or greater than 2.2.4, it installs celery 3.0, which no longer has this functional.py file.

After installing celery==2.2.4 first, then install django-celery==2.2.4 as I want to, this works fine, but this breaks if I do not have celery already installed.

Perhaps I am doing something wrong, so correct me if I'm wrong. Otherwise, amazing work on celery and all your other tools/libs :-)

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.