bretth / django-pq Goto Github PK
View Code? Open in Web Editor NEW[UNMAINTAINED] A task queue based on the elegant python RQ but with a django postgresql backend.
License: BSD 2-Clause "Simplified" License
[UNMAINTAINED] A task queue based on the elegant python RQ but with a django postgresql backend.
License: BSD 2-Clause "Simplified" License
Nice project, just perfect for what I was looking for. You might be faster in knowing what's going on. When using pqworker with SerialQueue, it throws error:
AttributeError: 'Queue' object has no attribute 'acquire_lock'
I think the problem is in queue.py, line 331:
if q.serial and not q.acquire_lock(timeout):
I'm just following readme for now:
from pq import SerialQueue
q = SerialQueue('serial')
q.enqueue_call(my_call)
I'll try to figure out if I can provide more debug info, I'm not yet sure how your classes exactly interact.
I have just started doing testing with a project that depends on django-pq for Django 1.7.
All builds run fine, apart from the python 3.2 one, which errors on 1.7.
E File "/home/travis/virtualenv/python3.2/lib/python3.2/site-packages/pq/migrations/0003_auto__add_field_worker_heartbeat.py", line 12
E db.add_column(u'pq_worker', 'heartbeat',
E ^
E SyntaxError: invalid syntax
It seems like when Django 1.7 parses the old migration files, it has some problem with the unicode literal. Any idea why this might be?
When I run python manage.py pqworker default
it consumes all enqueued jobs at first time it runs, then shows me: "*** Listening on default..."
Then I'm enqueuing more and more jobs, but they never consume. After timeout of the jobs exceeds, pq worker do performing a job, but moves job into failed list because it's timed out.
What I'm doing wrong?
When testing pq out I've managed to break it by adding a job scheduled for the future and right next enqueuing an immediate one. This resulted in this partial stack:
File "/Users/m/Dropbox/projects/tlink/pq/worker.py", line 313, in work
result = self.dequeue_job_and_maintain_ttl(timeout)
File "/Users/m/Dropbox/projects/tlink/pq/worker.py", line 352, in dequeue_job_and_maintain_ttl
return PQ.dequeue_any(self.queues, timeout)
File "/Users/m/Dropbox/projects/tlink/pq/queue.py", line 360, in dequeue_any
if q.serial and not q.acquire_lock(timeout):
AttributeError: 'unicode' object has no attribute 'serial'
It seems to be caused by the fact that Job._get_job_or_promise
returns job.queue_id
as the promise, while Queue.dequeue_any
expects a Queue
object when it adds it to stack (queue_stack.append(promise)
).
While trying to figure out why worker doesn't get notified of new jobs in queue I also noticed that the worker crashes after PQ_DEFAULT_WORKER_TTL seconds (minus 60 seconds if I'm reading the code correctly).
It looks like expires_after
is None in dequeue_job_and_maintain_ttl
. I'm attaching traceback and I'm running f840154 . Sorry for not providing a patch.
[DEBUG] pq.worker: Registering death
Traceback (most recent call last):
File "/home/gandalf/ck/eggs/Django-1.5.1-py2.7.egg/django/core/management/base.py", line 222, in run_from_argv
self.execute(*args, **options.__dict__)
File "/home/gandalf/ck/eggs/Django-1.5.1-py2.7.egg/django/core/management/base.py", line 255, in execute
output = self.handle(*args, **options)
File "/home/gandalf/ck/parts/pq/pq/management/commands/pqworker.py", line 51, in handle
w.work(burst=options['burst'])
File "/home/gandalf/ck/parts/pq/pq/worker.py", line 290, in work
result = self.dequeue_job_and_maintain_ttl(timeout)
File "/home/gandalf/ck/parts/pq/pq/worker.py", line 324, in dequeue_job_and_maintain_ttl
if self.expires_after < now():
TypeError: can't compare datetime.datetime to NoneType
as per proposed integration of rq-scheduler with rq:
schedule_call(datetime(2020, 1, 1, 3, 4), func, foo, bar=baz, interval, repeat)
interval in seconds after the first scheduled date (not when the job finishes
repeat n times where 0/None = None, -1 = forever
Because workers share the Django db stack they are vulnerable to exceptions in jobs. Essentially the design would be better all round if django-pq used it's own connection handlers which aren't shared with jobs. This would also make it easier to receive notification of new jobs via the postgres connection.
Not sure whether I need to do anything about this but noticed it on Heroku and may need to followup to ensure failures are handled correctly. It may be a postgresq pool issue..
Traceback (most recent call last):
File "./manage.py", line 24, in <module>
execute_from_command_line(sys.argv)
File "/app/.heroku/python/lib/python2.7/site-packages/django/core/management/__init__.py", line 443, in execute_from_command_line
utility.execute()
File "/app/.heroku/python/lib/python2.7/site-packages/django/core/management/__init__.py", line 382, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/app/.heroku/python/lib/python2.7/site-packages/django/core/management/base.py", line 196, in run_from_argv
self.execute(*args, **options.__dict__)
File "/app/.heroku/python/lib/python2.7/site-packages/django/core/management/base.py", line 232, in execute
output = self.handle(*args, **options)
File "/app/.heroku/python/lib/python2.7/site-packages/pq/management/commands/pqworker.py", line 42, in handle
w = Worker.create(queues, name=options.get('name'), connection=options['connection'])
File "/app/.heroku/python/lib/python2.7/site-packages/pq/worker.py", line 113, in create
w.failed_queue = get_failed_queue(connection)
File "/app/.heroku/python/lib/python2.7/site-packages/pq/queue.py", line 24, in get_failed_queue
return FailedQueue.create(connection=connection)
File "/app/.heroku/python/lib/python2.7/site-packages/pq/queue.py", line 439, in create
fq.save()
File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/base.py", line 463, in save
self.save_base(using=using, force_insert=force_insert, force_update=force_update)
File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/base.py", line 506, in save_base
self.save_base(cls=parent, origin=org, using=using)
File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/base.py", line 524, in save_base
manager.using(using).filter(pk=pk_val).exists())):
File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/query.py", line 562, in exists
return self.query.has_results(using=self.db)
File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/sql/query.py", line 441, in has_results
return bool(compiler.execute_sql(SINGLE))
File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/sql/compiler.py", line 817, in execute_sql
cursor = self.connection.cursor()
File "/app/.heroku/python/lib/python2.7/site-packages/django/db/backends/__init__.py", line 308, in cursor
cursor = util.CursorWrapper(self._cursor(), self)
File "/app/.heroku/python/lib/python2.7/site-packages/django_postgrespool/base.py", line 112, in _cursor
self.connection = db_pool.connect(**self._get_conn_params())
File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 1048, in connect
return self.get_pool(*args, **kw).connect()
File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 272, in connect
return _ConnectionFairy(self).checkout()
File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 425, in __init__
rec = self._connection_record = pool._do_get()
File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 777, in _do_get
con = self._create_connection()
File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 225, in _create_connection
return _ConnectionRecord(self)
File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 318, in __init__
self.connection = self.__connect()
File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 368, in __connect
connection = self.__pool._creator()
File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 1026, in <lambda>
self.module.connect(*args, **kw), **self.kw)
File "/app/.heroku/python/lib/python2.7/site-packages/psycopg2/__init__.py", line 179, in connect
connection_factory=connection_factory, async=async)
psycopg2.OperationalError: could not translate host name "***.compute-1.amazonaws.com" to address: Temporary failure in name resolution
I just stumbled on a small problem (in my config?)
question is if we should fix this in django-pq, or at least document the behavior.
https
http
or https
scheme is default.pqworker
forks a process to do the job-workWorker
class kills the forked process when the work is done.So the result is, failed jobs are not sent to sentry.
What should be the best solution to fix this?
edit: changed raven-version where this appeared.
Call me crazy, but I use Windows as my development platform (and Linux on servers). django-pq does not run on Windows at all (unsurprisingly crashes on os.fork() ). That's not a problem overall, but it'd be nice to have supported (or unsupported) OSes in README.
I am getting this error:
PicklingError: Can't pickle <class 'libs.ckeditor.fields.CKEditorString'>: attribute lookup libs.ckeditor.fields.CKEditorString failed
When a certain model is added to the key as an argument to a function.
I will look into it more to see what is causing it.
https://app.getsentry.com/canada/canada-heroku/group/16425192/
It appears the distribution django-pq is no longer available on PyPI.
joy:~ chief$ pip install django-pq
Downloading/unpacking django-pq
Real name of requirement django-pq is django-pq
Could not find any downloads that satisfy the requirement django-pq
No distributions at all found for django-pq
Storing complete log in /Users/chief/.pip/pip.log
I have also manually searched for django-pq on PyPI and it returns no results.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.