Giter Club home page Giter Club logo

django-pq's People

Contributors

bretth avatar saulshanabrook avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

django-pq's Issues

AttributeError: 'Queue' object has no attribute 'acquire_lock'

Nice project, just perfect for what I was looking for. You might be faster in knowing what's going on. When using pqworker with SerialQueue, it throws error:

AttributeError: 'Queue' object has no attribute 'acquire_lock'

I think the problem is in queue.py, line 331:

            if q.serial and not q.acquire_lock(timeout):

I'm just following readme for now:

from pq import SerialQueue
q = SerialQueue('serial')
q.enqueue_call(my_call)

I'll try to figure out if I can provide more debug info, I'm not yet sure how your classes exactly interact.

Django 1.7 and 3.2 Error

I have just started doing testing with a project that depends on django-pq for Django 1.7.

All builds run fine, apart from the python 3.2 one, which errors on 1.7.

E         File "/home/travis/virtualenv/python3.2/lib/python3.2/site-packages/pq/migrations/0003_auto__add_field_worker_heartbeat.py", line 12
E           db.add_column(u'pq_worker', 'heartbeat',
E                                    ^
E       SyntaxError: invalid syntax

It seems like when Django 1.7 parses the old migration files, it has some problem with the unicode literal. Any idea why this might be?

PQ worker doesn't consume jobs as expected

When I run python manage.py pqworker default it consumes all enqueued jobs at first time it runs, then shows me: "*** Listening on default..."
Then I'm enqueuing more and more jobs, but they never consume. After timeout of the jobs exceeds, pq worker do performing a job, but moves job into failed list because it's timed out.
What I'm doing wrong?

Queue.dequeue_any breaks with a promise

When testing pq out I've managed to break it by adding a job scheduled for the future and right next enqueuing an immediate one. This resulted in this partial stack:

  File "/Users/m/Dropbox/projects/tlink/pq/worker.py", line 313, in work
    result = self.dequeue_job_and_maintain_ttl(timeout)
  File "/Users/m/Dropbox/projects/tlink/pq/worker.py", line 352, in dequeue_job_and_maintain_ttl
    return PQ.dequeue_any(self.queues, timeout)
  File "/Users/m/Dropbox/projects/tlink/pq/queue.py", line 360, in dequeue_any
    if q.serial and not q.acquire_lock(timeout):
AttributeError: 'unicode' object has no attribute 'serial'

It seems to be caused by the fact that Job._get_job_or_promise returns job.queue_id as the promise, while Queue.dequeue_any expects a Queue object when it adds it to stack (queue_stack.append(promise)).

TypeError: can't compare datetime.datetime to NoneType

While trying to figure out why worker doesn't get notified of new jobs in queue I also noticed that the worker crashes after PQ_DEFAULT_WORKER_TTL seconds (minus 60 seconds if I'm reading the code correctly).

It looks like expires_after is None in dequeue_job_and_maintain_ttl. I'm attaching traceback and I'm running f840154 . Sorry for not providing a patch.

[DEBUG] pq.worker: Registering death
Traceback (most recent call last):
  File "/home/gandalf/ck/eggs/Django-1.5.1-py2.7.egg/django/core/management/base.py", line 222, in run_from_argv
    self.execute(*args, **options.__dict__)
  File "/home/gandalf/ck/eggs/Django-1.5.1-py2.7.egg/django/core/management/base.py", line 255, in execute
    output = self.handle(*args, **options)
  File "/home/gandalf/ck/parts/pq/pq/management/commands/pqworker.py", line 51, in handle
    w.work(burst=options['burst'])
  File "/home/gandalf/ck/parts/pq/pq/worker.py", line 290, in work
    result = self.dequeue_job_and_maintain_ttl(timeout)
  File "/home/gandalf/ck/parts/pq/pq/worker.py", line 324, in dequeue_job_and_maintain_ttl
    if self.expires_after < now():
TypeError: can't compare datetime.datetime to NoneType

scheduled tasks api

as per proposed integration of rq-scheduler with rq:

schedule_call(datetime(2020, 1, 1, 3, 4), func, foo, bar=baz, interval, repeat)

interval in seconds after the first scheduled date (not when the job finishes
repeat n times where 0/None = None, -1 = forever

A job with a db exception is never handled (moved to failed) by the worker

Because workers share the Django db stack they are vulnerable to exceptions in jobs. Essentially the design would be better all round if django-pq used it's own connection handlers which aren't shared with jobs. This would also make it easier to receive notification of new jobs via the postgres connection.

Temporary failure in name resolution (database connection)

Not sure whether I need to do anything about this but noticed it on Heroku and may need to followup to ensure failures are handled correctly. It may be a postgresq pool issue..

Traceback (most recent call last):
  File "./manage.py", line 24, in <module>
    execute_from_command_line(sys.argv)
  File "/app/.heroku/python/lib/python2.7/site-packages/django/core/management/__init__.py", line 443, in execute_from_command_line
    utility.execute()
  File "/app/.heroku/python/lib/python2.7/site-packages/django/core/management/__init__.py", line 382, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/app/.heroku/python/lib/python2.7/site-packages/django/core/management/base.py", line 196, in run_from_argv
    self.execute(*args, **options.__dict__)
  File "/app/.heroku/python/lib/python2.7/site-packages/django/core/management/base.py", line 232, in execute
    output = self.handle(*args, **options)
  File "/app/.heroku/python/lib/python2.7/site-packages/pq/management/commands/pqworker.py", line 42, in handle
    w = Worker.create(queues, name=options.get('name'), connection=options['connection'])
  File "/app/.heroku/python/lib/python2.7/site-packages/pq/worker.py", line 113, in create
    w.failed_queue = get_failed_queue(connection)
  File "/app/.heroku/python/lib/python2.7/site-packages/pq/queue.py", line 24, in get_failed_queue
    return FailedQueue.create(connection=connection)
  File "/app/.heroku/python/lib/python2.7/site-packages/pq/queue.py", line 439, in create
    fq.save()
  File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/base.py", line 463, in save
    self.save_base(using=using, force_insert=force_insert, force_update=force_update)
  File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/base.py", line 506, in save_base
    self.save_base(cls=parent, origin=org, using=using)
  File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/base.py", line 524, in save_base
    manager.using(using).filter(pk=pk_val).exists())):
  File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/query.py", line 562, in exists
    return self.query.has_results(using=self.db)
  File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/sql/query.py", line 441, in has_results
    return bool(compiler.execute_sql(SINGLE))
  File "/app/.heroku/python/lib/python2.7/site-packages/django/db/models/sql/compiler.py", line 817, in execute_sql
    cursor = self.connection.cursor()
  File "/app/.heroku/python/lib/python2.7/site-packages/django/db/backends/__init__.py", line 308, in cursor
    cursor = util.CursorWrapper(self._cursor(), self)
  File "/app/.heroku/python/lib/python2.7/site-packages/django_postgrespool/base.py", line 112, in _cursor
    self.connection = db_pool.connect(**self._get_conn_params())
  File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 1048, in connect
    return self.get_pool(*args, **kw).connect()
  File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 272, in connect
    return _ConnectionFairy(self).checkout()
  File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 425, in __init__
    rec = self._connection_record = pool._do_get()
  File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 777, in _do_get
    con = self._create_connection()
  File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 225, in _create_connection
    return _ConnectionRecord(self)
  File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 318, in __init__
    self.connection = self.__connect()
  File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 368, in __connect
    connection = self.__pool._creator()
  File "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/pool.py", line 1026, in <lambda>
    self.module.connect(*args, **kw), **self.kw)
  File "/app/.heroku/python/lib/python2.7/site-packages/psycopg2/__init__.py", line 179, in connect
    connection_factory=connection_factory, async=async)
psycopg2.OperationalError: could not translate host name "***.compute-1.amazonaws.com" to address: Temporary failure in name resolution

fixing / warning with sentry error-reporting?

I just stumbled on a small problem (in my config?)
question is if we should fix this in django-pq, or at least document the behavior.

  • default sentry DSNs (in docs, heroku, ...) are with scheme https
  • since getsentry/raven-python@7a69328 , seems to be in raven 3.6.0, the threaded http transport for the http or https scheme is default.
  • pqworker forks a process to do the job-work
  • the threaded raven transport uses a background thread to send the events to sentry (see the source)
  • our Worker class kills the forked process when the work is done.

So the result is, failed jobs are not sent to sentry.

What should be the best solution to fix this?

  • wait somehow until the background thread is finished?
  • on registering, raise an error when an async transport is used?
  • document the behavior?

edit: changed raven-version where this appeared.

Documentation should mention that it does not work on Windows

Call me crazy, but I use Windows as my development platform (and Linux on servers). django-pq does not run on Windows at all (unsurprisingly crashes on os.fork() ). That's not a problem overall, but it'd be nice to have supported (or unsupported) OSes in README.

No distribution found for django-pq

It appears the distribution django-pq is no longer available on PyPI.

joy:~ chief$ pip install django-pq
Downloading/unpacking django-pq
Real name of requirement django-pq is django-pq
Could not find any downloads that satisfy the requirement django-pq
No distributions at all found for django-pq
Storing complete log in /Users/chief/.pip/pip.log

I have also manually searched for django-pq on PyPI and it returns no results.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.