arteria / django-background-tasks Goto Github PK
View Code? Open in Web Editor NEWThis project forked from lilspikey/django-background-task
A database-backed work queue for Django
License: BSD 3-Clause "New" or "Revised" License
This project forked from lilspikey/django-background-task
A database-backed work queue for Django
License: BSD 3-Clause "New" or "Revised" License
Allow to run tasks in parallel
I have an existing function in utils.py that I'd like to turn into a background task. Is that possible?
(I'm calling the function from all over the place so it wouldn't be feasible to replace it with one in tasks.py)
User email_admins
or other build-in tools.
Using cached django-background-tasks-1.0.13.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "", line 1, in
File "C:\Users\user1\AppData\Local\Temp\pip-build-1ixyhz2eon\django-background-tasks\setup.py", line 11, in
long_description=open('README.rst').read(),
UnicodeDecodeError: 'gbk' codec can't decode byte 0x93 in position 1174: illegal multibyte sequence
I'm using Windows 10 Chinese version. Install in another pure English machine didn't encounter this.
(re-use/move from customer project to open source app)
Hi,
Is there any reason why the migrations are not in the module?
I generated some delayed tasks. Then I fired up 3 task runners like this:
python -u manage.py process_tasks --duration 0 > "bg_runner1.txt" 2>&1 &
python -u manage.py process_tasks --duration 0 > "bg_runner2.txt" 2>&1 &
python -u manage.py process_tasks --duration 0 > "bg_runner3.txt" 2>&1 &
(I did 3 concurrently to simulate how I'd have it deployed on 3 production servers.)
Two of them died with this error:
Traceback (most recent call last):
File "manage.py", line 10, in <module>
execute_from_command_line(sys.argv)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/core/management/__init__.py", line 338, in execute_from_command_line
utility.execute()
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/core/management/__init__.py", line 330, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/core/management/base.py", line 393, in run_from_argv
self.execute(*args, **cmd_options)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/core/management/base.py", line 444, in execute
output = self.handle(*args, **options)
File "/Users/gpinero/portals-fe/background_task/management/commands/process_tasks.py", line 86, in handle
if not self._tasks.run_next_task():
File "/Users/gpinero/portals-fe/background_task/tasks.py", line 102, in run_next_task
return self._runner.run_next_task(self)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/utils/decorators.py", line 145, in inner
return func(*args, **kwargs)
File "/Users/gpinero/portals-fe/background_task/tasks.py", line 229, in run_next_task
self.run_task(tasks, task)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/utils/decorators.py", line 145, in inner
return func(*args, **kwargs)
File "/Users/gpinero/portals-fe/background_task/tasks.py", line 218, in run_task
tasks.run_task(task.task_name, args, kwargs)
File "/Users/gpinero/portals-fe/background_task/tasks.py", line 100, in run_task
self._bg_runner(proxy_task, *args, **kwargs)
File "/Users/gpinero/portals-fe/background_task/tasks.py", line 57, in bg_runner
task.reschedule(t, e, traceback)
File "/Users/gpinero/portals-fe/background_task/models.py", line 169, in reschedule
self.increment_attempts()
File "/Users/gpinero/portals-fe/background_task/models.py", line 157, in increment_attempts
self.save()
File "/Users/gpinero/portals-fe/background_task/models.py", line 208, in save
return super(Task, self).save(*arg, **kw)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/models/base.py", line 734, in save
force_update=force_update, update_fields=update_fields)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/models/base.py", line 762, in save_base
updated = self._save_table(raw, cls, force_insert, force_update, using, update_fields)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/models/base.py", line 827, in _save_table
forced_update)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/models/base.py", line 877, in _do_update
return filtered._update(values) > 0
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/models/query.py", line 580, in _update
return query.get_compiler(self.db).execute_sql(CURSOR)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/models/sql/compiler.py", line 1062, in execute_sql
cursor = super(SQLUpdateCompiler, self).execute_sql(result_type)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/models/sql/compiler.py", line 840, in execute_sql
cursor.execute(sql, params)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/backends/utils.py", line 79, in execute
return super(CursorDebugWrapper, self).execute(sql, params)
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/backends/utils.py", line 59, in execute
self.db.validate_no_broken_transaction()
File "/Users/gpinero/sx_direct_env/lib/python2.7/site-packages/django/db/backends/base/base.py", line 327, in validate_no_broken_transaction
"An error occurred in the current transaction. You can't "
django.db.transaction.TransactionManagementError: An error occurred in the current transaction. You can't execute queries until the end of the 'atomic' block.
I get the following error and traceback when I try to run ./manage.py process_tasks
when DEBUG=True.
Traceback (most recent call last):
File "./manage.py", line 11, in <module>
execute_from_command_line(sys.argv)
File "/home/chadgh/.virtualenvs/atom/lib/python2.7/site-packages/django/core/management/__init__.py", line 385, in execute_from_command_line
utility.execute()
File "/home/chadgh/.virtualenvs/atom/lib/python2.7/site-packages/django/core/management/__init__.py", line 377, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/home/chadgh/.virtualenvs/atom/lib/python2.7/site-packages/django/core/management/base.py", line 288, in run_from_argv
self.execute(*args, **options.__dict__)
File "/home/chadgh/.virtualenvs/atom/lib/python2.7/site-packages/django/core/management/base.py", line 338, in execute
output = self.handle(*args, **options)
File "/home/chadgh/.virtualenvs/atom/lib/python2.7/site-packages/background_task/management/commands/process_tasks.py", line 76, in handle
autodiscover()
File "/home/chadgh/.virtualenvs/atom/lib/python2.7/site-packages/background_task/tasks.py", line 234, in autodiscover
app_path = import_module(app).__path__
File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
ImportError: No module named DebugToolbarConfig
As a new feature, it would be nice if I could start a process instance of background task, with an argument only to pull certain tasks out of a named queue. The task decorator would have an optional queue parameter.
That would allow the user to architect their own multiprocessing if they wanted such a mode of asynchronicity.
It would also allow more flexibility by having per-queue specific options. E.g. threading: having a per-queue setting for whether to run the tasks in that specific queue async or not (when the global BACKGROUND_TASK_RUN_ASYNC is not set).
Using Django 1.9 (I know it's not supported, but read in another issue that it should work fine?)
I was able to get it installed, and even figured out how to run the migrations.
I can get a task added to background_task correctly, but then...nothing happens.
I have ./manage.py process_tasks running in another screen as well.
What am I doing wrong?
Here is a TestCase:
class DoctArgsTestCase(unittest.TestCase):
def setUp(self):
@tasks.background()
def function(d):
pass
self.task = function
def test_launch_with_dict_keys_not_alphabet_order(self):
self.assertEqual(Task.objects.count(), 0)
self.task({22222:2, 11111:1})
self.assertEqual(Task.objects.count(), 1, 'Task don\'t created')
tasks.run_next_task()
self.assertEqual(Task.objects.count(), 0, 'Task don\'t started')
If I launch this test several times repeatly, sometimes it will fail.
But failed task remains locked forever:
def test_launch_with_dict_keys_locked(self):
self.assertEqual(Task.objects.count(), 0)
self.task({22222:2, 11111:1})
self.assertEqual(Task.objects.count(), 1, 'Task don\'t created')
tasks.run_next_task()
if Task.objects.count() == 1:
# didn't started, check the locked_by
self.assertIsNone(Task.objects.all().first().locked_by, 'Task is locked')
get_query_set will be removed in the next version of django-compat
Re-handle this in the sources manually.
How do you suggest I might write unit tests against the tasks I create? And other code calling the tasks.
I'm thinking I'd need a way for all the tasks to be run immediately and ignore the delay parameter during unit testing?
This project seems to indicate support for Python 3. However, after installing into a Python 3 project, I'm not able to run the migrations successfully.
To reproduce:
File ".../.virtualenvs/tmp-e3f3ffba6e5aedf/lib/python3.4/site-packages/background_task/admin.py", line 2, in
from models_completed import CompletedTask
ImportError: No module named 'models_completed'
Am I doing something wrong, or is Python 3 not supported?
@Luthaf 's lilspikey#12 on the original repo (use with compat #1)
My tasks are not automatically discovered and scheduled, they only work when I run the ./manage.py process_tasks
management command. Is this correct?
Here's what I have:
In my root urls.py file:
from background_task import tasks
tasks.autodiscover()
In one of my apps' tasks.py
file:
from background_task import background
@background(schedule=5)
def test_background_tasks():
print('It worked!')
And in a view I am calling the test_background_tasks
function.
I would expect 'It worked!' to be printed to my console within 5 seconds, but this does not happen. This only happens if I run the ./manage.py process_tasks
command.
Are tasks not automatically picked up and scheduled when the project starts up? Do I have to run process_tasks
command every time I want my scheduled tasks to be picked up? And if so, is there any way to put the process_tasks
command in the background and not let it take up my console?
When making migrations, I ran into an issue where it wouldn't properly generate due to an issue with importing models_completed in your admins.py
Traceback (most recent call last):
File "manage.py", line 10, in <module>
execute_from_command_line(sys.argv)
File "C:\Program Files (x86)\Python\lib\site-packages\django\core\management\_
_init__.py", line 338, in execute_from_command_line
utility.execute()
File "C:\Program Files (x86)\Python\lib\site-packages\django\core\management\_
_init__.py", line 312, in execute
django.setup()
File "C:\Program Files (x86)\Python\lib\site-packages\django\__init__.py", lin
e 18, in setup
apps.populate(settings.INSTALLED_APPS)
File "C:\Program Files (x86)\Python\lib\site-packages\django\apps\registry.py"
, line 115, in populate
app_config.ready()
File "C:\Program Files (x86)\Python\lib\site-packages\django\contrib\admin\app
s.py", line 22, in ready
self.module.autodiscover()
File "C:\Program Files (x86)\Python\lib\site-packages\django\contrib\admin\__i
nit__.py", line 24, in autodiscover
autodiscover_modules('admin', register_to=site)
File "C:\Program Files (x86)\Python\lib\site-packages\django\utils\module_load
ing.py", line 74, in autodiscover_modules
import_module('%s.%s' % (app_config.name, module_to_search))
File "C:\Program Files (x86)\Python\lib\importlib\__init__.py", line 109, in i
mport_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 2254, in _gcd_import
File "<frozen importlib._bootstrap>", line 2237, in _find_and_load
File "<frozen importlib._bootstrap>", line 2226, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 1200, in _load_unlocked
File "<frozen importlib._bootstrap>", line 1129, in _exec
File "<frozen importlib._bootstrap>", line 1471, in exec_module
File "<frozen importlib._bootstrap>", line 321, in _call_with_frames_removed
File "C:\Program Files (x86)\Python\lib\site-packages\background_task\admin.py
", line 2, in <module>
from models_completed import CompletedTask
ImportError: No module named 'models_completed'
I checked through and the module is definitely there, but it just wasn't getting read. I fiddled around a bit, and found that the following change in admins.py fixed the issue and let migrations run:
from django.contrib import admin
# from models_completed import CompletedTask << Old
from .models_completed import CompletedTask
from .models import Task
class TaskAdmin(admin.ModelAdmin):
display_filter = ['task_name']
list_display = ['task_name', 'task_params', 'run_at', 'priority', 'attempts']
admin.site.register(Task, TaskAdmin)
admin.site.register(CompletedTask)
I'd make a pull request, but I honestly don't know if this is just a me-problem. I'm running Win7 x64 using Django 1.8.2
If possible , could you guide me to use this in production environment? I tried to use celery periodic task to delete expired sessions. But celery production env setup is not easy way to do. So I planned to use django-background-tasks. Could you say the possiblities?
or if we are idle.
This would help not to terminate tasks due to a deployment for example
When you try to schedule two tasks based on the same function with the same args/kwargs combination, only the first task will be executed. This happens even if the first task has exceeded its maximum of allowed attempts.
The following test fails with the current version of django-background-tasks.
from django.conf import settings
from django.test import TestCase
from background_task.tasks import tasks
from background_task.models import Task
class MultipleFailingTasksTestCase(TestCase):
def setUp(self):
@tasks.background(name='failing task')
def failing_task():
return 0/0
self.failing_task = failing_task
def test_multiple_failing_tasks(self):
with self.settings(MAX_ATTEMPTS=1):
self.assertEqual(settings.MAX_ATTEMPTS, 1)
task1 = self.failing_task()
task2 = self.failing_task()
task1_id = task1.id
task2_id = task2.id
tasks.run_next_task()
self.assertEqual(Task.objects.get(pk=task1_id).attempts, 1, 'task1 should have been executed')
self.assertTrue(Task.objects.get(pk=task1_id).attempts >= settings.MAX_ATTEMPTS, 'task1 should have reached his maximum attempts')
tasks.run_next_task()
# Fails
self.assertEqual(Task.objects.get(pk=task1_id).attempts, 1, 'task1 should not be executed anymore')
# Fails
self.assertEqual(Task.objects.get(pk=task2_id).attempts, 1, 'task2 should have been executed')
# Fails
self.assertTrue(Task.objects.get(pk=task2_id).attempts >= settings.MAX_ATTEMPTS, 'task2 should have reached his maximum attempts')
tasks.run_next_task()
# Fails
self.assertEqual(Task.objects.get(pk=task1_id).attempts, 1, 'task1 should not be executed anymore')
# Fails
self.assertEqual(Task.objects.get(pk=task2_id).attempts, 1, 'task2 should not be executed anymore')
The issue seems to be caused by the fact that the SHA-1 hash used to identify tasks is based only on the task's name and parameters.
Use Django's built in logging system instead of the one implemented in the management command.
I was trying the code recommended from issue 39 like this:
from django.test import TestCase
from background_task import background
from background_task.tasks import tasks
@background(schedule=5)
def my_background_task(message):
send an email ...
class MyBackgroundTaskTestCase(TestCase):
def test_run_task(self):
my_background_task(...)
# time.sleep(5) # See below
tasks.run_next_task()
print len(mail.outbox)
The problem is that tasks.run_next_task won't find the task until 5 seconds later. I was under the impression run_next_task would be a way to run pending tasks immediately even if they're scheduled for the future.
Is there a different method that does that? Would it make sense to add a method like that for unit testing?
I was running into a situation where the task could not be found due to the hash value computed by
TaskManager.gettask() being different than that calculated by new_task(). I am using Python 2.7 UCS4 and Django 1.6.
The problem lies in lines 87 and 99 -
task_params = json.dumps((args, kwargs))
kwargs is not an OrderedDict, so how the keys are eventually arranged in the json string is not guaranteed.
I set the line to
task_params = json.dumps((args, kwargs), sort_keys=True)
and now things work as expected.
Any guess on when Django 1.9 might be supported?
Not sure if the tests are buggy but they are broken since adding the option for asynchronous execution of tasks using threads. We are using the latest version (1.0.5, with async activated) in pre-production without issues.
I am using django-rules (https://github.com/dfunckt/django-rules) which requires my INSTALLED_APPS to include 'rules.apps.AutodiscoverRulesConfig'.
This does not appear to be a module but is a method that auto-discovers modules as follows -
class AutodiscoverRulesConfig(RulesConfig):
def ready(self):
from django.utils.module_loading import autodiscover_modules
autodiscover_modules('rules')
When I use this in conjunction with manage.py process_tasks
, I get the message ImportError: No module named AutodiscoverRulesConfig
. I can rectify the problem by changing DBTaskRunner.autodiscover() as follows -
for app in settings.INSTALLED_APPS:
if app <> 'rules.apps.AutodiscoverRulesConfig':
try:
app_path = import_module(app).__path__
except AttributeError:
continue
try:
imp.find_module('tasks', app_path)
except ImportError:
continue
import_module("%s.tasks" % app)
This works fine because rules are not used for background processes but I have difficulty deploying. Any thoughts on how I should proceed?
With django 1.9:
$ python manage.py process_tasks
.../django/core/management/__init__.py:345: RemovedInDjango110Warning:
OptionParser usage for Django management commands is deprecated, use ArgumentParser instead
self.fetch_command(subcommand).run_from_argv(self.argv)
Cannot be fixed until <1.8 support is dropped
Hey all,
I want to be able to show users what jobs are sitting in the queue, and possibly let them manipulate them there ("delete" before a job is run for example)
Has anyone created a way to do this yet? I'm sure it wouldn't be that hard since everything is sitting in the database, but I don't know much about it.
Ideally it would involve some kind of AJAX. A button could push something into the queue on the page, the queue would update (without a page reload) and then you could "delete" jobs there (again, without a page reload).
If it doesn't exist I'll try my hand at it, but I have zero experience with AJAX, and it took me days to figure out django-background-tasks in the first place ;)
WARNING:root:Rescheduling Task(account.views.send_code)
Traceback (most recent call last):
File "/home/t/env/local/lib/python2.7/site-packages/background_task/tasks.py", line 172, in run_task
tasks.run_task(task.task_name, args, kwargs)
File "/home/t/env/local/lib/python2.7/site-packages/background_task/tasks.py", line 54, in run_task
task = self._tasks[task_name]
**KeyError: u'account.views.send_code'**
WARNING:root:Rescheduling task Task(account.views.send_code) for 0:00:06 later at 2015-05-16 18:54:01.832337+00:00
account/view.py
contains:
@background(schedule=10)
def send_code(email, code):
send_mail(
"It works!", "This will get sent through Mandrill<ahref=http://127.0.0.1:8000/confirm/"+code+">confirm</a>",
"Djrill Sender <[email protected]>", [email]
)
Works fine w/o backgroun-tasks.
I am looking through the code to see where the problem lays.
!?
Specifically I'm running a simple email task in python 3.4 and the task doesn't run. When I run the process_tasks
management command it is just silent. This is even with --log-level=DEBUG
.
The same tasks work fine with Python 2.7.
Hi, I'm a newbie in Django and I need help in using this project.
I downloaded (using pip install) this project. Added 'background_task' to my INSTALLED_APPS,
I added the code in my tasks.py
from background_data import background
@background(schedule=1)
def test_print():
print("HEYY")
I am receiving an error that the word background after 'import' is underlined as red.
Also, when I call manage.py makemigrations, the supposed tables for background_data is not detected. Can you help me solve these issues?
background current can only set a job to run at a specified time.
notify_user(user.id, schedule=timezone.now()) # at a specific time
It's better if this can support repeat job run at a regularly time, e.g., run daily, weekly or at a specified time. e.g.,
notify_user(user.id, schedule=timezone.now(), repeat=<daily/weekly/timedelta/etc.>)
1.0.13 ImportError: No module named 'myapp.apps.MyappsConfig'; 'myapps.apps' is not a package
That's due to django's upgrade from using INSTALLED_APPS to apps.get_app_configs().
Change autodiscover() in tasks.py to below would fix this:
def autodiscover():
'''autodiscover tasks.py files in much the same way as admin app'''
import imp
#from django.conf import settings
from django.apps import apps
#for app in settings.INSTALLED_APPS:
for app_config in apps.get_app_configs():
Do not run more than X tasks in treads at the same time.
Please cherrypick from @martinogden and @ahmadfaizalbh fork for Django 1.10 support.
Example:
Travis config: 1.8.* https://travis-ci.org/arteria/django-background-tasks/jobs/172598618/config
Actually installed: 1.10.3 https://travis-ci.org/arteria/django-background-tasks/jobs/172598618#L158
mail_admin
Opbeat
Sentry
https://github.com/arteria/django-background-tasks/blob/master/background_task/tasks.py#L190
This could be done easily using Django's logging framework. See #16 as well.
when i develop my site, I use python manage.py runserver
. Everything works fine.
when I deploy my site, I use apache2.4.6 and mod_wsgi. I found the tasks are registered, but not processed at all. In Admin, Home>Background_Task>Tasks
list all my tasks, but Home>Background_Task>Completed tasks
shows no task was completed. I checked the log, and found the tasks were not been processed.
owner
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.