celery
Here are 904 public repositories matching this topic...
1、安装环境,请务必确认docker和docker-compose已经安装好了
[root@localhost]# cat /etc/redhat-release
CentOS Linux release 7.6.1810 (Core)
[root@localhost]# docker --version
Docker version 19.03.1, build 74b1e89
[root@localhost]# docker-compose -version
docker-compose version 1.24.0, build 0aa59064
2、下载opsmanage的zip包
https://github.com/welliamcao/OpsManage
3、把zip包上传到linux系统任何目录,例如:/data/pkg。
4、编辑aut
In commit: celery/kombu@5780f1e
support for queues already defined in Amazon SQS was added.
However, simply by following the instructions at https://docs.celeryproject.org/en/latest/getting-started/brokers/sqs.html and Getting Started page, I must say that I'm still not able to process any message (it's not an issue with credentials or
-
Updated
May 26, 2020 - Python
AttributeError: 'DatabaseFeatures' object has no attribute 'autocommits_when_autocommit_is_off'
celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/djcelery/schedulers.py", line 169, in setup_schedule
celery-beat_1 | self.install_default_entries(self.schedule)
celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/djcelery/schedulers.py", line 263, in schedule
celery-beat_1 | self.sync()
celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/djce
-
Updated
Apr 1, 2019 - Python
Summary:
No module named 'celery.utils.time'
- Celery Version: 3.3.1
- Celery-Beat Version: 2.0.0
Exact steps to reproduce the issue:
- create virtualenv , pip install following packages
django==3.0.5
six==1.13.0
celery==3.1.25
django-celery>=3.3.1
django-celery-beat==2.0.0
- start django with manage command
Detailed information
File "/home/maeth/
-
Updated
May 1, 2020 - JavaScript
When adding a new celery shared_task to redbeat, unless the queue name is set via the options kwargs, the wrong queue is used in celery.app.base.Celery.send_task:
Task definition:
@celery.shared_task(bind=True, base=ApplicationTask, queue="my_queue")
def sometask(
self,
):
print('RUNNING')
Adding a task:
entry = redbeat.RedBeatSchedulerEntry(
name='my_
I am trying to figure out how to setup a basic minimum sink connector with Apache2 so it makes jasmin happy.
So the idea is to have a page that replies correctly to the jasmin MO messages submitted to it and ultimately throwing the messages away.
The reasons behind having dummy (/dev/null) connector is because the MO messages are not routed unless there is at least one destination, even thou
-
Updated
Mar 30, 2020 - Python
-
Updated
Feb 7, 2020 - Python
-
Updated
Jun 27, 2019 - Python
Overview
CeleryEmailBackend ignores the fail_silently option when enqueuing messages in send_messages(). When the Celery message broker is down and DEBUG is False, this causes huge delays in requests where a server error occurs. This allows a DoS attack.
Steps to reproduce:
- Ensure Celery is configured to retry failed publishing of tasks (on by default).
- Set
DEBUGto `Fal
-
Updated
May 25, 2020 - CSS
-
Updated
Jan 1, 2020 - Python
show how many commands ran
show pretty pictures on how many hosts/ports/paths are in the db
show the actual db output for ports/paths/hosts/etc
https://billiard.readthedocs.io/en/latest/index.html just looks like a copy of https://docs.python.org/3/library/multiprocessing.html . I get that billiard is a fork of multiprocessing but this doesn't seem right. I literally can't find the string 'billiard' in the documentation.
-
Updated
Oct 30, 2019 - JavaScript
-
Updated
Mar 22, 2019 - Jupyter Notebook
I have set up django with django_celery_results - everything works, but the task status doesn't get updated when the task begins processing (into STARTED). Is there an extra configuration that needs to be set up for this to be the case?
My current configuration has the following variables in django settings:
# Celery
# ---------------------------------------------------------------------
Improve this page
Add a description, image, and links to the celery topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the celery topic, visit your repo's landing page and select "manage topics."
In the web UI, While Clicking one of the Worker Name, WebUI does not take to the respective page rather it is unresponsive.
**Worker Name | Status | Active | Processed | Failed | Succeeded | Retried | Load Average
celery@abc.com| online | 0 | 2 | 0 | 2 | 0 | 0.06, 0.1, 0.09
celery@bcd.com| online | 0 | 3 | 0