- Jan 30, 2019
-
-
David Douard authored
so it uses a proper connection pool to access the database.
-
David Douard authored
so it uses a proper connection pool to access the database.
-
David Douard authored
-
David Douard authored
do only report the number of scheduled tasks at info level, if any, and lower the 'Run ready tasks' log message at debug level.
-
David Douard authored
when respawning several tasks at once.
-
David Douard authored
Note sure whether we should be concerned by the fact this sometimes occurs (under heavy load in the docker env).
-
David Douard authored
especially the infamous psycopg2 one.
-
- Jan 29, 2019
-
-
vlorentz authored
It's going very fast now, enough to empty its queue between two scheduler-runner runs.
-
- Jan 28, 2019
-
-
Antoine R. Dumont authored
This will work around the current debian package build failure Related T1498
-
- Jan 18, 2019
-
-
Nicolas Dandrimont authored
Works around https://github.com/ClearcodeHQ/pytest-postgresql/issues/16
-
- Jan 17, 2019
-
-
David Douard authored
instead of replicating the same logging code everywhere.
-
- Jan 16, 2019
-
-
David Douard authored
This reverts commit f267f454. Pushed by mistake.
-
- Jan 15, 2019
-
-
David Douard authored
-
David Douard authored
see https://github.com/pytest-dev/pytest/issues/4641 It should be possible to let this constraint go when celery 4.3 is out
-
David Douard authored
also remove now useless celery_testing.py and scheduler_testing.py files.
-
David Douard authored
this uses the pytest-postgresql package to manage the database life cycle.
-
rpc://'David Douard authored
the result backend must be configured to be able to save (celery) group results (as it is now the case for lister tasks for example).
-
David Douard authored
-
David Douard authored
that allows to respawn any task immediately (or any later date).
-
David Douard authored
so this method can be used to respawn a task immediately.
-
David Douard authored
and not only pending tasks.
-
David Douard authored
to display also the status and priority fields.
-
David Douard authored
-
David Douard authored
-
David Douard authored
Not all keys are mandatory, so do not expect all the possible keys for the task_type table to be given when calling create_task_type(). Note: no validation is made whether the given set of keys fullfill the table constraints.
-
- Jan 14, 2019
-
-
David Douard authored
-
David Douard authored
-
David Douard authored
instead of float numbers.
-
David Douard authored
-
- Jan 10, 2019
-
-
David Douard authored
and add some kind of unit tests.
-
David Douard authored
in favor of the 'swh-scheduler api-server' command.
-
David Douard authored
also log an eventual exception instead of crashing the runner service.
-
David Douard authored
use functions instead of methods. This is required to be able to use celery pytest fixtures so one can really test celery tasks (especially when a task spawns sub tasks). one (get_queue_lenth) of the 3 methods has been added as (monkeypatched) method on the Celery class for the sake of bw compat, but this should really be removed as well as soon as possible (seems only used in swh-archiver).
-
David Douard authored
so that logging level config can be consistently set for all swh-scheduler commands.
-
David Douard authored
since it's actually the responsibility of each subcommand to decide whether it can run without a properly configured scheduler instance. This is also required so the user can run: swh-scheduler subcommand --help even with a non-properly configured scheduler.
-
David Douard authored
These commands do what they say, ie. start a runner, listener or API server process. Note that processes are not daemonized and run in front. Typically used as: swh-scheduler --cls local --database postgresql:///?service=swh api-server --host 127.0.0.1 --port 5008 swh-scheduler --cls remote --url http://127.0.0.1:5008 runner --period 10 swh-scheduler --cls remote --url http://127.0.0.1:5008 listener
-
David Douard authored
This is needed to be able to add more context objects (see following revisions).
-
David Douard authored
It is meant to be used to declare swh tasks via the task decorator instead of subclassing the (now deprecated) Task class. It is typically used like: from swh.scheduler.celery_backend.config import app from swh.scheduler.tasks import SWHTask @app.task(base=SWHTask) def ping(): return 'pong'
-
- Jan 08, 2019
-
-
David Douard authored
class-based router is a remainder of celery 3.
-
David Douard authored
- do use a dedicated logger instead of the root logger, - add a couple of logging statements (in perform_action methods), - replace the --verbose cli option by --log-level
-