Celery, flask and Docker - Issues with the logs and running tasks
up vote
0
down vote
favorite
I’m building a project using Flask, Celery and Docker.
The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.
First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:
INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.
I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.
- the flask app container has on it the celery definition and instance
- the celery container uses the flaskapp image but runs this command after activating the virtualenv:
celery worker -A app.controller.celery -l info
Then I open the celery container’s log: docker logs server_celery_1 -f
in order to monitor that the tasks are running.
Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.
Here is the code involved:
def make_celery(_app):
celery = Celery(
_app.import_name,
backend=_app.config['redis://redis:6379/1'],
broker=_app.config['redis://redis:6379/0']
)
celery.conf.update(_app.config)
class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with _app.app_context():
return self.run(*args, **kwargs)
celery.Task = ContextTask
return celery
celery = make_celery(app)
@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
req = request.get_json()
module = req.get('module')
json_input = req.get('input')
logger.info('Running module: ' + req.get('module'))
res = do.delay(module, json_input)
return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})
@celery.task()
def do(module_name, json_input):
logger.info('____ Running ____')
modules.run(module_name, json_input)
**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)
celery -A app.controller.engine.celery events
and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.
What am I missing?
Thanks a lot!
python docker flask celery
add a comment |
up vote
0
down vote
favorite
I’m building a project using Flask, Celery and Docker.
The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.
First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:
INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.
I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.
- the flask app container has on it the celery definition and instance
- the celery container uses the flaskapp image but runs this command after activating the virtualenv:
celery worker -A app.controller.celery -l info
Then I open the celery container’s log: docker logs server_celery_1 -f
in order to monitor that the tasks are running.
Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.
Here is the code involved:
def make_celery(_app):
celery = Celery(
_app.import_name,
backend=_app.config['redis://redis:6379/1'],
broker=_app.config['redis://redis:6379/0']
)
celery.conf.update(_app.config)
class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with _app.app_context():
return self.run(*args, **kwargs)
celery.Task = ContextTask
return celery
celery = make_celery(app)
@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
req = request.get_json()
module = req.get('module')
json_input = req.get('input')
logger.info('Running module: ' + req.get('module'))
res = do.delay(module, json_input)
return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})
@celery.task()
def do(module_name, json_input):
logger.info('____ Running ____')
modules.run(module_name, json_input)
**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)
celery -A app.controller.engine.celery events
and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.
What am I missing?
Thanks a lot!
python docker flask celery
Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
– Bjoern Stiel
Nov 11 at 11:56
It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
– magnoz
Nov 11 at 13:38
I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
– magnoz
Nov 11 at 14:00
I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, iecelery -A app.controller.engine.celery inspect stats
, then the thing "unblocks" and start working..... any ideas?
– magnoz
Nov 11 at 14:11
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I’m building a project using Flask, Celery and Docker.
The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.
First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:
INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.
I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.
- the flask app container has on it the celery definition and instance
- the celery container uses the flaskapp image but runs this command after activating the virtualenv:
celery worker -A app.controller.celery -l info
Then I open the celery container’s log: docker logs server_celery_1 -f
in order to monitor that the tasks are running.
Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.
Here is the code involved:
def make_celery(_app):
celery = Celery(
_app.import_name,
backend=_app.config['redis://redis:6379/1'],
broker=_app.config['redis://redis:6379/0']
)
celery.conf.update(_app.config)
class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with _app.app_context():
return self.run(*args, **kwargs)
celery.Task = ContextTask
return celery
celery = make_celery(app)
@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
req = request.get_json()
module = req.get('module')
json_input = req.get('input')
logger.info('Running module: ' + req.get('module'))
res = do.delay(module, json_input)
return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})
@celery.task()
def do(module_name, json_input):
logger.info('____ Running ____')
modules.run(module_name, json_input)
**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)
celery -A app.controller.engine.celery events
and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.
What am I missing?
Thanks a lot!
python docker flask celery
I’m building a project using Flask, Celery and Docker.
The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.
First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:
INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.
I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.
- the flask app container has on it the celery definition and instance
- the celery container uses the flaskapp image but runs this command after activating the virtualenv:
celery worker -A app.controller.celery -l info
Then I open the celery container’s log: docker logs server_celery_1 -f
in order to monitor that the tasks are running.
Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.
Here is the code involved:
def make_celery(_app):
celery = Celery(
_app.import_name,
backend=_app.config['redis://redis:6379/1'],
broker=_app.config['redis://redis:6379/0']
)
celery.conf.update(_app.config)
class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with _app.app_context():
return self.run(*args, **kwargs)
celery.Task = ContextTask
return celery
celery = make_celery(app)
@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
req = request.get_json()
module = req.get('module')
json_input = req.get('input')
logger.info('Running module: ' + req.get('module'))
res = do.delay(module, json_input)
return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})
@celery.task()
def do(module_name, json_input):
logger.info('____ Running ____')
modules.run(module_name, json_input)
**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)
celery -A app.controller.engine.celery events
and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.
What am I missing?
Thanks a lot!
python docker flask celery
python docker flask celery
edited Nov 11 at 13:40
asked Nov 11 at 8:37
magnoz
354112
354112
Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
– Bjoern Stiel
Nov 11 at 11:56
It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
– magnoz
Nov 11 at 13:38
I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
– magnoz
Nov 11 at 14:00
I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, iecelery -A app.controller.engine.celery inspect stats
, then the thing "unblocks" and start working..... any ideas?
– magnoz
Nov 11 at 14:11
add a comment |
Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
– Bjoern Stiel
Nov 11 at 11:56
It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
– magnoz
Nov 11 at 13:38
I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
– magnoz
Nov 11 at 14:00
I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, iecelery -A app.controller.engine.celery inspect stats
, then the thing "unblocks" and start working..... any ideas?
– magnoz
Nov 11 at 14:11
Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
– Bjoern Stiel
Nov 11 at 11:56
Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
– Bjoern Stiel
Nov 11 at 11:56
It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
– magnoz
Nov 11 at 13:38
It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
– magnoz
Nov 11 at 13:38
I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
– magnoz
Nov 11 at 14:00
I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
– magnoz
Nov 11 at 14:00
I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie
celery -A app.controller.engine.celery inspect stats
, then the thing "unblocks" and start working..... any ideas?– magnoz
Nov 11 at 14:11
I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie
celery -A app.controller.engine.celery inspect stats
, then the thing "unblocks" and start working..... any ideas?– magnoz
Nov 11 at 14:11
add a comment |
1 Answer
1
active
oldest
votes
up vote
0
down vote
I kind of fixed this rare issue by starting the celery worker using eventlet:
celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet
I don't fully understand why it doesn't work with the default one though.
I'd appreciate if anyone can bring some light on this.
Thanks anyways
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
I kind of fixed this rare issue by starting the celery worker using eventlet:
celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet
I don't fully understand why it doesn't work with the default one though.
I'd appreciate if anyone can bring some light on this.
Thanks anyways
add a comment |
up vote
0
down vote
I kind of fixed this rare issue by starting the celery worker using eventlet:
celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet
I don't fully understand why it doesn't work with the default one though.
I'd appreciate if anyone can bring some light on this.
Thanks anyways
add a comment |
up vote
0
down vote
up vote
0
down vote
I kind of fixed this rare issue by starting the celery worker using eventlet:
celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet
I don't fully understand why it doesn't work with the default one though.
I'd appreciate if anyone can bring some light on this.
Thanks anyways
I kind of fixed this rare issue by starting the celery worker using eventlet:
celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet
I don't fully understand why it doesn't work with the default one though.
I'd appreciate if anyone can bring some light on this.
Thanks anyways
answered Nov 12 at 16:08
magnoz
354112
354112
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53247072%2fcelery-flask-and-docker-issues-with-the-logs-and-running-tasks%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
– Bjoern Stiel
Nov 11 at 11:56
It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
– magnoz
Nov 11 at 13:38
I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
– magnoz
Nov 11 at 14:00
I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie
celery -A app.controller.engine.celery inspect stats
, then the thing "unblocks" and start working..... any ideas?– magnoz
Nov 11 at 14:11