Celery, flask and Docker - Issues with the logs and running tasks











up vote
0
down vote

favorite












I’m building a project using Flask, Celery and Docker.
The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.



First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:



INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.


I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.




  • the flask app container has on it the celery definition and instance

  • the celery container uses the flaskapp image but runs this command after activating the virtualenv: celery worker -A app.controller.celery -l info


Then I open the celery container’s log: docker logs server_celery_1 -f in order to monitor that the tasks are running.



Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.



Here is the code involved:



def make_celery(_app):
celery = Celery(
_app.import_name,
backend=_app.config['redis://redis:6379/1'],
broker=_app.config['redis://redis:6379/0']
)
celery.conf.update(_app.config)


class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with _app.app_context():
return self.run(*args, **kwargs)

celery.Task = ContextTask
return celery


celery = make_celery(app)



@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
req = request.get_json()
module = req.get('module')
json_input = req.get('input')
logger.info('Running module: ' + req.get('module'))
res = do.delay(module, json_input)
return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})


@celery.task()
def do(module_name, json_input):
logger.info('____ Running ____')
modules.run(module_name, json_input)


**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)



celery -A app.controller.engine.celery events


and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.



What am I missing?



Thanks a lot!










share|improve this question
























  • Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
    – Bjoern Stiel
    Nov 11 at 11:56










  • It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
    – magnoz
    Nov 11 at 13:38












  • I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
    – magnoz
    Nov 11 at 14:00










  • I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie celery -A app.controller.engine.celery inspect stats, then the thing "unblocks" and start working..... any ideas?
    – magnoz
    Nov 11 at 14:11

















up vote
0
down vote

favorite












I’m building a project using Flask, Celery and Docker.
The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.



First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:



INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.


I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.




  • the flask app container has on it the celery definition and instance

  • the celery container uses the flaskapp image but runs this command after activating the virtualenv: celery worker -A app.controller.celery -l info


Then I open the celery container’s log: docker logs server_celery_1 -f in order to monitor that the tasks are running.



Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.



Here is the code involved:



def make_celery(_app):
celery = Celery(
_app.import_name,
backend=_app.config['redis://redis:6379/1'],
broker=_app.config['redis://redis:6379/0']
)
celery.conf.update(_app.config)


class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with _app.app_context():
return self.run(*args, **kwargs)

celery.Task = ContextTask
return celery


celery = make_celery(app)



@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
req = request.get_json()
module = req.get('module')
json_input = req.get('input')
logger.info('Running module: ' + req.get('module'))
res = do.delay(module, json_input)
return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})


@celery.task()
def do(module_name, json_input):
logger.info('____ Running ____')
modules.run(module_name, json_input)


**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)



celery -A app.controller.engine.celery events


and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.



What am I missing?



Thanks a lot!










share|improve this question
























  • Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
    – Bjoern Stiel
    Nov 11 at 11:56










  • It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
    – magnoz
    Nov 11 at 13:38












  • I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
    – magnoz
    Nov 11 at 14:00










  • I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie celery -A app.controller.engine.celery inspect stats, then the thing "unblocks" and start working..... any ideas?
    – magnoz
    Nov 11 at 14:11















up vote
0
down vote

favorite









up vote
0
down vote

favorite











I’m building a project using Flask, Celery and Docker.
The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.



First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:



INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.


I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.




  • the flask app container has on it the celery definition and instance

  • the celery container uses the flaskapp image but runs this command after activating the virtualenv: celery worker -A app.controller.celery -l info


Then I open the celery container’s log: docker logs server_celery_1 -f in order to monitor that the tasks are running.



Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.



Here is the code involved:



def make_celery(_app):
celery = Celery(
_app.import_name,
backend=_app.config['redis://redis:6379/1'],
broker=_app.config['redis://redis:6379/0']
)
celery.conf.update(_app.config)


class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with _app.app_context():
return self.run(*args, **kwargs)

celery.Task = ContextTask
return celery


celery = make_celery(app)



@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
req = request.get_json()
module = req.get('module')
json_input = req.get('input')
logger.info('Running module: ' + req.get('module'))
res = do.delay(module, json_input)
return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})


@celery.task()
def do(module_name, json_input):
logger.info('____ Running ____')
modules.run(module_name, json_input)


**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)



celery -A app.controller.engine.celery events


and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.



What am I missing?



Thanks a lot!










share|improve this question















I’m building a project using Flask, Celery and Docker.
The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.



First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:



INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.


I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.




  • the flask app container has on it the celery definition and instance

  • the celery container uses the flaskapp image but runs this command after activating the virtualenv: celery worker -A app.controller.celery -l info


Then I open the celery container’s log: docker logs server_celery_1 -f in order to monitor that the tasks are running.



Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.



Here is the code involved:



def make_celery(_app):
celery = Celery(
_app.import_name,
backend=_app.config['redis://redis:6379/1'],
broker=_app.config['redis://redis:6379/0']
)
celery.conf.update(_app.config)


class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with _app.app_context():
return self.run(*args, **kwargs)

celery.Task = ContextTask
return celery


celery = make_celery(app)



@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
req = request.get_json()
module = req.get('module')
json_input = req.get('input')
logger.info('Running module: ' + req.get('module'))
res = do.delay(module, json_input)
return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})


@celery.task()
def do(module_name, json_input):
logger.info('____ Running ____')
modules.run(module_name, json_input)


**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)



celery -A app.controller.engine.celery events


and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.



What am I missing?



Thanks a lot!







python docker flask celery






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 11 at 13:40

























asked Nov 11 at 8:37









magnoz

354112




354112












  • Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
    – Bjoern Stiel
    Nov 11 at 11:56










  • It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
    – magnoz
    Nov 11 at 13:38












  • I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
    – magnoz
    Nov 11 at 14:00










  • I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie celery -A app.controller.engine.celery inspect stats, then the thing "unblocks" and start working..... any ideas?
    – magnoz
    Nov 11 at 14:11




















  • Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
    – Bjoern Stiel
    Nov 11 at 11:56










  • It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
    – magnoz
    Nov 11 at 13:38












  • I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
    – magnoz
    Nov 11 at 14:00










  • I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie celery -A app.controller.engine.celery inspect stats, then the thing "unblocks" and start working..... any ideas?
    – magnoz
    Nov 11 at 14:11


















Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
– Bjoern Stiel
Nov 11 at 11:56




Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0
– Bjoern Stiel
Nov 11 at 11:56












It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
– magnoz
Nov 11 at 13:38






It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.
– magnoz
Nov 11 at 13:38














I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
– magnoz
Nov 11 at 14:00




I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.
– magnoz
Nov 11 at 14:00












I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie celery -A app.controller.engine.celery inspect stats, then the thing "unblocks" and start working..... any ideas?
– magnoz
Nov 11 at 14:11






I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie celery -A app.controller.engine.celery inspect stats, then the thing "unblocks" and start working..... any ideas?
– magnoz
Nov 11 at 14:11














1 Answer
1






active

oldest

votes

















up vote
0
down vote













I kind of fixed this rare issue by starting the celery worker using eventlet:



celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet



I don't fully understand why it doesn't work with the default one though.



I'd appreciate if anyone can bring some light on this.



Thanks anyways






share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53247072%2fcelery-flask-and-docker-issues-with-the-logs-and-running-tasks%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    I kind of fixed this rare issue by starting the celery worker using eventlet:



    celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet



    I don't fully understand why it doesn't work with the default one though.



    I'd appreciate if anyone can bring some light on this.



    Thanks anyways






    share|improve this answer

























      up vote
      0
      down vote













      I kind of fixed this rare issue by starting the celery worker using eventlet:



      celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet



      I don't fully understand why it doesn't work with the default one though.



      I'd appreciate if anyone can bring some light on this.



      Thanks anyways






      share|improve this answer























        up vote
        0
        down vote










        up vote
        0
        down vote









        I kind of fixed this rare issue by starting the celery worker using eventlet:



        celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet



        I don't fully understand why it doesn't work with the default one though.



        I'd appreciate if anyone can bring some light on this.



        Thanks anyways






        share|improve this answer












        I kind of fixed this rare issue by starting the celery worker using eventlet:



        celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet



        I don't fully understand why it doesn't work with the default one though.



        I'd appreciate if anyone can bring some light on this.



        Thanks anyways







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 12 at 16:08









        magnoz

        354112




        354112






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53247072%2fcelery-flask-and-docker-issues-with-the-logs-and-running-tasks%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Bressuire

            Vorschmack

            Quarantine