Understanding Concurrency/Parallelism in Async module - Node JS












3















I have a web server that has a Node process running on it, managed by pm2 and the web server has 16 available cores.



The Node process manages a queue of tasks utilising the async module. This queue of tasks, depending on the number of events coming in, can grow to over 10,000 in the event of a disconnection between the Node process and the service it interacts with.



The queue will maintain this list of tasks in the event of a disconnection, and then when the connection is re-established, the queued tasks are then executed.



The docs say this...




Tasks added to the queue are processed in parallel (up to the
concurrency limit). If all workers are in progress, the task is queued
until one becomes available. Once a worker completes a task, that
task's callback is called.




My questions are these...




  1. If the queue had 10,000 tasks in it, and I set the concurrency level to 1, am I right in thinking that these will essentially be executed 1 at a time? If so, I guess this means that if new tasks are added, this queue could potentially never be fully drained?


  2. If I was to set the concurrency value to 16, would it essentially run 16 tasks in parallel, making use of each CPU core? Or is the concurrency managed by something else?


  3. How are tasks ran in parallel? My understanding is that a singe Node process can only utilise one core at a time because it's single threaded.


  4. Am I totally missing the point as to how the async modules manages parallel tasks?!



Thanks in advance you clever bunch!










share|improve this question



























    3















    I have a web server that has a Node process running on it, managed by pm2 and the web server has 16 available cores.



    The Node process manages a queue of tasks utilising the async module. This queue of tasks, depending on the number of events coming in, can grow to over 10,000 in the event of a disconnection between the Node process and the service it interacts with.



    The queue will maintain this list of tasks in the event of a disconnection, and then when the connection is re-established, the queued tasks are then executed.



    The docs say this...




    Tasks added to the queue are processed in parallel (up to the
    concurrency limit). If all workers are in progress, the task is queued
    until one becomes available. Once a worker completes a task, that
    task's callback is called.




    My questions are these...




    1. If the queue had 10,000 tasks in it, and I set the concurrency level to 1, am I right in thinking that these will essentially be executed 1 at a time? If so, I guess this means that if new tasks are added, this queue could potentially never be fully drained?


    2. If I was to set the concurrency value to 16, would it essentially run 16 tasks in parallel, making use of each CPU core? Or is the concurrency managed by something else?


    3. How are tasks ran in parallel? My understanding is that a singe Node process can only utilise one core at a time because it's single threaded.


    4. Am I totally missing the point as to how the async modules manages parallel tasks?!



    Thanks in advance you clever bunch!










    share|improve this question

























      3












      3








      3


      1






      I have a web server that has a Node process running on it, managed by pm2 and the web server has 16 available cores.



      The Node process manages a queue of tasks utilising the async module. This queue of tasks, depending on the number of events coming in, can grow to over 10,000 in the event of a disconnection between the Node process and the service it interacts with.



      The queue will maintain this list of tasks in the event of a disconnection, and then when the connection is re-established, the queued tasks are then executed.



      The docs say this...




      Tasks added to the queue are processed in parallel (up to the
      concurrency limit). If all workers are in progress, the task is queued
      until one becomes available. Once a worker completes a task, that
      task's callback is called.




      My questions are these...




      1. If the queue had 10,000 tasks in it, and I set the concurrency level to 1, am I right in thinking that these will essentially be executed 1 at a time? If so, I guess this means that if new tasks are added, this queue could potentially never be fully drained?


      2. If I was to set the concurrency value to 16, would it essentially run 16 tasks in parallel, making use of each CPU core? Or is the concurrency managed by something else?


      3. How are tasks ran in parallel? My understanding is that a singe Node process can only utilise one core at a time because it's single threaded.


      4. Am I totally missing the point as to how the async modules manages parallel tasks?!



      Thanks in advance you clever bunch!










      share|improve this question














      I have a web server that has a Node process running on it, managed by pm2 and the web server has 16 available cores.



      The Node process manages a queue of tasks utilising the async module. This queue of tasks, depending on the number of events coming in, can grow to over 10,000 in the event of a disconnection between the Node process and the service it interacts with.



      The queue will maintain this list of tasks in the event of a disconnection, and then when the connection is re-established, the queued tasks are then executed.



      The docs say this...




      Tasks added to the queue are processed in parallel (up to the
      concurrency limit). If all workers are in progress, the task is queued
      until one becomes available. Once a worker completes a task, that
      task's callback is called.




      My questions are these...




      1. If the queue had 10,000 tasks in it, and I set the concurrency level to 1, am I right in thinking that these will essentially be executed 1 at a time? If so, I guess this means that if new tasks are added, this queue could potentially never be fully drained?


      2. If I was to set the concurrency value to 16, would it essentially run 16 tasks in parallel, making use of each CPU core? Or is the concurrency managed by something else?


      3. How are tasks ran in parallel? My understanding is that a singe Node process can only utilise one core at a time because it's single threaded.


      4. Am I totally missing the point as to how the async modules manages parallel tasks?!



      Thanks in advance you clever bunch!







      javascript node.js concurrency async.js






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 15 '18 at 19:41









      an0nc0d3ran0nc0d3r

      275




      275
























          1 Answer
          1






          active

          oldest

          votes


















          2














          1) Yes , they will be executed in series .
          If you don’t set a timeout for a task then you may never drain the queue .



          2) no. All tasks will be executed on the same core your node app runs on .
          Spread the tasks to workers and you will gain multi process queue(take a look on child process & ipc)



          3) if the tasks are a blocking code , nothing will be in parallel .
          But if you write to disk / network , you’re doing async work , and while waiting for an answer , the engine continues with the rest of the code



          4) just a bit . But you’re on the way to nail it . ;)






          share|improve this answer


























          • Thanks @Mazki516. Most of the tasks are network related so I guess the majority of the tasks are being done async. I'll take a look at child process but what is ipc? Without specifically writing code that creates Workers, what effect will setting the concurrency level to 16 do? Will it just do nothing? I've changed to 16 and the queues seem much healthier so I'm still a little confused. More learning to do me thinks!!

            – an0nc0d3r
            Nov 15 '18 at 20:45











          • Can you please specify what the task is doing exactly ? It’s depend on the circumstances and hardware limits (cpu , memory , network bandwidth ) . It also depend if the other side you’re calling allow for such traffic (most apis enforce concurrency limit) .

            – Mazki516
            Nov 15 '18 at 21:03











          • sorry for late reply . IPC is inter process communication and it's how you communicate between different processes. in Node, any child process you open can communicate directly with the parent (and vice versa). the communication is messages based. there are also ways to open IPC channel between individual processes (using file system / tpmfs / etc) .

            – Mazki516
            Nov 15 '18 at 21:54











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53326830%2funderstanding-concurrency-parallelism-in-async-module-node-js%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2














          1) Yes , they will be executed in series .
          If you don’t set a timeout for a task then you may never drain the queue .



          2) no. All tasks will be executed on the same core your node app runs on .
          Spread the tasks to workers and you will gain multi process queue(take a look on child process & ipc)



          3) if the tasks are a blocking code , nothing will be in parallel .
          But if you write to disk / network , you’re doing async work , and while waiting for an answer , the engine continues with the rest of the code



          4) just a bit . But you’re on the way to nail it . ;)






          share|improve this answer


























          • Thanks @Mazki516. Most of the tasks are network related so I guess the majority of the tasks are being done async. I'll take a look at child process but what is ipc? Without specifically writing code that creates Workers, what effect will setting the concurrency level to 16 do? Will it just do nothing? I've changed to 16 and the queues seem much healthier so I'm still a little confused. More learning to do me thinks!!

            – an0nc0d3r
            Nov 15 '18 at 20:45











          • Can you please specify what the task is doing exactly ? It’s depend on the circumstances and hardware limits (cpu , memory , network bandwidth ) . It also depend if the other side you’re calling allow for such traffic (most apis enforce concurrency limit) .

            – Mazki516
            Nov 15 '18 at 21:03











          • sorry for late reply . IPC is inter process communication and it's how you communicate between different processes. in Node, any child process you open can communicate directly with the parent (and vice versa). the communication is messages based. there are also ways to open IPC channel between individual processes (using file system / tpmfs / etc) .

            – Mazki516
            Nov 15 '18 at 21:54
















          2














          1) Yes , they will be executed in series .
          If you don’t set a timeout for a task then you may never drain the queue .



          2) no. All tasks will be executed on the same core your node app runs on .
          Spread the tasks to workers and you will gain multi process queue(take a look on child process & ipc)



          3) if the tasks are a blocking code , nothing will be in parallel .
          But if you write to disk / network , you’re doing async work , and while waiting for an answer , the engine continues with the rest of the code



          4) just a bit . But you’re on the way to nail it . ;)






          share|improve this answer


























          • Thanks @Mazki516. Most of the tasks are network related so I guess the majority of the tasks are being done async. I'll take a look at child process but what is ipc? Without specifically writing code that creates Workers, what effect will setting the concurrency level to 16 do? Will it just do nothing? I've changed to 16 and the queues seem much healthier so I'm still a little confused. More learning to do me thinks!!

            – an0nc0d3r
            Nov 15 '18 at 20:45











          • Can you please specify what the task is doing exactly ? It’s depend on the circumstances and hardware limits (cpu , memory , network bandwidth ) . It also depend if the other side you’re calling allow for such traffic (most apis enforce concurrency limit) .

            – Mazki516
            Nov 15 '18 at 21:03











          • sorry for late reply . IPC is inter process communication and it's how you communicate between different processes. in Node, any child process you open can communicate directly with the parent (and vice versa). the communication is messages based. there are also ways to open IPC channel between individual processes (using file system / tpmfs / etc) .

            – Mazki516
            Nov 15 '18 at 21:54














          2












          2








          2







          1) Yes , they will be executed in series .
          If you don’t set a timeout for a task then you may never drain the queue .



          2) no. All tasks will be executed on the same core your node app runs on .
          Spread the tasks to workers and you will gain multi process queue(take a look on child process & ipc)



          3) if the tasks are a blocking code , nothing will be in parallel .
          But if you write to disk / network , you’re doing async work , and while waiting for an answer , the engine continues with the rest of the code



          4) just a bit . But you’re on the way to nail it . ;)






          share|improve this answer















          1) Yes , they will be executed in series .
          If you don’t set a timeout for a task then you may never drain the queue .



          2) no. All tasks will be executed on the same core your node app runs on .
          Spread the tasks to workers and you will gain multi process queue(take a look on child process & ipc)



          3) if the tasks are a blocking code , nothing will be in parallel .
          But if you write to disk / network , you’re doing async work , and while waiting for an answer , the engine continues with the rest of the code



          4) just a bit . But you’re on the way to nail it . ;)







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Dec 1 '18 at 14:11

























          answered Nov 15 '18 at 19:48









          Mazki516Mazki516

          6431517




          6431517













          • Thanks @Mazki516. Most of the tasks are network related so I guess the majority of the tasks are being done async. I'll take a look at child process but what is ipc? Without specifically writing code that creates Workers, what effect will setting the concurrency level to 16 do? Will it just do nothing? I've changed to 16 and the queues seem much healthier so I'm still a little confused. More learning to do me thinks!!

            – an0nc0d3r
            Nov 15 '18 at 20:45











          • Can you please specify what the task is doing exactly ? It’s depend on the circumstances and hardware limits (cpu , memory , network bandwidth ) . It also depend if the other side you’re calling allow for such traffic (most apis enforce concurrency limit) .

            – Mazki516
            Nov 15 '18 at 21:03











          • sorry for late reply . IPC is inter process communication and it's how you communicate between different processes. in Node, any child process you open can communicate directly with the parent (and vice versa). the communication is messages based. there are also ways to open IPC channel between individual processes (using file system / tpmfs / etc) .

            – Mazki516
            Nov 15 '18 at 21:54



















          • Thanks @Mazki516. Most of the tasks are network related so I guess the majority of the tasks are being done async. I'll take a look at child process but what is ipc? Without specifically writing code that creates Workers, what effect will setting the concurrency level to 16 do? Will it just do nothing? I've changed to 16 and the queues seem much healthier so I'm still a little confused. More learning to do me thinks!!

            – an0nc0d3r
            Nov 15 '18 at 20:45











          • Can you please specify what the task is doing exactly ? It’s depend on the circumstances and hardware limits (cpu , memory , network bandwidth ) . It also depend if the other side you’re calling allow for such traffic (most apis enforce concurrency limit) .

            – Mazki516
            Nov 15 '18 at 21:03











          • sorry for late reply . IPC is inter process communication and it's how you communicate between different processes. in Node, any child process you open can communicate directly with the parent (and vice versa). the communication is messages based. there are also ways to open IPC channel between individual processes (using file system / tpmfs / etc) .

            – Mazki516
            Nov 15 '18 at 21:54

















          Thanks @Mazki516. Most of the tasks are network related so I guess the majority of the tasks are being done async. I'll take a look at child process but what is ipc? Without specifically writing code that creates Workers, what effect will setting the concurrency level to 16 do? Will it just do nothing? I've changed to 16 and the queues seem much healthier so I'm still a little confused. More learning to do me thinks!!

          – an0nc0d3r
          Nov 15 '18 at 20:45





          Thanks @Mazki516. Most of the tasks are network related so I guess the majority of the tasks are being done async. I'll take a look at child process but what is ipc? Without specifically writing code that creates Workers, what effect will setting the concurrency level to 16 do? Will it just do nothing? I've changed to 16 and the queues seem much healthier so I'm still a little confused. More learning to do me thinks!!

          – an0nc0d3r
          Nov 15 '18 at 20:45













          Can you please specify what the task is doing exactly ? It’s depend on the circumstances and hardware limits (cpu , memory , network bandwidth ) . It also depend if the other side you’re calling allow for such traffic (most apis enforce concurrency limit) .

          – Mazki516
          Nov 15 '18 at 21:03





          Can you please specify what the task is doing exactly ? It’s depend on the circumstances and hardware limits (cpu , memory , network bandwidth ) . It also depend if the other side you’re calling allow for such traffic (most apis enforce concurrency limit) .

          – Mazki516
          Nov 15 '18 at 21:03













          sorry for late reply . IPC is inter process communication and it's how you communicate between different processes. in Node, any child process you open can communicate directly with the parent (and vice versa). the communication is messages based. there are also ways to open IPC channel between individual processes (using file system / tpmfs / etc) .

          – Mazki516
          Nov 15 '18 at 21:54





          sorry for late reply . IPC is inter process communication and it's how you communicate between different processes. in Node, any child process you open can communicate directly with the parent (and vice versa). the communication is messages based. there are also ways to open IPC channel between individual processes (using file system / tpmfs / etc) .

          – Mazki516
          Nov 15 '18 at 21:54




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53326830%2funderstanding-concurrency-parallelism-in-async-module-node-js%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Xamarin.iOS Cant Deploy on Iphone

          Glorious Revolution

          Dulmage-Mendelsohn matrix decomposition in Python