How to reject request from client using Jetty config in kairosdb











up vote
0
down vote

favorite












I am using kairosdb latest version. I tried enabling the jetty thread pool. My expectation was if the queue size is filled with request then all the subsequent request are rejected immediately.
But the request is served after sometime eventhough I see



 java.util.concurrent.RejectedExecutionException


The client request should be rejected if the queue is full. How to achieve the same?



For testing I added these params.



kairosdb.jetty.threads.queue_size=2 #queue
kairosdb.jetty.threads.min=2 # minThread
kairosdb.jetty.threads.max=4 #maxThread
kairosdb.jetty.threads.keep_alive_ms=1000


The corresponding jetty thread pool code



new ExecutorThreadPool(minThreads, maxThreads, keepAliveMs, TimeUnit.MILLISECONDS, queue);



The jetty version used in kairosdb is 8.1.16










share|improve this question




























    up vote
    0
    down vote

    favorite












    I am using kairosdb latest version. I tried enabling the jetty thread pool. My expectation was if the queue size is filled with request then all the subsequent request are rejected immediately.
    But the request is served after sometime eventhough I see



     java.util.concurrent.RejectedExecutionException


    The client request should be rejected if the queue is full. How to achieve the same?



    For testing I added these params.



    kairosdb.jetty.threads.queue_size=2 #queue
    kairosdb.jetty.threads.min=2 # minThread
    kairosdb.jetty.threads.max=4 #maxThread
    kairosdb.jetty.threads.keep_alive_ms=1000


    The corresponding jetty thread pool code



    new ExecutorThreadPool(minThreads, maxThreads, keepAliveMs, TimeUnit.MILLISECONDS, queue);



    The jetty version used in kairosdb is 8.1.16










    share|improve this question


























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      I am using kairosdb latest version. I tried enabling the jetty thread pool. My expectation was if the queue size is filled with request then all the subsequent request are rejected immediately.
      But the request is served after sometime eventhough I see



       java.util.concurrent.RejectedExecutionException


      The client request should be rejected if the queue is full. How to achieve the same?



      For testing I added these params.



      kairosdb.jetty.threads.queue_size=2 #queue
      kairosdb.jetty.threads.min=2 # minThread
      kairosdb.jetty.threads.max=4 #maxThread
      kairosdb.jetty.threads.keep_alive_ms=1000


      The corresponding jetty thread pool code



      new ExecutorThreadPool(minThreads, maxThreads, keepAliveMs, TimeUnit.MILLISECONDS, queue);



      The jetty version used in kairosdb is 8.1.16










      share|improve this question















      I am using kairosdb latest version. I tried enabling the jetty thread pool. My expectation was if the queue size is filled with request then all the subsequent request are rejected immediately.
      But the request is served after sometime eventhough I see



       java.util.concurrent.RejectedExecutionException


      The client request should be rejected if the queue is full. How to achieve the same?



      For testing I added these params.



      kairosdb.jetty.threads.queue_size=2 #queue
      kairosdb.jetty.threads.min=2 # minThread
      kairosdb.jetty.threads.max=4 #maxThread
      kairosdb.jetty.threads.keep_alive_ms=1000


      The corresponding jetty thread pool code



      new ExecutorThreadPool(minThreads, maxThreads, keepAliveMs, TimeUnit.MILLISECONDS, queue);



      The jetty version used in kairosdb is 8.1.16







      jetty kairosdb end-of-life






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 12 at 4:44

























      asked Nov 11 at 19:26









      Knight71

      1,62722448




      1,62722448
























          1 Answer
          1






          active

          oldest

          votes

















          up vote
          0
          down vote



          accepted










          Jetty 8.1.16 was released in Sept 2014, its now EOL (End of Life), consider using a version of Jetty that is up to date, stable, and supported. (such as Jetty 9.4.12.20180830)



          The fact that you get a java.util.concurrent.RejectedExecutionException screams you have an insufficient thread pool configuration.



          The threadpool configuration you have is EXTREMELY small.



          That would only be suitable for a single core, single cpu, single thread hardware configuration. Why? well, that's because your cpu/core/thread hardware configuration determines your NIO behavior, and dictates the minimum demands on your threadpool.



          On a MacOS laptop from 2009 (nearly 10 years ago!) you would need a minimum of 9 threads just to support a single connection making a single blocking REST request on that hardware.



          On a modern Ryzen Threadripper system you would often need a minimum thread count of 69 threads just to support a single connection making a single blocking REST request on this hardware.



          On the other hand, your configuration is quite suitable on a Raspberry Pi Zero, and could support about 3 connections, with 1 request active per connection.



          With that configuration you would only be able to handle simple requests in serial, and your application not using any async processing or async I/O behaviors. Why? that's because even a typical modern web page will require a minimum thread count around 40, due to how browsers utilize your server.



          The ExecutorThreadPool is also a terrible choice for your situation (that's only suitable for highly concurrent environments, think 24+ cpu/cores, and with minimum thread configurations above 500, often in the thousands).



          You would be better off using the standard QueuedThreadPool its much more performant for the low end, and is capable of growing to handle demand (and scaling back over time to to lower resource utilization as demand subsides).



          The QueuedThreadPool (in Jetty 9.4.x) also has protections against bad configurations and will warn you if the configuration is insufficient for either your hardware configuration, your chosen set of features in Jetty, or your specific configuration within Jetty.



          If you want to reject connections when resources a low, then consider using DoSFilter (or if you want to be more gentle, consider QoSFilter).



          Attempting to limit usage via the ThreadPool will never work, as in order to reject a connection a thread is needed to accept it (acceptor thread, one per server connector), another to process the NIO events (selector thread, shared resource, handling multiple connections), and another to handle the request (to return the http status code 503).



          If you want an implementation in your own code (not Jetty), you could probably just write a Filter that counts active exchanges (request and response) and forces a 503 response status if the count is above some configurable number.



          But if you do that, you should probably force all responses to close. aka Connection: close response header and not allow persistent connections.






          share|improve this answer























          • The value used above is just for testing and to understand how to limit the number of request. Regarding the upgrade in jetty component it will take some effort and time, since I need to upgrade it in kairosdb component and test. Is there a way to restrict client request in jetty 8.X ?
            – Knight71
            Nov 12 at 7:03










          • QosFilter is working with minimal code changes as it has to be added in embedded jetty.
            – Knight71
            Nov 16 at 8:38











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53252366%2fhow-to-reject-request-from-client-using-jetty-config-in-kairosdb%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          0
          down vote



          accepted










          Jetty 8.1.16 was released in Sept 2014, its now EOL (End of Life), consider using a version of Jetty that is up to date, stable, and supported. (such as Jetty 9.4.12.20180830)



          The fact that you get a java.util.concurrent.RejectedExecutionException screams you have an insufficient thread pool configuration.



          The threadpool configuration you have is EXTREMELY small.



          That would only be suitable for a single core, single cpu, single thread hardware configuration. Why? well, that's because your cpu/core/thread hardware configuration determines your NIO behavior, and dictates the minimum demands on your threadpool.



          On a MacOS laptop from 2009 (nearly 10 years ago!) you would need a minimum of 9 threads just to support a single connection making a single blocking REST request on that hardware.



          On a modern Ryzen Threadripper system you would often need a minimum thread count of 69 threads just to support a single connection making a single blocking REST request on this hardware.



          On the other hand, your configuration is quite suitable on a Raspberry Pi Zero, and could support about 3 connections, with 1 request active per connection.



          With that configuration you would only be able to handle simple requests in serial, and your application not using any async processing or async I/O behaviors. Why? that's because even a typical modern web page will require a minimum thread count around 40, due to how browsers utilize your server.



          The ExecutorThreadPool is also a terrible choice for your situation (that's only suitable for highly concurrent environments, think 24+ cpu/cores, and with minimum thread configurations above 500, often in the thousands).



          You would be better off using the standard QueuedThreadPool its much more performant for the low end, and is capable of growing to handle demand (and scaling back over time to to lower resource utilization as demand subsides).



          The QueuedThreadPool (in Jetty 9.4.x) also has protections against bad configurations and will warn you if the configuration is insufficient for either your hardware configuration, your chosen set of features in Jetty, or your specific configuration within Jetty.



          If you want to reject connections when resources a low, then consider using DoSFilter (or if you want to be more gentle, consider QoSFilter).



          Attempting to limit usage via the ThreadPool will never work, as in order to reject a connection a thread is needed to accept it (acceptor thread, one per server connector), another to process the NIO events (selector thread, shared resource, handling multiple connections), and another to handle the request (to return the http status code 503).



          If you want an implementation in your own code (not Jetty), you could probably just write a Filter that counts active exchanges (request and response) and forces a 503 response status if the count is above some configurable number.



          But if you do that, you should probably force all responses to close. aka Connection: close response header and not allow persistent connections.






          share|improve this answer























          • The value used above is just for testing and to understand how to limit the number of request. Regarding the upgrade in jetty component it will take some effort and time, since I need to upgrade it in kairosdb component and test. Is there a way to restrict client request in jetty 8.X ?
            – Knight71
            Nov 12 at 7:03










          • QosFilter is working with minimal code changes as it has to be added in embedded jetty.
            – Knight71
            Nov 16 at 8:38















          up vote
          0
          down vote



          accepted










          Jetty 8.1.16 was released in Sept 2014, its now EOL (End of Life), consider using a version of Jetty that is up to date, stable, and supported. (such as Jetty 9.4.12.20180830)



          The fact that you get a java.util.concurrent.RejectedExecutionException screams you have an insufficient thread pool configuration.



          The threadpool configuration you have is EXTREMELY small.



          That would only be suitable for a single core, single cpu, single thread hardware configuration. Why? well, that's because your cpu/core/thread hardware configuration determines your NIO behavior, and dictates the minimum demands on your threadpool.



          On a MacOS laptop from 2009 (nearly 10 years ago!) you would need a minimum of 9 threads just to support a single connection making a single blocking REST request on that hardware.



          On a modern Ryzen Threadripper system you would often need a minimum thread count of 69 threads just to support a single connection making a single blocking REST request on this hardware.



          On the other hand, your configuration is quite suitable on a Raspberry Pi Zero, and could support about 3 connections, with 1 request active per connection.



          With that configuration you would only be able to handle simple requests in serial, and your application not using any async processing or async I/O behaviors. Why? that's because even a typical modern web page will require a minimum thread count around 40, due to how browsers utilize your server.



          The ExecutorThreadPool is also a terrible choice for your situation (that's only suitable for highly concurrent environments, think 24+ cpu/cores, and with minimum thread configurations above 500, often in the thousands).



          You would be better off using the standard QueuedThreadPool its much more performant for the low end, and is capable of growing to handle demand (and scaling back over time to to lower resource utilization as demand subsides).



          The QueuedThreadPool (in Jetty 9.4.x) also has protections against bad configurations and will warn you if the configuration is insufficient for either your hardware configuration, your chosen set of features in Jetty, or your specific configuration within Jetty.



          If you want to reject connections when resources a low, then consider using DoSFilter (or if you want to be more gentle, consider QoSFilter).



          Attempting to limit usage via the ThreadPool will never work, as in order to reject a connection a thread is needed to accept it (acceptor thread, one per server connector), another to process the NIO events (selector thread, shared resource, handling multiple connections), and another to handle the request (to return the http status code 503).



          If you want an implementation in your own code (not Jetty), you could probably just write a Filter that counts active exchanges (request and response) and forces a 503 response status if the count is above some configurable number.



          But if you do that, you should probably force all responses to close. aka Connection: close response header and not allow persistent connections.






          share|improve this answer























          • The value used above is just for testing and to understand how to limit the number of request. Regarding the upgrade in jetty component it will take some effort and time, since I need to upgrade it in kairosdb component and test. Is there a way to restrict client request in jetty 8.X ?
            – Knight71
            Nov 12 at 7:03










          • QosFilter is working with minimal code changes as it has to be added in embedded jetty.
            – Knight71
            Nov 16 at 8:38













          up vote
          0
          down vote



          accepted







          up vote
          0
          down vote



          accepted






          Jetty 8.1.16 was released in Sept 2014, its now EOL (End of Life), consider using a version of Jetty that is up to date, stable, and supported. (such as Jetty 9.4.12.20180830)



          The fact that you get a java.util.concurrent.RejectedExecutionException screams you have an insufficient thread pool configuration.



          The threadpool configuration you have is EXTREMELY small.



          That would only be suitable for a single core, single cpu, single thread hardware configuration. Why? well, that's because your cpu/core/thread hardware configuration determines your NIO behavior, and dictates the minimum demands on your threadpool.



          On a MacOS laptop from 2009 (nearly 10 years ago!) you would need a minimum of 9 threads just to support a single connection making a single blocking REST request on that hardware.



          On a modern Ryzen Threadripper system you would often need a minimum thread count of 69 threads just to support a single connection making a single blocking REST request on this hardware.



          On the other hand, your configuration is quite suitable on a Raspberry Pi Zero, and could support about 3 connections, with 1 request active per connection.



          With that configuration you would only be able to handle simple requests in serial, and your application not using any async processing or async I/O behaviors. Why? that's because even a typical modern web page will require a minimum thread count around 40, due to how browsers utilize your server.



          The ExecutorThreadPool is also a terrible choice for your situation (that's only suitable for highly concurrent environments, think 24+ cpu/cores, and with minimum thread configurations above 500, often in the thousands).



          You would be better off using the standard QueuedThreadPool its much more performant for the low end, and is capable of growing to handle demand (and scaling back over time to to lower resource utilization as demand subsides).



          The QueuedThreadPool (in Jetty 9.4.x) also has protections against bad configurations and will warn you if the configuration is insufficient for either your hardware configuration, your chosen set of features in Jetty, or your specific configuration within Jetty.



          If you want to reject connections when resources a low, then consider using DoSFilter (or if you want to be more gentle, consider QoSFilter).



          Attempting to limit usage via the ThreadPool will never work, as in order to reject a connection a thread is needed to accept it (acceptor thread, one per server connector), another to process the NIO events (selector thread, shared resource, handling multiple connections), and another to handle the request (to return the http status code 503).



          If you want an implementation in your own code (not Jetty), you could probably just write a Filter that counts active exchanges (request and response) and forces a 503 response status if the count is above some configurable number.



          But if you do that, you should probably force all responses to close. aka Connection: close response header and not allow persistent connections.






          share|improve this answer














          Jetty 8.1.16 was released in Sept 2014, its now EOL (End of Life), consider using a version of Jetty that is up to date, stable, and supported. (such as Jetty 9.4.12.20180830)



          The fact that you get a java.util.concurrent.RejectedExecutionException screams you have an insufficient thread pool configuration.



          The threadpool configuration you have is EXTREMELY small.



          That would only be suitable for a single core, single cpu, single thread hardware configuration. Why? well, that's because your cpu/core/thread hardware configuration determines your NIO behavior, and dictates the minimum demands on your threadpool.



          On a MacOS laptop from 2009 (nearly 10 years ago!) you would need a minimum of 9 threads just to support a single connection making a single blocking REST request on that hardware.



          On a modern Ryzen Threadripper system you would often need a minimum thread count of 69 threads just to support a single connection making a single blocking REST request on this hardware.



          On the other hand, your configuration is quite suitable on a Raspberry Pi Zero, and could support about 3 connections, with 1 request active per connection.



          With that configuration you would only be able to handle simple requests in serial, and your application not using any async processing or async I/O behaviors. Why? that's because even a typical modern web page will require a minimum thread count around 40, due to how browsers utilize your server.



          The ExecutorThreadPool is also a terrible choice for your situation (that's only suitable for highly concurrent environments, think 24+ cpu/cores, and with minimum thread configurations above 500, often in the thousands).



          You would be better off using the standard QueuedThreadPool its much more performant for the low end, and is capable of growing to handle demand (and scaling back over time to to lower resource utilization as demand subsides).



          The QueuedThreadPool (in Jetty 9.4.x) also has protections against bad configurations and will warn you if the configuration is insufficient for either your hardware configuration, your chosen set of features in Jetty, or your specific configuration within Jetty.



          If you want to reject connections when resources a low, then consider using DoSFilter (or if you want to be more gentle, consider QoSFilter).



          Attempting to limit usage via the ThreadPool will never work, as in order to reject a connection a thread is needed to accept it (acceptor thread, one per server connector), another to process the NIO events (selector thread, shared resource, handling multiple connections), and another to handle the request (to return the http status code 503).



          If you want an implementation in your own code (not Jetty), you could probably just write a Filter that counts active exchanges (request and response) and forces a 503 response status if the count is above some configurable number.



          But if you do that, you should probably force all responses to close. aka Connection: close response header and not allow persistent connections.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Nov 13 at 16:36

























          answered Nov 11 at 20:00









          Joakim Erdfelt

          32.2k45695




          32.2k45695












          • The value used above is just for testing and to understand how to limit the number of request. Regarding the upgrade in jetty component it will take some effort and time, since I need to upgrade it in kairosdb component and test. Is there a way to restrict client request in jetty 8.X ?
            – Knight71
            Nov 12 at 7:03










          • QosFilter is working with minimal code changes as it has to be added in embedded jetty.
            – Knight71
            Nov 16 at 8:38


















          • The value used above is just for testing and to understand how to limit the number of request. Regarding the upgrade in jetty component it will take some effort and time, since I need to upgrade it in kairosdb component and test. Is there a way to restrict client request in jetty 8.X ?
            – Knight71
            Nov 12 at 7:03










          • QosFilter is working with minimal code changes as it has to be added in embedded jetty.
            – Knight71
            Nov 16 at 8:38
















          The value used above is just for testing and to understand how to limit the number of request. Regarding the upgrade in jetty component it will take some effort and time, since I need to upgrade it in kairosdb component and test. Is there a way to restrict client request in jetty 8.X ?
          – Knight71
          Nov 12 at 7:03




          The value used above is just for testing and to understand how to limit the number of request. Regarding the upgrade in jetty component it will take some effort and time, since I need to upgrade it in kairosdb component and test. Is there a way to restrict client request in jetty 8.X ?
          – Knight71
          Nov 12 at 7:03












          QosFilter is working with minimal code changes as it has to be added in embedded jetty.
          – Knight71
          Nov 16 at 8:38




          QosFilter is working with minimal code changes as it has to be added in embedded jetty.
          – Knight71
          Nov 16 at 8:38


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53252366%2fhow-to-reject-request-from-client-using-jetty-config-in-kairosdb%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Bressuire

          Vorschmack

          Quarantine