Use a custom Log4J appender when running spark in AWS EMR





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







0















I'm trying to execute spark submit in AWS EMR to execute a simple project that uses a custom log4j appender that I wrote.

Im able to pass my log4j properties by providing the following configuration in cluster software settings:



[{
"classification": "spark-log4j",
"properties": {
"log4j.appender.S": "CustomLog4JAppender",
"log4j.rootLogger": "DEBUG,S"
}
}


]



But when I'm running the cluster step I'm getting:
log4j:ERROR Could not instantiate class [CustomLog4JAppender]. java.lang.ClassNotFoundException: CustomLog4JAppender

in the cluster stderr.



The jar that I'm executing is located in S3 and it contains the Main class, my appender class and all the dependencies.



I'm executing the cluster using: command-runner.jar

and executing the following command:
spark-submit --deploy-mode client --class Main s3://{path_to_jar}.jar



So a few questions here:




  1. Which component in the cluster loads the log4j logger and properties? does it happen in the master node? in the core node?

  2. What can I do in order to solve this issue? How should I execute it differently? how to make it recognize my custom appender class?


Thanks!










share|improve this question





























    0















    I'm trying to execute spark submit in AWS EMR to execute a simple project that uses a custom log4j appender that I wrote.

    Im able to pass my log4j properties by providing the following configuration in cluster software settings:



    [{
    "classification": "spark-log4j",
    "properties": {
    "log4j.appender.S": "CustomLog4JAppender",
    "log4j.rootLogger": "DEBUG,S"
    }
    }


    ]



    But when I'm running the cluster step I'm getting:
    log4j:ERROR Could not instantiate class [CustomLog4JAppender]. java.lang.ClassNotFoundException: CustomLog4JAppender

    in the cluster stderr.



    The jar that I'm executing is located in S3 and it contains the Main class, my appender class and all the dependencies.



    I'm executing the cluster using: command-runner.jar

    and executing the following command:
    spark-submit --deploy-mode client --class Main s3://{path_to_jar}.jar



    So a few questions here:




    1. Which component in the cluster loads the log4j logger and properties? does it happen in the master node? in the core node?

    2. What can I do in order to solve this issue? How should I execute it differently? how to make it recognize my custom appender class?


    Thanks!










    share|improve this question

























      0












      0








      0








      I'm trying to execute spark submit in AWS EMR to execute a simple project that uses a custom log4j appender that I wrote.

      Im able to pass my log4j properties by providing the following configuration in cluster software settings:



      [{
      "classification": "spark-log4j",
      "properties": {
      "log4j.appender.S": "CustomLog4JAppender",
      "log4j.rootLogger": "DEBUG,S"
      }
      }


      ]



      But when I'm running the cluster step I'm getting:
      log4j:ERROR Could not instantiate class [CustomLog4JAppender]. java.lang.ClassNotFoundException: CustomLog4JAppender

      in the cluster stderr.



      The jar that I'm executing is located in S3 and it contains the Main class, my appender class and all the dependencies.



      I'm executing the cluster using: command-runner.jar

      and executing the following command:
      spark-submit --deploy-mode client --class Main s3://{path_to_jar}.jar



      So a few questions here:




      1. Which component in the cluster loads the log4j logger and properties? does it happen in the master node? in the core node?

      2. What can I do in order to solve this issue? How should I execute it differently? how to make it recognize my custom appender class?


      Thanks!










      share|improve this question














      I'm trying to execute spark submit in AWS EMR to execute a simple project that uses a custom log4j appender that I wrote.

      Im able to pass my log4j properties by providing the following configuration in cluster software settings:



      [{
      "classification": "spark-log4j",
      "properties": {
      "log4j.appender.S": "CustomLog4JAppender",
      "log4j.rootLogger": "DEBUG,S"
      }
      }


      ]



      But when I'm running the cluster step I'm getting:
      log4j:ERROR Could not instantiate class [CustomLog4JAppender]. java.lang.ClassNotFoundException: CustomLog4JAppender

      in the cluster stderr.



      The jar that I'm executing is located in S3 and it contains the Main class, my appender class and all the dependencies.



      I'm executing the cluster using: command-runner.jar

      and executing the following command:
      spark-submit --deploy-mode client --class Main s3://{path_to_jar}.jar



      So a few questions here:




      1. Which component in the cluster loads the log4j logger and properties? does it happen in the master node? in the core node?

      2. What can I do in order to solve this issue? How should I execute it differently? how to make it recognize my custom appender class?


      Thanks!







      amazon-web-services apache-spark log4j amazon-emr appender






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 16 '18 at 16:23









      Boris_PBoris_P

      62




      62
























          1 Answer
          1






          active

          oldest

          votes


















          0














          I also developed a custom log4j appender class and used it as follows in my log4j.properties file with no problem:



          log4j.rootLogger=ERROR, defaultLog
          log4j.appender.defaultLog=com.my.package.CustomLog4jFileAppender


          so my guess is that this line of code "log4j.appender.S": "CustomLog4JAppender" is not enough to locate your custom appender, and you probably need to give the location of your custom appender class. Try this:



          "log4j.appender.S": "com.yourPackage.CustomLog4JAppender",





          share|improve this answer



















          • 1





            Actually in my case the appender is in the same package as the main program and therefore my properties are correct. I was able to solve the issue by adding a Bootstrap action that copies the jar with the appender to /usr/lib/spark/jars/. What I don't understand is why it didn't see the appender when it was part of the application jar.

            – Boris_P
            Nov 20 '18 at 5:50












          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53341798%2fuse-a-custom-log4j-appender-when-running-spark-in-aws-emr%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          I also developed a custom log4j appender class and used it as follows in my log4j.properties file with no problem:



          log4j.rootLogger=ERROR, defaultLog
          log4j.appender.defaultLog=com.my.package.CustomLog4jFileAppender


          so my guess is that this line of code "log4j.appender.S": "CustomLog4JAppender" is not enough to locate your custom appender, and you probably need to give the location of your custom appender class. Try this:



          "log4j.appender.S": "com.yourPackage.CustomLog4JAppender",





          share|improve this answer



















          • 1





            Actually in my case the appender is in the same package as the main program and therefore my properties are correct. I was able to solve the issue by adding a Bootstrap action that copies the jar with the appender to /usr/lib/spark/jars/. What I don't understand is why it didn't see the appender when it was part of the application jar.

            – Boris_P
            Nov 20 '18 at 5:50
















          0














          I also developed a custom log4j appender class and used it as follows in my log4j.properties file with no problem:



          log4j.rootLogger=ERROR, defaultLog
          log4j.appender.defaultLog=com.my.package.CustomLog4jFileAppender


          so my guess is that this line of code "log4j.appender.S": "CustomLog4JAppender" is not enough to locate your custom appender, and you probably need to give the location of your custom appender class. Try this:



          "log4j.appender.S": "com.yourPackage.CustomLog4JAppender",





          share|improve this answer



















          • 1





            Actually in my case the appender is in the same package as the main program and therefore my properties are correct. I was able to solve the issue by adding a Bootstrap action that copies the jar with the appender to /usr/lib/spark/jars/. What I don't understand is why it didn't see the appender when it was part of the application jar.

            – Boris_P
            Nov 20 '18 at 5:50














          0












          0








          0







          I also developed a custom log4j appender class and used it as follows in my log4j.properties file with no problem:



          log4j.rootLogger=ERROR, defaultLog
          log4j.appender.defaultLog=com.my.package.CustomLog4jFileAppender


          so my guess is that this line of code "log4j.appender.S": "CustomLog4JAppender" is not enough to locate your custom appender, and you probably need to give the location of your custom appender class. Try this:



          "log4j.appender.S": "com.yourPackage.CustomLog4JAppender",





          share|improve this answer













          I also developed a custom log4j appender class and used it as follows in my log4j.properties file with no problem:



          log4j.rootLogger=ERROR, defaultLog
          log4j.appender.defaultLog=com.my.package.CustomLog4jFileAppender


          so my guess is that this line of code "log4j.appender.S": "CustomLog4JAppender" is not enough to locate your custom appender, and you probably need to give the location of your custom appender class. Try this:



          "log4j.appender.S": "com.yourPackage.CustomLog4JAppender",






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 19 '18 at 15:08









          sticky_elbowssticky_elbows

          35711




          35711








          • 1





            Actually in my case the appender is in the same package as the main program and therefore my properties are correct. I was able to solve the issue by adding a Bootstrap action that copies the jar with the appender to /usr/lib/spark/jars/. What I don't understand is why it didn't see the appender when it was part of the application jar.

            – Boris_P
            Nov 20 '18 at 5:50














          • 1





            Actually in my case the appender is in the same package as the main program and therefore my properties are correct. I was able to solve the issue by adding a Bootstrap action that copies the jar with the appender to /usr/lib/spark/jars/. What I don't understand is why it didn't see the appender when it was part of the application jar.

            – Boris_P
            Nov 20 '18 at 5:50








          1




          1





          Actually in my case the appender is in the same package as the main program and therefore my properties are correct. I was able to solve the issue by adding a Bootstrap action that copies the jar with the appender to /usr/lib/spark/jars/. What I don't understand is why it didn't see the appender when it was part of the application jar.

          – Boris_P
          Nov 20 '18 at 5:50





          Actually in my case the appender is in the same package as the main program and therefore my properties are correct. I was able to solve the issue by adding a Bootstrap action that copies the jar with the appender to /usr/lib/spark/jars/. What I don't understand is why it didn't see the appender when it was part of the application jar.

          – Boris_P
          Nov 20 '18 at 5:50




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53341798%2fuse-a-custom-log4j-appender-when-running-spark-in-aws-emr%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Xamarin.iOS Cant Deploy on Iphone

          Glorious Revolution

          Dulmage-Mendelsohn matrix decomposition in Python