Increase Spark memory when using local[*]





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







7















How do I increase Spark memory when using local[*]?



I tried setting the memory like this:



  val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")


But I still get:



MemoryStore: MemoryStore started with capacity 524.1 MB


Does this have something to do with:



.setMaster("local[*]")









share|improve this question























  • Have you tried increasing spark.executor.memory?

    – Marius Soutier
    Sep 21 '15 at 7:55











  • .setMaster("local[*]") is for using the available core at local machine for doing the processing

    – Ajay Gupta
    Sep 21 '15 at 8:00


















7















How do I increase Spark memory when using local[*]?



I tried setting the memory like this:



  val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")


But I still get:



MemoryStore: MemoryStore started with capacity 524.1 MB


Does this have something to do with:



.setMaster("local[*]")









share|improve this question























  • Have you tried increasing spark.executor.memory?

    – Marius Soutier
    Sep 21 '15 at 7:55











  • .setMaster("local[*]") is for using the available core at local machine for doing the processing

    – Ajay Gupta
    Sep 21 '15 at 8:00














7












7








7


2






How do I increase Spark memory when using local[*]?



I tried setting the memory like this:



  val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")


But I still get:



MemoryStore: MemoryStore started with capacity 524.1 MB


Does this have something to do with:



.setMaster("local[*]")









share|improve this question














How do I increase Spark memory when using local[*]?



I tried setting the memory like this:



  val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")


But I still get:



MemoryStore: MemoryStore started with capacity 524.1 MB


Does this have something to do with:



.setMaster("local[*]")






scala apache-spark






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Sep 21 '15 at 7:49









BARBAR

6,5681167124




6,5681167124













  • Have you tried increasing spark.executor.memory?

    – Marius Soutier
    Sep 21 '15 at 7:55











  • .setMaster("local[*]") is for using the available core at local machine for doing the processing

    – Ajay Gupta
    Sep 21 '15 at 8:00



















  • Have you tried increasing spark.executor.memory?

    – Marius Soutier
    Sep 21 '15 at 7:55











  • .setMaster("local[*]") is for using the available core at local machine for doing the processing

    – Ajay Gupta
    Sep 21 '15 at 8:00

















Have you tried increasing spark.executor.memory?

– Marius Soutier
Sep 21 '15 at 7:55





Have you tried increasing spark.executor.memory?

– Marius Soutier
Sep 21 '15 at 7:55













.setMaster("local[*]") is for using the available core at local machine for doing the processing

– Ajay Gupta
Sep 21 '15 at 8:00





.setMaster("local[*]") is for using the available core at local machine for doing the processing

– Ajay Gupta
Sep 21 '15 at 8:00












8 Answers
8






active

oldest

votes


















6














I was able to solve this by running SBT with:



sbt -mem 4096


However the MemoryStore is half the size. Still looking into where this fraction is.






share|improve this answer



















  • 1





    Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default

    – Gillespie
    Sep 21 '15 at 8:26













  • @Gillespie Thank you

    – BAR
    Sep 21 '15 at 8:27











  • This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!

    – eliasah
    Sep 21 '15 at 8:40











  • @eliasah Read my comments.

    – BAR
    Sep 21 '15 at 8:41



















5














Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.



You can either launch your spark-shell using:



./bin/spark-shell --driver-memory 4g


or you can set it in spark-defaults.conf:



spark.driver.memory 4g


If you are launching an application using spark-submit, you must specify the driver memory as an argument:



./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar





share|improve this answer
























  • I am not using the shell.

    – BAR
    Sep 21 '15 at 8:03











  • then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.

    – Gillespie
    Sep 21 '15 at 8:04











  • So it should go in SBT?

    – BAR
    Sep 21 '15 at 8:05











  • Are you launching your application as in the last paragraph of my answer?

    – Gillespie
    Sep 21 '15 at 8:07






  • 1





    no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.

    – BAR
    Sep 21 '15 at 8:09





















2














in spark 2.x ,you can use SparkSession,which looks like :



        val spark= new SparkSession()
.config("spark.executor.memory", "1g")
.config("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")





share|improve this answer































    1














    The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory setting :)



    Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.






    share|improve this answer































      1














      Tried --driver-memory 4g, --executor-memory 4g, neither worked to increase working memory. However, I noticed that bin/spark-submit was picking up _JAVA_OPTIONS, setting that to -Xmx4g resolved it. I use jdk7






      share|improve this answer

































        1














        You can't change driver memory after application start link.






        share|improve this answer































          -1














          To assign memory to Spark:



          on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G






          share|improve this answer
























          • you can edit this answer and include your other answer

            – RaisingAgent
            Aug 31 '18 at 10:39



















          -2














          /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G





          share|improve this answer


























            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f32689934%2fincrease-spark-memory-when-using-local%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            8 Answers
            8






            active

            oldest

            votes








            8 Answers
            8






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            6














            I was able to solve this by running SBT with:



            sbt -mem 4096


            However the MemoryStore is half the size. Still looking into where this fraction is.






            share|improve this answer



















            • 1





              Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default

              – Gillespie
              Sep 21 '15 at 8:26













            • @Gillespie Thank you

              – BAR
              Sep 21 '15 at 8:27











            • This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!

              – eliasah
              Sep 21 '15 at 8:40











            • @eliasah Read my comments.

              – BAR
              Sep 21 '15 at 8:41
















            6














            I was able to solve this by running SBT with:



            sbt -mem 4096


            However the MemoryStore is half the size. Still looking into where this fraction is.






            share|improve this answer



















            • 1





              Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default

              – Gillespie
              Sep 21 '15 at 8:26













            • @Gillespie Thank you

              – BAR
              Sep 21 '15 at 8:27











            • This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!

              – eliasah
              Sep 21 '15 at 8:40











            • @eliasah Read my comments.

              – BAR
              Sep 21 '15 at 8:41














            6












            6








            6







            I was able to solve this by running SBT with:



            sbt -mem 4096


            However the MemoryStore is half the size. Still looking into where this fraction is.






            share|improve this answer













            I was able to solve this by running SBT with:



            sbt -mem 4096


            However the MemoryStore is half the size. Still looking into where this fraction is.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Sep 21 '15 at 8:25









            BARBAR

            6,5681167124




            6,5681167124








            • 1





              Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default

              – Gillespie
              Sep 21 '15 at 8:26













            • @Gillespie Thank you

              – BAR
              Sep 21 '15 at 8:27











            • This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!

              – eliasah
              Sep 21 '15 at 8:40











            • @eliasah Read my comments.

              – BAR
              Sep 21 '15 at 8:41














            • 1





              Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default

              – Gillespie
              Sep 21 '15 at 8:26













            • @Gillespie Thank you

              – BAR
              Sep 21 '15 at 8:27











            • This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!

              – eliasah
              Sep 21 '15 at 8:40











            • @eliasah Read my comments.

              – BAR
              Sep 21 '15 at 8:41








            1




            1





            Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default

            – Gillespie
            Sep 21 '15 at 8:26







            Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default

            – Gillespie
            Sep 21 '15 at 8:26















            @Gillespie Thank you

            – BAR
            Sep 21 '15 at 8:27





            @Gillespie Thank you

            – BAR
            Sep 21 '15 at 8:27













            This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!

            – eliasah
            Sep 21 '15 at 8:40





            This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!

            – eliasah
            Sep 21 '15 at 8:40













            @eliasah Read my comments.

            – BAR
            Sep 21 '15 at 8:41





            @eliasah Read my comments.

            – BAR
            Sep 21 '15 at 8:41













            5














            Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.



            You can either launch your spark-shell using:



            ./bin/spark-shell --driver-memory 4g


            or you can set it in spark-defaults.conf:



            spark.driver.memory 4g


            If you are launching an application using spark-submit, you must specify the driver memory as an argument:



            ./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar





            share|improve this answer
























            • I am not using the shell.

              – BAR
              Sep 21 '15 at 8:03











            • then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.

              – Gillespie
              Sep 21 '15 at 8:04











            • So it should go in SBT?

              – BAR
              Sep 21 '15 at 8:05











            • Are you launching your application as in the last paragraph of my answer?

              – Gillespie
              Sep 21 '15 at 8:07






            • 1





              no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.

              – BAR
              Sep 21 '15 at 8:09


















            5














            Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.



            You can either launch your spark-shell using:



            ./bin/spark-shell --driver-memory 4g


            or you can set it in spark-defaults.conf:



            spark.driver.memory 4g


            If you are launching an application using spark-submit, you must specify the driver memory as an argument:



            ./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar





            share|improve this answer
























            • I am not using the shell.

              – BAR
              Sep 21 '15 at 8:03











            • then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.

              – Gillespie
              Sep 21 '15 at 8:04











            • So it should go in SBT?

              – BAR
              Sep 21 '15 at 8:05











            • Are you launching your application as in the last paragraph of my answer?

              – Gillespie
              Sep 21 '15 at 8:07






            • 1





              no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.

              – BAR
              Sep 21 '15 at 8:09
















            5












            5








            5







            Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.



            You can either launch your spark-shell using:



            ./bin/spark-shell --driver-memory 4g


            or you can set it in spark-defaults.conf:



            spark.driver.memory 4g


            If you are launching an application using spark-submit, you must specify the driver memory as an argument:



            ./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar





            share|improve this answer













            Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.



            You can either launch your spark-shell using:



            ./bin/spark-shell --driver-memory 4g


            or you can set it in spark-defaults.conf:



            spark.driver.memory 4g


            If you are launching an application using spark-submit, you must specify the driver memory as an argument:



            ./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar






            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Sep 21 '15 at 8:02









            GillespieGillespie

            1,04821221




            1,04821221













            • I am not using the shell.

              – BAR
              Sep 21 '15 at 8:03











            • then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.

              – Gillespie
              Sep 21 '15 at 8:04











            • So it should go in SBT?

              – BAR
              Sep 21 '15 at 8:05











            • Are you launching your application as in the last paragraph of my answer?

              – Gillespie
              Sep 21 '15 at 8:07






            • 1





              no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.

              – BAR
              Sep 21 '15 at 8:09





















            • I am not using the shell.

              – BAR
              Sep 21 '15 at 8:03











            • then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.

              – Gillespie
              Sep 21 '15 at 8:04











            • So it should go in SBT?

              – BAR
              Sep 21 '15 at 8:05











            • Are you launching your application as in the last paragraph of my answer?

              – Gillespie
              Sep 21 '15 at 8:07






            • 1





              no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.

              – BAR
              Sep 21 '15 at 8:09



















            I am not using the shell.

            – BAR
            Sep 21 '15 at 8:03





            I am not using the shell.

            – BAR
            Sep 21 '15 at 8:03













            then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.

            – Gillespie
            Sep 21 '15 at 8:04





            then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.

            – Gillespie
            Sep 21 '15 at 8:04













            So it should go in SBT?

            – BAR
            Sep 21 '15 at 8:05





            So it should go in SBT?

            – BAR
            Sep 21 '15 at 8:05













            Are you launching your application as in the last paragraph of my answer?

            – Gillespie
            Sep 21 '15 at 8:07





            Are you launching your application as in the last paragraph of my answer?

            – Gillespie
            Sep 21 '15 at 8:07




            1




            1





            no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.

            – BAR
            Sep 21 '15 at 8:09







            no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.

            – BAR
            Sep 21 '15 at 8:09













            2














            in spark 2.x ,you can use SparkSession,which looks like :



                    val spark= new SparkSession()
            .config("spark.executor.memory", "1g")
            .config("spark.driver.memory", "4g")
            .setMaster("local[*]")
            .setAppName("MyApp")





            share|improve this answer




























              2














              in spark 2.x ,you can use SparkSession,which looks like :



                      val spark= new SparkSession()
              .config("spark.executor.memory", "1g")
              .config("spark.driver.memory", "4g")
              .setMaster("local[*]")
              .setAppName("MyApp")





              share|improve this answer


























                2












                2








                2







                in spark 2.x ,you can use SparkSession,which looks like :



                        val spark= new SparkSession()
                .config("spark.executor.memory", "1g")
                .config("spark.driver.memory", "4g")
                .setMaster("local[*]")
                .setAppName("MyApp")





                share|improve this answer













                in spark 2.x ,you can use SparkSession,which looks like :



                        val spark= new SparkSession()
                .config("spark.executor.memory", "1g")
                .config("spark.driver.memory", "4g")
                .setMaster("local[*]")
                .setAppName("MyApp")






                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 16 '18 at 9:18









                fansy1990fansy1990

                213




                213























                    1














                    The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory setting :)



                    Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.






                    share|improve this answer




























                      1














                      The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory setting :)



                      Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.






                      share|improve this answer


























                        1












                        1








                        1







                        The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory setting :)



                        Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.






                        share|improve this answer













                        The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory setting :)



                        Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.







                        share|improve this answer












                        share|improve this answer



                        share|improve this answer










                        answered Sep 21 '15 at 8:22









                        Glennie Helles SindholtGlennie Helles Sindholt

                        7,52022842




                        7,52022842























                            1














                            Tried --driver-memory 4g, --executor-memory 4g, neither worked to increase working memory. However, I noticed that bin/spark-submit was picking up _JAVA_OPTIONS, setting that to -Xmx4g resolved it. I use jdk7






                            share|improve this answer






























                              1














                              Tried --driver-memory 4g, --executor-memory 4g, neither worked to increase working memory. However, I noticed that bin/spark-submit was picking up _JAVA_OPTIONS, setting that to -Xmx4g resolved it. I use jdk7






                              share|improve this answer




























                                1












                                1








                                1







                                Tried --driver-memory 4g, --executor-memory 4g, neither worked to increase working memory. However, I noticed that bin/spark-submit was picking up _JAVA_OPTIONS, setting that to -Xmx4g resolved it. I use jdk7






                                share|improve this answer















                                Tried --driver-memory 4g, --executor-memory 4g, neither worked to increase working memory. However, I noticed that bin/spark-submit was picking up _JAVA_OPTIONS, setting that to -Xmx4g resolved it. I use jdk7







                                share|improve this answer














                                share|improve this answer



                                share|improve this answer








                                edited Sep 1 '16 at 0:55









                                Guillaume Racicot

                                16.2k53872




                                16.2k53872










                                answered Sep 1 '16 at 0:44









                                bshyamkumarbshyamkumar

                                111




                                111























                                    1














                                    You can't change driver memory after application start link.






                                    share|improve this answer




























                                      1














                                      You can't change driver memory after application start link.






                                      share|improve this answer


























                                        1












                                        1








                                        1







                                        You can't change driver memory after application start link.






                                        share|improve this answer













                                        You can't change driver memory after application start link.







                                        share|improve this answer












                                        share|improve this answer



                                        share|improve this answer










                                        answered Nov 16 '18 at 13:33









                                        hamza tunahamza tuna

                                        9091612




                                        9091612























                                            -1














                                            To assign memory to Spark:



                                            on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G






                                            share|improve this answer
























                                            • you can edit this answer and include your other answer

                                              – RaisingAgent
                                              Aug 31 '18 at 10:39
















                                            -1














                                            To assign memory to Spark:



                                            on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G






                                            share|improve this answer
























                                            • you can edit this answer and include your other answer

                                              – RaisingAgent
                                              Aug 31 '18 at 10:39














                                            -1












                                            -1








                                            -1







                                            To assign memory to Spark:



                                            on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G






                                            share|improve this answer













                                            To assign memory to Spark:



                                            on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G







                                            share|improve this answer












                                            share|improve this answer



                                            share|improve this answer










                                            answered Aug 31 '18 at 10:08









                                            Rajiv SinghRajiv Singh

                                            1488




                                            1488













                                            • you can edit this answer and include your other answer

                                              – RaisingAgent
                                              Aug 31 '18 at 10:39



















                                            • you can edit this answer and include your other answer

                                              – RaisingAgent
                                              Aug 31 '18 at 10:39

















                                            you can edit this answer and include your other answer

                                            – RaisingAgent
                                            Aug 31 '18 at 10:39





                                            you can edit this answer and include your other answer

                                            – RaisingAgent
                                            Aug 31 '18 at 10:39











                                            -2














                                            /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G





                                            share|improve this answer






























                                              -2














                                              /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G





                                              share|improve this answer




























                                                -2












                                                -2








                                                -2







                                                /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G





                                                share|improve this answer















                                                /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G






                                                share|improve this answer














                                                share|improve this answer



                                                share|improve this answer








                                                edited Aug 31 '18 at 11:23









                                                Zoe

                                                13.3k85386




                                                13.3k85386










                                                answered Aug 31 '18 at 10:31









                                                Rajiv SinghRajiv Singh

                                                1488




                                                1488






























                                                    draft saved

                                                    draft discarded




















































                                                    Thanks for contributing an answer to Stack Overflow!


                                                    • Please be sure to answer the question. Provide details and share your research!

                                                    But avoid



                                                    • Asking for help, clarification, or responding to other answers.

                                                    • Making statements based on opinion; back them up with references or personal experience.


                                                    To learn more, see our tips on writing great answers.




                                                    draft saved


                                                    draft discarded














                                                    StackExchange.ready(
                                                    function () {
                                                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f32689934%2fincrease-spark-memory-when-using-local%23new-answer', 'question_page');
                                                    }
                                                    );

                                                    Post as a guest















                                                    Required, but never shown





















































                                                    Required, but never shown














                                                    Required, but never shown












                                                    Required, but never shown







                                                    Required, but never shown

































                                                    Required, but never shown














                                                    Required, but never shown












                                                    Required, but never shown







                                                    Required, but never shown







                                                    Popular posts from this blog

                                                    Bressuire

                                                    Vorschmack

                                                    Quarantine