Increase Spark memory when using local[*]
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
How do I increase Spark memory when using local[*]?
I tried setting the memory like this:
val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
But I still get:
MemoryStore: MemoryStore started with capacity 524.1 MB
Does this have something to do with:
.setMaster("local[*]")
scala apache-spark
add a comment |
How do I increase Spark memory when using local[*]?
I tried setting the memory like this:
val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
But I still get:
MemoryStore: MemoryStore started with capacity 524.1 MB
Does this have something to do with:
.setMaster("local[*]")
scala apache-spark
Have you tried increasingspark.executor.memory
?
– Marius Soutier
Sep 21 '15 at 7:55
.setMaster("local[*]") is for using the available core at local machine for doing the processing
– Ajay Gupta
Sep 21 '15 at 8:00
add a comment |
How do I increase Spark memory when using local[*]?
I tried setting the memory like this:
val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
But I still get:
MemoryStore: MemoryStore started with capacity 524.1 MB
Does this have something to do with:
.setMaster("local[*]")
scala apache-spark
How do I increase Spark memory when using local[*]?
I tried setting the memory like this:
val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
But I still get:
MemoryStore: MemoryStore started with capacity 524.1 MB
Does this have something to do with:
.setMaster("local[*]")
scala apache-spark
scala apache-spark
asked Sep 21 '15 at 7:49
BARBAR
6,5681167124
6,5681167124
Have you tried increasingspark.executor.memory
?
– Marius Soutier
Sep 21 '15 at 7:55
.setMaster("local[*]") is for using the available core at local machine for doing the processing
– Ajay Gupta
Sep 21 '15 at 8:00
add a comment |
Have you tried increasingspark.executor.memory
?
– Marius Soutier
Sep 21 '15 at 7:55
.setMaster("local[*]") is for using the available core at local machine for doing the processing
– Ajay Gupta
Sep 21 '15 at 8:00
Have you tried increasing
spark.executor.memory
?– Marius Soutier
Sep 21 '15 at 7:55
Have you tried increasing
spark.executor.memory
?– Marius Soutier
Sep 21 '15 at 7:55
.setMaster("local[*]") is for using the available core at local machine for doing the processing
– Ajay Gupta
Sep 21 '15 at 8:00
.setMaster("local[*]") is for using the available core at local machine for doing the processing
– Ajay Gupta
Sep 21 '15 at 8:00
add a comment |
8 Answers
8
active
oldest
votes
I was able to solve this by running SBT with:
sbt -mem 4096
However the MemoryStore is half the size. Still looking into where this fraction is.
1
Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default
– Gillespie
Sep 21 '15 at 8:26
@Gillespie Thank you
– BAR
Sep 21 '15 at 8:27
This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!
– eliasah
Sep 21 '15 at 8:40
@eliasah Read my comments.
– BAR
Sep 21 '15 at 8:41
add a comment |
Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.
You can either launch your spark-shell using:
./bin/spark-shell --driver-memory 4g
or you can set it in spark-defaults.conf:
spark.driver.memory 4g
If you are launching an application using spark-submit, you must specify the driver memory as an argument:
./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar
I am not using the shell.
– BAR
Sep 21 '15 at 8:03
then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.
– Gillespie
Sep 21 '15 at 8:04
So it should go in SBT?
– BAR
Sep 21 '15 at 8:05
Are you launching your application as in the last paragraph of my answer?
– Gillespie
Sep 21 '15 at 8:07
1
no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.
– BAR
Sep 21 '15 at 8:09
|
show 1 more comment
in spark 2.x ,you can use SparkSession,which looks like :
val spark= new SparkSession()
.config("spark.executor.memory", "1g")
.config("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
add a comment |
The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory
setting :)
Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.
add a comment |
Tried --driver-memory 4g
, --executor-memory 4g
, neither worked to increase working memory. However, I noticed that bin/spark-submit
was picking up _JAVA_OPTIONS, setting that to -Xmx4g
resolved it. I use jdk7
add a comment |
You can't change driver memory after application start link.
add a comment |
To assign memory to Spark:
on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G
you can edit this answer and include your other answer
– RaisingAgent
Aug 31 '18 at 10:39
add a comment |
/usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f32689934%2fincrease-spark-memory-when-using-local%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
8 Answers
8
active
oldest
votes
8 Answers
8
active
oldest
votes
active
oldest
votes
active
oldest
votes
I was able to solve this by running SBT with:
sbt -mem 4096
However the MemoryStore is half the size. Still looking into where this fraction is.
1
Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default
– Gillespie
Sep 21 '15 at 8:26
@Gillespie Thank you
– BAR
Sep 21 '15 at 8:27
This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!
– eliasah
Sep 21 '15 at 8:40
@eliasah Read my comments.
– BAR
Sep 21 '15 at 8:41
add a comment |
I was able to solve this by running SBT with:
sbt -mem 4096
However the MemoryStore is half the size. Still looking into where this fraction is.
1
Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default
– Gillespie
Sep 21 '15 at 8:26
@Gillespie Thank you
– BAR
Sep 21 '15 at 8:27
This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!
– eliasah
Sep 21 '15 at 8:40
@eliasah Read my comments.
– BAR
Sep 21 '15 at 8:41
add a comment |
I was able to solve this by running SBT with:
sbt -mem 4096
However the MemoryStore is half the size. Still looking into where this fraction is.
I was able to solve this by running SBT with:
sbt -mem 4096
However the MemoryStore is half the size. Still looking into where this fraction is.
answered Sep 21 '15 at 8:25
BARBAR
6,5681167124
6,5681167124
1
Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default
– Gillespie
Sep 21 '15 at 8:26
@Gillespie Thank you
– BAR
Sep 21 '15 at 8:27
This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!
– eliasah
Sep 21 '15 at 8:40
@eliasah Read my comments.
– BAR
Sep 21 '15 at 8:41
add a comment |
1
Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default
– Gillespie
Sep 21 '15 at 8:26
@Gillespie Thank you
– BAR
Sep 21 '15 at 8:27
This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!
– eliasah
Sep 21 '15 at 8:40
@eliasah Read my comments.
– BAR
Sep 21 '15 at 8:41
1
1
Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default
– Gillespie
Sep 21 '15 at 8:26
Total storage memory is calculated by spark.storage.memoryFraction * spark.storage.safetyFraction - which are 0.6 and 0.9 by default
– Gillespie
Sep 21 '15 at 8:26
@Gillespie Thank you
– BAR
Sep 21 '15 at 8:27
@Gillespie Thank you
– BAR
Sep 21 '15 at 8:27
This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!
– eliasah
Sep 21 '15 at 8:40
This answer is not correct. This is not how you increase memory when you are running your app in a standalone local cluster. Nevertheless the comments of @Gillespie are very useful + his answer is the way to do it!
– eliasah
Sep 21 '15 at 8:40
@eliasah Read my comments.
– BAR
Sep 21 '15 at 8:41
@eliasah Read my comments.
– BAR
Sep 21 '15 at 8:41
add a comment |
Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.
You can either launch your spark-shell using:
./bin/spark-shell --driver-memory 4g
or you can set it in spark-defaults.conf:
spark.driver.memory 4g
If you are launching an application using spark-submit, you must specify the driver memory as an argument:
./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar
I am not using the shell.
– BAR
Sep 21 '15 at 8:03
then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.
– Gillespie
Sep 21 '15 at 8:04
So it should go in SBT?
– BAR
Sep 21 '15 at 8:05
Are you launching your application as in the last paragraph of my answer?
– Gillespie
Sep 21 '15 at 8:07
1
no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.
– BAR
Sep 21 '15 at 8:09
|
show 1 more comment
Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.
You can either launch your spark-shell using:
./bin/spark-shell --driver-memory 4g
or you can set it in spark-defaults.conf:
spark.driver.memory 4g
If you are launching an application using spark-submit, you must specify the driver memory as an argument:
./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar
I am not using the shell.
– BAR
Sep 21 '15 at 8:03
then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.
– Gillespie
Sep 21 '15 at 8:04
So it should go in SBT?
– BAR
Sep 21 '15 at 8:05
Are you launching your application as in the last paragraph of my answer?
– Gillespie
Sep 21 '15 at 8:07
1
no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.
– BAR
Sep 21 '15 at 8:09
|
show 1 more comment
Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.
You can either launch your spark-shell using:
./bin/spark-shell --driver-memory 4g
or you can set it in spark-defaults.conf:
spark.driver.memory 4g
If you are launching an application using spark-submit, you must specify the driver memory as an argument:
./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar
Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.
You can either launch your spark-shell using:
./bin/spark-shell --driver-memory 4g
or you can set it in spark-defaults.conf:
spark.driver.memory 4g
If you are launching an application using spark-submit, you must specify the driver memory as an argument:
./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar
answered Sep 21 '15 at 8:02
GillespieGillespie
1,04821221
1,04821221
I am not using the shell.
– BAR
Sep 21 '15 at 8:03
then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.
– Gillespie
Sep 21 '15 at 8:04
So it should go in SBT?
– BAR
Sep 21 '15 at 8:05
Are you launching your application as in the last paragraph of my answer?
– Gillespie
Sep 21 '15 at 8:07
1
no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.
– BAR
Sep 21 '15 at 8:09
|
show 1 more comment
I am not using the shell.
– BAR
Sep 21 '15 at 8:03
then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.
– Gillespie
Sep 21 '15 at 8:04
So it should go in SBT?
– BAR
Sep 21 '15 at 8:05
Are you launching your application as in the last paragraph of my answer?
– Gillespie
Sep 21 '15 at 8:07
1
no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.
– BAR
Sep 21 '15 at 8:09
I am not using the shell.
– BAR
Sep 21 '15 at 8:03
I am not using the shell.
– BAR
Sep 21 '15 at 8:03
then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.
– Gillespie
Sep 21 '15 at 8:04
then you need to provide the driver memory as an argument when launching your application. The point is that by the time your SparkConf is read in your application - it's too late.
– Gillespie
Sep 21 '15 at 8:04
So it should go in SBT?
– BAR
Sep 21 '15 at 8:05
So it should go in SBT?
– BAR
Sep 21 '15 at 8:05
Are you launching your application as in the last paragraph of my answer?
– Gillespie
Sep 21 '15 at 8:07
Are you launching your application as in the last paragraph of my answer?
– Gillespie
Sep 21 '15 at 8:07
1
1
no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.
– BAR
Sep 21 '15 at 8:09
no. ./bin/spark-submit is for the standalone mode but executing on the server/slave. When local[*] is used the program creates its own context. No need to have the server/slave setup, so there is no spark-submit.
– BAR
Sep 21 '15 at 8:09
|
show 1 more comment
in spark 2.x ,you can use SparkSession,which looks like :
val spark= new SparkSession()
.config("spark.executor.memory", "1g")
.config("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
add a comment |
in spark 2.x ,you can use SparkSession,which looks like :
val spark= new SparkSession()
.config("spark.executor.memory", "1g")
.config("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
add a comment |
in spark 2.x ,you can use SparkSession,which looks like :
val spark= new SparkSession()
.config("spark.executor.memory", "1g")
.config("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
in spark 2.x ,you can use SparkSession,which looks like :
val spark= new SparkSession()
.config("spark.executor.memory", "1g")
.config("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
answered Nov 16 '18 at 9:18
fansy1990fansy1990
213
213
add a comment |
add a comment |
The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory
setting :)
Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.
add a comment |
The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory
setting :)
Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.
add a comment |
The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory
setting :)
Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.
The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory
setting :)
Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.
answered Sep 21 '15 at 8:22
Glennie Helles SindholtGlennie Helles Sindholt
7,52022842
7,52022842
add a comment |
add a comment |
Tried --driver-memory 4g
, --executor-memory 4g
, neither worked to increase working memory. However, I noticed that bin/spark-submit
was picking up _JAVA_OPTIONS, setting that to -Xmx4g
resolved it. I use jdk7
add a comment |
Tried --driver-memory 4g
, --executor-memory 4g
, neither worked to increase working memory. However, I noticed that bin/spark-submit
was picking up _JAVA_OPTIONS, setting that to -Xmx4g
resolved it. I use jdk7
add a comment |
Tried --driver-memory 4g
, --executor-memory 4g
, neither worked to increase working memory. However, I noticed that bin/spark-submit
was picking up _JAVA_OPTIONS, setting that to -Xmx4g
resolved it. I use jdk7
Tried --driver-memory 4g
, --executor-memory 4g
, neither worked to increase working memory. However, I noticed that bin/spark-submit
was picking up _JAVA_OPTIONS, setting that to -Xmx4g
resolved it. I use jdk7
edited Sep 1 '16 at 0:55
Guillaume Racicot
16.2k53872
16.2k53872
answered Sep 1 '16 at 0:44
bshyamkumarbshyamkumar
111
111
add a comment |
add a comment |
You can't change driver memory after application start link.
add a comment |
You can't change driver memory after application start link.
add a comment |
You can't change driver memory after application start link.
You can't change driver memory after application start link.
answered Nov 16 '18 at 13:33
hamza tunahamza tuna
9091612
9091612
add a comment |
add a comment |
To assign memory to Spark:
on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G
you can edit this answer and include your other answer
– RaisingAgent
Aug 31 '18 at 10:39
add a comment |
To assign memory to Spark:
on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G
you can edit this answer and include your other answer
– RaisingAgent
Aug 31 '18 at 10:39
add a comment |
To assign memory to Spark:
on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G
To assign memory to Spark:
on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G
answered Aug 31 '18 at 10:08
Rajiv SinghRajiv Singh
1488
1488
you can edit this answer and include your other answer
– RaisingAgent
Aug 31 '18 at 10:39
add a comment |
you can edit this answer and include your other answer
– RaisingAgent
Aug 31 '18 at 10:39
you can edit this answer and include your other answer
– RaisingAgent
Aug 31 '18 at 10:39
you can edit this answer and include your other answer
– RaisingAgent
Aug 31 '18 at 10:39
add a comment |
/usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G
add a comment |
/usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G
add a comment |
/usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G
/usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G
edited Aug 31 '18 at 11:23
Zoe
13.3k85386
13.3k85386
answered Aug 31 '18 at 10:31
Rajiv SinghRajiv Singh
1488
1488
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f32689934%2fincrease-spark-memory-when-using-local%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Have you tried increasing
spark.executor.memory
?– Marius Soutier
Sep 21 '15 at 7:55
.setMaster("local[*]") is for using the available core at local machine for doing the processing
– Ajay Gupta
Sep 21 '15 at 8:00