0

I am running Spark application using spark-submit and defined JVM parameters. With this set of parameters I get java heap space error:

EXTRA_JVM_FLAGS="-server -XX:+UseG1GC
                         -XX:ReservedCodeCacheSize=384m
                         -XX:MaxDirectMemorySize=2G
                         -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005
     --master "local[4]"
     --driver-memory 2G
     --driver-java-options "${EXTRA_JVM_FLAGS}" 

I tried to increase driver memory but it caused JVM crash. Also, I tried to increase max direct memory size which did not help in any way. What options should I change to fix heap space error?

1 Answer 1

1

You should try the most basic option -Xmx - this is the max heap space size. Code cache and direct memory size are native memory areas and don't affect the size of the heap. By default, the JVM allocates 1/4 of RAM available on the box as max heap size. You can increase that if the machine is dedicated to the one JVM process pretty safely.

Sign up to request clarification or add additional context in comments.

1 Comment

As far as I know, it works a bit different for Spark. Specifically, -Xmx is defined through the driver-memory option

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.