Skip to content

Latest commit

 

History

History
46 lines (32 loc) · 2.12 KB

spark-properties.adoc

File metadata and controls

46 lines (32 loc) · 2.12 KB

Spark Properties and spark-defaults.conf Properties File

Spark properties are the means of tuning the execution environment for your Spark applications.

The default Spark properties file is $SPARK_HOME/conf/spark-defaults.conf that could be overriden using spark-submit with --properties-file command-line option.

Table 1. Environment Variables
Environment Variable Default Value Description

SPARK_CONF_DIR

${SPARK_HOME}/conf

Spark’s configuration directory (with spark-defaults.conf)

Tip
Read the official documentation of Apache Spark on Spark Configuration.
Table 2. Spark Application’s Properties
Property Name Default Description

spark.local.dir

/tmp

Comma-separated list of directories that are used as a temporary storage for "scratch" space, including map output files and RDDs that get stored on disk.

This should be on a fast, local disk in your system. It can also be a comma-separated list of multiple directories on different disks.

spark-defaults.conf — Default Spark Properties File

spark-defaults.conf (under SPARK_CONF_DIR or $SPARK_HOME/conf) is the default properties file with the Spark properties of your Spark applications.

Note
spark-defaults.conf is loaded by AbstractCommandBuilder’s loadPropertiesFile internal method.

Calculating Path of Default Spark Properties — Utils.getDefaultPropertiesFile method

getDefaultPropertiesFile(env: Map[String, String] = sys.env): String

getDefaultPropertiesFile calculates the absolute path to spark-defaults.conf properties file that can be either in directory specified by SPARK_CONF_DIR environment variable or $SPARK_HOME/conf directory.

Note
getDefaultPropertiesFile is part of private[spark] org.apache.spark.util.Utils object.