Skip to content

Commit 186c975

Browse files
committed
updating configuration doc with SPARK_CONF_DIR
1 parent d2d1543 commit 186c975

File tree

1 file changed

+13
-0
lines changed

1 file changed

+13
-0
lines changed

docs/configuration.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -839,3 +839,16 @@ compute `SPARK_LOCAL_IP` by looking up the IP of a specific network interface.
839839
Spark uses [log4j](http://logging.apache.org/log4j/) for logging. You can configure it by adding a
840840
`log4j.properties` file in the `conf` directory. One way to start is to copy the existing
841841
`log4j.properties.template` located there.
842+
843+
# Overriding configuration
844+
845+
In some cases you might want to provide all configuration from another place than the default SPARK_HOME/conf dir.
846+
For example if you are using the prepackaged version of Spark or if you are building it your self but want to be
847+
independent from your cluster configuration (managed by an automation tool).
848+
849+
In that scenario you can define the SPARK_CONF_DIR variable pointing to an alternate directory containing you configuration.
850+
Spark will then use it for the following configurations:
851+
852+
* spark-defaults.conf and spark-env.sh will be loaded only from the SPARK_CONF_DIR
853+
* log4j.properties, fairscheduler.xml and metrics.properties if present will be loaded from SPARK_CONF_DIR,
854+
but if missing, the ones from SPARK_HOME/conf will be used.

0 commit comments

Comments
 (0)