You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/configuration.md
+13Lines changed: 13 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -839,3 +839,16 @@ compute `SPARK_LOCAL_IP` by looking up the IP of a specific network interface.
839
839
Spark uses [log4j](http://logging.apache.org/log4j/) for logging. You can configure it by adding a
840
840
`log4j.properties` file in the `conf` directory. One way to start is to copy the existing
841
841
`log4j.properties.template` located there.
842
+
843
+
# Overriding configuration
844
+
845
+
In some cases you might want to provide all configuration from another place than the default SPARK_HOME/conf dir.
846
+
For example if you are using the prepackaged version of Spark or if you are building it your self but want to be
847
+
independent from your cluster configuration (managed by an automation tool).
848
+
849
+
In that scenario you can define the SPARK_CONF_DIR variable pointing to an alternate directory containing you configuration.
850
+
Spark will then use it for the following configurations:
851
+
852
+
* spark-defaults.conf and spark-env.sh will be loaded only from the SPARK_CONF_DIR
853
+
* log4j.properties, fairscheduler.xml and metrics.properties if present will be loaded from SPARK_CONF_DIR,
854
+
but if missing, the ones from SPARK_HOME/conf will be used.
0 commit comments