You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/configuration.md
+11-6Lines changed: 11 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,10 +16,10 @@ Spark provides three locations to configure the system:
16
16
# Spark Properties
17
17
18
18
Spark properties control most application settings and are configured separately for each
19
-
application. The preferred way is to set them through
20
-
[SparkConf](api/scala/index.html#org.apache.spark.SparkConf) and passing it as an argument to your
21
-
SparkContext. SparkConf allows you to configure most of the common properties to initialize a
22
-
cluster (e.g. master URL and application name), as well as arbitrary key-value pairs through the
19
+
application. These properties can be set directly on a
20
+
[SparkConf](api/scala/index.html#org.apache.spark.SparkConf) and passed as an argument to your
21
+
SparkContext. SparkConf allows you to configure some of the common properties
22
+
(e.g. master URL and application name), as well as arbitrary key-value pairs through the
23
23
`set()` method. For example, we could initialize an application as follows:
24
24
25
25
{% highlight scala %}
@@ -32,8 +32,13 @@ val sc = new SparkContext(conf)
32
32
33
33
## Dynamically Loading Spark Properties
34
34
In some cases, you may want to avoid hard-coding certain configurations in a `SparkConf`. For
35
-
instance, if you'd like to run the same applicaiton with different masters or different
36
-
amounts of memory.
35
+
instance, if you'd like to run the same application with different masters or different
36
+
amounts of memory. Spark allows you to omit this in your code:
37
+
38
+
{% highlight scala %}
39
+
val conf = new SparkConf().setAppName("myApp")
40
+
{% endhighlight %}
41
+
37
42
38
43
The Spark shell and [`spark-submit`](cluster-overview.html#launching-applications-with-spark-submit) tool support two ways to load configurations dynamically.
39
44
When a SparkConf is created, it will read configuration options from `conf/spark-defaults.conf`,
0 commit comments