Skip to content

Commit 592e94a

Browse files
committed
Stash
1 parent 29b5446 commit 592e94a

File tree

1 file changed

+11
-6
lines changed

1 file changed

+11
-6
lines changed

docs/configuration.md

Lines changed: 11 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,10 @@ Spark provides three locations to configure the system:
1616
# Spark Properties
1717

1818
Spark properties control most application settings and are configured separately for each
19-
application. The preferred way is to set them through
20-
[SparkConf](api/scala/index.html#org.apache.spark.SparkConf) and passing it as an argument to your
21-
SparkContext. SparkConf allows you to configure most of the common properties to initialize a
22-
cluster (e.g. master URL and application name), as well as arbitrary key-value pairs through the
19+
application. These properties can be set directly on a
20+
[SparkConf](api/scala/index.html#org.apache.spark.SparkConf) and passed as an argument to your
21+
SparkContext. SparkConf allows you to configure some of the common properties
22+
(e.g. master URL and application name), as well as arbitrary key-value pairs through the
2323
`set()` method. For example, we could initialize an application as follows:
2424

2525
{% highlight scala %}
@@ -32,8 +32,13 @@ val sc = new SparkContext(conf)
3232

3333
## Dynamically Loading Spark Properties
3434
In some cases, you may want to avoid hard-coding certain configurations in a `SparkConf`. For
35-
instance, if you'd like to run the same applicaiton with different masters or different
36-
amounts of memory.
35+
instance, if you'd like to run the same application with different masters or different
36+
amounts of memory. Spark allows you to omit this in your code:
37+
38+
{% highlight scala %}
39+
val conf = new SparkConf().setAppName("myApp")
40+
{% endhighlight %}
41+
3742

3843
The Spark shell and [`spark-submit`](cluster-overview.html#launching-applications-with-spark-submit) tool support two ways to load configurations dynamically.
3944
When a SparkConf is created, it will read configuration options from `conf/spark-defaults.conf`,

0 commit comments

Comments
 (0)