You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SparkR by default uses Apache Spark 1.1.0. You can switch to a different Spark
26
+
SparkR by default uses Apache Spark 1.3.0. You can switch to a different Spark
33
27
version by setting the environment variable `SPARK_VERSION`. For example, to
34
28
use Apache Spark 1.2.0, you can run
35
29
@@ -97,7 +91,7 @@ To run one of them, use `./sparkR <filename> <args>`. For example:
97
91
98
92
./sparkR examples/pi.R local[2]
99
93
100
-
You can also run the unit-tests for SparkR by running
94
+
You can also run the unit-tests for SparkR by running. You need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first.
101
95
102
96
./run-tests.sh
103
97
@@ -110,7 +104,7 @@ Instructions for running SparkR on EC2 can be found in the
110
104
Currently, SparkR supports running on YARN with the `yarn-client` mode. These steps show how to build SparkR with YARN support and run SparkR programs on a YARN cluster:
111
105
112
106
```
113
-
# assumes Java, R, rJava, yarn, spark etc. are installed on the whole cluster.
107
+
# assumes Java, R, yarn, spark etc. are installed on the whole cluster.
0 commit comments