Skip to content

Commit 749e2d0

Browse files
committed
Updated README
1 parent 0981dff commit 749e2d0

File tree

1 file changed

+4
-10
lines changed

1 file changed

+4
-10
lines changed

README.md

Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -10,15 +10,9 @@ R.
1010

1111
### Requirements
1212
SparkR requires Scala 2.10 and Spark version >= 0.9.0. Current build by default uses
13-
Apache Spark 1.1.0. You can also build SparkR against a
13+
Apache Spark 1.3.0. You can also build SparkR against a
1414
different Spark version (>= 0.9.0) by modifying `pkg/src/build.sbt`.
1515

16-
SparkR also requires the R package `rJava` to be installed. To install `rJava`,
17-
you can run the following command in R:
18-
19-
install.packages("rJava")
20-
21-
2216
### Package installation
2317
To develop SparkR, you can build the scala package and the R package using
2418

@@ -29,7 +23,7 @@ If you wish to try out the package directly from github, you can use [`install_g
2923
library(devtools)
3024
install_github("amplab-extras/SparkR-pkg", subdir="pkg")
3125

32-
SparkR by default uses Apache Spark 1.1.0. You can switch to a different Spark
26+
SparkR by default uses Apache Spark 1.3.0. You can switch to a different Spark
3327
version by setting the environment variable `SPARK_VERSION`. For example, to
3428
use Apache Spark 1.2.0, you can run
3529

@@ -97,7 +91,7 @@ To run one of them, use `./sparkR <filename> <args>`. For example:
9791

9892
./sparkR examples/pi.R local[2]
9993

100-
You can also run the unit-tests for SparkR by running
94+
You can also run the unit-tests for SparkR by running. You need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first.
10195

10296
./run-tests.sh
10397

@@ -110,7 +104,7 @@ Instructions for running SparkR on EC2 can be found in the
110104
Currently, SparkR supports running on YARN with the `yarn-client` mode. These steps show how to build SparkR with YARN support and run SparkR programs on a YARN cluster:
111105

112106
```
113-
# assumes Java, R, rJava, yarn, spark etc. are installed on the whole cluster.
107+
# assumes Java, R, yarn, spark etc. are installed on the whole cluster.
114108
cd SparkR-pkg/
115109
USE_YARN=1 SPARK_YARN_VERSION=2.4.0 SPARK_HADOOP_VERSION=2.4.0 ./install-dev.sh
116110
```

0 commit comments

Comments
 (0)