Skip to content

Commit c11f24a

Browse files
jsnowackiHyukjinKwon
authored andcommitted
[SPARK-18136] Fix SPARK_JARS_DIR for Python pip install on Windows
## What changes were proposed in this pull request? Fix for setup of `SPARK_JARS_DIR` on Windows as it looks for `%SPARK_HOME%\RELEASE` file instead of `%SPARK_HOME%\jars` as it should. RELEASE file is not included in the `pip` build of PySpark. ## How was this patch tested? Local install of PySpark on Anaconda 4.4.0 (Python 3.6.1). Author: Jakub Nowacki <[email protected]> Closes #19310 from jsnowacki/master.
1 parent f180b65 commit c11f24a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

bin/spark-class2.cmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ if "x%1"=="x" (
2929
)
3030

3131
rem Find Spark jars.
32-
if exist "%SPARK_HOME%\RELEASE" (
32+
if exist "%SPARK_HOME%\jars" (
3333
set SPARK_JARS_DIR="%SPARK_HOME%\jars"
3434
) else (
3535
set SPARK_JARS_DIR="%SPARK_HOME%\assembly\target\scala-%SPARK_SCALA_VERSION%\jars"

0 commit comments

Comments
 (0)