Skip to content

Commit 3537876

Browse files
HyukjinKwonsrowen
authored andcommitted
[SPARK-20343][BUILD] Avoid Unidoc build only if Hadoop 2.6 is explicitly set in SBT build
## What changes were proposed in this pull request? This PR proposes two things as below: - Avoid Unidoc build only if Hadoop 2.6 is explicitly set in SBT build Due to a different dependency resolution in SBT & Unidoc by an unknown reason, the documentation build fails on a specific machine & environment in Jenkins but it was unable to reproduce. So, this PR just checks an environment variable `AMPLAB_JENKINS_BUILD_PROFILE` that is set in Hadoop 2.6 SBT build against branches on Jenkins, and then disables Unidoc build. **Note that PR builder will still build it with Hadoop 2.6 & SBT.** ``` ======================================================================== Building Unidoc API Documentation ======================================================================== [info] Building Spark unidoc (w/Hive 1.2.1) using SBT with these arguments: -Phadoop-2.6 -Pmesos -Pkinesis-asl -Pyarn -Phive-thriftserver -Phive unidoc Using /usr/java/jdk1.8.0_60 as default JAVA_HOME. ... ``` I checked the environment variables from the logs (first bit) as below: - **spark-master-test-sbt-hadoop-2.6** (this one is being failed) - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.6/lastBuild/consoleFull ``` JAVA_HOME=/usr/java/jdk1.8.0_60 JAVA_7_HOME=/usr/java/jdk1.7.0_79 SPARK_BRANCH=master AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.6 <- I use this variable AMPLAB_JENKINS="true" ``` - spark-master-test-sbt-hadoop-2.7 - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/lastBuild/consoleFull ``` JAVA_HOME=/usr/java/jdk1.8.0_60 JAVA_7_HOME=/usr/java/jdk1.7.0_79 SPARK_BRANCH=master AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.7 AMPLAB_JENKINS="true" ``` - spark-master-test-maven-hadoop-2.6 - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-2.6/lastBuild/consoleFull ``` JAVA_HOME=/usr/java/jdk1.8.0_60 JAVA_7_HOME=/usr/java/jdk1.7.0_79 HADOOP_PROFILE=hadoop-2.6 HADOOP_VERSION= SPARK_BRANCH=master AMPLAB_JENKINS="true" ``` - spark-master-test-maven-hadoop-2.7 - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-2.7/lastBuild/consoleFull ``` JAVA_HOME=/usr/java/jdk1.8.0_60 JAVA_7_HOME=/usr/java/jdk1.7.0_79 HADOOP_PROFILE=hadoop-2.7 HADOOP_VERSION= SPARK_BRANCH=master AMPLAB_JENKINS="true" ``` - PR builder - https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/75843/consoleFull ``` JENKINS_MASTER_HOSTNAME=amp-jenkins-master JAVA_HOME=/usr/java/jdk1.8.0_60 JAVA_7_HOME=/usr/java/jdk1.7.0_79 ``` Assuming from other logs in branch-2.1 - SBT & Hadoop 2.6 against branch-2.1 https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-branch-2.1-test-sbt-hadoop-2.6/lastBuild/consoleFull ``` JAVA_HOME=/usr/java/jdk1.8.0_60 JAVA_7_HOME=/usr/java/jdk1.7.0_79 SPARK_BRANCH=branch-2.1 AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.6 AMPLAB_JENKINS="true" ``` - Maven & Hadoop 2.6 against branch-2.1 https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-branch-2.1-test-maven-hadoop-2.6/lastBuild/consoleFull ``` JAVA_HOME=/usr/java/jdk1.8.0_60 JAVA_7_HOME=/usr/java/jdk1.7.0_79 HADOOP_PROFILE=hadoop-2.6 HADOOP_VERSION= SPARK_BRANCH=branch-2.1 AMPLAB_JENKINS="true" ``` We have been using the same convention for those variables. These are actually being used in `run-tests.py` script - here https://github.com/apache/spark/blob/master/dev/run-tests.py#L519-L520 - Revert the previous try After #17651, it seems the build still fails on SBT Hadoop 2.6 master. I am unable to reproduce this - #17477 (comment) and the reviewer was too. So, this got merged as it looks the only way to verify this is to merge it currently (as no one seems able to reproduce this). ## How was this patch tested? I only checked `is_hadoop_version_2_6 = os.environ.get("AMPLAB_JENKINS_BUILD_PROFILE") == "hadoop2.6"` is working fine as expected as below: ```python >>> import collections >>> os = collections.namedtuple('os', 'environ')(environ={"AMPLAB_JENKINS_BUILD_PROFILE": "hadoop2.6"}) >>> print(not os.environ.get("AMPLAB_JENKINS_BUILD_PROFILE") == "hadoop2.6") False >>> os = collections.namedtuple('os', 'environ')(environ={"AMPLAB_JENKINS_BUILD_PROFILE": "hadoop2.7"}) >>> print(not os.environ.get("AMPLAB_JENKINS_BUILD_PROFILE") == "hadoop2.6") True >>> os = collections.namedtuple('os', 'environ')(environ={}) >>> print(not os.environ.get("AMPLAB_JENKINS_BUILD_PROFILE") == "hadoop2.6") True ``` I tried many ways but I was unable to reproduce this in my local. Sean also tried the way I did but he was also unable to reproduce this. Please refer the comments in #17477 (comment) Author: hyukjinkwon <[email protected]> Closes #17669 from HyukjinKwon/revert-SPARK-20343.
1 parent 773754b commit 3537876

File tree

3 files changed

+12
-15
lines changed

3 files changed

+12
-15
lines changed

dev/run-tests.py

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -365,8 +365,16 @@ def build_spark_assembly_sbt(hadoop_version):
365365
print("[info] Building Spark assembly (w/Hive 1.2.1) using SBT with these arguments: ",
366366
" ".join(profiles_and_goals))
367367
exec_sbt(profiles_and_goals)
368-
# Make sure that Java and Scala API documentation can be generated
369-
build_spark_unidoc_sbt(hadoop_version)
368+
369+
# Note that we skip Unidoc build only if Hadoop 2.6 is explicitly set in this SBT build.
370+
# Due to a different dependency resolution in SBT & Unidoc by an unknown reason, the
371+
# documentation build fails on a specific machine & environment in Jenkins but it was unable
372+
# to reproduce. Please see SPARK-20343. This is a band-aid fix that should be removed in
373+
# the future.
374+
is_hadoop_version_2_6 = os.environ.get("AMPLAB_JENKINS_BUILD_PROFILE") == "hadoop2.6"
375+
if not is_hadoop_version_2_6:
376+
# Make sure that Java and Scala API documentation can be generated
377+
build_spark_unidoc_sbt(hadoop_version)
370378

371379

372380
def build_apache_spark(build_tool, hadoop_version):

pom.xml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -142,7 +142,6 @@
142142
<ivy.version>2.4.0</ivy.version>
143143
<oro.version>2.0.8</oro.version>
144144
<codahale.metrics.version>3.1.2</codahale.metrics.version>
145-
<!-- Keep consistent with Avro vesion in SBT build for SPARK-20343 -->
146145
<avro.version>1.7.7</avro.version>
147146
<avro.mapred.classifier>hadoop2</avro.mapred.classifier>
148147
<jets3t.version>0.9.3</jets3t.version>

project/SparkBuild.scala

Lines changed: 2 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -318,8 +318,8 @@ object SparkBuild extends PomBuild {
318318
enable(MimaBuild.mimaSettings(sparkHome, x))(x)
319319
}
320320

321-
/* Generate and pick the spark build info from extra-resources and override a dependency */
322-
enable(Core.settings ++ CoreDependencyOverrides.settings)(core)
321+
/* Generate and pick the spark build info from extra-resources */
322+
enable(Core.settings)(core)
323323

324324
/* Unsafe settings */
325325
enable(Unsafe.settings)(unsafe)
@@ -443,16 +443,6 @@ object DockerIntegrationTests {
443443
)
444444
}
445445

446-
/**
447-
* Overrides to work around sbt's dependency resolution being different from Maven's in Unidoc.
448-
*
449-
* Note that, this is a hack that should be removed in the future. See SPARK-20343
450-
*/
451-
object CoreDependencyOverrides {
452-
lazy val settings = Seq(
453-
dependencyOverrides += "org.apache.avro" % "avro" % "1.7.7")
454-
}
455-
456446
/**
457447
* Overrides to work around sbt's dependency resolution being different from Maven's.
458448
*/

0 commit comments

Comments
 (0)