Skip to content

[ZEPPELIN-1336] hadoop library for spark interpreter is not match.#1335

Closed
astroshim wants to merge 3 commits intoapache:masterfrom
astroshim:ZEPPELIN-1336
Closed

[ZEPPELIN-1336] hadoop library for spark interpreter is not match.#1335
astroshim wants to merge 3 commits intoapache:masterfrom
astroshim:ZEPPELIN-1336

Conversation

@astroshim
Copy link
Copy Markdown
Contributor

@astroshim astroshim commented Aug 16, 2016

What is this PR for?

This PR is for download hadoop libraries for yarn cluster.
This issue comes from #1318.

What type of PR is it?

Bug Fix

What is the Jira issue?

https://issues.apache.org/jira/browse/ZEPPELIN-1336

How should this be tested?

1 build zeppelin for spark2.0&hadoop2.7

mvn clean package -Pspark-2.0 -Phadoop-2.7 -Dhadoop.version=2.7.2 -Pyarn -Ppyspark -Pscala-2.11 -DskipTests 

2 verify hadoop library version for spark.

ls -al $ZEPPELIN_HOME/spark/target/lib/

Questions:

  • Does the licenses files need update? no
  • Is there breaking changes for older versions? no
  • Does this needs documentation? no

spark/pom.xml Outdated

<groupId>org.apache.zeppelin</groupId>
<artifactId>zeppelin-spark_2.10</artifactId>
<artifactId>zeppelin-spark_2.11</artifactId>
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently Zeppelin is built with Scala 2.10 by default and that's why it has 2.10 suffix. If you changed this line because of the build error, I recommend you to use dev/change_scala_version.sh 2.11 prior to building with mvn.
FYI, there is jira ticket making Scala 2.11 to default so this change better to be handled there.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update spark2.11 is my mistake. It was for testing so I'll fix it.
Thank you.

@jongyoul
Copy link
Copy Markdown
Member

I think spark-dependencies is a proper position for this change. How do you think?

@astroshim
Copy link
Copy Markdown
Contributor Author

@jongyoul I will close this issue since removing yarn profile is working.
please do it at the #1301 as @zjffdu mentioned.

@astroshim astroshim closed this Aug 17, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants