Closed
Conversation
Owner
Author
|
@zjffdu @felixcheung Please review it. I'll move it into apache/zeppelin then merge it. |
### What is this PR for? Fixing some CI by changing `hadoop-client` version of ₩spark_core` and `guava` version of `zeppelin-python`. We basically remove `zeppelin-python` dependency from `zeppelin-spark` interpreter in a near future. ### What type of PR is it? [Hot Fix] ### Todos * [x] - Fix dependency problem between hadoop-client and guava ### What is the Jira issue? N/A ### How should this be tested? * It should pass CI ### Screenshots (if appropriate) ### Questions: * Does the licenses files need update? No * Is there breaking changes for older versions? No * Does this needs documentation? No Author: Jongyoul Lee <[email protected]> Closes apache#3010 from jongyoul/hotfix/zeppelin-2221-dependency-issue and squashes the following commits: d1b89dd [Jongyoul Lee] Change hadoop-client version to avoid guava conflict 16fa70e [Jongyoul Lee] Exclude guava from zeppelin-python dependency from spark/interpreter ede117b [Jongyoul Lee] Add a guava version of `19.0` to avoid guava version mismatch b0873e3 [Jongyoul Lee] Change the version of the dependency of `hadoop-common` to `2.6.5` 542b188 [Jongyoul Lee] Revert a scope of hadoop-comoon to `provided` 8bc67e6 [Jongyoul Lee] Add a dependency of `hadoop-common` as `test` scope
### What is this PR for? Someone complain that they could not get appId, this PR just try to verify appId returned by livy rest api is not null. ### What type of PR is it? [Improvement] ### Todos * [ ] - Task ### What is the Jira issue? * No jira created ### How should this be tested? * CI pass ### Screenshots (if appropriate) ### Questions: * Does the licenses files need update? No * Is there breaking changes for older versions? No * Does this needs documentation? No Author: Jeff Zhang <[email protected]> Closes apache#2999 from zjffdu/minor_livy and squashes the following commits: eae0cf5 [Jeff Zhang] [MINOR] Verify appId is null in LivyInterpreterIT
### What is this PR for? Just remove setupConfForPySpark in NewSparkInterpreter as it is not necessary and will cause NPE when the node launch spark interpreter doesn't have spark installed in yarn cluster mode. ### What type of PR is it? [Bug Fix] ### Todos * [ ] - Task ### What is the Jira issue? * https://issues.apache.org/jira/browse/ZEPPELIN-3531 ### How should this be tested? * CI pass & Manually tested in a 3 node cluster ### Screenshots (if appropriate) ### Questions: * Does the licenses files need update? No * Is there breaking changes for older versions? No * Does this needs documentation? No Author: Jeff Zhang <[email protected]> Closes apache#3008 from zjffdu/ZEPPELIN-3531 and squashes the following commits: e8b2969 [Jeff Zhang] ZEPPELIN-3531. Don't look for py4j in NewSparkInterpreter
Currently, Zeppelin only supports roles for AuthorizationFilter, but there can be a condition as described in https://issues.apache.org/jira/browse/ZEPPELIN-2913 where Zeppelin's user does not belong to a group/role, and the administrator wants to have control using user only. [Feature] * [x] - Add documentation * https://issues.apache.org/jira/browse/ZEPPELIN-2913 add the following in shiro.ini: ``` [main] ... anyofroles = org.apache.zeppelin.utils.AnyOfRolesUserAuthorizationFilter [urls] ... /api/interpreter/** = authc, anyofroles[admin, user1] /api/configurations/** = authc, roles[admin] /api/credential/** = authc, roles[admin] ``` With the above config both user (user1) and users the belong to role admin will have access to interpreter setting page. Author: Prabhjyot Singh <[email protected]> Closes apache#3004 from prabhjyotsingh/ZEPPELIN-2913 and squashes the following commits: e05d72a [Prabhjyot Singh] rename AnyOfRolesAuthorizationFilter to AnyOfRolesUserAuthorizationFilter 724192f [Prabhjyot Singh] add doc 53c0c03 [Prabhjyot Singh] [ZEPPELIN-2913] support for both user and role Change-Id: I63cdebf66d76a67cfca0054283c7d1c65a9b5805
…uld be mutually exclusive Problem: When any external authentication (like LDAP/AD) is enabled for Zeppelin, the default password-based authentication could still be configured in addition to that. This makes space for backdoor in Zeppelin where the user can still get in using the local username/password. Proposed Solution: Zeppelin shouldn't allow specifying [users] section in shiro.ini when it is configured to authenticate with LDAP/AD. [Bug Fix | Feature ] * [x] - Add documentation * [ZEPPELIN-3526](https://issues.apache.org/jira/browse/ZEPPELIN-3526) If both [users] and [main] for example activeDirectoryRealm section enabled in shiro, Zeppelin server should not start. Author: Prabhjyot Singh <[email protected]> Author: Prabhjyot <[email protected]> Closes apache#3003 from prabhjyotsingh/ZEPPELIN-3526 and squashes the following commits: edc4323 [Prabhjyot] Merge branch 'master' into ZEPPELIN-3526 05c9e14 [Prabhjyot Singh] add doc 529ab3e [Prabhjyot Singh] ZEPPELIN-3526: Zeppelin auth mechanisms (LDAP or password based) should be mutually exclusive Change-Id: I0608cdc64ae7952eeec22bfe939810a6b24f357a
### What is this PR for? `zeppelin.pyspark.python` should be removed as it is zeppelin specific property, and only affect the driver, but not on executor. So we should use spark property instead. ### What type of PR is it? [Bug Fix] ### Todos * [ ] - Task ### What is the Jira issue? * https://issues.apache.org/jira/browse/ZEPPELIN-3517 ### How should this be tested? * CI pass ### Screenshots (if appropriate) ### Questions: * Does the licenses files need update? No * Is there breaking changes for older versions? No * Does this needs documentation? No Author: Jeff Zhang <[email protected]> Closes apache#2993 from zjffdu/ZEPPELIN-3517 and squashes the following commits: 24dafa1 [Jeff Zhang] ZEPPELIN-3517. Remove zeppelin.pyspark.python in PySparkInterpreter
This is a side effect of ZEPPELIN-3526, occurs when there's no shiro.ini in the classpath. [Hot Fix] * CI should be green * Does the licenses files need update? n/a * Is there breaking changes for older versions? n/a * Does this needs documentation? n/a Author: Prabhjyot Singh <[email protected]> Closes apache#3011 from prabhjyotsingh/hotfix/ZEPPELIN-3526 and squashes the following commits: ac6565c [Prabhjyot Singh] fix ZEPPELIN-3526 when no shiro.ini exists Change-Id: I5016e293eeec17e44be29dbf7f2668ec542a8dfa
(cherry picked from commit 40f1c77)
9e4a213 to
cf1d0af
Compare
|
spark 2.3.1 vote passed, it would have the matching 0.10.7 |
Owner
Author
|
BTW, @zjffdu told me it was not all of them. We might need to add authentication method. So I closed it at first. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
No description provided.