Reliable dependency loading mechanism#319
Conversation
|
It's ready. Please someone review this PR. |
|
It works on local environment, let me test more on cluster environment. Great job!! comments and questions:
|
|
|
z.load("org.apache.james:apache-mime4j:0.7.2") it loads with build.sbt, |
|
it was because of |
|
For 2) restart issue, found it was related to |
* Make recursive default * Exclude by pattern
|
Made some improvements
Here's updated API |
|
OMG, AWESOME job! I've brought one discussion. |
|
Updated to make 'dist' default. 'dist()' is removed from API and added 'local()', for the case does not want to add artifact to spark cluster. Here's updated API |
|
LGTM! On 2015년 2월 11일 (수) 08:48 Lee moon soo [email protected] wrote:
|
Reliable dependency loading mechanism
From ZEPL/zeppelin#388. Update description of dependency loader to reflect ZEPL/zeppelin#319. To do this, document structure is changed. * docs/zeppelincontext -> removed * interpreter/spark -> added (includes description about zeppelincontext and dependencyloader) Ready to merge. Author: Lee moon soo <[email protected]> Closes #7 from Leemoonsoo/gh-pages_update_changes and squashes the following commits: a3894cf [Lee moon soo] Add interpreter/spark.md instead of docs/zeppelincontext.md update description about dependency loader
#308 Implements/fixes runtime dependency library loading. But the feature is unreliable so some library loaded correctly and some library not.
While it looks not easy to find out reliable solution for runtime library loading, this PR trying to do library loading before SparkIMain being created, so library does not need to be loaded dynamically on runtime, but just included as a classpath.
To do this, this PR adds an new interpreter, "DepInterpreter".
It provides separate scala interpreter and API to loads dependency. He is fetching necessary library from maven repository and keep the file list. And then when SparkInterpreter is initializing, it's passing that file list to SparkInterpreter, so SparkInterpreter adds them in the classpath without trying to load them on runtime.
Usage. DepInterpreter can be used with %dep expose instance of
com.nflabs.zeppelin.spark.dep.DependencyContextas variablez.Here's API
Example of use
