[ZEPPELIN-935] Adding more configurations to livy interpreter#944
[ZEPPELIN-935] Adding more configurations to livy interpreter#944mfelgamal wants to merge 8 commits intoapache:masterfrom
Conversation
update from original
|
@mfelgamal thank you for the contribution, have tested this with both https://github.com/cloudera/livy and https://github.com/cloudera/hue/tree/master/apps/spark/java. LGTM. |
|
@prabhjyotsingh Yes, sure, I have tested these configurations on both livy distributions many times |
|
Thanks @mfelgamal for the contribution. |
|
@mfelgamal could you please close the PR #903? |
| conf.put("spark.driver.cores", property.getProperty("spark.driver.cores")); | ||
| conf.put("spark.executor.cores", property.getProperty("spark.executor.cores")); | ||
| conf.put("spark.driver.memory", property.getProperty("spark.driver.memory")); | ||
| conf.put("spark.executor.memory", property.getProperty("spark.executor.memory")); |
There was a problem hiding this comment.
are these the only ones that can be set/changed?
There was a problem hiding this comment.
should we just pass along all spark.* properties?
There was a problem hiding this comment.
@felixcheung Yes ,sure, we should pass all spark.* properties so I did that and made livy interpreter accept any configurations belong to spark.
56f775a to
f988af0
Compare
docs/interpreter/livy.md
Outdated
| </tr> | ||
| <tr> | ||
| <td>zeppelin.livy.master</td> | ||
| <td>spark.master</td> |
There was a problem hiding this comment.
@prabhjyotsingh what do you think about replacing "zeppelin.livy.master"?
There was a problem hiding this comment.
Yes agreed. @mfelgamal can you prefix all properties with livy.* to maintain project styling WRT rest of the interpreters.
There was a problem hiding this comment.
@prabhjyotsingh I changed all spark properties to start with livy.spark.*
docs/interpreter/livy.md
Outdated
| * Livy server. | ||
|
|
||
| ### Configuration | ||
| We added some common configurations for spark, and you can set any configuration you want which should start with `livy.spark.` |
There was a problem hiding this comment.
could you add a link to Spark programming guide as to what can be set here?
also please clarify that "instead of starting with spark. it should be replaced with livy.spark." or similar.
There was a problem hiding this comment.
@felixcheung Thanks for your review, Done required changes.
| <td>livy.spark.executor.cores</td> | ||
| <td>1</td> | ||
| <td>Max number of SparkSQL result to display.</td> | ||
| </tr> |
There was a problem hiding this comment.
I think you should put these in doc as well;
livy.spark.executor.memory
livy.spark.dynamicAllocation.enabled
livy.spark.dynamicAllocation.cachedExecutorIdleTimeout
livy.spark.dynamicAllocation.minExecutors
livy.spark.dynamicAllocation.initialExecutors
livy.spark.dynamicAllocation.maxExecutors
|
Except for the doc thing, rest is all good. |
40a9b20 to
42910a7
Compare
42910a7 to
ccf3c82
Compare
|
@prabhjyotsingh Thanks for your review, modified doc in the recent commit. |
|
LGTM |
|
Merging this if no more discussion. |
What is this PR for?
Extending the livy interpreter to allow manipulation in the configurations of Spark from zeppelin web ui.
What type of PR is it?
Improvement
Todos
What is the Jira issue?
How should this be tested?
Screenshots (if appropriate)
Questions: