Skip to content

Commit 5647d37

Browse files
committed
Rebase and solve conflicts
2 parents 09bacd8 + 5754670 commit 5647d37

File tree

50 files changed

+1349
-597
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

50 files changed

+1349
-597
lines changed

docs/development/howtocontributewebsite.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ When you are ready, just make a pull-request.
6262

6363
## Alternative way
6464

65-
You can directly edit `.md` files in `/docs/` directory at the web interface of github and make pull-request immediatly.
65+
You can directly edit `.md` files in `/docs/` directory at the web interface of github and make pull-request immediately.
6666

6767
## Stay involved
6868
Contributors should join the Zeppelin mailing lists.

docs/install/install.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -188,7 +188,7 @@ Congratulations, you have successfully installed Apache Zeppelin! Here are two n
188188
* If you need more configuration for Apache Zeppelin, jump to the next section: [Apache Zeppelin Configuration](#apache-zeppelin-configuration).
189189

190190
#### If you need more information about Spark or JDBC interpreter settings...
191-
* Apache Zeppelin provides deep integration with [Apache Spark](http://spark.apache.org/). For more informtation, see [Spark Interpreter for Apache Zeppelin](../interpreter/spark.html).
191+
* Apache Zeppelin provides deep integration with [Apache Spark](http://spark.apache.org/). For more information, see [Spark Interpreter for Apache Zeppelin](../interpreter/spark.html).
192192
* You can also use generic JDBC connections in Apache Zeppelin. Go to [Generic JDBC Interpreter for Apache Zeppelin](../interpreter/jdbc.html).
193193

194194
#### If you are in a multi-user environment...

docs/install/virtual_machine.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -75,7 +75,7 @@ into a directory on your host machine, or directly in your virtual machine.
7575
7676
Cloning Zeppelin into the `/scripts/vagrant/zeppelin-dev` directory from the host, will allow the directory to be shared between your host and the guest machine.
7777
78-
Cloning the project again may seem counter intuitive, since this script likley originated from the project repository. Consider copying just the vagrant/zeppelin-dev script from the Zeppelin project as a stand alone directory, then once again clone the specific branch you wish to build.
78+
Cloning the project again may seem counter intuitive, since this script likely originated from the project repository. Consider copying just the vagrant/zeppelin-dev script from the Zeppelin project as a stand alone directory, then once again clone the specific branch you wish to build.
7979
8080
Synced folders enable Vagrant to sync a folder on the host machine to the guest machine, allowing you to continue working on your project's files on your host machine, but use the resources in the guest machine to compile or run your project. _[(1) Synced Folder Description from Vagrant Up](https://docs.vagrantup.com/v2/synced-folders/index.html)_
8181

docs/manual/interpreterinstallation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@ If you install one of these interpreters only with `--name` option, installer wi
8585
```
8686

8787
#### Install Spark interpreter built with Scala 2.10
88-
Spark distribution package has been built with Scala 2.10 until 1.6.2. If you have `SPARK_HOME` set pointing to Spark version ealier than 2.0.0, you need to download Spark interpreter packaged with Scala 2.10. To do so, use follow command:
88+
Spark distribution package has been built with Scala 2.10 until 1.6.2. If you have `SPARK_HOME` set pointing to Spark version earlier than 2.0.0, you need to download Spark interpreter packaged with Scala 2.10. To do so, use follow command:
8989

9090
```
9191
rm -rf ./interpreter/spark

docs/manual/interpreters.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ interpreter.start()
7979
8080
```
8181

82-
The above code will start interpreter thread inside your process. Once the interpreter is started you can configure zeppelin to connect to RemoteInterpreter by checking **Connect to existing process** checkbox and then provide **Host** and **Port** on which interpreter porocess is listening as shown in the image below:
82+
The above code will start interpreter thread inside your process. Once the interpreter is started you can configure zeppelin to connect to RemoteInterpreter by checking **Connect to existing process** checkbox and then provide **Host** and **Port** on which interpreter process is listening as shown in the image below:
8383

8484
<img src="../assets/themes/zeppelin/img/screenshots/existing_interpreter.png" width="450px">
8585

docs/manual/notebookashomepage.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -59,17 +59,17 @@ or ```zeppelin.notebook.homescreen.hide``` property to hide the new notebook fro
5959
Restart your Zeppelin server
6060

6161
```
62-
./bin/zeppelin-deamon stop
63-
./bin/zeppelin-deamon start
62+
./bin/zeppelin-daemon stop
63+
./bin/zeppelin-daemon start
6464
```
6565
That's it! Open your browser and navigate to Apache Zeppelin and see your customized homepage.
6666

6767
<br />
68-
## Show notebooks list in your custom homepage
69-
If you want to display the list of notebooks on your custom Apache Zeppelin homepage all
68+
## Show notes list in your custom homepage
69+
If you want to display the list of notes on your custom Apache Zeppelin homepage all
7070
you need to do is use our %angular support.
7171

72-
Add the following code to a paragraph in you home page and run it... walla! you have your notebooks list.
72+
Add the following code to a paragraph in you home page and run it... Voila! You have your notes list.
7373

7474
```javascript
7575
println(

docs/quickstart/install_with_flink_and_spark_cluster.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,12 @@ See the License for the specific language governing permissions and
1818
limitations under the License.
1919
-->
2020

21+
{% include JB/setup %}
22+
23+
# Install with flink and spark cluster
24+
25+
<div id="toc"></div>
26+
2127
This tutorial is extremely entry-level. It assumes no prior knowledge of Linux, git, or other tools. If you carefully type what I tell you when I tell you, you should be able to get Zeppelin running.
2228

2329
## Installing Zeppelin with Flink and Spark in cluster mode

elasticsearch/src/main/java/org/apache/zeppelin/elasticsearch/ElasticsearchInterpreter.java

Lines changed: 32 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@
2222
import java.util.ArrayList;
2323
import java.util.Arrays;
2424
import java.util.HashMap;
25+
import java.util.HashSet;
2526
import java.util.Iterator;
2627
import java.util.LinkedList;
2728
import java.util.List;
@@ -35,7 +36,6 @@
3536
import org.apache.commons.lang.StringUtils;
3637
import org.apache.zeppelin.interpreter.Interpreter;
3738
import org.apache.zeppelin.interpreter.InterpreterContext;
38-
import org.apache.zeppelin.interpreter.InterpreterPropertyBuilder;
3939
import org.apache.zeppelin.interpreter.InterpreterResult;
4040
import org.apache.zeppelin.interpreter.thrift.InterpreterCompletion;
4141
import org.elasticsearch.action.delete.DeleteResponse;
@@ -48,6 +48,8 @@
4848
import org.elasticsearch.client.transport.TransportClient;
4949
import org.elasticsearch.common.settings.Settings;
5050
import org.elasticsearch.common.transport.InetSocketTransportAddress;
51+
import org.elasticsearch.common.xcontent.XContentBuilder;
52+
import org.elasticsearch.common.xcontent.XContentFactory;
5153
import org.elasticsearch.common.xcontent.XContentHelper;
5254
import org.elasticsearch.index.query.QueryBuilders;
5355
import org.elasticsearch.search.SearchHit;
@@ -437,14 +439,37 @@ else if (agg instanceof InternalSingleBucketAggregation) {
437439
resMsg = XContentHelper.toString((InternalSingleBucketAggregation) agg).toString();
438440
}
439441
else if (agg instanceof InternalMultiBucketAggregation) {
440-
final StringBuffer buffer = new StringBuffer("key\tdoc_count");
441-
442+
final Set<String> headerKeys = new HashSet<>();
443+
final List<Map<String, Object>> buckets = new LinkedList<>();
442444
final InternalMultiBucketAggregation multiBucketAgg = (InternalMultiBucketAggregation) agg;
445+
443446
for (MultiBucketsAggregation.Bucket bucket : multiBucketAgg.getBuckets()) {
444-
buffer.append("\n")
445-
.append(bucket.getKeyAsString())
446-
.append("\t")
447-
.append(bucket.getDocCount());
447+
try {
448+
final XContentBuilder builder = XContentFactory.jsonBuilder();
449+
bucket.toXContent(builder, null);
450+
final Map<String, Object> bucketMap = JsonFlattener.flattenAsMap(builder.string());
451+
headerKeys.addAll(bucketMap.keySet());
452+
buckets.add(bucketMap);
453+
}
454+
catch (IOException e) {
455+
logger.error("Processing bucket: " + e.getMessage(), e);
456+
}
457+
}
458+
459+
final StringBuffer buffer = new StringBuffer();
460+
final String[] keys = headerKeys.toArray(new String[0]);
461+
for (String key: keys) {
462+
buffer.append("\t" + key);
463+
}
464+
buffer.deleteCharAt(0);
465+
466+
for (Map<String, Object> bucket : buckets) {
467+
buffer.append("\n");
468+
469+
for (String key: keys) {
470+
buffer.append(bucket.get(key)).append("\t");
471+
}
472+
buffer.deleteCharAt(buffer.length() - 1);
448473
}
449474

450475
resType = InterpreterResult.Type.TABLE;

elasticsearch/src/test/java/org/apache/zeppelin/elasticsearch/ElasticsearchInterpreterTest.java

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,12 @@
2121
import static org.junit.Assert.assertEquals;
2222

2323
import java.io.IOException;
24-
import java.util.*;
24+
import java.util.ArrayList;
25+
import java.util.Arrays;
26+
import java.util.Date;
27+
import java.util.List;
28+
import java.util.Properties;
29+
import java.util.UUID;
2530

2631
import org.apache.commons.lang.math.RandomUtils;
2732
import org.apache.zeppelin.interpreter.InterpreterResult;
@@ -178,6 +183,11 @@ public void testAgg() {
178183
res = interpreter.interpret("search /logs { \"aggs\" : { \"status_count\" : " +
179184
" { \"terms\" : { \"field\" : \"status\" } } } }", null);
180185
assertEquals(Code.SUCCESS, res.code());
186+
187+
res = interpreter.interpret("search /logs { \"aggs\" : { " +
188+
" \"length\" : { \"terms\": { \"field\": \"status\" }, " +
189+
" \"aggs\" : { \"sum_length\" : { \"sum\" : { \"field\" : \"content_length\" } }, \"sum_status\" : { \"sum\" : { \"field\" : \"status\" } } } } } }", null);
190+
assertEquals(Code.SUCCESS, res.code());
181191
}
182192

183193
@Test

python/src/main/java/org/apache/zeppelin/python/PythonInterpreter.java

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ public PythonInterpreter(Properties property) {
6868

6969
@Override
7070
public void open() {
71-
LOG.info("Starting Python interpreter .....");
71+
LOG.info("Starting Python interpreter ---->");
7272
LOG.info("Python path is set to:" + property.getProperty(ZEPPELIN_PYTHON));
7373

7474
maxResult = Integer.valueOf(getProperty(MAX_RESULT));
@@ -111,7 +111,7 @@ public void open() {
111111

112112
@Override
113113
public void close() {
114-
LOG.info("closing Python interpreter .....");
114+
LOG.info("closing Python interpreter <----");
115115
try {
116116
if (process != null) {
117117
process.close();
@@ -134,11 +134,9 @@ public InterpreterResult interpret(String cmd, InterpreterContext contextInterpr
134134

135135
InterpreterResult result;
136136
if (pythonErrorIn(output)) {
137-
result = new InterpreterResult(Code.ERROR, output);
137+
result = new InterpreterResult(Code.ERROR, output.replaceAll("\\.\\.\\.", ""));
138138
} else {
139-
// TODO(zjffdu), we should not do string replacement operation in the result, as it is
140-
// possible that the output contains the kind of pattern itself, e.g. print("...")
141-
result = new InterpreterResult(Code.SUCCESS, output.replaceAll("\\.\\.\\.", ""));
139+
result = new InterpreterResult(Code.SUCCESS, output);
142140
}
143141
return result;
144142
}

0 commit comments

Comments
 (0)