Details
-
Bug
-
Status: Reopened
-
Major
-
Resolution: Unresolved
-
2.2.0, 3.0.0
-
None
-
None
Description
Currently, when there are Javadoc breaks, this seems printing warnings as errors.
For example, the actual errors were as below in https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77070/consoleFull
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/HighlyCompressedMapStatus.java:4: error: reference not found
[error] * than both {@link config.SHUFFLE_ACCURATE_BLOCK_THRESHOLD} and
[error] ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/HighlyCompressedMapStatus.java:5: error: reference not found
[error] * {@link config.SHUFFLE_ACCURATE_BLOCK_THRESHOLD_BY_TIMES_AVERAGE} * averageSize. It stores the
[error] ^
but it also prints many errors from generated Java codes as below:
[info] Constructing Javadoc information... [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117: error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from outside package [error] public BlacklistTracker (org.apache.spark.scheduler.LiveListenerBus listenerBus, org.apache.spark.SparkConf conf, scala.Option<org.apache.spark.ExecutorAllocationClient> allocationClient, org.apache.spark.util.Clock clock) { throw new RuntimeException(); } [error] ^ [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118: error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from outside package [error] public BlacklistTracker (org.apache.spark.SparkContext sc, scala.Option<org.apache.spark.ExecutorAllocationClient> allocationClient) { throw new RuntimeException(); } [error] ^ [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:133: error: ConfigReader is not public in org.apache.spark.internal.config; cannot be accessed from outside package [error] private org.apache.spark.internal.config.ConfigReader reader () { throw new RuntimeException(); } [error] ^ [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:138: error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [error] <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.ConfigEntry<T> entry, T value) { throw new RuntimeException(); } [error] ^ [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:139: error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [error] <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.OptionalConfigEntry<T> entry, T value) { throw new RuntimeException(); } [error] ^ [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:187: error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [error] <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.ConfigEntry<T> entry, T value) { throw new RuntimeException(); } [error] ^ [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:188: error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [error] <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.OptionalConfigEntry<T> entry, T value) { throw new RuntimeException(); } [error] ^ [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:208: error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [error] org.apache.spark.SparkConf remove (org.apache.spark.internal.config.ConfigEntry<?> entry) { throw new RuntimeException(); } [error] ...
These errors are actually warnings in a successful build without Javadoc breaks as below - https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7/2908/consoleFull
[info] Constructing Javadoc information... [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117: error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from outside package [warn] public BlacklistTracker (org.apache.spark.scheduler.LiveListenerBus listenerBus, org.apache.spark.SparkConf conf, scala.Option<org.apache.spark.ExecutorAllocationClient> allocationClient, org.apache.spark.util.Clock clock) { throw new RuntimeException(); } [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118: error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from outside package [warn] public BlacklistTracker (org.apache.spark.SparkContext sc, scala.Option<org.apache.spark.ExecutorAllocationClient> allocationClient) { throw new RuntimeException(); } [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:133: error: ConfigReader is not public in org.apache.spark.internal.config; cannot be accessed from outside package [warn] private org.apache.spark.internal.config.ConfigReader reader () { throw new RuntimeException(); } [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:138: error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [warn] <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.ConfigEntry<T> entry, T value) { throw new RuntimeException(); } [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:139: error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [warn] <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.OptionalConfigEntry<T> entry, T value) { throw new RuntimeException(); } [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:187: error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [warn] <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.ConfigEntry<T> entry, T value) { throw new RuntimeException(); } [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:188: error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [warn] <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.OptionalConfigEntry<T> entry, T value) { throw new RuntimeException(); } [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:208: error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from outside package [warn] org.apache.spark.SparkConf remove (org.apache.spark.internal.config.ConfigEntry<?> entry) { throw new RuntimeException(); } [warn] ...
These look warnings not errors in javadoc but when we introduce a Javadoc break but it seems sbt produces other warnings as errors when generating javadoc.
For example, with the Java code, A.java, below:
/** * Hi */ public class A extends B { }
if we run javadoc
javadoc A.java
it produces a warning because it does not find B symbol. It seems still generating the documenation fine.
Loading source file A.java... Constructing Javadoc information... A.java:4: error: cannot find symbol public class A extends B { ^ symbol: class B Standard Doclet version 1.8.0_45 Building tree for all the packages and classes... Generating ./A.html... Generating ./package-frame.html... Generating ./package-summary.html... Generating ./package-tree.html... Generating ./constant-values.html... Building index for all the packages and classes... Generating ./overview-tree.html... Generating ./index-all.html... Generating ./deprecated-list.html... Building index for all classes... Generating ./allclasses-frame.html... Generating ./allclasses-noframe.html... Generating ./index.html... Generating ./help-doc.html... 1 warning
However, if we have a javadoc break in comments as below:
/** * Hi * @see B */ public class A extends B { }
this produces an error and warning.
Loading source file A.java... Constructing Javadoc information... A.java:5: error: cannot find symbol public class A extends B { ^ symbol: class B Standard Doclet version 1.8.0_45 Building tree for all the packages and classes... Generating ./A.html... A.java:3: error: reference not found * @see B ^ Generating ./package-frame.html... Generating ./package-summary.html... Generating ./package-tree.html... Generating ./constant-values.html... Building index for all the packages and classes... Generating ./overview-tree.html... Generating ./index-all.html... Generating ./deprecated-list.html... Building index for all classes... Generating ./allclasses-frame.html... Generating ./allclasses-noframe.html... Generating ./index.html... Generating ./help-doc.html... 1 error 1 warning
It seems sbt unidoc recognises errors and also warnings as [error] when there are breaks (the related context looks described in https://github.com/sbt/sbt/issues/875#issuecomment-24315400).
Given my observations so far, it is generally okay to just fix [info] # errors printed at the bottom which are usually produced in generating the html Building tree for all the packages and classes... phase.
Essentially, this looks a bug in GenJavaDoc which generates Java codes wrongly and a bug in SBT that fails to distinguish warnings and errors in this case.
This message via Jenkins actually looks confusing.
Attachments
Issue Links
- is duplicated by
-
SPARK-21185 Spurious errors in unidoc causing PRs to fail
-
- Resolved
-