Skip to content

Commit 7178ccf

Browse files
committed
[SPARK-21936][SQL] backward compatibility test framework for HiveExternalCatalog
`HiveExternalCatalog` is a semi-public interface. When creating tables, `HiveExternalCatalog` converts the table metadata to hive table format and save into hive metastore. It's very import to guarantee backward compatibility here, i.e., tables created by previous Spark versions should still be readable in newer Spark versions. Previously we find backward compatibility issues manually, which is really easy to miss bugs. This PR introduces a test framework to automatically test `HiveExternalCatalog` backward compatibility, by downloading Spark binaries with different versions, and create tables with these Spark versions, and read these tables with current Spark version. test-only change Author: Wenchen Fan <[email protected]> Closes #19148 from cloud-fan/test.
1 parent 781a1f8 commit 7178ccf

File tree

6 files changed

+301
-366
lines changed

6 files changed

+301
-366
lines changed

sql/hive/pom.xml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -162,6 +162,10 @@
162162
<groupId>org.apache.thrift</groupId>
163163
<artifactId>libfb303</artifactId>
164164
</dependency>
165+
<dependency>
166+
<groupId>org.apache.derby</groupId>
167+
<artifactId>derby</artifactId>
168+
</dependency>
165169
<dependency>
166170
<groupId>org.scalacheck</groupId>
167171
<artifactId>scalacheck_${scala.binary.version}</artifactId>

sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogBackwardCompatibilitySuite.scala

Lines changed: 0 additions & 264 deletions
This file was deleted.

0 commit comments

Comments
 (0)