fix: Mark cast from float/double to decimal as incompatible#1372
fix: Mark cast from float/double to decimal as incompatible#1372andygrove merged 9 commits intoapache:mainfrom
Conversation
|
I don't understand the following test failure: |
| val table = "t1" | ||
| val table = s"final_decimal_avg_$dictionaryEnabled" |
There was a problem hiding this comment.
These changes ended up not being entirely necessary, but the test did have a mix of hard-coded t1 and use of the variable $tableName references in SQL, and I made these consistent.
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1372 +/- ##
=============================================
- Coverage 56.12% 39.17% -16.96%
- Complexity 976 2065 +1089
=============================================
Files 119 262 +143
Lines 11743 60327 +48584
Branches 2251 12836 +10585
=============================================
+ Hits 6591 23631 +17040
- Misses 4012 32223 +28211
- Partials 1140 4473 +3333 ☔ View full report in Codecov by Sentry. |
| withTable(tableName) { | ||
| val table = spark.read.parquet(filename).coalesce(1) | ||
| table.createOrReplaceTempView(tableName) | ||
| checkSparkAnswer(s"SELECT c1, avg(c7) FROM $tableName GROUP BY c1 ORDER BY c1") |
There was a problem hiding this comment.
Would you mind adding // https://github.com/apache/datafusion-comet/issues/1371 and mention to use checkSparkAnswerAndNumOfAggregates once resolved?
Which issue does this PR close?
Closes #1354
Follow on issue: #1371
Rationale for this change
Fix a correctness issue
What changes are included in this PR?
How are these changes tested?
New test + existing tests