-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-33549][SQL] Remove configuration spark.sql.legacy.allowCastNumericToTimestamp #30493
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-33549][SQL] Remove configuration spark.sql.legacy.allowCastNumericToTimestamp #30493
Conversation
|
Test build #131731 has finished for PR 30493 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry but users cannot enable spark.sql.ansi.enabled because it enforces many other stuffs as a side effect. I'm not sure about removing this. Is it really okay?
spark.sql.legacy.allowCastNumericToTimestamp for disallowing the conversion. Users just need to set
spark.sql.ansi.enabledfor the behavior.
cc @gatorsmile
|
@dongjoon-hyun I can understand your concern. But if we are going to have this configuration, then it is reasonable to add new feature flags for each disallowed conversion in ANSI mode (#30260). |
|
According to the report of SPARK-31710, it was a correctness issue. This should be considered seriously. Are you using ANSI mode in production really? |
I just tried the following in Spark 3.0.0 and there is no exception as reported in https://issues.apache.org/jira/browse/SPARK-31710. (The affects version is 2.4.5)
It is possible when it is ready. |
|
Got it, @gengliangwang . |
|
Test build #131732 has finished for PR 30493 at commit
|
|
|
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 for the removal according to the discussion.
HyukjinKwon
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am okay with this too.
| } | ||
| } | ||
|
|
||
| test("SPARK-31710: fail casting from numeric to timestamp if it is forbidden") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Users just need to set spark.sql.ansi.enabled for the behavior.
Just in case, should we leave the test, and check the cases under ^^ the config?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
GA passed, merging to master, thanks! |
|
Test build #131742 has finished for PR 30493 at commit
|
What changes were proposed in this pull request?
Remove SQL configuration spark.sql.legacy.allowCastNumericToTimestamp
Why are the changes needed?
In the current master branch, there is a new configuration
spark.sql.legacy.allowCastNumericToTimestampwhich controls whether to cast Numeric types to Timestamp or not. The default value is true.After #30260, the type conversion between Timestamp type and Numeric type is disallowed in ANSI mode. So, we don't need to a separate configuration
spark.sql.legacy.allowCastNumericToTimestampfor disallowing the conversion. Users just need to setspark.sql.ansi.enabledfor the behavior.As the configuration is not in any released yet, we should remove the configuration to make things simpler.
Does this PR introduce any user-facing change?
No, since the configuration is not released yet.
How was this patch tested?
Existing test cases