Description
Add a config `spark.sql.ansi.allowCastBetweenDatetimeAndNumeric`to allow casting between Datetime and Numeric. The default value of the configuration is `false`.
Also, casting double/float type to timestamp should raise exceptions if there is overflow or the input is Nan/infinite.
This is for better adoption of ANSI SQL mode:
- As we did some data science, we found that many Spark SQL users are actually using `Cast(Timestamp as Numeric)` and `Cast(Numeric as Timestamp)`. There are also some usages of `Cast(Date as Numeric)`.
- The Spark SQL connector for Tableau is using this feature for DateTime math. e.g.
`CAST(FROM_UNIXTIME(CAST(CAST(%1 AS BIGINT) + (%2 * 86400) AS BIGINT)) AS TIMESTAMP)`
So, having a new configuration can provide users with an alternative choice on turning on ANSI mode.