Description
When ANSI mode is off, `ArrayType(DoubleType, containsNull = false)` can't cast as `ArrayType(IntegerType, containsNull = false)` since there can be overflow thus result in null results and breaks the non-null constraint.
When ANSI mode is on, currently Spark SQL has the same behavior. However, this is not correct since the non-null constraint won't be break. Spark SQL can just execute the cast and throw runtime error on overflow, just like casting DoubleType as IntegerType.
This applies to MapType and StructType as well.