Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
3.1.2, 3.2.1, 3.3.0
-
None
Description
During the work in SPARK-38204, I figured out the parameter is swapped which led to test failure on new test (I disabled the schema check for now in the PR of SPARK-38204).
That allows nullable column to be stored into non-nullable column, which should be prohibited. This is less likely making runtime problem since state schema is conceptual one and row can be stored even not respecting the state schema.
The worse problem is happening in opposite way, that disallows non-nullable column to be stored into nullable column, which should be allowed. Spark fails the query for this case.
We should fix this to allow the case properly.