Details
Description
scala> val df = Seq(("abc", 1), (null, 3)).toDF("col1", "col2") df: org.apache.spark.sql.DataFrame = [col1: string, col2: int] scala> df.write.mode("overwrite").parquet("/tmp/test1") scala> val df2 = spark.read.parquet("/tmp/test1"); df2: org.apache.spark.sql.DataFrame = [col1: string, col2: int] scala> df2.filter("col1 = 'abc' OR (col1 != 'abc' AND col2 == 3)").show() +----+----+ |col1|col2| +----+----+ | abc| 1| |null| 3| +----+----+