Description
Queries resulting in interval types shouldn't fail in PySpark, i.e. the following
from pyspark.sql.functions import current_timestamp spark.range(1).select(current_timestamp() - current_timestamp())
should result in
DataFrame[(current_timestamp() - current_timestamp()): interval]
instead of ValueError.