Description
```
>>> df = spark.range(1)
>>> df2 = spark.range(2)
>>> df.join(df2, how="left_outer")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/xinrong.meng/spark/python/pyspark/sql/connect/dataframe.py", line 438, in join
plan.Join(left=self._plan, right=other._plan, on=on, how=how),
File "/Users/xinrong.meng/spark/python/pyspark/sql/connect/plan.py", line 730, in _init_
raise NotImplementedError(
NotImplementedError:
Unsupported join type: left_outer. Supported join types include:
"inner", "outer", "full", "fullouter", "full_outer",
"leftouter", "left", "left_outer", "rightouter",
"right", "right_outer", "leftsemi", "left_semi",
"semi", "leftanti", "left_anti", "anti", "cross",
```