Details
-
Bug
-
Status: Open
-
Critical
-
Resolution: Unresolved
-
None
-
None
-
None
Description
For Spark 3.2.0+ the connector needs
spark.hadoopRDD.ignoreEmptySplits=false to work correctly.
This crucial piece of information is not documented either in the README, or in the main Hbase connector section.
Attachments
Issue Links
- is related to
-
SPARK-37660 Spark-3.2.0 Fetch Hbase Data not working
- Open