Details
-
Task
-
Status: Closed
-
Major
-
Resolution: Duplicate
-
None
-
None
-
None
Description
CTAS fails w/ UnsupportedOperationException when trying to modify immutable map in DataSourceUtils.mayBeOverwriteParquetWriteLegacyFormatProp.
with spark3.2 master.
val s = """ create table catalog_sales USING HUDI options ( type = 'cow', primaryKey = 'cs_item_sk,cs_order_number' ) LOCATION 'file:///tmp/catalog_sales_hudi' PARTITIONED BY (cs_sold_date_sk) AS SELECT * FROM catalog_sales_ext2
stacktrace:
java.lang.UnsupportedOperationException at java.util.Collections$UnmodifiableMap.put(Collections.java:1459) at org.apache.hudi.DataSourceUtils.mayBeOverwriteParquetWriteLegacyFormatProp(DataSourceUtils.java:323) at org.apache.hudi.spark3.internal.DefaultSource.getTable(DefaultSource.java:59) at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Utils$.getTableFromProvider(DataSourceV2Utils.scala:83) at org.apache.spark.sql.DataFrameWriter.getTable$1(DataFrameWriter.scala:280) at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:296) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247) at org.apache.hudi.HoodieSparkSqlWriter$.bulkInsertAsRow(HoodieSparkSqlWriter.scala:478) at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:159) at org.apache.spark.sql.hudi.command.InsertIntoHoodieTableCommand$.run(InsertIntoHoodieTableCommand.scala:109) at org.apache.spark.sql.hudi.command.CreateHoodieTableAsSelectCommand.run(CreateHoodieTableAsSelectCommand.scala:91)
Attachments
Issue Links
- duplicates
-
HUDI-3140 Fix bulk_insert failure on Spark 3.2.0
- Closed