Details
-
Improvement
-
Status: Open
-
Critical
-
Resolution: Unresolved
-
1.10.0, 1.9.1, 1.11.0
-
None
-
None
Description
When using the Avro schema below to write a parquet(1.8.1) file and then read back by using parquet 1.10.1 without passing any schema, the reading throws an exception "XXX is not a group" . Reading through parquet 1.8.1 is fine.
{
"name": "phones",
"type": [
"null",
{
"type": "array",
"items": {
"type": "record",
"name": "phones_items",
"fields": [
{ "name": "phone_number", "type": [ "null", "string" ], "default": null }
]
}
}
],
"default": null
}
The code to read is as below
val reader = AvroParquetReader.builder[SomeRecordType](parquetPath).withConf(new Configuration).build()
reader.read()
PARQUET-651 changed the method isElementType() by relying on Avro's checkReaderWriterCompatibility() to check the compatibility. However, checkReaderWriterCompatibility() consider the ParquetSchema and the AvroSchema(converted from File schema) as not compatible(the name in avro schema is ‘phones_items’, but the name is ‘array’ in Parquet schema, hence not compatible) . Hence return false and caused the “phone_number” field in the above schema to be considered as group type which is not true. Then the exception throws as .asGroupType().
I didn’t try writing via parquet 1.10.1 would reproduce the same problem or not. But it could because the translation of Avro schema to Parquet schema is not changed(didn’t verify yet).
I hesitate to revert PARQUET-651 because it solved several problems. I would like to hear the community's thoughts on it.
Attachments
Issue Links
- causes
-
PARQUET-651 Parquet-avro fails to decode array of record with a single field name "element" correctly
- Resolved