Details
-
Bug
-
Status: Resolved
-
P1
-
Resolution: Fixed
-
2.36.0
Description
When using the BigQuery Storage Write API through the Java Beam SDK (both the latest release 2.35.0 and 2.36.0-SNAPSHOT), there seems to be an issue when converting field Storage API Proto to columns named 'f'.
Reproduction Steps: The "field" named 'f' is unable to be written to BigQuery with the error referenced below.
[1]
"name": "item3",
"type": "RECORD",
"mode": "NULLABLE",
"fields": [
{
"name": "data",
"mode": "NULLABLE",
"type": "RECORD",
"fields": [
{ "mode": "NULLABLE", "name": "a", "type": "FLOAT" },
{ "mode": "NULLABLE", "name": "b", "type": "FLOAT" },
{ "mode": "NULLABLE", "name": "c", "type": "FLOAT" },
{ "mode": "NULLABLE", "name": "d", "type": "FLOAT" },
{ "mode": "NULLABLE", "name": "e", "type": "FLOAT" },
{ "mode": "NULLABLE", "name": "f", "type": "FLOAT" }]
}
]
[2]
{
...
"item3": {
"data":
{ "a": 1.627424812511E12, "b": 3.0, "c": 3.0, "d": 530.0, "e": 675.0 }},
...
}
The following error occurs: Exception in thread "main" org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.IllegalArgumentException: Can not set java.util.List field com.google.api.services.bigquery.model.TableRow.f to java.lang.Double at org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:373) at org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:341) at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:218) at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:67) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309) at com.google.cloud.teleport.templates.PubSubToBigQuery.run(PubSubToBigQuery.java:342) at com.google.cloud.teleport.templates.PubSubToBigQuery.main(PubSubToBigQuery.java:223) Caused by: java.lang.IllegalArgumentException: Can not set java.util.List field com.google.api.services.bigquery.model.TableRow.f to java.lang.Double at sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:167) at sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:171) at sun.reflect.UnsafeObjectFieldAccessorImpl.set(UnsafeObjectFieldAccessorImpl.java:81) at java.lang.reflect.Field.set(Field.java:764) at com.google.api.client.util.FieldInfo.setFieldValue(FieldInfo.java:275) at com.google.api.client.util.FieldInfo.setValue(FieldInfo.java:231) at com.google.api.client.util.GenericData.set(GenericData.java:118) at com.google.api.client.json.GenericJson.set(GenericJson.java:91) at com.google.api.services.bigquery.model.TableRow.set(TableRow.java:64) at com.google.api.services.bigquery.model.TableRow.set(TableRow.java:29) at com.google.api.client.util.GenericData.putAll(GenericData.java:131) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.toProtoValue(TableRowToStorageApiProto.java:206) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageValueFromFieldValue(TableRowToStorageApiProto.java:175) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageFromTableRow(TableRowToStorageApiProto.java:103) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.toProtoValue(TableRowToStorageApiProto.java:207) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageValueFromFieldValue(TableRowToStorageApiProto.java:175) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageFromTableRow(TableRowToStorageApiProto.java:103) at org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow$1.toMessage(StorageApiDynamicDestinationsTableRow.java:95) at org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages$ConvertMessagesDoFn.processElement(StorageApiConvertMessages.java:106) This error does not show up if I leave the write method to use Streaming Inserts.