Description
https://stackoverflow.com/questions/69863008/call-the-bigquery-stored-procedure-in-dataflow-pipeline
I have written a stored procedure in Bigquery and trying to call it within a dataflow pipeline. This works for the SELECT queries but not for the stored procedure:
pipeLine = beam.Pipeline(options=options) rawdata = ( pipeLine | beam.io.ReadFromBigQuery( query="CALL my_dataset.create_customer()", use_standard_sql=True) ) pipeLine.run().wait_until_finish()
Stored procedure:
CREATE OR REPLACE PROCEDURE my_dataset.create_customer() BEGIN SELECT * FROM `project_name.my_dataset.my_table` WHERE customer_name LIKE "%John%" ORDER BY created_time LIMIT 5; END;
I am able to create the stored procedure and call it within the Bigquery console. But, in the dataflow pipeline, it throws an error:
"code": 400,
"message": "configuration.query.destinationEncryptionConfiguration cannot be set for scripts","message": "configuration.query.destinationEncryptionConfiguration cannot be set for scripts", "domain": "global",
"reason": "invalid""status": "INVALID_ARGUMENT"