Is there a way to set the maximum number of bad records when writing to BigqueryIO? It seems to keep the default at 0.
At this time, unfortunately, we don't provide a way to directly set the value of configuration.load.maxBadRecords
in relation to BigQueryIO
in Cloud Dataflow.
As a workaround, you should be able to apply a custom ParDo
transform that filters "bad records" before they are passed to BigQueryIO.Write
. As a result, BigQuery shouldn't get any "bad records". Hopefully, this helps.
If the ability to control configuration.load.maxBadRecords
is important to you, you are welcome to file a feature request in the issue tracker of our GitHub repository.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With