There is a table in BigQuery where the aggregated data on users is uploaded, and over time this data changes, and it is necessary to make an update. DML requests are very limited. Is it possible to upload data by batch, but that the data are not re-recorded, but changed by some primary-key

  • If you believe the documentation cloud.google.com/bigquery/docs/reference/standard-sql/ ... , then INSERT and UPDATE are there. You upload fresh data that requires updating, to another table that requires insertion - into the third, then with one query you make an update, with the second insert, and you nail down unnecessary tables with replenishment. Having the “old” and “new” states of the data, it’s not so difficult to prepare two uploads. - Akina
  • Thank you. I thought maybe there is a more convenient "direct" way. - Constantine

0