CAPT-1962: Sync batch database operations with BigQuery #3378
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
https://dfedigital.atlassian.net/browse/CAPT-1962
We use dfe-analytics to sync database operations with the data warehouse held in Google BigQuery.
The library uses model callbacks to enable this, so in some instances where we use
insert_all
/delete_all
/update_all
these operations are being missed.This change should rectify all of these sync issues.
In some cases I felt it was better, on balance, to trigger the model callbacks and sacrifice a little performance. This is because syncing large tables such as
claims
ortasks
can take a very significant amount of time in the background jobs.In other instances it is less expensive to simply force a full import of the table, which I have done by invoking the supplied rake task rather than re-implementing it which risks becoming out of sync or incompatible with changes to the upstream library. Unfortunately there doesn't appear to be a convenient way to do this other than invoking the rake task.