-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(data-warehouse): use materialized view in hogql #25187
Conversation
📸 UI snapshots have been updated1 snapshot changes in total. 0 added, 1 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated1 snapshot changes in total. 0 added, 1 modified, 0 deleted:
Triggered by this commit. |
…osthog into dw-materialized-use-new-tables
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See my one comment about using DataWarehouseTable
- I reckon the pipelines should operate the same between external sources vs materialization - both create a DataWarehouseTable
and set it on their equivalent models (Schema vs SavedQuery)
@@ -63,6 +68,7 @@ class Status(models.TextChoices): | |||
null=True, | |||
help_text="The timestamp of this SavedQuery's last run (if any).", | |||
) | |||
credential = models.ForeignKey(DataWarehouseCredential, on_delete=models.CASCADE, null=True, blank=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of adding credential
's to saved query, lets add a foreign key to DataWarehouseTable
instead, this means:
- We only have one type of "Table", regardless of whether it's an external data sourced table, or a materialized view, they should both be synonymous as far as the system is concerned
- The URL of the table is stored in the DB and not code, making easy changes to how we store the data with backward compatibility (this has been a godsend for all the DLT pipeline changes we've made)
- It'll support different S3 formats (parquet, delta, csv, etc) - this will be helpful for when we eventually move to iceberg, and again, this will be stored in the DB and so migrations of tables don't have to be one big migration
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good point let me give it a try
…osthog into dw-materialized-use-new-tables
TODO:
|
📸 UI snapshots have been updated2 snapshot changes in total. 0 added, 2 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated4 snapshot changes in total. 0 added, 4 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated2 snapshot changes in total. 0 added, 2 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated2 snapshot changes in total. 0 added, 2 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated4 snapshot changes in total. 0 added, 4 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated4 snapshot changes in total. 0 added, 4 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated4 snapshot changes in total. 0 added, 4 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated5 snapshot changes in total. 0 added, 5 modified, 0 deleted:
Triggered by this commit. |
📸 UI snapshots have been updated3 snapshot changes in total. 0 added, 3 modified, 0 deleted:
Triggered by this commit. |
Suspect IssuesThis pull request was deployed and Sentry observed the following issues:
Did you find this useful? React with a 👍 or 👎 |
Problem
Changes
👉 Stay up-to-date with PostHog coding conventions for a smoother review.
Does this work well for both Cloud and self-hosted?
How did you test this code?