-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handling data redundancy for legacy data with the event bus #319
Comments
If we want to rely on the event bus as the only method of data transfer I could see "rebuilding" events from the old/existing data working safely when there is a single consumer. What happens the next time we need this same event data synced to a different service? Can we run that process again or would it risk unforeseen side effects in the first service? |
@zacharis278: Based on the issues that you are raising, I don't think re-sending all events to the event bus is the right solution, at least for the topics we have created.
|
Creating synthetic old events is tricky in that it will explicitly be sending events out of order, which puts the onus of figuring out ordering and idempotency on every consumer of that stream (some of which may be out of the operator's control). IMO bootstrapping new consumers with all of the existing event data s a perfect use case for the event bus, but once data is flowing manufacturing old events is fraught with peril. |
@bmtcril: Can you help explain the following?
When data on the event bus is sent to old, new, and future consumers, I'm just not clear on what perfect use case you are referring to? It feels like "once data is flowing" will always be the case, unless you mean quite literally when a topic is first getting produced to? And if so, I'm not clear on whether you are saying the event bus should be used for this and how you think this would work? |
It's sounding like the best option for right now is to load data separately and its up to the particular implementation to work out a seamless transition from initially loaded data to incoming events. Luckily, I don't think this is problematic for our team's use case. As a general solution, I could see this being a bit harder to deal with if we have high frequency events that do more than just create a DB row. |
@robrap our comments crossed, but I think we were saying the same things. You just did it better. :) |
How to we handle the case when services want to use the event bus for data redundancy across services, but there is legacy data that predates the existence of the event?
It's unclear if this ticket would result in any tooling, or just documentation to acknowledge and provide some guidance around this potentially common situation.
It is possible that each situation will need its own way to copy the old data (API, export/import, etc.) that doesn't conflict with the ongoing events, or the events could be rerun from a certain time.
How we handle db timestamps for each event may come into play.
The text was updated successfully, but these errors were encountered: