Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/5.0 changes #1008

Merged
merged 7 commits into from
Nov 30, 2023
Merged

Feat/5.0 changes #1008

merged 7 commits into from
Nov 30, 2023

Conversation

stheppi
Copy link
Contributor

@stheppi stheppi commented Nov 29, 2023

Brings 5.0 branch changes to master.

The S3 consumer offset sink will have another PR

stheppi added 2 commits November 29, 2023 22:52
…ding to an exception raised since the Connect struct for the envelope does not have the "value" field on null. The code change ensures the transformer does not populate the struct unless the filed is present.
…class extending SinkData is introduced. For text storage the safeValue function is overwritten to return the java BigDecimal. The JsonConverter is changed to store decimals as numbers, not base64 (default). While a breaking change, this should not be a cause of concern since BigDecimal was not handled before, so it should have created an error if that was the case.

Unit test and integration tests were added to ensure the storage can handle BigDecimal.
.put("title", "mr")
.put(
"salary",
BigDecimal(100.43).setScale(18).bigDecimal,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's advisable to get into the habit of constructing using BigDecimal(String) instead of relying on floating-point literals. The conversion of floating-point literals, such as 100.43, to BigDecimal using setScale(18) may introduce rounding errors due to the binary representation of these literals. To ensure accurate representation and avoid precision loss, consider using BigDecimal("100.43").setScale(18) instead.

Additionally, it's worth evaluating whether a scale of 18 is necessary for the specific use case, as an excessively high scale might impact performance and memory usage without providing significant benefits.

(This comment applies to various places in the PR.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for raising that. The 18 scale comes from the schema definition. They should align or Avro serialisation will fail.

@stheppi stheppi enabled auto-merge (squash) November 30, 2023 15:45
@stheppi stheppi merged commit 5ed9e7a into master Nov 30, 2023
131 checks passed
@stheppi stheppi deleted the feat/5.0-changes branch November 30, 2023 15:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants