You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When attempting to write records to Confluent Cloud using the CLI, we sometimes get errors related JSON Encoded Avro records that deal with union types.
if its type is null, then it is encoded as a JSON null;
otherwise it is encoded as a JSON object with one name/value pair whose name is the type’s name and whose value is the recursively encoded value. For Avro’s named types (record, fixed or enum) the user-specified name is used, for other types the type name is used.
We can do this locally using the included /usr/bin/kafka-avro-console-producer inside of the CP Schema Registry docker container, but not using the CLI. Instead, we get
When attempting to write records to Confluent Cloud using the CLI, we sometimes get errors related JSON Encoded Avro records that deal with union types.
According to the JSON Encoding spec from Avro, when using a union type
We can do this locally using the included
/usr/bin/kafka-avro-console-producer
inside of the CP Schema Registry docker container, but not using the CLI. Instead, we getBelow is the full schema (with some sensitive bits removed) and the sample record we are using
Let me know what other data we can provide!
The text was updated successfully, but these errors were encountered: