Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error deserializing Avro message "Error deserializing key/value for partition " #441

Open
yunhappy opened this issue Jun 28, 2018 · 8 comments · May be fixed by #576
Open

Error deserializing Avro message "Error deserializing key/value for partition " #441

yunhappy opened this issue Jun 28, 2018 · 8 comments · May be fixed by #576

Comments

@yunhappy
Copy link

version: 4.1.1

curl -X GET -H "Accept: application/vnd.kafka.avro.v2+json" \
http://localhost:8082/consumers/testgroup2/instances/ym_test/records
[2018-06-28 20:29:15,439] INFO 0:0:0:0:0:0:0:1 - - [28/六月/2018:20:28:04 +0800] "GET /consumers/testgroup2/instances/ym_test/records HTTP/1.1" 500 187  70513 (io.confluent.rest-utils.requests:77)
[2018-06-28 20:29:15,446] ERROR Unexpected exception in consumer read task id=io.confluent.kafkarest.v2.KafkaConsumerReadTask@5506eafe  (io.confluent.kafkarest.v2.KafkaConsumerReadTask:154)
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition tools_example0627-0 at offset 20. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
@PavanGurram-DevOps
Copy link

Is there any update on this issue as I'm also facing the same error? Any inputs is much appreciated

@yunhappy
Copy link
Author

change deserializer, so when this is a error for data the consumer will receive like this: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1

@Ritaja
Copy link

Ritaja commented May 10, 2020

Any progress on this? In my case Kafka topic with string key and avro data has this problem. Could be fixed by the pull requests from @yunhappy .

@xeeaax
Copy link

xeeaax commented Nov 12, 2020

Is any workaround found on this issue?

@xeeaax
Copy link

xeeaax commented Nov 12, 2020

Why this issue is not marked as BUG?

@AbdulRahmanAlHamali
Copy link

I had something similar, it turns out in my case that it was assuming both the key and value of the message to be in Avro format, and was failing because the key was not.

To solve this, I believe that when creating the consumer, users should have the option to either specify format (as is the case right now), in which case both the key and the value will have the same format, or specify keyFormat, and valueFormat separately

@paulolimarb
Copy link

Is there any update on this issue? I have the same problem.

My message have two schemas different.

{ "key":{ "type":"JSON" "data": "test", }, "value": { "type":"PROTOBUF" "data": { "myfield": "myvalue" } } }

But the consumer don't deserialize two schemas different on key and value.

Is mandatory the message contain the same schema type
Accept:application/vnd.kafka.protobuf.v2+json

Thank you so much for what's new.

@Eqta
Copy link

Eqta commented Sep 16, 2024

Hello, I am facing same exception, Is there a workaround for unknown magic byte?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants