You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 30, 2023. It is now read-only.
Hi,
I'm using protobuf-converter for S3 sink connector for ingesting protobuf from kafka to S3.
my sink config looks like -
Map.entry("connector.class", "io.confluent.connect.s3.S3SinkConnector"),
Map.entry("topics", "topicName"),
Map.entry("tasks.max", 1),
Map.entry("flush.size", 100),
Map.entry("rotate.schedule.interval.ms", 10000),
Map.entry("s3.region", s3Region),
Map.entry("s3.bucket.name", s3Bucket),
Map.entry("s3.ssea.name", s3Ssea),
Map.entry("s3.part.size", 26214400),
Map.entry("storage.class", "io.confluent.connect.s3.storage.S3Storage"),
Map.entry("key.converter", "com.blueapron.connect.protobuf.ProtobufConverter"),
Map.entry("value.converter", "com.blueapron.connect.protobuf.ProtobufConverter"), Map.entry("value.converter.protoClassName", "com.protobuf.hawaii.Event"),
Map.entry("format.class", "io.confluent.connect.s3.format.json.JsonFormat"),
Map.entry("schemas.enable", "true"),
Map.entry("schema.compatibility", "NONE"),
Map.entry("partitioner.class", "io.confluent.connect.storage.partitioner.TimeBasedPartitioner"),
Map.entry("timestamp.extractor", "Wallclock"),
Map.entry("partition.duration.ms", 3600000),
Map.entry("topics.dir", String.format("%s/connect_topics", s3Root)),
Map.entry("path.format", "'year'=YYYY/'month'=MM/'day'=dd/'hour'=HH"),
Map.entry("locale", "en-US"),
Map.entry("timezone", "UTC"));
I'm getting this exception when I try to run this - StackOverflowError
1
at internalGetFieldAccessorTable (com.google.protobuf.DescriptorProtos$FieldOptions.java:27472)
2
at getAllFieldsMutable (com.google.protobuf.GeneratedMessageV3.java:155)
3
at access$800 (com.google.protobuf.GeneratedMessageV3.java:79)
4
at getAllFields (com.google.protobuf.GeneratedMessageV3$ExtendableMessage.java:1232)
5
at getConnectFieldName (com.blueapron.connect.protobuf.ProtobufData.java:72)
6
at toConnectSchema (com.blueapron.connect.protobuf.ProtobufData.java:194)
7
at toConnectSchema (com.blueapron.connect.protobuf.ProtobufData.java:194)
8
at toConnectSchema (com.blueapron.connect.protobuf.ProtobufData.java:194)
9
at toConnectSchema (com.blueapron.connect.protobuf.ProtobufData.java:194)
..........
Edit - I think the issue is when you have google.protobuf.Struct type in .proto/schema file. It throws the above exception. Could you look around this and help to support google.protobuf.Struct types?
The text was updated successfully, but these errors were encountered:
akanungoz
changed the title
Error while using my own protoClassName
StackOverflow Exception while using my own protoClassName having google.protobuf.Struct
Sep 9, 2019
akanungoz
changed the title
StackOverflow Exception while using my own protoClassName having google.protobuf.Struct
StackOverflow Exception while using my own protoClassName having field type google.protobuf.Struct
Sep 9, 2019
Hi,
I'm using protobuf-converter for S3 sink connector for ingesting protobuf from kafka to S3.
my sink config looks like -
Map.entry("connector.class", "io.confluent.connect.s3.S3SinkConnector"),
Map.entry("topics", "topicName"),
Map.entry("tasks.max", 1),
Map.entry("flush.size", 100),
Map.entry("rotate.schedule.interval.ms", 10000),
Map.entry("s3.region", s3Region),
Map.entry("s3.bucket.name", s3Bucket),
Map.entry("s3.ssea.name", s3Ssea),
Map.entry("s3.part.size", 26214400),
Map.entry("storage.class", "io.confluent.connect.s3.storage.S3Storage"),
Map.entry("key.converter", "com.blueapron.connect.protobuf.ProtobufConverter"),
Map.entry("value.converter", "com.blueapron.connect.protobuf.ProtobufConverter"),
Map.entry("value.converter.protoClassName", "com.protobuf.hawaii.Event"),
Map.entry("format.class", "io.confluent.connect.s3.format.json.JsonFormat"),
Map.entry("schemas.enable", "true"),
Map.entry("schema.compatibility", "NONE"),
Map.entry("partitioner.class", "io.confluent.connect.storage.partitioner.TimeBasedPartitioner"),
Map.entry("timestamp.extractor", "Wallclock"),
Map.entry("partition.duration.ms", 3600000),
Map.entry("topics.dir", String.format("%s/connect_topics", s3Root)),
Map.entry("path.format", "'year'=YYYY/'month'=MM/'day'=dd/'hour'=HH"),
Map.entry("locale", "en-US"),
Map.entry("timezone", "UTC"));
I'm getting this exception when I try to run this -
StackOverflowError
1
at internalGetFieldAccessorTable (com.google.protobuf.DescriptorProtos$FieldOptions.java:27472)
2
at getAllFieldsMutable (com.google.protobuf.GeneratedMessageV3.java:155)
3
at access$800 (com.google.protobuf.GeneratedMessageV3.java:79)
4
at getAllFields (com.google.protobuf.GeneratedMessageV3$ExtendableMessage.java:1232)
5
at getConnectFieldName (com.blueapron.connect.protobuf.ProtobufData.java:72)
6
at toConnectSchema (com.blueapron.connect.protobuf.ProtobufData.java:194)
7
at toConnectSchema (com.blueapron.connect.protobuf.ProtobufData.java:194)
8
at toConnectSchema (com.blueapron.connect.protobuf.ProtobufData.java:194)
9
at toConnectSchema (com.blueapron.connect.protobuf.ProtobufData.java:194)
..........
Edit - I think the issue is when you have google.protobuf.Struct type in .proto/schema file. It throws the above exception. Could you look around this and help to support google.protobuf.Struct types?
The text was updated successfully, but these errors were encountered: