You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 9, 2024. It is now read-only.
I've been trying cogito serverless workflow for a while now in local setup. In my setup, I've defined workflows with states involving AsyncAPI calls using Kafka. This is working fine in local, using local Kafka setup without any auth mechanism.
When I'm trying to promote and take this further and tried the same with remote Kafka cluster, I'm facing issue where it seems to do the authorisation as expected, but it is not polling for events to trigger the workflow. This is happening when I tried with Kafka cluster which used SASL_SSL auth mechanism. I tried with another remote Kafka cluster which uses cert based authorisation, and there it seems to work fine. I'm trying to understand if there is any restrictions on the auth mechanisms supported for Kafka.
I got to conclusion that it is doing authorisation as expected because when I tried incorrect username/password in "kafka.sasl.jaas.config" value, it gives proper auth failure error, but when I corrected it, it is not giving any error and prints the logs as shown at the bottom. But even though the logs mention that consumer is started, no consumer group is registered for the topic in scope, so something is going wrong there. There is no issue with Kafka broker as I tried connecting and consuming/producing messages with same properties and it works fine.
Can you please have a look and provide your comments.
Properties used for local Kafka connection in workflow service:
kafka.bootstrap.servers=127.0.0.1:9092
Properties used for remote Kafka connection where it is not working:
kafka.bootstrap.servers=<brokers>
kafka.security.protocol=SASL_SSL
kafka.sasl.mechanism=SCRAM-SHA-512
kafka.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="<username>" password="<password>";
//Tried with and without the below properties and same result in both cases
kafka.ssl.truststore.location=<truststorepath>
kafka.ssl.truststore.password=<truststorepassword>
Logs:
17:03:26 INFO traceId=, parentId=, spanId=, sampled= [io.sm.re.me.kafka] (Quarkus Main Thread) SRMSG18229: Configured topics for channel '<channelName>': [<topicName>]
17:03:27 INFO traceId=, parentId=, spanId=, sampled= [io.sm.re.me.kafka] (smallrye-kafka-consumer-thread-4) SRMSG18257: Kafka consumer kafka-consumer-<channelName/topicName>, connected to Kafka brokers '<brokersList>', belongs to the 'sw-xyz-in8' consumer group and is configured to poll records from [<topicName>]
17:03:27 INFO traceId=, parentId=, spanId=, sampled= [or.ki.ko.ev.im.AbstractMessageConsumer] (Quarkus Main Thread) Consumer for <channelName/topicName> started
Expected behavior
No response
Actual behavior
No response
How to Reproduce?
No response
Output of uname -a or ver
No response
Output of java -version
No response
GraalVM version (if different from Java)
No response
Kogito version or git rev (or at least Quarkus version if you are using Kogito via Quarkus platform BOM)
No response
Build tool (ie. output of mvnw --version or gradlew --version)
No response
Additional information
No response
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Describe the bug
Hi Team,
I've been trying cogito serverless workflow for a while now in local setup. In my setup, I've defined workflows with states involving AsyncAPI calls using Kafka. This is working fine in local, using local Kafka setup without any auth mechanism.
When I'm trying to promote and take this further and tried the same with remote Kafka cluster, I'm facing issue where it seems to do the authorisation as expected, but it is not polling for events to trigger the workflow. This is happening when I tried with Kafka cluster which used SASL_SSL auth mechanism. I tried with another remote Kafka cluster which uses cert based authorisation, and there it seems to work fine. I'm trying to understand if there is any restrictions on the auth mechanisms supported for Kafka.
I got to conclusion that it is doing authorisation as expected because when I tried incorrect username/password in "kafka.sasl.jaas.config" value, it gives proper auth failure error, but when I corrected it, it is not giving any error and prints the logs as shown at the bottom. But even though the logs mention that consumer is started, no consumer group is registered for the topic in scope, so something is going wrong there. There is no issue with Kafka broker as I tried connecting and consuming/producing messages with same properties and it works fine.
Can you please have a look and provide your comments.
Properties used for local Kafka connection in workflow service:
kafka.bootstrap.servers=127.0.0.1:9092
Properties used for remote Kafka connection where it is not working:
Logs:
Expected behavior
No response
Actual behavior
No response
How to Reproduce?
No response
Output of
uname -a
orver
No response
Output of
java -version
No response
GraalVM version (if different from Java)
No response
Kogito version or git rev (or at least Quarkus version if you are using Kogito via Quarkus platform BOM)
No response
Build tool (ie. output of
mvnw --version
orgradlew --version
)No response
Additional information
No response
The text was updated successfully, but these errors were encountered: