Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to find kafka_connect_connector_metrics_* metrics in the kafka connect application #206

Open
ZealPatel15 opened this issue Feb 15, 2024 · 3 comments
Labels
question Further information is requested

Comments

@ZealPatel15
Copy link

Hi,
I am trying to export the kafka connect connector metrics - https://docs.confluent.io/platform/current/connect/monitoring.html#connector-metrics
and scrape it via prometheus. I followed the prometheus naming convention to add it to the system. - kafka_connect_connector_metrics_connector_class. But it doesn't show up.
Can somebody help out here, or if there is some other naming convention for these specific metrics? The other worker metrics do get exported.

@hifly81
Copy link
Contributor

hifly81 commented Feb 16, 2024

Hi, I am trying to export the kafka connect connector metrics - https://docs.confluent.io/platform/current/connect/monitoring.html#connector-metrics and scrape it via prometheus. I followed the prometheus naming convention to add it to the system. - kafka_connect_connector_metrics_connector_class. But it doesn't show up. Can somebody help out here, or if there is some other naming convention for these specific metrics? The other worker metrics do get exported.

Hi, you have an example in this repo
https://github.com/confluentinc/jmx-monitoring-stacks/blob/main/shared-assets/jmx-exporter/kafka_connect.yml

It definitely depends on the regular expression you defined.
Also check if you have not entered metrics in the section
blacklistObjectNames

@hifly81 hifly81 added the question Further information is requested label Feb 16, 2024
@ZealPatel15
Copy link
Author

Thank you @hifly81 for the information. I have a follow up question if you could share some insights on that as well. I am trying to understand this concept of jmx exporting and and scraping metrics by prometheus.

In the pattern of task-metrics defined in the link ,
image

$1 would be replaced by either the term "source/sink/connector" and for example we are interested in type sink-task-metrics which are many as per the confluent list So, $4 would be replaced by one of the sink-task-metric say sink-record-read-rate. So, the metric name as per the prometheus naming conventions would be - kafka_connect_sink_task_metrics_sink_record_read_rate

However, in my case, i am interested in the connector-class metric of the type connector-metrics , but the name defined in the pattern

image

does not have any variable or place holders, so the final prometheus metric name would just be kafka_connect_connector_metrics only and not kafka_connect_connector_metrics_connector_class.

I appreciate your help! Thank you!

@hifly81
Copy link
Contributor

hifly81 commented Feb 20, 2024

Thank you @hifly81 for the information. I have a follow up question if you could share some insights on that as well. I am trying to understand this concept of jmx exporting and and scraping metrics by prometheus.

In the pattern of task-metrics defined in the link , image

$1 would be replaced by either the term "source/sink/connector" and for example we are interested in type sink-task-metrics which are many as per the confluent list So, $4 would be replaced by one of the sink-task-metric say sink-record-read-rate. So, the metric name as per the prometheus naming conventions would be - kafka_connect_sink_task_metrics_sink_record_read_rate

However, in my case, i am interested in the connector-class metric of the type connector-metrics , but the name defined in the pattern

image does not have any variable or place holders, so the final prometheus metric name would just be kafka_connect_connector_metrics only and not kafka_connect_connector_metrics_connector_class.

I appreciate your help! Thank you!

Hi @ZealPatel15,
your observation is correct and we could definitely improve that pattern you indicated. If you want, feel free to open an issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants