We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
connector/datadog, exporter/datadog
In versions [0.95.0 , 0.113.0] service catalog is not populate with the correct "env" tag.
Update version to 0.95.0+
Different combinations of Compute_stats_by_span_kind and Compute_top_level_by_span_kind do not seem to have an effect.
Service catalog should contais services and data filtered by "env" tag.
Service catalog only show data in "env:none"
v0.95.0 to v0.113.0
OS: Ubuntu 20.04
apiVersion: helm.toolkit.fluxcd.io/v2beta1 kind: HelmRelease metadata: name: opentelemetry-collector-np1a spec: chart: spec: version: "0.109.0" values: resources: requests: cpu: 30m memory: 500Mi limits: cpu: 256m memory: 2048Mi image: repository: "otel/opentelemetry-collector-contrib" tag: 0.113.0 config: processors: attributes: actions: - key: env value: np-test action: insert connectors: datadog/connector: traces: compute_stats_by_span_kind: true compute_top_level_by_span_kind: true exporters: datadog/exporter: api: site: datadoghq.eu key: ${env:DATADOG_API_KEY} traces: trace_buffer: 1000 host_metadata: enabled: true service: pipelines: traces: receivers: - otlp processors: - batch - attributes exporters: - datadog/connector - datadog/exporter metrics: receivers: - otlp - datadog/connector processors: - batch - attributes exporters: - datadog/exporter logs: null
2024-11-08T10:42:59.165Z info [email protected]/service.go:166 Setting up own telemetry... 2024-11-08T10:42:59.165Z warn [email protected]/service.go:221 service::telemetry::metrics::address is being deprecated in favor of service::telemetry::metrics::readers 2024-11-08T10:42:59.165Z info telemetry/metrics.go:70 Serving metrics {"address": "10.160.58.246:8888", "metrics level": "Normal"} 2024-11-08T10:42:59.166Z info builders/builders.go:26 Development component. May change in the future. {"kind": "exporter", "data_type": "logs", "name": "debug"} 2024-11-08T10:42:59.281Z info provider/provider.go:71 Resolved source {"kind": "exporter", "data_type": "metrics", "name": "datadog/exporter", "provider": "system", "source": {"Kind":"host","Identifier":"opentelemetry-collector-np1a-78f9dccf65-66lcd"}} 2024-11-08T10:42:59.282Z info clientutil/api.go:41 Validating API key. {"kind": "exporter", "data_type": "metrics", "name": "datadog/exporter"} 2024-11-08T10:42:59.282Z info [email protected]/factory.go:62 Datadog connector using the native OTel API to ingest OTel spans and produce APM stats. To revert to the legacy processing pipeline, disable the feature gate {"kind": "connector", "name": "datadog/connector", "exporter_in_pipeline": "traces", "receiver_in_pipeline": "metrics", "feature gate": "connector.datadogconnector.NativeIngest"} 2024-11-08T10:42:59.283Z info [email protected]/connector_native.go:62 Building datadog connector for traces to metrics {"kind": "connector", "name": "datadog/connector", "exporter_in_pipeline": "traces", "receiver_in_pipeline": "metrics"} 2024-11-08T10:42:59.283Z info [email protected]/connector.go:117 traces::compute_top_level_by_span_kind needs to be enabled in both the Datadog connector and Datadog exporter configs if both components are being used {"kind": "connector", "name": "datadog/connector", "exporter_in_pipeline": "traces", "receiver_in_pipeline": "metrics"} 2024-11-08T10:42:59.283Z info [email protected]/factory.go:359 Trace metrics are now disabled in the Datadog Exporter by default. To continue receiving Trace Metrics, configure the Datadog Connector or disable the feature gate. {"kind": "exporter", "data_type": "traces", "name": "datadog/exporter", "documentation": "https://docs.datadoghq.com/opentelemetry/guide/migration/", "feature gate ID": "exporter.datadogexporter.DisableAPMStats"} 2024-11-08T10:42:59.283Z info [email protected]/zaplogger.go:38 Starting Agent with processor trace buffer of size 1000 {"kind": "exporter", "data_type": "traces", "name": "datadog/exporter"} 2024-11-08T10:42:59.283Z info [email protected]/zaplogger.go:38 Receiver configured with 4 decoders and a timeout of 0ms {"kind": "exporter", "data_type": "traces", "name": "datadog/exporter"} 2024-11-08T10:42:59.285Z info [email protected]/zaplogger.go:38 Trace writer initialized (climit=5 qsize=1 compression=gzip) {"kind": "exporter", "data_type": "traces", "name": "datadog/exporter"} 2024-11-08T10:42:59.286Z info [email protected]/zaplogger.go:38 Processing Pipeline configured with 8 workers {"kind": "exporter", "data_type": "traces", "name": "datadog/exporter"} 2024-11-08T10:42:59.286Z info clientutil/api.go:41 Validating API key. {"kind": "exporter", "data_type": "traces", "name": "datadog/exporter"} 2024-11-08T10:42:59.287Z info [email protected]/memorylimiter.go:151 Using percentage memory limiter {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "total_memory_mib": 2048, "limit_percentage": 80, "spike_limit_percentage": 25} 2024-11-08T10:42:59.287Z info [email protected]/memorylimiter.go:75 Memory limiter configured {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "limit_mib": 1638, "spike_limit_mib": 512, "check_interval": 5} 2024-11-08T10:42:59.370Z info [email protected]/service.go:238 Starting otelcol-contrib... {"Version": "0.113.0", "NumCPU": 8} 2024-11-08T10:42:59.370Z info extensions/extensions.go:39 Starting extensions... 2024-11-08T10:42:59.371Z info extensions/extensions.go:42 Extension is starting... {"kind": "extension", "name": "health_check"} 2024-11-08T10:42:59.371Z info [email protected]/healthcheckextension.go:33 Starting health_check extension {"kind": "extension", "name": "health_check", "config": {"Endpoint":"10.160.58.246:13133","TLSSetting":null,"CORS":null,"Auth":null,"MaxRequestBodySize":0,"IncludeMetadata":false,"ResponseHeaders":null,"CompressionAlgorithms":null,"ReadTimeout":0,"ReadHeaderTimeout":0,"WriteTimeout":0,"IdleTimeout":0,"Path":"/","ResponseBody":null,"CheckCollectorPipeline":{"Enabled":false,"Interval":"5m","ExporterFailureThreshold":5}}} 2024-11-08T10:42:59.371Z info extensions/extensions.go:59 Extension started. {"kind": "extension", "name": "health_check"} 2024-11-08T10:42:59.372Z info [email protected]/otlp.go:112 Starting GRPC server {"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "10.160.58.246:4317"} 2024-11-08T10:42:59.372Z info [email protected]/otlp.go:169 Starting HTTP server {"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "10.160.58.246:4318"} 2024-11-08T10:42:59.372Z info [email protected]/connector_native.go:91 Starting datadogconnector {"kind": "connector", "name": "datadog/connector", "exporter_in_pipeline": "traces", "receiver_in_pipeline": "metrics"} 2024-11-08T10:42:59.379Z info healthcheck/handler.go:132 Health Check state change {"kind": "extension", "name": "health_check", "status": "ready"} 2024-11-08T10:42:59.379Z info [email protected]/service.go:261 Everything is ready. Begin running and processing data. 2024-11-08T10:42:59.549Z info clientutil/api.go:45 API key validation successful. {"kind": "exporter", "data_type": "metrics", "name": "datadog/exporter"} 2024-11-08T10:42:59.549Z info clientutil/api.go:45 API key validation successful. {"kind": "exporter", "data_type": "traces", "name": "datadog/exporter"} 1731062585519773101 [Debug] Error fetching info for pid 1: %!w(*fs.PathError=&{open /etc/passwd 2}) 2024-11-08T10:43:08.869Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 12} 2024-11-08T10:43:10.154Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 4} 2024-11-08T10:43:11.761Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 9} 2024-11-08T10:43:13.173Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 12} 2024-11-08T10:43:20.276Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 4} 2024-11-08T10:43:21.880Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 18} 2024-11-08T10:43:23.086Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 12} 2024-11-08T10:43:30.110Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 8} 2024-11-08T10:43:31.914Z info Logs {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 9}
No response
The text was updated successfully, but these errors were encountered:
Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself.
Sorry, something went wrong.
@rubencc there are 2 problems with your config:
... processors: attributes: actions: - key: env value: np-test action: insert ...
env
So you should change the lines above to:
... processors: resource: attributes: - key: deployment.environment.name value: np-test action: insert ...
and replace the reference attributes with resource in your pipelines
attributes
resource
Hi @songy23
I tried the changes you suggested, and they worked. Thanks a lot.
No branches or pull requests
Component(s)
connector/datadog, exporter/datadog
What happened?
Description
In versions [0.95.0 , 0.113.0] service catalog is not populate with the correct "env" tag.
Steps to Reproduce
Update version to 0.95.0+
Different combinations of Compute_stats_by_span_kind and Compute_top_level_by_span_kind do not seem to have an effect.
Expected Result
Service catalog should contais services and data filtered by "env" tag.
Actual Result
Service catalog only show data in "env:none"
Collector version
v0.95.0 to v0.113.0
Environment information
Environment
OS: Ubuntu 20.04
OpenTelemetry Collector configuration
Log output
Additional context
No response
The text was updated successfully, but these errors were encountered: