Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Datadog logs exporter doesn't adhere to standard http proxy variables #36292

Closed
fredjn opened this issue Nov 11, 2024 · 5 comments
Closed

Datadog logs exporter doesn't adhere to standard http proxy variables #36292

fredjn opened this issue Nov 11, 2024 · 5 comments
Labels
bug Something isn't working exporter/datadog Datadog components

Comments

@fredjn
Copy link

fredjn commented Nov 11, 2024

Component(s)

exporter/datadog

What happened?

Description

When running the Collector in a network environment where a proxy is required to reach Datadog key verifications APIs the logs exporter fails to connect (and thus key verification fails). This happens even if standard proxy environment variables are set. The key verification for Metrics and Traces work fine.

Details

I traced the problem back to when the Datadog agent was introduced to handle the communication (v0.108.0?), and for unclear reasons it does not pick up the HTTP_PROXY, HTTPS_PROXY and NO_PROXY variables from the environment.

Steps to Reproduce

  1. Fire up the otel/opentelemetry-collector-contrib:0.112.0 container on a network where proxy is needed to reach the Datadog key verification APIs.
  2. Make sure to inject the HTTP_PROXY and/or HTTPS_PROXY environment variables when starting the container.
  3. Check the logs for key verification failure message for the logs exporter.
  4. Check the logs for key verification success message for the trace and the metrics exporters.
  5. Send signals to Datadog and only Traces and Metrics show up on the Datadog servers.

Alternative method

  1. Fire up the otel/opentelemetry-collector-contrib:0.112.0 container on a network where no proxy is needed to reach the Datadog key verification APIs.
  2. Make sure to inject a faulty and/or unreachable proxy adress via the HTTP_PROXY and/or HTTPS_PROXY environment variables.
  3. Check the logs for key verification failure messages for the Metrics and Trace signal exporters.
  4. Check the logs for the absence of the key verification failure message for the logs signal exporter
  5. Send signals to Datadog and only logs show up on the Datadog servers.

Expected Result

All signal exporters (Trace, Metrics and Logs) should adhere to the HTTP_PROXY, HTTPS_PROXY and NO_PROXY environment variables.

Actual Result

Only Trace and Metrics signal exporters adhere to the HTTP_PROXY, HTTPS_PROXY and NO_PROXY environment variables.

Collector version

v0.112.0

Environment information

Environment

Docker container provided by the community via Dockerhub (otel/opentelemetry-collector-contrib:0.112.0) running with network where proxy is required to reach the Datadog key verification APIs

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 0.0.0.0:4317
      http:
        endpoint: 0.0.0.0:4318
exporters:
  datadog:
    hostname: some.host
    api:
      key: xxx
      site: datadoghq.eu
    host_metadata:
      tags:
      - owner:myown
      - team:tagteam
  otlp:
    endpoint: some.other.host
    tls:
      insecure: true
  debug:
    verbosity: detailed
connectors:
  datadog/connector:
processors:
  transform:
    metric_statements:
    - context: resource
      statements:
      - set(attributes["team"], "tagteam")
      - set(attributes["env"], "experiment")
    trace_statements:
    - context: resource
      statements:
      - set(attributes["team"], "tagteam")
      - set(attributes["env"], "experiment")
    log_statements:
    - context: resource
      statements:
      - set(attributes["team"], "tagteam")
      - set(attributes["env"], "experiment")
  batch:
    send_batch_max_size: 1000
    send_batch_size: 100
    timeout: 10s
service:
  telemetry:
    logs:
      level: debug
  pipelines:
    logs:
      receivers:
      - otlp
      processors:
      - transform
      - batch
      exporters:
      - datadog
      - debug
    traces:
      receivers:
      - otlp
      processors:
      - transform
      - batch
      exporters:
      - datadog/connector
      - datadog
      - debug
    metrics:
      receivers:
      - datadog/connector
      - otlp
      processors:
      - transform
      - batch
      exporters:
      - datadog

Log output

2024-11-11T13:55:09.326+0100	warn	datadogexporter/zaplogger.go:43	Error while validating API key	{"kind": "exporter", "data_type": "logs", "name": "datadog"}
2024-11-11T13:56:36.660+0100	info	clientutil/api.go:40	Validating API key.	{"kind": "exporter", "data_type": "traces", "name": "datadog"}
2024-11-11T13:56:36.661+0100	info	clientutil/api.go:40	Validating API key.	{"kind": "exporter", "data_type": "metrics", "name": "datadog"}
2024-11-11T13:56:36.751+0100	info	clientutil/api.go:44	API key validation successful.	{"kind": "exporter", "data_type": "metrics", "name": "datadog"}
2024-11-11T13:56:36.765+0100	info	clientutil/api.go:44	API key validation successful.	{"kind": "exporter", "data_type": "traces", "name": "datadog"}

Additional context

No response

@fredjn fredjn added bug Something isn't working needs triage New item requiring triage labels Nov 11, 2024
@github-actions github-actions bot added the exporter/datadog Datadog components label Nov 11, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@songy23
Copy link
Member

songy23 commented Nov 11, 2024

@fredjn This is a known limitation from our upstream dependency, for now you will want to set this config to your proxy url:
https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/pkg/datadog/config/logs.go#L12

datadog:
  logs:
    endpoint: xyz

Or, you can disable the feature gate exporter.datadogexporter.UseLogsAgentExporter

If you want support for the proxy env vars, please file a ticket on https://www.datadoghq.com/support

@songy23 songy23 closed this as completed Nov 11, 2024
@songy23 songy23 removed the needs triage New item requiring triage label Nov 11, 2024
@songy23
Copy link
Member

songy23 commented Nov 11, 2024

cc @liustanley

@fredjn
Copy link
Author

fredjn commented Nov 11, 2024

@songy23 your suggestion about setting the datadog/logs/endpoint to my proxy url does not seem to work. or did I mistunderstand your previous comment?

I added it under exporters like so:

exporters:
  datadog:
    logs:
      endpoint: <my proxy url>

@songy23
Copy link
Member

songy23 commented Nov 11, 2024

@fredjn please try the alternative

Or, you can disable the feature gate exporter.datadogexporter.UseLogsAgentExporter

And file a ticket on https://www.datadoghq.com/support so that our team can triage accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/datadog Datadog components
Projects
None yet
Development

No branches or pull requests

2 participants