Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Span service.name attribute ignored during NativeIngest APM stats metric aggregation #35481

Closed
tinyjacky opened this issue Sep 28, 2024 · 4 comments

Comments

@tinyjacky
Copy link

Component(s)

connector/datadog

What happened?

Description

With connect.datadogconnector.NativeIngest enabled, during APM stats metric aggregation, only the resource attribute service.name would be used for service name, while the span attribute service.name attribute would be ignored.

This is believed to be a bug because, in Datadog backend, the service value in the span is still using span attribute service.name, while the service value in APM metrics uses the resource attribute service.name.

For these kind of spans, the APM metrics related UX would be broken because of the inconsistencies.

photo_2024-09-28 10 13 42
photo_2024-09-28 10 13 39

Steps to Reproduce

Send a span with attribute service.name value sname and resource attribute service.name value rname
For example,

{
  "resourceSpans": [
    {
      "resource": {
        "attributes": [
          {
            "key": "service.name",
            "value": {
              "stringValue": "rname"
            }
          },
          {
            "key": "deployment.environment",
            "value": {
              "stringValue": "local"
            }
          }
        ]
      },
      "scopeSpans": [
        {
          "scope": {
            "name": "io.opentelemetry.http-url-connection"
          },
          "spans": [
            {
              "traceId": "71699b6fe85982c7c8995ea3d9c95df2",
              "spanId": "3c191d03fa8be065",
              "name": "spanitron",
              "kind": 3,
              "startTimeUnixNano": "1716776846359907000",
              "endTimeUnixNano": "1716776846360086000",
              "droppedAttributesCount": 0,
              "events": [],
              "droppedEventsCount": 0,
              "status": {
                "code": 1
              },
              "attributes": [
                {
                  "key": "service.name",
                  "value" : {
                    "stringValue": "sname"
                  }
                }
              ]
            }
          ]
        }
      ]
    }
  ]
 }

Expected Result

APM metrics with attr service:sname.

Actual Result

APM metrics with attr service:rname.

Collector version

v0.110.0

Environment information

Environment

OS: macOS Sonoma 14.6.1, arm64

OpenTelemetry Collector configuration

connectors:
  datadog/connector:
    traces:
      compute_stats_by_span_kind: true
      compute_top_level_by_span_kind: true
      peer_tags_aggregation: true
exporters:
  datadog:
    api:
      key: ${env:DD_API_KEY}
      site: datadoghq.com.
    host_metadata:
      enabled: false
    traces:
      compute_stats_by_span_kind: true
      compute_top_level_by_span_kind: true
      peer_tags_aggregation: true
  debug:
    verbosity: detailed
    sampling_initial: 5
    sampling_thereafter: 200
extensions:
  health_check:
    endpoint: ${env:MY_POD_IP}:13133
processors:
  batch: {}
  memory_limiter:
    check_interval: 5s
    limit_mib: 45
    spike_limit_percentage: 25
receivers:
  otlp:
    protocols:
      grpc:
        endpoint: ${env:MY_POD_IP}:4317
      http:
        endpoint: ${env:MY_POD_IP}:4318
service:
  extensions:
  - health_check
  pipelines:
    metrics:
      receivers:
      - datadog/connector
      exporters:
      - datadog
    traces:
      exporters:
      - datadog/connector
      - datadog
      processors:
      - memory_limiter
      receivers:
      - otlp
  telemetry:
    metrics:
      address: ${env:MY_POD_IP}:8888

Log output

No response

Additional context

No response

@tinyjacky tinyjacky added bug Something isn't working needs triage New item requiring triage labels Sep 28, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@songy23
Copy link
Member

songy23 commented Sep 30, 2024

@tinyjacky if you can remove service.name from span attributes and only leave the one in resource attributes, that should solve the issue.

According to OTel spec https://opentelemetry.io/docs/specs/semconv/resource/#service service.name is a resource attribute rather than span attribute. Datadog semantic mapping doc suggests the same: https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/?tab=kubernetes#opentelemetry

When using OpenTelemetry, map the following resource attributes to their corresponding Datadog conventions

We plan to eventually remove the mappings that do not conform to OTel specs following some deprecation period, such as getting resource attributes from span attributes. /cc @IbraheemA @dineshg13

@songy23 songy23 added waiting for author and removed bug Something isn't working needs triage New item requiring triage labels Sep 30, 2024
Copy link
Contributor

github-actions bot commented Dec 2, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Dec 2, 2024
@mx-psi
Copy link
Member

mx-psi commented Dec 2, 2024

Closing as wontfix since we do not have plans to further support setting resource semantic conventions as span attributes.

@dineshg13 @IbraheemA to file/link to a separate issue for tracking removing support for existing resource semantic conventions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants