Skip to content

receiver_creator doesn't work with kafkametrics receiver in 0.123.0 #39313

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
dmitryax opened this issue Apr 11, 2025 · 3 comments · Fixed by #39395
Closed

receiver_creator doesn't work with kafkametrics receiver in 0.123.0 #39313

dmitryax opened this issue Apr 11, 2025 · 3 comments · Fixed by #39395
Labels
bug Something isn't working help wanted Extra attention is needed receiver/kafkametrics receiver/receivercreator

Comments

@dmitryax
Copy link
Member

dmitryax commented Apr 11, 2025

Component(s)

receiver/receivercreator

What happened?

Description

receiver_creator doesn't work with kafkametrics receiver in 0.122.0

Steps to Reproduce

  1. Run a kafka container. https://github.com/apache/kafka/blob/3.7.0/docker/examples/jvm/single-node/plaintext/docker-compose.yml can be used for example

  2. Run collector with receiver_creator and kafkametrics receiver.

Use the following config:

extensions:
  docker_observer:
receivers:
  receiver_creator:
    watch_observers: [docker_observer]
    receivers:
      kafkametrics:
        rule: type == "container" and any([name, image, command], {# matches "(?i)(.*)kafka(.*)"})
        config:
          client_id: "otel-test"
          protocol_version: "2.0.0"
exporters:
  debug:
    verbosity: detailed
service:
  extensions: [docker_observer]
  telemetry:
    logs:
      level: debug
  pipelines:
    metrics:
      receivers: [receiver_creator]
      exporters: [debug]

Expected Result

Kafka metrics receiver is automaticaly started and scrapes kafka metrics

Actual Result

Kafka receiver is failing to start with the following logs:

2025-04-10T16:42:06.658-0700	debug	endpointswatcher/endpointswatcher.go:187	added endpoints	{"otelcol.component.id": "docker_observer", "otelcol.component.kind": "Extension", "notify": "receiver_creator", "b49d4280e3babd7c86868a44149c4c8b0dc57014f252547c3f04c3bf77311a9d:65216": "{\"alternate_port\":65216,\"command\":\"/bin/ryuk\",\"container_id\":\"b49d4280e3babd7c86868a44149c4c8b0dc57014f252547c3f04c3bf77311a9d\",\"endpoint\":\"172.17.0.3:8080\",\"host\":\"172.17.0.3\",\"id\":\"b49d4280e3babd7c86868a44149c4c8b0dc57014f252547c3f04c3bf77311a9d:65216\",\"image\":\"testcontainers/ryuk\",\"labels\":{\"org.testcontainers\":\"true\",\"org.testcontainers.lang\":\"go\",\"org.testcontainers.reaper\":\"true\",\"org.testcontainers.ryuk\":\"true\",\"org.testcontainers.sessionId\":\"11de74282ce2a6e23e5562c1634747cf4964455899354715718e0a3ab84e98fa\",\"org.testcontainers.version\":\"0.35.0\"},\"name\":\"reaper_11de74282ce2a6e23e5562c1634747cf4964455899354715718e0a3ab84e98fa\",\"port\":8080,\"tag\":\"0.11.0\",\"transport\":\"TCP\",\"type\":\"container\"}", "a919c89347366257d7adbf9b50a51c065063fea13c335945deb7359e52a49190:9092": "{\"alternate_port\":9092,\"command\":\"/etc/kafka/docker/run\",\"container_id\":\"a919c89347366257d7adbf9b50a51c065063fea13c335945deb7359e52a49190\",\"endpoint\":\"172.18.0.2:9092\",\"host\":\"172.18.0.2\",\"id\":\"a919c89347366257d7adbf9b50a51c065063fea13c335945deb7359e52a49190:9092\",\"image\":\"apache/kafka\",\"labels\":{\"com.docker.compose.config-hash\":\"c267e5fc0ac6c368ecec1410c5d5e76725582be9dcc7bfbda912e624e307de02\",\"com.docker.compose.container-number\":\"1\",\"com.docker.compose.depends_on\":\"\",\"com.docker.compose.image\":\"sha256:41c20d65b0a2180b9deca056d47540771752e69fe9225d5748dd65a25a9675d7\",\"com.docker.compose.oneoff\":\"False\",\"com.docker.compose.project\":\"docker\",\"com.docker.compose.project.config_files\":\"/Users/danoshin/Projects/splunk-otel-collector/docker/docker-compose.yml\",\"com.docker.compose.project.working_dir\":\"/Users/danoshin/Projects/splunk-otel-collector/docker\",\"com.docker.compose.service\":\"kafka-kraft-single\",\"com.docker.compose.version\":\"2.34.0\",\"maintainer\":\"Apache Kafka\",\"org.label-schema.build-date\":\"2024-02-09\",\"org.label-schema.description\":\"Apache Kafka\",\"org.label-schema.name\":\"kafka\",\"org.label-schema.vcs-url\":\"https://github.com/apache/kafka\"},\"name\":\"broker\",\"port\":9092,\"tag\":\"3.7.0\",\"transport\":\"TCP\",\"type\":\"container\"}"}
2025-04-10T16:42:06.659-0700	info	[email protected]/observerhandler.go:201	starting receiver	{"otelcol.component.id": "receiver_creator", "otelcol.component.kind": "Receiver", "otelcol.signal": "metrics", "name": "kafkametrics", "endpoint": "172.18.0.2:9092", "endpoint_id": "a919c89347366257d7adbf9b50a51c065063fea13c335945deb7359e52a49190:9092", "config": {"brockers":["`endpoint`"],"client_id":"otel-integration-test","protocol_version":"2.0.0","scrapers":["brokers"]}}
2025-04-10T16:42:06.660-0700	error	[email protected]/observerhandler.go:217	failed to start receiver	{"otelcol.component.id": "receiver_creator", "otelcol.component.kind": "Receiver", "otelcol.signal": "metrics", "receiver": "kafkametrics", "error": "failed to load \"kafkametrics\" template config: decoding failed due to the following error(s):\n\nerror decoding '': decoding failed due to the following error(s):\n\n'' has invalid keys: brockers, endpoint"}
github.com/open-telemetry/opentelemetry-collector-contrib/receiver/receivercreator.(*observerHandler).startReceiver
	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/[email protected]/observerhandler.go:217
github.com/open-telemetry/opentelemetry-collector-contrib/receiver/receivercreator.(*observerHandler).OnAdd
	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/[email protected]/observerhandler.go:105
github.com/open-telemetry/opentelemetry-collector-contrib/extension/observer/endpointswatcher.(*EndpointsWatcher).updateAndNotifyOfEndpoints
	github.com/open-telemetry/opentelemetry-collector-contrib/extension/[email protected]/endpointswatcher/endpointswatcher.go:114
202

Collector version

0.123.0

Additional context

I investigated to the point that it was introduced by #38634, but haven't gone further yet.

@dmitryax dmitryax added bug Something isn't working needs triage New item requiring triage labels Apr 11, 2025
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@dmitryax dmitryax added receiver/kafkametrics help wanted Extra attention is needed and removed needs triage New item requiring triage labels Apr 11, 2025
Copy link
Contributor

Pinging code owners for receiver/kafkametrics: @dmitryax. See Adding Labels via Comments if you do not have permissions to add labels yourself. For example, comment '/label priority:p2 -needs-triaged' to set the priority and remove the needs-triaged label.

@muskan2622
Copy link
Contributor

/assign

dmitryax added a commit that referenced this issue Apr 15, 2025
#### Description
Fix dynamic start of kafka metrics receiver

#### Link to tracking issue
Fixes
#39313.
The issue is caused by adding the custom unmarshaler to the kafka
metrics receiver, specifically [this
line](https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/38634/files#diff-ac118f5e10dc8be3556f9c7b192ca2a98161c1289a5a0bc9ddaa812e397acfaaR65).
The extra `Unmashall` call seems to be redundant. However, it's an
establish practice in other receivers. So I don't want to change that
yet. The change in `receiver_creator` also resolves the problem and
doesn't have any side effects.

#### Testing
Follow reproducing steps from the issue
akshays-19 pushed a commit to akshays-19/opentelemetry-collector-contrib that referenced this issue Apr 23, 2025
…metry#39395)

#### Description
Fix dynamic start of kafka metrics receiver

#### Link to tracking issue
Fixes
open-telemetry#39313.
The issue is caused by adding the custom unmarshaler to the kafka
metrics receiver, specifically [this
line](https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/38634/files#diff-ac118f5e10dc8be3556f9c7b192ca2a98161c1289a5a0bc9ddaa812e397acfaaR65).
The extra `Unmashall` call seems to be redundant. However, it's an
establish practice in other receivers. So I don't want to change that
yet. The change in `receiver_creator` also resolves the problem and
doesn't have any side effects.

#### Testing
Follow reproducing steps from the issue
Fiery-Fenix pushed a commit to Fiery-Fenix/opentelemetry-collector-contrib that referenced this issue Apr 24, 2025
…metry#39395)

#### Description
Fix dynamic start of kafka metrics receiver

#### Link to tracking issue
Fixes
open-telemetry#39313.
The issue is caused by adding the custom unmarshaler to the kafka
metrics receiver, specifically [this
line](https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/38634/files#diff-ac118f5e10dc8be3556f9c7b192ca2a98161c1289a5a0bc9ddaa812e397acfaaR65).
The extra `Unmashall` call seems to be redundant. However, it's an
establish practice in other receivers. So I don't want to change that
yet. The change in `receiver_creator` also resolves the problem and
doesn't have any side effects.

#### Testing
Follow reproducing steps from the issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed receiver/kafkametrics receiver/receivercreator
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants