You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Deprecate `topic` and `encoding`, and introduce
signal-specific equivalents:
- `logs::topic`, `metrics::topic`, and `traces::topic`
- `logs::encoding`, `metrics::encoding`, and `traces::encoding`
This enables users to explicitly define a configuration
equivalent to the default configuration, or some variation
thereof. It also enables specifying different encodings for
each signal type, which may be important due to the fact that
some encodings only support a subset of signals.
Closesopen-telemetry#35432
-`resolve_canonical_bootstrap_servers_only` (default = false): Whether to resolve then reverse-lookup broker IPs during startup.
26
28
-`client_id` (default = "otel-collector"): The client ID to configure the Kafka client with. The client ID will be used for all produce requests.
27
-
-`topic` (default = otlp_spans for traces, otlp_metrics for metrics, otlp_logs for logs): The name of the default kafka topic to export to. See [Destination Topic](#destination-topic) below for more details.
29
+
-`logs`
30
+
-`topic` (default = otlp\_logs): The name of the Kafka topic to which logs will be exported.
31
+
-`encoding` (default = otlp\_proto): The encoding for logs. See [Supported encodings](#supported-encodings).
32
+
-`metrics`
33
+
-`topic` (default = otlp\_metrics): The name of the Kafka topic from which to consume metrics.
34
+
-`encoding` (default = otlp\_proto): The encoding for metrics. See [Supported encodings](#supported-encodings).
35
+
-`traces`
36
+
-`topic` (default = otlp\_spans): The name of the Kafka topic from which to consume traces.
37
+
-`encoding` (default = otlp\_proto): The encoding for traces. See [Supported encodings](#supported-encodings).
38
+
-`topic` (Deprecated in v0.124.0: use `logs::topic`, `metrics::topic`, and `traces::topic`) If specified, this is used as the default topic, but will be overridden by signal-specific configuration. See [Destination Topic](#destination-topic) below for more details.
28
39
-`topic_from_attribute` (default = ""): Specify the resource attribute whose value should be used as the message's topic. See [Destination Topic](#destination-topic) below for more details.
29
-
-`encoding` (default = otlp_proto): The encoding of the traces sent to kafka. All available encodings:
30
-
-`otlp_proto`: payload is Protobuf serialized from `ExportTraceServiceRequest` if set as a traces exporter or `ExportMetricsServiceRequest` for metrics or `ExportLogsServiceRequest` for logs.
31
-
-`otlp_json`: payload is JSON serialized from `ExportTraceServiceRequest` if set as a traces exporter or `ExportMetricsServiceRequest` for metrics or `ExportLogsServiceRequest` for logs.
32
-
- The following encodings are valid *only* for **traces**.
33
-
-`jaeger_proto`: the payload is serialized to a single Jaeger proto `Span`, and keyed by TraceID.
34
-
-`jaeger_json`: the payload is serialized to a single Jaeger JSON Span using `jsonpb`, and keyed by TraceID.
35
-
-`zipkin_proto`: the payload is serialized to Zipkin v2 proto Span.
36
-
-`zipkin_json`: the payload is serialized to Zipkin v2 JSON Span.
37
-
- The following encodings are valid *only* for **logs**.
38
-
-`raw`: if the log record body is a byte array, it is sent as is. Otherwise, it is serialized to JSON. Resource and record attributes are discarded.
40
+
-`encoding` (Deprecated in v0.124.0: use `logs::encoding`, `metrics::encoding`, and `traces::encoding`) If specified, this is used as the default encoding, but will be overridden by signal-specific configuration. See [Supported encodings](#supported-encodings) below for more details.
39
41
-`partition_traces_by_id` (default = false): configures the exporter to include the trace ID as the message key in trace messages sent to kafka. *Please note:* this setting does not have any effect on Jaeger encoding exporters since Jaeger exporters include trace ID as the message key by default.
40
42
-`partition_metrics_by_resource_attributes` (default = false) configures the exporter to include the hash of sorted resource attributes as the message partitioning key in metric messages sent to kafka.
41
43
-`partition_logs_by_resource_attributes` (default = false) configures the exporter to include the hash of sorted resource attributes as the message partitioning key in log messages sent to kafka.
@@ -96,6 +98,25 @@ The following settings can be optionally configured:
96
98
-`compression` (default = 'none') the compression used when producing messages to kafka. The options are: `none`, `gzip`, `snappy`, `lz4`, and `zstd`https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#compression-type
97
99
-`flush_max_messages` (default = 0) The maximum number of messages the producer will send in a single broker request.
98
100
101
+
### Supported encodings
102
+
103
+
The Kafka exporter supports encoding extensions, as well as the following built-in encodings.
104
+
105
+
Available for all signals:
106
+
-`otlp_proto`: data is encoded as OTLP Protobuf Protobuf
107
+
-`otlp_json`: data is encoded as OTLP JSON
108
+
109
+
Available only for traces:
110
+
-`jaeger_proto`: the payload is serialized to a single Jaeger proto `Span`, and keyed by TraceID.
111
+
-`jaeger_json`: the payload is serialized to a single Jaeger JSON Span using `jsonpb`, and keyed by TraceID.
112
+
-`zipkin_proto`: the payload is serialized to Zipkin v2 proto Span.
113
+
-`zipkin_json`: the payload is serialized to Zipkin v2 JSON Span.
114
+
115
+
Available only for logs:
116
+
-`raw`: if the log record body is a byte array, it is sent as is. Otherwise, it is serialized to JSON. Resource and record attributes are discarded.
117
+
118
+
### Example configuration
119
+
99
120
Example configuration:
100
121
101
122
```yaml
@@ -106,7 +127,9 @@ exporters:
106
127
```
107
128
108
129
## Destination Topic
130
+
109
131
The destination topic can be defined in a few different ways and takes priority in the following order:
132
+
110
133
1. When `topic_from_attribute` is configured, and the corresponding attribute is found on the ingested data, the value of this attribute is used.
111
134
2. If a prior component in the collector pipeline sets the topic on the context via the `topic.WithTopic` function (from the `github.com/open-telemetry/opentelemetry-collector-contrib/pkg/kafka/topic` package), the value set in the context is used.
112
-
3. Finally, the `topic` configuration is used as a default/fallback destination.
135
+
3. Finally, the `<signal>::topic` configuration is used for the signal-specific destination topic. If this is not explicitly configured, the `topic` configuration (deprecated in v0.124.0) is used as a fallback for all signals.
0 commit comments