We’re having trouble getting the Generic Kafka Producer to start. Our Generic Kafka Consumer works fine using the same broker address and same SASL/PLAIN credentials, but the Producer fails during startup with:
<code>Confluent.Kafka.KafkaException: Local: Broker transport failure (occurs during AdminClient.GetMetadata()) </code>
In Cube, we also configured an SNMP Manager (Apps → System Center → SNMP Forwarding) and selected that forwarder when creating the Producer element — but the Kafka connection still fails. We’re unclear what that SNMP Manager setting is actually used for in the context of the Kafka Producer.


Hi
It could be related to firewall/DNS between DataMiner and the Kafka broker(s). Please ensure the broker listener on port 9096 (listeners/advertised.listeners, including protocol mapping) matches the DataMiner client settings. Also, please confirm the broker provider/network allows connections from our server public IP(s)/network to the Kafka brokers on 9096, and that the broker advertises endpoints that are reachable from DataMiner (not only the bootstrap address).
Standard SNMP Forwarding functionality is used to filter the alarms. On each DataMiner agent, at least one SNMP Manager shall be created.
In the SNMP Manager, you can:
- Define which alarm information should be forwarded (using the custom binding OIDs)
- Define which alarms should be forwarded (via filtering rules)
The SNMP Manager then sends the selected alarm information as SNMP Inform messages over the loopback interface to the Kafka element.
Regards