Skip to content
DataMiner Dojo

More results...

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Search in posts
Search in pages
Search in posts
Search in pages
Log in
Menu
  • Updates & Insights
  • Questions
  • Learning
    • E-learning Courses
    • Tutorials
    • Open Classroom Training
    • Certification
      • DataMiner Fundamentals
      • DataMiner Configurator
      • DataMiner Automation
      • Scripts & Connectors Developer: HTTP Basics
      • Scripts & Connectors Developer: SNMP Basics
      • Visual Overview – Level 1
      • Verify a certificate
    • YouTube Videos
    • Solutions & Use Cases
      • Solutions
      • Use Case Library
    • Agility
      • Learn more about Agile
        • Agile Webspace
        • Everything Agile
          • The Agile Manifesto
          • Best Practices
          • Retro Recipes
        • Methodologies
          • The Scrum Framework
          • Kanban
          • Extreme Programming
        • Roles
          • The Product Owner
          • The Agile Coach
          • The Quality & UX Coach (QX)
      • Book your Agile Fundamentals training
      • Book you Kanban workshop
    • >> Go to DataMiner Docs
  • DevOps
    • About the DevOps Program
    • Sign up for the DevOps Program
    • DataMiner DevOps Support
    • Feature Suggestions
  • Downloads
  • Swag Shop
  • PARTNERS
    • Business Partners
    • Technology Partners
  • Contact
    • Sales, Training & Certification
    • DataMiner Support
    • Global Feedback Survey
  • >> Go to dataminer.services

Generic Kafka Producer fails with “Broker transport failure” while Consumer works

247 views23rd February 2026kafka kafka prod
2
A B M Siddique70 17th February 2026 0 Comments

We’re having trouble getting the Generic Kafka Producer to start. Our Generic Kafka Consumer works fine using the same broker address and same SASL/PLAIN credentials, but the Producer fails during startup with:

<code>Confluent.Kafka.KafkaException: Local: Broker transport failure
(occurs during AdminClient.GetMetadata())
</code>

In Cube, we also configured an SNMP Manager (Apps → System Center → SNMP Forwarding) and selected that forwarder when creating the Producer element — but the Kafka connection still fails. We’re unclear what that SNMP Manager setting is actually used for in the context of the Kafka Producer.


Tiago Pina [SLC] [DevOps Member] Answered question 23rd February 2026

1 Answer

  • Active
  • Voted
  • Newest
  • Oldest
1
Tiago Pina [SLC] [DevOps Member]476 Posted 23rd February 2026 2 Comments

Hi

It could be related to firewall/DNS between DataMiner and the Kafka broker(s). Please ensure the broker listener on port 9096 (listeners/advertised.listeners, including protocol mapping) matches the DataMiner client settings. Also, please confirm the broker provider/network allows connections from our server public IP(s)/network to the Kafka brokers on 9096, and that the broker advertises endpoints that are reachable from DataMiner (not only the bootstrap address).

Standard SNMP Forwarding functionality is used to filter the alarms. On each DataMiner agent, at least one SNMP Manager shall be created.

In the SNMP Manager, you can:

  • Define which alarm information should be forwarded (using the custom binding OIDs)
  • Define which alarms should be forwarded (via filtering rules)

The SNMP Manager then sends the selected alarm information as SNMP Inform messages over the loopback interface to the Kafka element.

Regards

Tiago Pina [SLC] [DevOps Member] Posted new comment 2 days ago
A B M Siddique commented 2nd March 2026

Thank you Tiago for your response – We have been checked our port 9096 and its able to connect to the broker however for the consumer protocol – but the same broker we are not able to connect with the producer protocol.

We configured a SNMP forwarder loop back on the same dma where the kafka element is hosting and using the default OID bindings- IS there any other steps we might take to see the connection not failing ? Thank you in advance

Tiago Pina [SLC] [DevOps Member] commented 2 days ago

The consumer working with the same bootstrap server and SASL/PLAIN credentials confirms that the initial connection/authentication path to at least one broker is at least partly valid, but it does not rule out a later broker reachability issue. Kafka uses bootstrap.servers to establish the initial connection and discover the cluster, and clients then connect to broker addresses returned in metadata from advertised.listeners. Since the Kafka producer explicitly calls AdminClient.GetMetadata() at startup, it can fail there if one or more of those broker addresses is not reachable or resolvable from the connector host.

The main thing to verify is that the advertised.listeners are reachable from the DataMiner server and match the listener/security configuration used by the producer.

As a side note, the Kafka Producer works together with the SNMP forwarding module, acting as a trap receiver that formats DataMiner alarm traps into JSON and pushes them to a Kafka topic. This means that messages are forwarded to Kafka only after a successful connection.
Regards

Please login to be able to comment or post an answer.

My DevOps rank

DevOps Members get more insights on their profile page.

My user earnings

0 Dojo credits

Spend your credits in our swag shop.

0 Reputation points

Boost your reputation, climb the leaderboard.

Promo banner DataMiner DevOps Professiona Program
DataMiner Integration Studio (DIS)
Empower Katas
Privacy Policy • Terms & Conditions • Contact

© 2026 Skyline Communications. All rights reserved.

DOJO Q&A widget

Can't find what you need?

? Explore the Q&A DataMiner Docs

[ Placeholder content for popup link ] WordPress Download Manager - Best Download Management Plugin