Skip to content
DataMiner DoJo

More results...

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Search in posts
Search in pages
Search in posts
Search in pages
Log in
Menu
  • Updates & Insights
  • Questions
  • Learning
    • E-learning Courses
    • Empower Replay: Limited Edition
    • Tutorials
    • Open Classroom Training
    • Certification
      • DataMiner Fundamentals
      • DataMiner Configurator
      • DataMiner Automation
      • Scripts & Connectors Developer: HTTP Basics
      • Scripts & Connectors Developer: SNMP Basics
      • Visual Overview – Level 1
      • Verify a certificate
    • Video Library
    • Books We Like
    • >> Go to DataMiner Docs
  • Expert Center
    • Solutions & Use Cases
      • Solutions
      • Use Case Library
    • Markets & Industries
      • Media production
      • Government & defense
      • Content distribution
      • Service providers
      • Partners
      • OSS/BSS
    • Agile
      • Agile Webspace
      • Everything Agile
        • The Agile Manifesto
        • Best Practices
        • Retro Recipes
      • Methodologies
        • The Scrum Framework
        • Kanban
        • Extreme Programming
      • Roles
        • The Product Owner
        • The Agile Coach
        • The Quality & UX Coach (QX)
    • DataMiner DevOps Professional Program
      • About the DevOps Program
      • DataMiner DevOps Support
  • Downloads
  • More
    • DataMiner Releases & Updates
    • Feature Suggestions
    • Climb the leaderboard!
    • Swag Shop
    • Contact
    • Global Feedback Survey
  • PARTNERS
    • All Partners
    • Technology Partners
    • Strategic Partner Program
    • Deal Registration
  • >> Go to dataminer.services

Big Data scenario: What is possible with the Generic Kafka Producer?

Solved447 views21st June 2024adl2099 kafka Producer SNMP Manager
2
Alberto De Luca [DevOps Enabler]4.58K 20th June 2024 0 Comments

When using the "Generic Kafka Producer" what is possible with the driver as-is?
Any customization script required or can the driver directly publish topics on an associated broker?

I can see the driver help mentions also the SNMP Manager - is this going to use the same OID as if we are relaying info directly via SNMP forwarding? What's the difference?

>> Copied from the help of the driver:

SNMP MANAGER CONFIGURATION
DataMiner receives the alarm information in the incoming SNMP inform messages/traps. Messages might be forwarded by any DataMiner Agent or by third-party software. In case a DataMiner Agent forwards them, the SNMP Manager (Apps > System Center > SNMP Forwarding) settings should be configured as follows:

  • SNMP version: SNMPv2 or SNMPv3.

  • Notification OID: Needs to match the Custom Bindings Object ID displayed when you click the More Configurations page button on the Alarms page. Otherwise, received inform messages/traps will not be processed.

  • The custom bindings table should be filled in.

Alberto De Luca [DevOps Enabler] Selected answer as best 21st June 2024

1 Answer

  • Active
  • Voted
  • Newest
  • Oldest
2
Tiago Pina [SLC] [DevOps Advocate]394 Posted 21st June 2024 1 Comment

Hi Alberto

With the Generic Kafka Producer, DataMiner acts as a producer and can publish information about alarms and parameter values (standalone parameters and/or table parameters) to a specific topic.

After configuring and setting up the Kafka cluster, there is nothing else you need to do besides deploying the connector.

Regarding SNMP Manager configuration, standard SNMP forwarding functionality is used to filter the alarms and define the information to forward. At each DataMiner agent, it is necessary to create at least one SNMP manager. The SNMP manager will then send the information using Inform messages via the loopback interface to the Kafka Producer element.

In the SNMP Manager, it will be possible to:

  • Define the required alarm information that should be forwarded. To achieve this, custom binding OIDs will be used

  • Define which alarms should be forwarded. To achieve this, alarm filters could be defined

Each alarm will be processed, and the information will be sent in JSON format using key/value pair-based messaging. For example, the alarm:

Will be sent as:

and it will be changed to:

The alarms table on the configuration page is where it will be possible to define the OID and the key that will replace it, like:

Please let me know if I can further assist you.

Regards,

Alberto De Luca [DevOps Enabler] Selected answer as best 21st June 2024
Alberto De Luca [DevOps Enabler] commented 21st June 2024

Thank you so much for this thorough description, Tiago, much appreciated.

I was afraid that the connector for the Producer would require some additional code to be written in order to work, but now that I have this step-by-step guide, I’ll check if we can set up a POC.

Marking this as solved – thanks again!

You are viewing 1 out of 1 answers, click here to view all answers.
Please login to be able to comment or post an answer.

My DevOps rank

DevOps Members get more insights on their profile page.

My user earnings

0 Dojo credits

Spend your credits in our swag shop.

0 Reputation points

Boost your reputation, climb the leaderboard.

Promo banner DataMiner DevOps Professiona Program
DataMiner Integration Studio (DIS)
Empower Katas
Privacy Policy • Terms & Conditions • Contact

© 2025 Skyline Communications. All rights reserved.

DOJO Q&A widget

Can't find what you need?

? Explore the Q&A DataMiner Docs

[ Placeholder content for popup link ] WordPress Download Manager - Best Download Management Plugin