Skip to content
DataMiner Dojo

More results...

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Search in posts
Search in pages
Search in posts
Search in pages
Log in
Menu
  • Updates & Insights
  • Questions
  • Learning
    • E-learning Courses
    • Tutorials
    • Open Classroom Training
    • Certification
      • DataMiner Fundamentals
      • DataMiner Configurator
      • DataMiner Automation
      • Scripts & Connectors Developer: HTTP Basics
      • Scripts & Connectors Developer: SNMP Basics
      • Visual Overview – Level 1
      • Verify a certificate
    • YouTube Videos
    • Solutions & Use Cases
      • Solutions
      • Use Case Library
    • Agility
      • Learn more about Agile
        • Agile Webspace
        • Everything Agile
          • The Agile Manifesto
          • Best Practices
          • Retro Recipes
        • Methodologies
          • The Scrum Framework
          • Kanban
          • Extreme Programming
        • Roles
          • The Product Owner
          • The Agile Coach
          • The Quality & UX Coach (QX)
      • Book your Agile Fundamentals training
      • Book you Kanban workshop
    • >> Go to DataMiner Docs
  • DevOps
    • About the DevOps Program
    • Sign up for the DevOps Program
    • DataMiner DevOps Support
    • Feature Suggestions
  • Downloads
  • Swag Shop
  • PARTNERS
    • Business Partners
    • Technology Partners
  • Contact
    • Sales, Training & Certification
    • DataMiner Support
    • Global Feedback Survey
  • >> Go to dataminer.services

Generic Kafka Producer: Do we need Keystore/CA fields when using SASL (user/pass)?

144 views21 hours agokafka
0
A B M Siddique70 23rd February 2026 0 Comments

Hi all — quick question on the Generic KAFKA Producer connector. We’re configuring it with SASL (username/password) and the broker connection is failing with

Confluent.Kafka.KafkaException: Local: Broker transport failure
,

while a Kafka consumer using the same broker/creds works. In the Producer element UI we also see fields like Keystore Location, Keystore Password, and CA Certificate Location. Do these need to be filled in when using SASL, or only when the broker requires TLS (e.g., SASL_SSL)? If TLS is required, what’s the expected format/location for the cert/keystore on the DMA and which of those fields are mandatory?

Edson Alfaro [SLC] [DevOps Advocate] Answered question 21 hours ago

1 Answer

  • Active
  • Voted
  • Newest
  • Oldest
0
Edson Alfaro [SLC] [DevOps Advocate]1.62K Posted 21 hours ago 0 Comments

Hi!

No, the Keystore and CA certificate fields are not required when using plain SASL (SASL_PLAINTEXT). They are only required when the Kafka broker uses TLS encryption, such as SASL_SSL or SSL.

Regarding the error Local: Broker transport failure, usually indicates one of these issues:

  • Wrong security.protocol
  • TLS required but not configured
  • Firewall blocking broker port (In my experience, Cofluent was blocked by a firewall in the application layer)
  • Broker expects SSL but client uses PLAINTEXT
  • advertised.listeners mismatch

Since your consumer works, compare these settings carefully:

security.protocol
sasl.mechanism
sasl.username
sasl.password
bootstrap.servers

Edson Alfaro [SLC] [DevOps Advocate] Answered question 21 hours ago
You are viewing 1 out of 1 answers, click here to view all answers.
Please login to be able to comment or post an answer.

My DevOps rank

DevOps Members get more insights on their profile page.

My user earnings

0 Dojo credits

Spend your credits in our swag shop.

0 Reputation points

Boost your reputation, climb the leaderboard.

Promo banner DataMiner DevOps Professiona Program
DataMiner Integration Studio (DIS)
Empower Katas
Privacy Policy • Terms & Conditions • Contact

© 2026 Skyline Communications. All rights reserved.

DOJO Q&A widget

Can't find what you need?

? Explore the Q&A DataMiner Docs

[ Placeholder content for popup link ] WordPress Download Manager - Best Download Management Plugin