There's interest in the 2 Kafka connectors in the Catalog. Both have an initial version number 1.0.0.1 but no documentation yet.
How do the drivers work, and is there anything needed in addition to the connectors?
Hi Jeroen,
The information in the catalog is not yet there because it has internally not been approved yet. I'll chase this to make sure it's available as soon as possible.
To give you an idea of what they exactly represent:
Producers are client applications that publish (write) events to Kafka, and consumers are those that subscribe to (read and process) these events. Producers and consumers are fully decoupled and agnostic of each other.
The Generic KAFKA Producer is a generic solution to send alarms and parameters values to a specific topic in the broker, where DataMiner acts as a Producer.
The Generic KAFKA Consumer is subscribes to a topic(s) via a broker(s) that can be defined within the protocol. The data retrieved from the API will be offloaded to a JSON file per topic, where it can be ingested and used in other workflows by protocol, automation script, etc.
Hi,
We use Dataminer Version 10.2.0.0-11517. Is there now a Kafka Connector available? If not, to rapatriate (“consume”) data, does it make sense to install a HTTP Bridge such as “Strimzi Kafka Bridge”?