We are looking for options to get the data that is currently sent to our ElasticSearch cluster into Google Big Query. Other teams in our company are using OTEL collectors to stream this data over public internet to receivers in the Google cloud. What would be the easiest method to accomplish this with DataMiner? Apparently Elastic does support opentelemetry natively but I'm not certain it acts as an exporter as well. Any help you can give would be appreciated.
Hi Richard, we have seen this use-case before where DataMiner needs to offload data to systems other than the default available central databases. (The other examples were GrayLog, Kinesis Firehose, custom database schemes, ...)
How to tackle: Full details can be found here.
In short:
- Make sure DataMiner generates the offload files, which is available from DataMiner 10.2 onwards.
- Then we'll have to create a connector that processes these files, and pushes them into Google Big Query.
I also refer to the answer on this question where Ben explains some other techniques could optionally be used.
The alarm properties are offloaded in a separate file as this is one-to-many relation towards alarms. (Originates from the relational structure of MySQL tables).
Thank you Jan. I’ve taken a look at the offload files provided by our DMS. The trend data seems pretty straight forward and I think we can send that along to Big Query easily.
The info and alarm offload, however, don’t seem to have the same level of enrichment as the logs in Elastic. The custom properties inherited by each alarm seem to be missing. Is there a way to get the same structured alarm and log data in Elastic sent to a third party?