Hi, I have created a number of ad-hoc GQI queries. In the next step of my project, I would like to run these queries at fixed intervals - e.g. once per hour, and keep track of how the results evolve over time.
Is there any existing solution for this? Perhaps a connector that can run the queries and trend the results?
Thanks for the help Joachim and Toon!
I ended up converting my C# code to python and using a scripted connector. This is using the data aggregation in the background, but also allowed me to get the same data into a connector quickly, with all the well-known connector features such alarming and trending, and without the need for an intermediate CSV file. The scripting was also quicker than creating a GQI ad-hoc data source.
Hi Michiel,
I think this is a good use case for using the Data Aggregator. With this DxM, you can convert your GQI Queries to Queries that can be executed in the Data Aggregator. Next, you can schedule via so-called Cron Expressions the frequency (for example every hour). The results can be written in a CSV, so every hour you will have a CSV, with a clear datetime. Then, with an ad hoc data source reading those CSVs, you could visualize how the data changes over time in a Dashboard (or you can read it and calculate statistics in your own virtual connector of course).
An extra benefit of the Data Aggregator is that it comes out of the box with the Data Aggregator Monitor, which enables you to already have alarming and trending on the generated amount of rows, the duration of the query and if it has succeeded or not.
Kind regards,
Joachim
A more low-level solution I tend to use is to schedule a script run that writes your data to a csv (or whatever storage you want for your table). Then you can use a generic csv ad-hoc data source to extract the data for your LCA or DB.
Of course converting this to a connector or using an aggregator is probably the cleaner solution.