Hi Dojo,
I have a query which lists alarms filtered by time and a column value. The result is around 1 000 alarms, checked from Cube Alarm Console, but I have around a 160 000 alarms without column value filter. I understand that when I'm trying to export the query result as csv, system needs to check 160 000 alarms and then, filtered by the value of one column by one session.
This situation is finishing as timeout error. My question is, are there any way to extend the timeout limit on the system?
Thanks!
Hi Daniel,
I am not sure if there is already a setting that you can use to extend the timeout, but for this type of use cases, a possible option is to use the Data Aggregator DxM. This DxM will allow you to run GQI queries periodically at fixed intervals (there is also an option to execute these queries manually). The results of these GQI queries (done through jobs) will be offloaded to CSV files. Settings in the Data Aggregator allows you to run GQI queries for longer periods.
Hope it helps.
Hi Miguel,
Finally, we made a different solution using other DataMiner features.
Thanks!