We have a monitoring app, which needs to have almost real-time updates if any parameter value is changed
1) We have used pagination to iterate on services and on elements to get data but I feel the LCA still waits until it has all data, rather it should get a set of page data and return to LCA without waiting for all (It should be like facebook lazy load).
2) If we use trigger I suspect it will whole GQI script again and then pagination thing will hamper.
3) How to handle multiple users at same time so that dataminer doesnt go slow to retrieve the information.
Hi Apurva,
To answer your points:
- Depending on the visualization, the LCA framework determines how much rows it needs. For a table component, the number of rows will be limited and lazy loaded as you describe, but there is always a minimum initial load.
For example, if the LCA needs at least 100 rows it will fetch the next page of the query until it has at least that much rows.
So if you have an ad hoc data source that returns 200 rows per page, the GetNextPage method would be called once for the initial load. If the ad hoc data source returns 10 rows per page, the GetNextPage method will be called 10 times.
You can verify the behavior by adding some logging in the data source. Let us know if the observed behavior differs. - Yes, if your retrigger e.g. a table using a trigger component, the query will be executed from scratch again, independent of how many pages were previously loaded.
- This is a very general question without a clear definitive answer that works for all queries and data source. But I can give some tips:
- Measure timings to discover where performance bottlenecks occur
- Cache data whenever possible
- Perform expensive computations/aggregations asynchronously (either using code triggered by the ad hoc data source in GQI or an external script/process)
For example, in combination with cached data, if a query is executed you can immediately return the cached data while also triggering a background task that updates the cache if its older than 30s or so.