In low code app the page is going in timeout in order to fetch data
We have some code like this
public GQIPage GetNextPage(GetNextPageInputArgs args)
{
const int batchSize = 20;
var rows = new List<GQIRow>();
var serviceList = GetElements();
_logger.Information("SERVICE LIST -- " + serviceList.Count);
foreach (var service in serviceList)
{
_logger.Information("ServiceName - " + service.Name);
foreach (var childInfo in service.Children)
{
CreateElementRow(ref rows, childInfo, service.Name);
}
}
_logger.Information("ROWS Length -- " + rows.Count);
var batchedRows = new List<GQIRow>();
for (int i = 0; i < rows.Count; i += batchSize)
{
var batch = rows.Skip(i).Take(batchSize).ToArray();
batchedRows.AddRange(batch);
}
return new GQIPage(batchedRows.ToArray());
}
Which we tried to push in batches but still if there is much data the page goes on timeout while loading, any idea how we can achieve this so that page does not goes int timeout?
A good general tip when dealing with a lot of data is splitting it up in multiple pages. This only works though if you know where the bottleneck is. If GetElements for instance takes the most part of the time, it should also try to fetch the services in a paged manner. Splitting it up in batches won't help much here and might even make it a tiny bit slower.
Depending on which visualization you use to show the data it could have a big impact on he initial load times. The table visualization for instance will only fetch the first page and lazy load the next page when needed.
Other things to consider is caching the things that take up a lot of time, by storing it into a static property (comes with a security risk if not properly managed in the ad hoc data source).
The first step to take is to pinpoint what exactly is taking up most of the time by extending the logging. Then you know where to add optimizations.