What is the advised method to use to avoid a massive load / memory usage when logic needs to retrieve DomInstances where the result could vary from 1 to 1 million (depending on the filter) ?
Option 1 (basic):
This will simply read All the DomInstances that matches the filter. So I'm expecting a big impact (timeout?) in case we are reading a lot of data.
public static List<DomInstance> GetAllDomInstanceByFieldValue(this DomHelper domHelper, FieldDescriptor fieldDescriptor, string value)
{
var filter = DomInstanceExposers.FieldValues.Field(fieldDescriptor.ID.Id.ToString()).Equal(value);
return domHelper.DomInstances.Read(filter);
}
Option 2 (paging):
This will get a set of DomInstances per go. (spreading the load somewhat)
public static List<DomInstance> GetAllDomInstanceByFieldValueByPages(this DomHelper domHelper, FieldDescriptor fieldDescriptor, string value)
{
var filter = DomInstanceExposers.FieldValues.Field(fieldDescriptor.ID.Id.ToString()).Equal(value);
var allDomInstances = new List<DomInstance>();
var pagingHelper = domHelper.DomInstances.PreparePaging(filter);
while (pagingHelper.MoveToNextPage())
{
var currentPage = pagingHelper.GetCurrentPage();
allDomInstances.AddRange(currentPage);
}
return allDomInstances;
}
How many items are retrieved via 1 page?
Option 3 (Skyline.DataMiner.Net.Tools):
I found this, but I don’t know what it does or when this should be used:
public static List<DomInstance> GetDomInstances(this DomHelper domHelper, List<Guid> instanceIds)
{
var instances = Tools.RetrieveBigOrFilter(
instanceIds,
id => DomInstanceExposers.Id.Equal(id),
filter => domHelper.DomInstances.Read(filter));
return instances;
}
Hi Mieke,
Option 1 is the most common way to do it if you expect not too many instances.
Option 3 is the same as option 1, but instead of using a field value it is based on a list of specific DOM instances. The tools method is a converter from a list of GUID to a big OR filter (database query.
Option 2 is recommended if you want to handle instances in chunks.
By default, this has a size of 500 objects.
So if your result is 1300 instances, then you'll have 3 pages.
You can then create a sort of subscription system to handle these chunks, like for instance filling up a table or so. However, with your current example, it will not change in behaviour. because you iterate all pages in same thread and send it all back in 1 list.
Hope this will help you further.