A customer has been creating service objects with many child elements. Many of the elements are the same physical element but using different indexes and aliases. Recently they started getting this warning notification and indeed they are beginning to experience some slowness when loading the services in Cube.
If there is no set limit does anyone know what number triggers this warning? Do we have any baseline recommendations for limiting the number of elements or parameters in a service?
Thanks for any input!
Cube has a setting to limit the number of open cards (default: 16). The same value is used to display the warning in the screenshot: if you open more than 16 elements inside a service, it can potentially load the same amount of data as if it were 16 individual element cards.
Aside from this limit, there is indeed a performance impact in Cube, in the sidepanel and the edit of the service card (non-virtualized lists). It becomes very noticable with for example 1000 elements included in the same service. This is an issue for which you can create a task.
If all data comes from the same element, it is recommended to put it in the same service child with multiple filters (rightclick > Duplicate column parameter with a new filter), unless you need it for functionality like dynamic inclusion/exclusion, alarm capping or conditional monitoring.
I believe the impact on serverside should be rather small in this case. The opposite scenario is more problematic, when including the same element in 1000 services: an alarm on that element will then have a service impact of 1000, and can cause a large amount of (service)property data to be present on the alarm.