Hi,
I am presented with a scenario where I require to know if there are limits to the usage of API calls in DataMiner.
Please consider the following scenario:
- Number of CMTSs: 300
- Number of dataminer agents: 6
- maxNumber of CMTSs per agent: 60
The following API calls are executed per agent (assume some of these are executed in parallel):
- GetElementsForProtocol (Arris E6000)
- GetTableForParameter (9000)
- GetTableForParameter (20200)
- GetTableForParameter (20201)
Assuming that I will need to:
- place 5000 API calls per day per agent
- place 1000 API calls per hour per agent
- place 10/20 API calls per minute per agent
Could this amount for calls result in any issues that could affect the normal operations of the DMS?
Do we have any guidelines on such a use API calls? A sort of "best practices to WebService API"?
Thank you for your help.
There are no hardcoded limits on this, it all depends on how many resources there are still available on your DataMiner agents, and on how much data these tables contain. GetElementsForProtocol does not require much resources as the data for this is as good as directly available, but GetTableForParameter will request the data further in DataMiner (SLElement), has to be serialized and pass through both SLNet and the WebAPI service which then has to serialize it in JSON or SOAP-XML. So requesting big tables, or requesting a lot of tables, will put extra load on SLElement, SLNet and IIS. If you're requesting the content of 10-20 tables every minute, then this will definitely put a significant amount of load on the agent.
I believe that’s a good idea, start with a low amount of calls, which you can increase while monitoring the load impact. Don’t go too far, so that DataMiner still has enough resources available to do other tasks as well.
Thank you for your feedback Wim.
I assume then the best approach would be to make a some tests with such a few calls and try to scale it from there.
Do you agree with is or do you have any other suggestion?