I want to display a rate using the Lost Packages counter that my device provides.
(e.g. Lost Packages/s)
I was planning to use the Class Library RateCalculator class from the common namespace.
The CalculateRate method requires a delta and minDelta argument to be provided.
What would be the best way to get a delta and what would be a good value for minDelta?
Note: the protocol uses HTTP sessions to communicate with the device, so I can't use the GetSnmpGroupExecutionDelta method.
Additionally, do I need to make my polling timer fixed?
Edit: Please look at the answers from David, Simon and Miguel!
They all contain valid things to look out for during development.
Best way to get a delta is to create a new column defining a TimeStamp. This time stamp would initially be 0 and all rates and counters should be 0'ed as well. delta = (Current_Sample_TimeStamp - Previous_Sample_TimeStamp) The timestamp, ideally should come from the product http reply, if not try the using the group timestamps (not sure if retrievable) or if all else fails use the actual DateTime.Now (with the risk of some level imprecision). Regarding minDelta, which is a value that defines whether or not calculations should be executed by comparing it to delta, best practice should be the frequency on which the row is updated. Anything under that value should be disregarded as it should be considered that the sample was collected too fast. A big minDelta prevents calculations from happening over and over while increasing the imprecision of the rate/s. A small minDelta allows many calculations to happen improving the precision of the rate/s at the cost of performance by actually doing the calculations over and over. In an implementation I've done, I set it to 1ms since the protocol was already using polling based on timer, so it is safe to assume that the protocol flow would never try calculating rates more times than it should.
Answer provided by David regarding delta calculation is correct but let me add my 2 cents regarding making the polling timer fixed:
You have 2 options there, the easy one is indeed to make that timer fixed.
The other option is to leave it configurable but in that case, you'll need to retrieve the value of the [Timer base] DataMiner Internal parameter of your element (parameter id="65017") and take it into account in order to define your minDelta.
Note that another (even better) approach is to let the end user define the minDelta as this might depend from one device to another. It might even depend from one interface to another on the same device.
In that case, you can buffer the delta and counter values and calculate the rate by using the current value against the 'latest value older than (currentTime - minDelta)'.
This is a better approach as if we poll counter values and calculate rates faster than the device itself updates its counters, it might lead to misleading results.
Note that a task has been made to update the Class Library and protocol SDF according to this last approach.
Hi Thomas,
I believe a good starting point is to have the following details from the device:
- Counter size: Are we referring to a 32 or 64 bit counter? The RateCalculator library uses different approach when dealing with 32 or 64 bit counter
- How often the device update the counters? This will define the polling frequency of the counters and set a proper minDelta value
For the delta calculation, you could use one of the alternatives already described by David. However, if the device provides you a timestamp when the counters are updated, it will be better to use this timestamp for your delta calculation.
The main challenge when calculating rates is to be sure that the driver polls the device when the counters are updated.