I want to display a rate using the Lost Packages counter that my device provides.
(e.g. Lost Packages/s)
I was planning to use the Class Library RateCalculator class from the common namespace.
The CalculateRate method requires a delta and minDelta argument to be provided.
What would be the best way to get a delta and what would be a good value for minDelta?
Note: the protocol uses HTTP sessions to communicate with the device, so I can't use the GetSnmpGroupExecutionDelta method.
Additionally, do I need to make my polling timer fixed?
Edit: Please look at the answers from David, Simon and Miguel!
They all contain valid things to look out for during development.
Best way to get a delta is to create a new column defining a TimeStamp. This time stamp would initially be 0 and all rates and counters should be 0'ed as well. delta = (Current_Sample_TimeStamp - Previous_Sample_TimeStamp) The timestamp, ideally should come from the product http reply, if not try the using the group timestamps (not sure if retrievable) or if all else fails use the actual DateTime.Now (with the risk of some level imprecision). Regarding minDelta, which is a value that defines whether or not calculations should be executed by comparing it to delta, best practice should be the frequency on which the row is updated. Anything under that value should be disregarded as it should be considered that the sample was collected too fast. A big minDelta prevents calculations from happening over and over while increasing the imprecision of the rate/s. A small minDelta allows many calculations to happen improving the precision of the rate/s at the cost of performance by actually doing the calculations over and over. In an implementation I've done, I set it to 1ms since the protocol was already using polling based on timer, so it is safe to assume that the protocol flow would never try calculating rates more times than it should.