Hello Everyone,
I am working on integrating my first DataMiner driver, which uses HTTPS communication. I am still new to DataMiner connectors and need some guidance.
I have an API that returns a JSON array, where each entry contains 40 parameters (e.g., 'Audio 1', 'Audio 2', etc.). This is the only API available, but I cannot store all 40 parameters in a single DataMiner table.
What is the best approach to split and distribute these parameters across multiple tables? Can QActions be used to process and insert the data correctly? If so, how should I structure the QActions to handle this?
Alternatively, is there another way to efficiently separate and store this many parameters?
Any advice or examples would be greatly appreciated.
Hi Maya,
You could definitely use a QAction to do such logic. Upon receiving the HTTP response from your HTTP session, you can trigger a QAction to process the JSON object.
With a library like Newtonsoft.Json you can then deserialize the data and work with classes to decide where to place each bit of data, being all in the same table or each part in its own table/set of standalone parameters.
If you need assistance with specifics on how to integrate an HTTP API in a DataMiner connector we do have this nice course DataMiner Connector Integration: HTTP Basics - DataMiner Dojo that could assist you, especially the section Connector Integration: HTTP basics.
There is an alternative that you could use that would do this automatically, but it is not via a DataMiner connector but instead via Data API and Scripted Connectors, which is a new feature being developed. Do note, however, that this is still in Soft-Launch, and as such, it is not recommended for production environments.
You can see an example of how it can be done here: Kata #18: Creating a scripted connector - DataMiner Dojo