Hi Dojo,
A while back I attended Kata #18: Creating a scripted connector to learn about scripted connectors. This was great for me to practice around and see this new feature in action!
One of the items that was mentioned, is that "today" this API can only be leveraged from within the DataMiner system only. And it seems Skyline already listed this features extension to become available later down the line (date to be defined):
Enabling Data API to accept requests from external systems, providing the capability to host your Scripted Connectors in your own environment and, for example, leverage them to push data from the ground to the cloud, from docker containers, from one operational domain to another, etc.
I'm having a use case where I already wanted to push content from an external system into my cloud connected DataMiner system using this DataAPI.
In order to bypass & workaround, I'm combining this DataAPI feature along with the use of custom User Defined APIs (Thank you Kata #2: User-Defined APIs).
Please have a look at the DataAPI-Proxy script (GitHub) that I created. This proxy script can be used out of the box to transfer messages from external systems onto the local DataAPI "scripted connectors".
I have added an extensive "readme" on the repo in order to guide you through the steps to setup this proxy on any DataMiner system.
Note:
With every data push this will generate a script-run. Also taking into account that the DataAPI feature is still in soft launch, It's advised for testing this solely on staging environments for now with a controlled number of requests.
My questions to the community:
- Is there visibility on a ETA when external systems will be allowed to push data into DataAPI?
- Is there any feedback I should consider with my current workaround? (taking into account that I'm managing the number of data pushes to a stable update rate of 1 request / min.
Thank you!
Hi Thijs,
I'm afraid there's no visibility on a ETA when external systems can push data into Data API. Although your approach is probably not the best to handle loads from production systems, it does serve as good demonstrator (and validator) of this feature. So, thank you for contributing.
Besides taking into account the Data API limitations, you should also bear in mind that clients leveraging the proxy script, will need to be changed e.g. the identifier and type are expected as HTTP Header fields for Data API, not as query parameters.
I also like to share that Data API now offers API calls to configure units and precision (for decimals). See RN DataAPI 1.1.3 - New configuration endpoint for units and decimals [ID_39016] In case you want to further extend the features of the proxy script, that would be my first pick :).
Thanks for the feedback JK. So far I didn’t see how to fetch custom incoming headers in the User Defined API handler. That’s why I opted to pass them over as URL encoded parameters to the UD-API proxy. Thanks for pointing out the new available features on DataAPI, I’ll have a look if we can leverage the customizations for our use-case, and also make that available via the proxy script to support the “config” part of it! Thanks!