I'm currently building a reporting application which connects to the DataMiner northbound v1 API to retrieve a considerable amount of real-time parameter values. In order to speed up the process of retrieving this information, I'm thinking about launching these requests in parallel (multithreaded) rather than in a serial fashion. Would this indeed be faster and more ecological compared to the serial approach and if so, would there be a limit to the number of parallel requests I can execute at the same time?
Yes this is possible (even with the same connection-string), actually multithreading is recommended so that you can optimally use all the available capacity of multi-core servers. IIS, the WebAPIs interface, DataMiner, and any databases, are all able to process requests simultaneously in parallel. As far as I know there's not really a hardcoded limit, so it depends on the capacity of the server(s) and of how many resources your requests take. Also keep in mind to leave some capacity for other tasks that DataMiner might be handling.
A good approach here would be to create a pool of 10 threads and cycle through this thread pool when executing the stack of method requests to be executed. Launching 500 requests in parallel will very likely cause a high load on the DMA and therefore might influence other processing actions happening at that time by this DataMiner agent.