I would like to create a log file that is available via Cube (for example via the Documents of a specific element)
This log file should contain info of Automation scripts that can be run on any agent in my cluster.
These script can run at the same time.
How do I make this happen?
Hi Mieke,
I believe that storing log files in the Documents folder is not recommended. Since these files are constantly being updated, this will cause constant sync requests across all the DMAs in the cluster. It is true that there is an option to save a file only on the DMA that is hosting the element (see DataMiner Docs), but this implies that you will need to share this folder so other DMAs can access this file.
Another option is to store these log files in a shared folder. A nice example is the logging feature implemented in SRM (see SRM Logging). This feature allows you to define a path to store the log files. Using the Booking Manager, there are links to open these log files.
Hope it helps!

Hi Mieke,
The Automation-Orchestration team is currently working on making a log file for each Automation script available in Cube. Would this help for your use case? For more info, check out task 264901 (the current server-side task) and 264916 (the upcoming Cube task).
OK, is there a way to see the code without installing SRM?
+ Does this mean there is no way to create own logging that can be seen through cube? e.g. this could be useful on DAAS systems.