Hello,
I wrote a scripted connector and I don't know what the error message mean.
[2024-10-02 16:15:42.099 ERR]ScriptExecutor#201027357 failed to run script 'Data Sources\Scripted Connectors\8413128f-ef3a-4141-a989-129235c8415d.py' A task was canceled.[DataAggregator.Application.Services.Scheduler.Jobs.Executors.ScriptExecutor]This is my python script : 8413128f-ef3a-4141-a989-129235c8415d.py
It appears a lot in the SLDataAggregator_042.txt
Local, my script work fine without any issues.
The output is a JSON.
At the end of the code, I execute these command:
session = requests.Session()
header_params = {
"identifier": 'Link Coverage',
"type": 'APi Call'
}
session.put("http://localhost:34567/api/data/parameters", json=json_string, headers=header_params)
Thanks in advance.
Hi,
Could you check that log file to see if you see any entry close to that one indicating that the Scripted Connector has started?
If you find one, please let us know if the time between the 2 is roughly 1 minute.
Because if that is the case you are likely running into one of the limitations mentioned here Limitations | DataMiner Docs where Scripted Connectors run at a fixed frequency of 1 minute and to prevent launching multiple instances to do the same the running one gets canceled after 1 minute.
Hi, it is indeed that file, it should be near where you find those error logs.
Alternatively, you could check how long the script takes to run locally to compare and see if it is likely to exceed the 1-minute threshold.
okay thanks i will check my script
The script run locally around 25–30 seconds. max 40 sec
It can be possible that my JSON requested is too long ?
There is a limitation where Data API rejects requests with payloads exceeding 1 MB, however, it should return an HTTP error code and not have a message indicating cancellations.
hi thanks for your answer
where can i find the log file ?
is the SLDataAggregator_042.txt the log file you describe ?