In Debug level we want to know exact the json contents that came in. Sometimes it is much more than 5000 characters.
Previously this was no problem at all.
Now we have installed 10.2, and in there lines > 5000 characters are only written partialy written to the logfile => unusable for debugging.
How can we change this unwanted behaviour in 10.2 ?
Hi Cristel,
The 5K limit for loglines has been added for very good reasons, including process crashes on huge loglines and the inability to still see other loglines in between larger ones. It is a general protection against rogue logging. The 5K limit is a good compromise which protects the system while still allowing relatively large messages to be logged in case of need.
Some alternatives approaches could include:
- Writing into a separate file aside from the protocol.Log method, e.g. using File.AppendAllText
- Creating a debug textbox parameter in the driver in which the data is dumped, so you can inspect it afterwards
- Writing chunks in a loop (as previously suggested), if the logging really needs to end up in the element logfile
Hi,
I have the same problem to be able to write out larger content for debugging purposes.
That fixed limit was introduced by RN that can be found here.
Unfortunately there is no way to disable or change that limit. In my case, when I know that a larger debug content is expected, I'm including a wrapper method in the QAction to split up the content into smaller parts and write out those ones.
Something like:
for (int i = 0; i < allLog.Length; i += 5000)
{
int remainingChars = allLog.Length - i;
int charCount = remainingChars > 5000 ? 5000 : remainingChars;
if (charCount <= 0)
{
break;
}
protocol.Log(8, 5, allLog.Substring(i, charCount));
}
Then afterwards in logging it's copy and paste work to combine it to get the original message again.
I know it's indeed not ideal but it's the only workaround that I could come up with to still be able to have some debug information when requested.
Regards,
Hi Cristel,
have you tried using C:Skyline DataMinerToolsDataMiner Restart DataMiner And SLNet.bat instead? It would at least save you from having to kill all SL processes manually, I know it’s not a solution to your issues, but it may at least help a bit.
Upgrades can indeed on those rare occasions throw you those curve balls, but the advise will always be to continue keeping your system updated.
With Dataminer today on 10.4 there is a high chance that the issues you are facing are already addressed and you are just an update away to return to stability. Do reach out to our support team to assist further on any peculiar issues with your DMS.
Tried using C:Skyline DataMinerToolsDataMiner Restart DataMiner And SLNet.bat instead : yes, but if the restart button does not work, stop and restart bat files should be used separately and in between Kill manually the remaining DMA processes.
10.4 ?
It will be already a huge semi-refactoring effort to move to 10.3, so 10.4…
It is a pity that functionalities just disapear after upgrade. Skyline keep pushing to convince to upgrade, but on the same time : each upgrade there is a downgrade of the usability.
Since 10.2 is installed on my localhost: I have 2 a 3 blue screens a day.
Since 10.2 installation of DMA on our validation systems is much more vulnerable and succeeds never from the first time.
At this moment > 75% of our 10.2 DMA installations are Not OK, while all our 10.1 DMA installations work perfect.
Since 10.2 is installed on my localhost a simple restart does not work anymore : I have to manually stop Skyline processes, and do a manually restart, while in 10.1 : just push a button and 30 secs later DMA was running again, now it takes 5 minutes