Profile

Rene De Posada

User info

First name Rene
Last name De Posada

DevOps Program

Acquired rank
Enabler
Points progress
Number of DevOps Professional Points
2968 DevOps Points
Here are a few tips to level up your DevOps game and unlock an arsenal of perks and benefits.
DevOps attestation Request your attestation ID and expiry date

Achievements

Questions asked

Answers given

A custom command that returns the most important health KPIs of my DMS at any given moment: Tracking the health of a DataMiner System | DataMiner Docs

View Question
36 Votes

Hi Adama, Not sure which item you are trying to recover but here is some information regarding the restoration of certain DataMiner components. Recycle bin | DataMiner Docs Backup - Using the recycle...

View Question
8 Votes Selected

It is possible to execute a script from a driver. To do so, call the ExecuteScript method as specified in the connector development user guide (Link).

View Question

Rebecca, While the number of columns is very flexible, there are limitations linked to the maximum number of parameters (see Limits) that a single DMA can host. In the case of tables, each cell counts...

View Question
6 Votes Selected

Bing, The reason you are not able to see the DataMiner IDP Connectivity element in your alarm filters (see Link) is because this element is hidden by default since a few IDP releases ago. To show this...

View Question
6 Votes Selected

Hi David, Below is an example on how to achieve the results you are looking for. Using a Microsoft Platform element Configure a GQI query using the "Get alarms" data source Filter by Element name...

View Question
6 Votes Selected

Hi Jeff, Please see some answers below. See answers here: (https://docs.dataminer.services/user-guide/Advanced_Functionality/DataMiner_Agents/Installing_a_DMA/Installing_DM_using_the_DM_Installer.html#custom-dataminer-installation)...

View Question
5 Votes Selected

Hi Joe, Could you try adjusting the custom bindings following the specifications for the element properties (Syntax of OIDs referring to properties)? The OIDs shown above appear to be missing parts of...

View Question
5 Votes Selected

Philip, DataMiner 10.0.13 was the last feature release after which 10.1.0 main release became generally available. Please, also see DataMiner Main Release vs. Feature Release | DataMiner Docs. With the...

View Question
4 Votes Selected

Roger, This is a perfect use case for DataMiner and the architecture is rather flexible. Please see below for a common approach. DataMiner receives trap or generates native event/alarm DataMiner...

View Question
4 Votes Selected

Hi Jens, Expanding on Edson's response, for as long as the connectivity script has access to the DCF configuration (we have used flat files in the past, but if access to DOM is possible then that would...

View Question
4 Votes

Paul, DataMiner Cube is only supported on Windows. From a Mac OS or any other operating system, you can access the DataMiner functionality using a standard browser via the DataMiner web applications....

View Question
4 Votes Selected

Hi Stefan, Another alternative could be to create a dashboard and use the native query filter to search across all elements running a specific protocol. Step 1: Create a query that fetches the target...

View Question
3 Votes Selected

Bruno, Time to live is all about data retention. Depending on the DataMiner version, these settings may be found in different locations. I will try to explain using the latest feature release as reference...

View Question
3 Votes Selected

Javier, If I understand your question, correctly, you are looking to perform the lcoal offload every 20 minutes but want the actual forwarding to happen every 30 minutes. For this, you'd need to to specify...

View Question
3 Votes Selected

Hi, Yes, it is possible to display the list of alarms associated to a group (e.g., under a view) of DataMiner entities in the web apps. Using the alarm component (Alarm table | DataMiner Docs), it...

View Question
2 Votes Selected

Hi Dennis, The inter DMA synchronization is handled by the SLDMS process. You can find an overview of DataMiner processes here (Link). For more information on synchronization, I recommend the following...

View Question
2 Votes Selected

Hi Tser, DataMiner already integrates with devices (IoT) using MQTT. You can use readily available Nuget packages such as this (MQTTNet) to write a DataMiner connector (protocol, driver) to bring data...

View Question
2 Votes Selected

Hi Steve, This is possible by passing the information in the dashboard URL. Please see [Specifying data input in a dashboard URL | DataMiner Docs]. Thanks,

View Question
2 Votes Selected

Hi Steve, The cloud team is working diligently to bring relevant information associated with packages directly to the Catalog. However, at the current moment, this information could be found in multiple...

View Question
2 Votes Selected

Jesus, A DCF connection is just a link between two known interfaces, in your case between devices. Therefore, having both devices confirming the relations is always the preferred approach. Having said...

View Question
2 Votes

Cristel, Could it be that the element is actually hidden and not deleted? Hidden elements | DataMiner Docs If for some reason, the element went into a rogue state, you could try using the client tool...

View Question
2 Votes

Dario, View tables are normally used to bring data from actual tables into a unified view. This means that the alarming should be applied against the actual parameters holding the data. If you enable...

View Question
2 Votes Selected

Yvan, For security reason we always recommend to use HTTPS when accessing DataMiner via a web browser, and if the correct settings have been applied (Setting up HTTPS on a DMA | DataMiner Docs) you should...

View Question
2 Votes

Depending on how you are implementing the connection (standard or via QAction), I'd say you can either disable timeouts for the specific connections or use group conditions to not even execute the sessions...

View Question
2 Votes Selected

Sebastian - Things you could try: Use a service protocol on your services and have a parameter with the name of the service parent view Name your services in a way that you can identify them by...

View Question
2 Votes

Jeff, This has the potential of creating problems even for Cassandra. At such high frequency, we have seen that a a large number of tombstones (unprocessed records) start accumulating in Cassandra, which...

View Question
2 Votes Selected

Wale, It seems not possible to do this for all elements in the system using GQI. Using the Get Parameter By ID gives you access to the Created Property but only for a single element. However, you could...

View Question
2 Votes Selected

Alberto, The export to CSV file has limited information and it is not the intention (or secure) to handle credentials via this operation. For your use case targeting SNMPv3 I would recommend using the...

View Question
2 Votes Selected

For this use case, the Pivot table would be more than enough. You can use the parameter feed and pass the parameters you need to the table. You could, also, add avg, min, and max to the results. Showing...

View Question
2 Votes Selected

Hi Paul, There are a few considerations when setting up the anomaly detection, which might be causing your configuration not to render the expected results. Below are some of them: Ensure that the...

View Question
1 Vote

Hi Shawn, Just tested against 10.3.12.0 and see the same behavior. This seems to me as a design choice as the masked alarm is still linked to the view. Moving the element out of the view resets the latch...

View Question
1 Vote Selected

Hi Samson, If you are using an indexing engine (Elastic, etc..), this post provides an explanation as to how to go about estimating the memory space alarms and information events could take on a system...

View Question
1 Vote Selected

Alberto, You can configure Cube to only use polling see (Eventing or polling | DataMiner Docs). This is less performant but should align with what you are trying to achieve if I understand correctly?

View Question

Hi Stacey, The migration to Cassandra procedure can be found here (Migrating the general database to Cassandra | DataMiner Docs). To install Cassandra (Linux is now recommended as SO): Installing Cassandra...

View Question
1 Vote

Tiago, If you are looking to load large number of alarms into the dashboard, I'd recommend using GQI (Link) and the table component (Table | DataMiner Docs). This combination allows you to make use of...

View Question
1 Vote Selected

Tobias, Most, if not all, of the tasks mentioned above (correlations are tricky (see https://community.dataminer.services/question/mechanism-to-export-correlation-rules/)) should be possible via automation....

View Question
1 Vote Selected

Naveendran, Block storage is used by instances/VMs for consistent data storing and this is what should be used for the standard DataMiner, Cassandra, and Elastic installations. File/object storage such...

View Question
1 Vote Selected

Jarno, I don't see any reason to drift away from the general recommendations of 30-50 ms (https://community.dataminer.services/dataminer-compute-requirements/). Thank you,

View Question
1 Vote Selected

Bernard, An option would be to keep the rows in the table for certain period of time after they are not being returned and use a column to mark the row accordingly (e.g., Present or Removed), and then...

View Question
1 Vote Selected

Another option could be to enable data offload via file export for the specific parameters. This will allow you already get the offloaded data in in CSV files following the specified data resolution....

View Question
1 Vote

Christhiam, Looking at the numbers as presented above, the node calculation seems reliable and it should cover your numbers safely. The multi-cluster approach is the recommended architecture, and indeed,...

View Question
1 Vote Selected

Alberto, The situation you explain is very common for connectors that target a specific product line. The bottom line is that as vendors evolve their offerings, APIs, MIBs, etc., also evolve to accommodate...

View Question
1 Vote

Flavio, This should be possible by implementing the appropriate logic within the service definition connector, since there are, already, calls that could be used to retrieve the properties associated...

View Question
1 Vote Selected

Jochen, It is possible to integrate with JAVA using .Net and we have done this in the past by converting the native JAR libraries into DLLs that can be used by DataMiner connectors. The tool we have used...

View Question
1 Vote Selected

Robin, As specified in the connector development user guide (Link), you can have access to all service parameters using the Engine methods. A snippet of code would look like this: Thank you,

View Question
1 Vote Selected

This use case can be achieved using the Dashboards app QGI or Pivot table component. Using Pivot table Configure the component to report on one or more parameters Enable trend statistics A CSV...

View Question
1 Vote Selected

Hi Dennis, As explained in this tutorial (Automatically detect anomalies with DataMiner), it is recommended that at least one week of data is present, and that both real time and average trending are...

View Question
0 Votes Selected

Hi Henri, As stated in the documentation [Link], the user settings are stored in the DataMiner server, so if you have admin access to the system and if your IT policies allow it, you could try copying...

View Question
0 Votes Selected

Hi Philip, Based on your description where you are currently running Cassandra and DataMiner on the same machine without indexing, you could move into any configuration described here (Separate Cassandra...

View Question
0 Votes Selected

Hi Ciprian, The indexing engine (Configuring an indexing database | DataMiner Docs) is necessary if specific features (DOM, SRM, UDAs, specific GQI data sources) will be used in combination with LCAs....

View Question
0 Votes Selected

Just updating the post with the results of our investigation. As it turns out the alarms mentioned in the original post had been generated already prior to the introduction of the RCA links, which mean...

View Question
0 Votes Selected

Hi Kristopher, With the pivot table (Pivot table | DataMiner Docs) in dashboards, you can easily achieve this behavior. Below you see an example where the Avg, Min and Max are shown for the Total Processor...

View Question
0 Votes

Michiel, Hard to tell where the issue may be with history sets, but below are some possibilities that we have seen impacting history sets:  Could it be that some data points coincide with the exception...

View Question
0 Votes Selected

Bruno - the storage architecture for distributed systems continues to be the same (Recommended setup: DataMiner, Cassandra, and Elasticsearch hosted on dedicated machines, with a minimum of three Elasticsearch...

View Question
0 Votes

Randy - Could you try applying the conditions to the Configure Indices section instead of the Configure Elements and see if it works that way? 

View Question
0 Votes

Paul, Multithreading could also be a viable option for the use case above since the tasks executed by the QActions are logically decoupled. Multi-threading | DataMiner Docs

View Question
0 Votes

Wil, Redundant connections are leveraged as described here: Redundant polling | DataMiner Docs Other useful sources are: PortSettings element | DataMiner Docs Type element | DataMiner Docs Session...

View Question
0 Votes Selected

Bernard - This is not currently possible in the way you present it here. However, you can make use of custom properties to play with a custom parameter description.

View Question
0 Votes Selected

Ryan, While the time zone where the servers are located does not matter, it is recommended that all servers in the cluster (DataMiner, general database, and indexing engine) are using a synchronized time...

View Question
0 Votes Selected

Daniel, Q1.A: it does not seem possible to show the legend in any other place other than on the bottom of the graph. Note that you could decide to collapse the legend by default. Q2.A: it does not seem...

View Question
0 Votes Selected

We were recently asked to provide something similar and a workable solution would be to use the information events. Every time a user logs in/out of DataMiner an associated information event is logged....

View Question
0 Votes

Stijn, If SLNET was not running, DataMiner was not operational. If this happened during the upgrade attempt, the upgrade shouldn't have been successful either. To determine what could have caused the...

View Question
0 Votes Selected

Javier, As specified in the DataMiner User Guide (Configuration of DataMiner Processes) it is possible to configure how many of this processes run within each DataMiner Agent. However, please, keep in...

View Question
0 Votes Selected

This use case could be achieved via the Dashboards Alarm component (see DataMiner Dashboards for more on this app), which could be set up to include only timeout events and further refined to any specific...

View Question
0 Votes