Unit testing introduced

Unit testing introduced

In a weekly blog post during the coming months, Pedro and myself will be introducing you to several tutorial videos on the basics of unit testing in protocols.

When we talk about quality, we also talk about testing. And when we talk about testing, your first port of call should always be unit testing. Unit testing focusses on testing your classes and its public methods. They are quick to write, easy to maintain and provide immediate benefits.

You are testing the building blocks of what can eventually become a grand castle. Making sure that the classes you have written work well and continue to work well in the future! I.e. it facilitates refactoring ensuring that everything still works afterwards.

If you spend a little bit of time on writing your tests during development, you’ll not only save a lot of time in the future, but you’ll be much more confident in what you’ve written and delivered.

And now with the DIS Extension and the CI/CD Pipeline, you can truly benefit from these tests within Protocols. Using DIS, you can now write protocols as a solution and add unit test projects to validate your QActions.

Unlike integration and system tests that focus on interaction between different modules, with unit tests, you do not need or even want access to a DataMiner or real device. The focus lies on making sure your code does what it does without possible impact from e.g. different DataMiner versions or device firmware versions. You are cutting out all possible noise from outside sources. This gives you the guarantee your part of the code works as intended when isolated.

Wait, you are going too fast! What are all these types of tests you are talking about?

  • Unit tests: these are the smallest tests. Written by the developer usually before or during development. The focus is on isolating your class and only testing that class and its public methods. You are pretending to be ‘something’ that uses the class.
    • Within System Development, these are generally written using MSTestv2 in Visual Studio itself. Using MOQ as a mocking/isolation framework to help isolate your code.
  • Integration Tests: the intention here is to specifically test different processes working together. You will for example use a real device, actually sending a call and checking if you’ve received the response you expected. Or you will use an actual DataMiner, to test a piece of code that should create an element for example.
    • Within System Development, these are generally written using MSTestv2 in Visual Studio itself. Using an actual DataMiner installed on a staging server somewhere, or actual devices available over the network.
  • System tests: You are using the full system. In our case that would mean, you use an actual DataMiner, actual devices, a real database, etc.
    • Within System Development, these are generally written as automation scripts on a DataMiner with support of the QA Chapter. The scripts will take control and manipulate a whole system.
  • UI tests: You are testing a full system visually, by actually using the GUI as a user would see it (e.g. using tools such as Ranorex Studio or Cypress). Using Image Recognition to find and click on specific buttons and checking if they sent the right data.

I have heard someone talk about regression tests. What are those then?

Regression tests can be any one of the above. It just means that the tests are getting re-run often to avoid breaking existing code when you are working on it. You are avoiding a regression in quality of your code over time.

Every test has their own worth and is valuable, but the higher up the testing pyramid you go, the more complex and difficult the tests become to create and maintain. This means you want to always try and cover as many situations as possible using the previous layers of tests.

Be careful of falling into the trap of writing a lot of large end to end UI test and ignoring the other tests. Running those large tests can take a long time. They can also quickly become very difficult to maintain and can end up breaking due to the external dependencies. Debugging then requires you to check all those dependencies.

As a developer, unit tests are quick to write and maintain, and you generally want to start with these and do as much as possible with Unit Tests. This will avoid you having to try and cover those scenarios with the more complicated tests like Integration, System, UI.

For example:

  • You took half an hour to write 3 or 4 unit tests that make sure a specific expected response of a device is parsed correctly and we perform all the right calls to set parameters on an element. You didn’t actually use a device, you just provided an example response hardcoded.
  • When you now consider writing an integration test, where you use a real device. All you want to test now, is that you receive the right response if you have sent a specific command. You no longer care about how it parses or sets values.
  • Imagine you started with this integration test and did not have unit tests. You would now end up writing a much more complicated tests that could take you an hour or more, because you also want to verify it parses the response the way you want it. You’d have a very large test, that covers a lot, but would be difficult to actually know what broke in case the test fails and these tests will need to be refactored often or they become useless over time (as firmware’s change and the dependent software changes).

So, convinced yet about unit tests? Well if not, I urge you to give them a try anyway and I am sure that will do the trick!

Next week, we will release the first blog post alongside a video to get you started.

3 thoughts on “Unit testing introduced

  1. Simon Raine

    Great introduction to leveraging a powerful testing tool in the developer’s toolbox! Unit testing is often overlooked and it should be the first port of call in your development/testing cycle, and ideally the basis of test driven development.

  2. Ben Vandenberghe

    Indeed, great introduction! And really looking forward to the next episode and to learn how this will be applied in practice for the development of interface drivers integration new data sources into DataMiner.

  3. Emma Saldi

    Well explained! As a non-technical person it’s also great to see the bigger picture. Looking forward to the next episode.

Leave a Reply