DEV Community

Cover image for Unit testing: best practices
Elena
Elena

Posted on • Updated on • Originally published at smartpuffin.com

Unit testing: best practices

This post was originally published in my blog smartpuffin.com.

I have been unit testing my code for many years.

While building a GIS-system, we really cared about our product quality. Our users' needs demanded the app to work properly. I had all critical and/or complex parts of code 100% test-covered, with multiple paths and corner cases. It was such a pleasure to find a bug, fix it, write a couple of tests for this surprise scenario, and be sure it won't break again. Ah, good times.

Here is the collection of best practices that I've built over the years. Please note: I'm going to assume you're familiar with the concept of unit testing already.

What to test

Of course, you have limited resources. Of course, to test absolutely everything will require an enormous amount of time. And, of course, you need to explain unit testing and its benefits to your project manager and negotiate with them. That's why you need to prioritize.

The best way to prioritize is to look at the business needs.

It may sound surprising for some developers. You might think: tests are code, they don't have anything to do with business.

In fact, all code has to do with business. Ultimately, you write code to solve a problem your users have, and that's a business need.

But I digress. Say, you don't have any tests but you really want to start. Brilliant!

Pick the most critical part of the system. Is it your customer's purchase experience? Is it tools with which your users work? Is it the data flow? Is it a complex math calculation?

Identify this place and start from there. Since it's the most important part of the system, which absolutely has to be working properly, you can justify spending time on unit testing.

Why is unit testing needed

Unit testing helps to make sure the code works correctly. Unit tests are a great way to do regression testing: simply run them every time you change something and make sure nothing breaks. Spotted a bug? Fix it and write a unit test to make sure it doesn't happen again.

For your identified important parts of code, you need to write unit tests to make sure the code works:

In base cases

When the users and the other code do what's expected from them.

These tests are the easiest to write - after all, we all are aimed at success :).

  • Pass proper arguments to your function and make the test expect the proper return value.
In corner cases

You want to make sure the code works in corner and edge cases - when a rare scenario happens.

  • Pass 0 to your math function. Pass -1. Pass INT_MAX. Pass an empty string.
In case of failure

You also need to verify that the code breaks properly. For example, if a financial operation failed, we have to be sure that the money didn't disappear altogether.

  • Pass a NULL object reference. Do something that generates exceptions.
  • You think this is not possible because the code calling your function doesn't do that? Maybe right now it's the case; but the code changes. If the frontend form validates all input, but tomorrow someone refactors it, you don't want your backend to start failing unexpectedly.

 

After covering the most important code with the comprehensive unit tests, you can work your way towards less critical code, with less unit tests. Prioritization!

Modularization and dependencies

Sometimes, the first step would be to prepare your piece of code for unit testing. This is when modularization comes handy! The more modular the code is, the easier it is to test.

After you identified dependencies and split them properly, the rest is easy. For a unit test, you have to mock the dependencies, to make sure you're only testing your current module's behavior. Otherwise you will be testing also the behavior of other things it depends on. But these things deserve their own tests.

You also can decide to test the module with dependencies. While some might say it is not unit testing anymore (but rather integration testing), it is still testing, and it it still useful.

Test structure

Test naming

There are several approaches to naming the test methods. In my opinion the most important thing is - that the name should be descriptive. As descriptive as possible. Let it be 200 characters long - the more clear, the better.

Small tests

Since all tests have very descriptive names, you can add as many small tests as possible - for all your nooks and crannies of corner cases. Make sure your test only tests one scenario.

Folder structure

For me, it turned out to be the most convenient when the test project structure repeats the main project structure. This way it is easy to find the tests for the module.

Independent tests

Make sure your tests don't depend on each other. Perform a cleaning step in the beginning and in the end of each test, if you need it - there are usually special methods in the unit testing framework to do that.

Change the code - run tests

Sometimes I hear someone asking: how do I understand which tests to run when I edit the code?

Run them all!

Since the folder structure for tests repeats the one for main code, it is easy to find a test package related to your change.

When you performed the modularization step, and also when you made them small, you already made sure they run fast. So you can afford to run the package of tests quite often, for example, while you develop and before pushing.

Add the code - write tests

Changing the important piece of code, which, according to your business needs, must work? Write a test.

Fix a bug? Write a test. Make it a habit! Tests are an important part of code, so make an effort to keep them up-to-date, and they will save you many times in return.

That's okay if tests outnumber the code

In fact, that's expected!

It's great if you have many tests for one piece of code - that means you check a lot of cases.

It's okay if you spend time on writing tests - sometimes, even more than on writing code. Tests are a safety net for your critical business parts, remember?

When unit testing is not needed

In my project, we didn't cover all our code with unit tests. For some, we made conscious decision not to do it. So, when is it not needed?

When your code is not critical and your business can afford some mistakes in that part of the system.

When the effort to make the code testable and write tests is too large, and testing manually requires less efforts. You need to estimate the efforts cumulatively, for the period of time you plan to maintain the project.

When there is a dependency which you cannot abstract away: you moved it out from all other modules to just one, but you just cannot remove it from there. In this case you can unit test all other modules, but this one will have to live as is. A popular example is a module using current timestamp.

When it's a prototype, a proof-of-concept, or experimental code, and you need to develop it as fast as possible, and you're probably going to throw it away anyway.

And last but not least - when you are sure your code is the best :)

Top comments (20)

Collapse
 
rafalpienkowski profile image
Rafal Pienkowski

Hi Elena.

First of all great article. I would like to add mine observation about unit tests.

In discussion with the business about writing unit tests necessity, I used to use the argument that unit test help us discover a bug earlier and at the end of the day they are cheaper than manual testing. Costs always play role in discussions with the business. 😉

About modularization and dependencies, as you said, unit tests improve the architecture of our code. It is easier to follow SOLID principles when we're writing unit tests. In my opinion, we can treat unit tests as a guard of SOLID principles in our code. If something is painful to test it smells like not SOLID enough.

About code which doesn't require unit tests, I totally agree with you that we should be aware of the amount which is needed to cover a particular piece of code with a test and in the case when costs are higher than benefits, we should resign of writing it. BUT it could be dangerous in some cases. We can end with code which is hard to test due to its dependencies, which is fragile and not maintainable. We can end a vicious circle. On the one hand, we want and need to write a unit test and on the other hand, writing it is very, very time-consuming. I think that sometimes we should pay the costs. The longer we're waiting, the cost is rising. If we are talking about some legacy code, which doesn't change at all, I think we can deal with this. When we need to maintain the legacy code, we should pay and write unit tests.

To sum up, great article. I've read it with real pleasure. Cheers.

Collapse
 
ice_lenor profile image
Elena

Hey Rafal, thank you!
The importance of testing depends on business priorities. After all, tests consume time, and sometimes the business decides to stay on the side of development speed, rather than quality.

Sometimes, when this happens, the code becomes untestable and also harder to change, but at this time refactoring and tests cost even more. They are a long-term investment. This is a good time to make the statement about the future costs you were talking about, and start to introduce testing from the most critical parts.

Collapse
 
rafalpienkowski profile image
Rafal Pienkowski

Exactly, this was my thought ;)

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

I disagree with the need for mocking and I think people spend way too much time setting up mocks. The rest of your modules should alos work, since you've tested them as well, thus it's totally fine to rely on them working to write new tests.

The goal is, as you said, to test business needs. There isn't a strong objective to have modules tested independently.

This isn't to say that mocks aren't useful tools, just that they are being too widely used in situations when they just aren't necessary.

Collapse
 
ice_lenor profile image
Elena

I think I understand your point of view - you are talking about integration testing.

But I have an objection. When your code is complex enough and has several nested dependencies, it becomes very hard to test all combinations of all code paths with integration testing.
Imagine this:
{code}
class A {
void methodA(B b) {
if(condition) {
b.methodB1();
} else {
b.methodB2();
}
}

class B {
void methodB1() {
if(condition) {
doSomething();
} else {
doSomethingElse();
}
}
void methodB2() {
if(condition) {
doSomething2();
} else {
doSomething3();
}
}
{code}
If we don't mock while testing class A, we end up with a dependency on a particular implementation of B. How many tests for A we need to write now?

If in the code we use different B's, say, different subclasses, do we need to test these combinations as well? This is a lot of tests for A.

Now, if there is a problem in B's implementation, all tests for B and A will break, and it will be harder to understand where's the problem.

I think there should be a healthy combination of pure unit-tests and integration tests. In case of doubt, and especially if the logic is complex, I would prefer to have unit-tests as well.

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

If you're testing A you still limit your testing to A, on the assumption that B is working. That is, just becasue B branches on a condition doesn't mean you test it in A's tests.

What you're essentially doing is just using a current implementation of B as the mock object. You know how it works and rely on that for the testing of A. This is sufficient to demonstrate that A is working correctly.

If B breaks, then yes, tests in A will also break, but so will tests in B. Unless you have circular dependencies you still have a tree of changes to work through in order.

Keep in mind the primary value of unit tests is to ensure things are working. Debugging is a secondary value. Getting more coverage by reusing code is of more immediate value than getting clean mocks. I also start with the breadth of coverage, and if, and only if, debugging is proving problematic, do I start writing focused mock objects.

Thread Thread
 
nestedsoftware profile image
Nested Software • Edited

I agree with edA-qa on this point. Creating multitudes of mocks just to keep unit tests 'pure' seems like a great deal of effort to create and maintain additional code, and the benefit of doing so seems unclear to me. As edA-qa said, the real code can usually do double-duty as a test object.

There are cases for using mocks, but my opinion is that they should be introduced when depending on the real application logic in a test causes a problem. That can be for performance reasons; possibly to reduce the amount of setup/configuration needed just to run unit tests; perhaps because the real code may not be available when tests run; often mocks are used because time affects the results of the real code - in fact I imagine that things like the current system time and calls to random are some of the most often mocked pieces of logic.

Note: For this comment, by 'mock' I mean any kind of stand-in used for a test that replaces the actual application code. That includes 'mocks', 'stubs', 'fakes'...

Collapse
 
sandordargo profile image
Sandor Dargo

Thank you, Elena, for this really interesting article!

I have a couple of questions. You wrote that the tests' directory structure repeats the production code's directory structure. Do your tests completely repeat the production code's structure? Do you change your tests, when you refactor? In other words, are your tests contra-variant with the structure of your production code?

You also wrote that we have to explain the importance of unit testing to the business. What scenario do you mean? When you take over some legacy code and you have to turn it into something maintainable or when you start working on some legacy project?

Thanks again for your article, I think you emphasize on some very important practices that so many us lack.

Collapse
 
ice_lenor profile image
Elena

Hi Sandor, happy you liked it:)

Yes, for us the tests repeated the code structure in terms of classes - one class called, i.e., ToolX had a complementary class called ToolXTests.

Regarding refactoring: if while refactoring I change the logic or the API of a module, then I have to change the tests and make sure they pass.

About the business. Sometimes I hear people starting a new project, or already in the middle of a project, and struggling: do we need tests? When do I start with tests? How to explain unit-tests to the management? Business needs are a good way to prioritize and begin.
For a legacy project which you have to support, it is also considered a good practice to start with tests. This way you're supposed to understand better what the code does, to make it more maintainable (through modularization and decoupling), and to provide yourself a safety net for code changes. I never used this approach with legacy code myself yet, but I do see the pros.

Collapse
 
sandordargo profile image
Sandor Dargo

Thanks for your answer, Elena.

A few months ago, one of our architects commented on my pull request in which I refactored some classes and I was copying the new structure to the tests. He told me that I was basically coupling production and test code and he also shared with me this article about test contra-variance. You might find it interesting.

I think it's always a good practice to start with tests. Unless you are really, I mean really sure that it's just a bit of throw-away code, most probably a prototype.

As I try to follow TDD as much I can, I don't even feel the urge to explain unit testing to the business. It's just the way I write code. Unit tests are part of the code, their cost is part of the estimation. I don't explain to them if statements or polymorphism either, but more the relevant decisions about the business logic we have to make.

Thread Thread
 
ice_lenor profile image
Elena

I have a bit of a different point of view than that article. Let me try to explain.
For me tests are the integral part of the code, just like comments or naming. So I don't really see a problem with changing tests while refactoring.

The most important thing about the code, for me, is how easy it is to read and understand. I do my best to keep the code as plain as possible because I am probably not smart enough to understand and remember complex code. If the tests are not reflecting the code structure at all, like they propose in the article, it becomes very hard for me to understand how tests and code are connected, where they are, and which parts of code they are for.

If I refactor by moving a piece of code to a separate public class or method, I do want a test for it! They propose to only have integration tests for the generic API. But I want to have a unit-test for this new piece too, for several reasons:

  • it is a public API now, can be used in many places independently, so doesn't make sense to only test inside the old code;
  • if we consider tests "documentation" for the code, we need a piece of it for the new code as well;
  • again, I am not smart enough to properly and comprehensively test a complex piece of code which is the generic API. It is easier for me to test in pieces. Modularization and decoupling - also of tests.

Regarding the TDD, I find it an interesting concept, but I never managed to actually start with tests. I don't start with API, I always need some time to crystallize it based on implementation.

Thank you for a very interesting discussion!:)

Thread Thread
 
sandordargo profile image
Sandor Dargo

I must agree, the most important thing about the code is to make it easy to understand. Then if we understand it, we can correct it, if needed.

Thank you too, it's always interesting to learn about other ways to see the things!

Collapse
 
hilaberger92 profile image
Hila Berger

Hi Elena, great article!I really enjoyed reading it :)
Which unit testing/mocking frameworks do you use?

Collapse
 
ice_lenor profile image
Elena

The project was mostly in C#. We had NUnit, MSTest, and Rhino Mocks.
For the part that was in Java, I think we had JUnit and Mockito, but here I'm not so sure - it was more than 2 years ago:)

Collapse
 
hilaberger92 profile image
Hila Berger

Were you satisfied with these frameworks?
Have you heard about Typemock by any chance?

Thread Thread
 
ice_lenor profile image
Elena • Edited

I liked the C# ones - they were rather easy to use and rich in features at the same time. I.e., you can "expect" how many times a certain function is called, or make your mock return different values on different calls (although the syntax for that wasn't very easy).

I don't think I tried Typemocks, so can't say anything about it, unfortunately.

Thread Thread
 
hilaberger92 profile image
Hila Berger

Thanks!
I'm asking because my team and I are working with Typemock, and your article made me realize how powerful it is.

Thread Thread
 
ice_lenor profile image
Elena

Oh how nice:)) happy I helped!

Collapse
 
ismailmayat profile image
Ismail Mayat

Great article. The article covers adding tests to an existing code base however it leads back this confusion of tests and test driven development where you write tests first to drive the development of the system architecture the code coverage is an added benefit thrown in.

I highly recommend Ian Coopers TDD where did it all go wrong youtube.com/watch?v=EZ05e7EMOLM and of course the classic Kent Beck TDD book and the 3 laws of tdd.

Collapse
 
idanarye profile image
Idan Arye