Best Practices for Writing Tests

Sharing is Caring

When I say the word test, what’s the first thing that comes to your mind? It’s probably the boring tests that you did in school that were either multiple choice or required sentences or in math showing your work. When I talk about tests it’s usually an automated process which can execute on software and help improve the quality and act as part of the quality assurance process.

There’s a bunch of different ways of doing tests such as automated testing, manual testing, smoke tests, and then various different subtypes such as unit testing or regression testing.

Why is testing important

Testing is vital in software development. It’s been proven many times over that automated testing can help reduce bugs. Nobody wants to spend time learning How to Be a More Efficient Debugger if they don’t have to. In the Book Clean Code, Robert Martin (Uncle Bob) discusses the important of testing quite a bit.

Automated tests are usually more efficient than manual tests and can be executed frequently to ensure quality. Software developers should always be writing tests for everything that is absolutely fundamental – these tests are usually called regression tests and should be added whenever changes to functionality or features occur. In systems where documentation is sparse, the tests can prove to be very good documentation especially if they are executed as part of the build process.

Best Practices for Writing Tests

In this blog post, I’m going to explain what I consider are some best practices for writing tests. A lot of these tips and best practices have been acquired over years of development experience and from reading and reviewing a lot of code.

Properly designed tests have a binary result – it’s either a pass or a fail. There’s no ambiguity and the response should be immediately obvious because there’s an assert to ensure the response is correct.

Know the Difference between Unit Tests and Integration Tests

Unit Tests and Integration Tests are very different, but both have very good intentions. A well designed codebase will contain a combination of both to make finding bugs simpler, and help reduce the change of regression tests.

Unit Tests are tests that test the behaviour of a single unit of code – in some cases this might be as small as a function or as large as a series of classes. Unit Tests are generally very coupled to the code they are testing and have an intimate knowledge of the behaviour. Usually, they do a lot of mocking and test expected return values and test what would happen on an exception. Unit tests should not contain knowledge of other parts of the code – that means they should only fail when you’ve broke something in that particular unit of code.

Integration Tests are usually bigger tests that don’t really have a knowledge of how the codebase is structured. They are the tests that generally have the necessary mocks and pass in parameters and inspect the return value. Generally, integration tests are relatively easy to maintain and won’t break as the code is refactored or changed. Integration Tests are a great way of understanding how the system actually functions at a very high level.

Keep tests short

Tests need to be as short as possible, so they execute very quickly. A large application can easily contain a large test suite that has hundreds if not thousands of tests.

Tests should Test Only one Thing

Tests should be short, so they are easy to develop and easy to understand. Each test should only test one thing. For example, you may have a test that tests that an invalid parameter returns a 404 Not Found Error or doesn’t throw a 500 error.

One Assert Per Test Method

If you use on assert it’s easier to actually tell where things failed. A properly designed test should fail for only one reason and that reason should be because of the assert.

Writing Tests Isn’t About Code Coverage

A significant portion of my development experience has been on the Force.com platform which basically requires 75% code coverage to deploy. I say basically because triggers only need to have 1% code coverage – but code shouldn’t really be in apex triggers. (See my blog post Apex Trigger Best Practices for details on that.)

You should track code coverage, but you shouldn’t aim for 100% code coverage or demand some arbitrarily high percentage either. The reason to track code coverage is to make sure that it’s never decreasing and always improving slightly with each change. I think asking for 50% code coverage for most classes is perfectly acceptable.

Mock all external dependencies

Mocks are very important, but it usually makes more sense to use design stubs instead. A stub is designed to stand in for a method or class. Sometimes it makes sense to stub an entire class especially when doing integration tests.

Tests should have their own setup and teardown

Tests should have their own setup and teardown, so they can execute potentially execute in parallel. The order that tests will execute is usually not guaranteed by most testing frameworks, so you cannot depend on consistent side effects.

Add tests to the build process

Ideally, everything in the build process should be automated as it will result in more predictable builds. Continuous Integration and Continuous Development systems are excellent at automating this process and don’t necessarily have to cost a lot of money. Jenkins is open source and can be setup fairly cheap and run on an AWS EC2 instance with few problems.

As part of this constant build process, tests should be executed before something is deployed to each environment. Most more modern languages and systems provide testing frameworks or testing environments that can be used to do this.

Tests Should Execute Quickly and Track Failures

Tests must execute quickly so that developers can use them to help provide feedback on changes. As a developer, I always need to know if I’m going to be causing a problem somewhere else with my changes. It’s important to track failures, so you can understand if there’s a need for training for any of the team or if there’s a problems with the environment or possibly flapping tests.

Sharing is Caring

Brian is a software architect and technology leader living in Niagara Falls with 13+ years of development experience. He is passionate about automation, business process re-engineering, and building a better tomorrow.

Brian is a proud father of four: two boys, and two girls and has been happily married to Crystal for more than ten years. From time to time, Brian may post about his faith, his family, and definitely about technology.