Test-Driven Development with integration and unit tests: a pragmatic approach

Updated . Posted . Visible to the public.

Test-Driven Development (TDD) in its most dogmatic form (red-green-refactor in micro-iterations) can be tedious. It does not have to be this way! This guide shows a pragmatic approach with integration and unit tests, that works in practice and improves on productivity.

Advantages

  • No added effort: tests need to be written anyway.
  • Test heads serve as todo lists. You'll always know what is finished and what is left to do.
  • Big tasks are broken down into smaller tasks that can be processed one by one.
  • You will not forget a test.
  • You will not over-test stuff.

The flow of TDD

  1. Start by listing all relevant scenarios. Just the titles.
  2. Fill the scenarios with your expectations & implement them. Do so in whatever order works for you.
  3. When writing code that is not covered by an integration test, add unit tests and first list all relevant examples. Just the titles.
  4. Fill the examples with your expectations & implement them. Do so in whatever order works for you.
  5. Whenever you notice a new requirement, add it as a new scenario or example.

While following this process, you're constantly switching between three levels of abstraction: quite abstract "integration level", your code, and the detailed "unit level".

Hierarchical approach

TDD with integration and unit tests is a process with three abstraction levels:

  • Integration level
  • Code
  • Unit level

Integration tests give the code its rough form, whereas unit tests pinpoint specific behavior.

During development, you are frequently changing levels. It may look something like this:

Integration _                    ___
Code         \____    _____   __/   \__ ...
Unit test         \__/     \_/

Integration level

Start writing a feature file. List all usage scenarios you can think of:

Feature: <Concise description of the feature>

  <If it is a complex/questionable/very specific feature, add a motivation description or give a good overview.>

  Scenario: <Basic usage example, "happy path">

  # Add a scenario for each use case. Examples are: error behavior, access rights, contexts, closer looks at parts of the feature
  Scenario: ...

(This example uses Cucumber, but the approach works with any integration testing framework.)

If required, add more feature files in the same manner. From these, the reader should understand everything to know about the new feature you're building.

When you're done sketching the feature, start writing out the scenarios and implementing them. You may write the scenarios first, then implement, or you jump right into the code and write the scenarios later. Whatever works for you – the scenario titles keep you on track.

Code level

Here you write the code that the tests demand. Make sure to never write untested code! Long stretches of code may be covered by a single integration test. However, introduction of a new method or class, or a new if are an indicator that you might need to add unit tests.

Unit level

Unit tests make sure some isolated code works as expected. For each method you're testing, start by listing all usage examples you can think of:

describe ... do  
  it '<basic usage example>'
  it '<other usage example>'
  it '...'
end

From these, the reader should understand everything to know about that method.

When you're done listing examples, start writing them out and implementing them. As with the integration tests, the order of test-writing and implementation is up to you.

When you're done, return to writing the code. The specced thing now exists, so you can continue making that integration test green.

References

Dominik Schöler
Last edit
Dominik Schöler
License
Source code in this card is licensed under the MIT License.
Posted by Dominik Schöler to makandra dev (2020-04-23 14:43)