Based on a successful Assessment, we had identified that there is high value in adopting Clean Architecture, Test Automation and Clean Code for backend applications.
Why coaching? Learning just by reading books or watching courses is often ineffective - because knowledge is acquired on simple examples but we don't know how to apply it to complex real-life examples. This is where coaching helps, it provides on-the-job learning and feedback.
The following is our transformation process for backend applications, whether they're monoliths or microservices. The guide below is a very rough outline. In general, we try to adopt a "slicing" approach, whereby instead of applying each of the transformations below across the entire codebase, we may apply it to a module, to illustrate breadth to the team.
Delivery Pipeline
Having a delivery pipeline is a "Level 0" requirement for reaching quality. The minimal requirements for a pipeline are: Automated build, automated test, and automated deployment. It is also recommended to set up Linter and SonarQube because they will help streamline code reviews. Furthermore, it is also recommended to set up Code Coverage which helps us identify which code is not covered by tests at all (and later add Mutation Testing to detect code where tests are executing the code but are not verifying it - assertions are missing or inadequate).
Clean Architecture
In enterprise applications, the largest complexity and the highest maintenance cost is in the business logic itself. We cannot refactor the business logic code if there are no corresponding unit tests (fast-running, isolated, in-memory tests). We cannot write unit tests if the application architecture is not testable. The "minimal" testable architecture is Hexagonal architecture - isolating the business logic from infrastructure, so that we are able to test the business logic in isolation from the REST API, DB, and any other external concerns: Core, Driver Ports, Driver Adapters, Driven Ports, Driven Adapters. We introduce layers based on Clean Architecture: Presentation (REST API Controllers), Use Cases (Use case Handlers), Domain (Entities, Repository Interfaces, External Service Interfaces, Clock Interface), and Infrastructure (Repository Implementations, External Service Implementations, Clock Implementation). Effectively, the Core is now "testable"; it is ready to be unit-tested in isolation from any I/O concerns.
Test Automation
Based on introducing Hexagonal Architecture into the monolith or the microservice, we're now ready to be able to introduce relevant tests (retroactive tests - Test Last Development), we may also use Code Coverage and Mutation Testing to help us write the missing tests (fill in "gaps" in the test suite):
Unit Tests - targeting the Use Cases (this will constitute the majority of tests)
Integration tests - targeting the Presentation and Infrastructure (this constitutes the minority of tests)
System tests - spanning Presentation, Use Cases, Domain, and Infrastructure (very few tests)
Our approach regarding Unit Tests is based on the approach of the Classicist TDD school, aka Chicago style (Uncle Bob, Kent Beck, Martin Fowler), known as "sociable unit tests" where "mocking" is applied only to the architectural boundary (mocking out I/O sources and sources of non-determinism). More specifically, within Clean Architecture, the Unit Tests target Use Cases, as was illustrated by Uncle Bob.
Clean Code
Since we have introduced automated tests above, we can safely refactor our code towards clean code; the tests serve as a safety net against regression bugs. Most of our refactoring is focused on the Use Cases and Entities, and a secondary focus is on Infrastructure and Presentation. Whilst refactoring, the sequence in which we refactor is typically as follows:
Compiler Warnings, SonarLint, Linter, SonarQube
Large classes and large methods, applying SOLID principles
Duplication across classes and methods, applying DRY principles
We move from simpler to more complex refactorings. After the foundations above, we can also move to more advanced refactorings, so that we get cleaner and cleaner code.
Test Driven Development
In the above steps, we had written tests last. Now we switch the sequence, we start with test first, and code second. The TDD cycle is Red-Green-Refactor, which means we start with a failing test, we write code to make the test pass, then we refactor the code and ensure the test still passes. We apply TDD for new development:
For new User Stories, we read the acceptance criteria. We formulate the expectations in our mind. For each expectation, we write the expectation in executable form, as a test, which is an executable specification, we then write some code to satisfy the test and check that the test passes, and we're now free to tidy up the code and ensure the test still passes.
For new Bugs, we read the bug description. We write a test which specifies the desired behavior, verify that the test fails (this means we've reproduced the bug). Then we fix the code, and verify that the test passes, which means we have successfully implemented the bug fix. We can then also tidy up the code.
Domain Driven Design
Most likely, the existing project has an anemic domain, whereby the Entities are just data structures with getters and setters, without behavior. Quite often, these entities are ORM-entities. We refactor code by moving behavior from Use Cases to Entities where appropriate. The result is thin Use Cases and rich Entities. We can do this in a safe way because we have the protection of the Unit Tests targeting Use Cases, which continue to remain stable during our transition to a rich domain.
The outcomes of the transformation are reducing maintenance costs and reducing bugs on the real product:
Transformed from CRUD to Clean Architecture - separating business logic from I/O delivery mechanisms
Transformed from Manual Debugging to Test Automation - provides a safety net against regression bugs, reduces the need for debugging and manual testing
Transformed from Unmaintainable Code to Clean Code - reduces the cost of reading existing code, reduces the cost of making a change
Transformed from Anemic Domain to Rich Domain - improves the communication between domain experts and developers through domain modeling
Note: Our goal through coaching is to guide the team through the above on their real codebase. We adapt our speed of transformation to the actual team. We thus cannot guarantee how much of the above will get done as it varies on a project-by-project and team-by-team basis. Improvement is expected, but the extent of improvement is beyond our control. The minimum we strive for is that the team has introduced Clean Architecture at least in the minimal form so that Unit Tests can be written, in order for us to demonstrate Refactoring to Clean Code. This is our #1 focus and we try to demonstrate this on multiple examples. As for the adopting of TDD and DDD, for most teams this is not achievable within months 1-3 due to the extent of the existing legacy code base, so quite often we postpone them to months 4-6.