What does this mean?

Let’s have a look at the Scrum Guide. It states that in order to provide value, the Increment must be usable. Also, the Definition of Done is a formal description of the state of the Increment when it meets the quality measures required for the product to be released.

It essentially confirms the following statements from the Agile Manifesto. Working product over comprehensive documentation (value) and working product is the only measure of progress (principle).

For that reason, the only thing which can be demonstrated in the Sprint Review is working product. Nothing else! (that’s why I don’t allow the use of PowerPoint slides in my Sprint Reviews)

 

Coding:

Everything is under version controlClean Code (coding standards, design patterns, passes SonarQube)Merged into Trunk

Testing:

All PBI acceptance criteria are metUnit Tests for all private functions with 95% coverage (automated)Unit Tests for all public functions with 100% coverage (automated)BDD for important user scenarios (automated)Integration tests with Mock objects successfulIntegration tests with real systems successfulAccepted by Product Owner before Sprint Review

Documentation:

APIArchitecture (at most 10 pages in total for whole product)Scenario Use Cases (see BDD in Testing)User Manual

NFR

Regulatory compliance workWorks for color blind peopleAmerican EnglishWorks on 1024×768 display

Figure 1: Example Definition of Done

This Definition of Done will most likely deliver a high technical quality product with every release. 

The challenge is that most organizations or teams don’t possess the organizational engineering capabilities to achieve that. 
What’s the quick fix? Lower the standard of your Definition of Done, pick the elements you can fairly easily achieve. 

Let’s say company Rabbit decides that in order to keep the current output per Sprint constant — let’s say 10 Product Backlog items — they ‘adjust’ the Definition of Done accordingly to their current capabilities. This Scrum Team also promises to increase the Def of Done level as they mature with their engineering practices. (which usually never really happens, but this is another story)

 

Coding

CodedEverything is under version controlClean Code (coding standards, design patterns, passes SonarQube)Merged into Trunk

Testing

All PBI acceptance criteria are metUnit Tests for all public functions with 50% coverageUnit Tests for all private functions with 95% coverage (automated)Unit Tests for all public functions with 100% coverage (automated)BDD for important user scenarios (automated)Integration tests with real systems if possibleIntegration tests with Mock objects successfulIntegration tests with real systems successfulAccepted by Product Owner before Sprint Review

Documentation

APIArchitecture (at most 10 pages in total for whole product)Scenario Use Cases (see BDD in Testing)User Manual

NFR

Regulatory compliance workWorks for color blind peopleAmerican EnglishWorks on 1024×768 display

Figure 2: Adjusted, weak Definition of Done

Company Rabbit keeps on completing the same amount of product every Sprint as before. Everyone is happy until the next release. The just released system keeps on crashing, data is corrupted. Immediate hot fixes only partially mitigate the situation. Important key accounts complain and threaten to cancel their contract.  The next release, even though it was delayed for manual testing purposes, was even worse. Often, a vicious cycle commences. 

Company Tortoise has their principles about quality and does not compromise. They are honestly shocked when they discover that their engineering capabilities are not as strong as they had thought. Since quality is none negotiable for them they decide to reduce the amount of work per Sprint until they can reliable meet the Def of Done.

Their throughput reduces from 10 Product Backlog items to 2 for the next three Sprints. However, as they deliberately practice good engineering they learn and find ways to improve. After five Sprints they can complete 4 PBi. After eight Sprint they are at 6 PBi. Finally after 12 Sprints they are back 10 PBi per Sprint.

Because of the improved engineering capabilities they doubled their release frequency after 3 Sprints and could react to customer wishes even faster. Now, they are at 3 releases per Sprint and they complete in average 14 PBis per Sprint.

In short – company Rabbit focuses on resource efficiency and output whereas company Tortoise focuses on customer outcome and flow efficiency. 

Again, your Definition of Done is a vertical. A vertical thin slice — little product — all the way through.

Leave a Reply