Sunday 8 December 2013

ATDD-Acceptance Test Driven Development

ATDD-Acceptance Test Driven Development is advanced XP practices to baseline software development on testing the user acceptance criteria.
Our Agile Team used ATDD and experienced up to 80% reduction in defects  and  rework time (reduced waste and high productivity).It also helped us to establish better trust and relationship between IT and business stakeholders.
According to Amir Kolsky in Dean Leffingwell's "Agile Software Requirements", the savings can be explained using simple maths.
If T= time to write test, C=time to write code and H= time to hook test
Time to Complete Story if you write test first= T + C + H
And
Time to Complete Story if you don’t= C + T + H + R
Where R is rework time to pass the test once the test is understood and available.

How to make ATDD work:
Customer perspective:
Identify the customers/users. Use mind mapping to understand their influence and network.
Use personas to understand the personalities and accordingly plan your communication strategy.
Maintain ongoing dialogue and engagement with the stakeholders.

Process Perspective:
Plan and schedule the requirements workshops and send notification to right stakeholders well in advance.
Send the reference material, questionaire and convey your expectations/outcome from the session.
When stories are captured in the workshops define the acceptance criteria in detail. Ask for examples and scenarios. This will also help to build the test data.
Translate the acceptance criteria into the test scenarios. The scenarios should cover the system behaviour encountered in actual use from usability and functionality perspective.
Organise another workshop to review the test scenarios in detail. If possible build the prototype so that the customer gets the look and feel of the product well in advance.
Once the scenarios are finalised start writing the automated test suite. The design and build can be done simultaneously if pair test-programming is used.
Book sessions with the customers and demonstrate the functionality progressively as and when the build for test scenario is ready and green in the automated test suite.
Consistently demo the working software and get the feedback. Plan for at least 3 review sessions depending on the number of scenarios.
Organise Acceptance test session and run through all the test scenarios. In most of the cases the UAT runs smooth and signoff can be obtained immediately.

Challenges to take care of:
Requirements/scenarios may change more frequently as you are consistently engaged with the customers.
Black box testing may take back seat as technical tests are not visible to the business users.
The process requires change in team mindset/culture.
The practise is more suited for software applications which interact with the users directly than for the softwares which do not interact with the users directly.