Case Studies
Acceptance Test-Driven Development / Behavior Driven Development can increase the effectiveness of teams. This has been demonstrated numerous times. Here are some case studies.Critical Success Factor
At a financial firm where I taught and coached ATDD and TDD, here's the results as reported by one team:One of the critical success factors for our project was our adoption of ATTD and TDD, including frequent test collaboration, engaging the business in writing acceptance test criteria, and robust test automation. Collaborating on writing tests ensured that the entire team understood all stories, which facilitated swarming and cross-skill development. We got questions answered real-time from the business that used to occur in one-off conversations and caused system test failure rework. We saw dramatically reduced failures during exploratory tests which further stabilized velocity. We added 0 manual test cases to our regression bed, and had high quality in production.
ATDD Saves a Release
Here is a report from a team at a leading financial company:Our project team was formed to deliver a business-critical function with a fixed delivery date. Many of the team members were new to line of business that we were asked to support. Early in the project, Acceptance Test-Driven Development, and more specifically Cucumber, was a useful tool for capturing and documenting the expectations of the system. The business team was able to communicate requirements in familiar terms, even if Gherkin was a new syntax. The IT team was able to understand table-driven examples, while learning the business terminology. The two collaborating sides of our project were never completely out of their comfort zones.
Having an automated test suite based on Cucumber gives both our IT team and business team the confidence to release our software to our production environment more frequently than other project teams and without long, manual regression test cycles typical of a company of our size. Our project team knows immediately when a software change unintentionally impacts another part of the system because our automated test suite is integrated into our continuous build environment. This point was clearly on display during a Sprint Review meeting with senior leadership when a series of tests failed during a demo of our application, caused by a pre-demo check-in from a project team member. In turn, we modified the demo to show the failing tests, and how the software would be corrected to prevent defects from leaking into our production baseline.
It's worth calling out one specific example of a Cucumber success story for our project. An entire software release was saved because we were able to debug a troublesome piece of business logic through the creation of a comprehensive, production-like integration test using Cucumber. In this case, we had a series of JUnit tests that were passing, but an integration scenario was failing. The JUnit tests were not comprehensive enough with their use of mock objects and data to uncover a hidden bug in the logic. Through the creation and debug of a real world example using Cucumber, we were able to find the flaw in our logic, correct the software defect, and elevate on schedule.
Totally Worth It
Here's a report from one team:The product owner was originally skeptical going into training, but she is clearly a fan now, stating that it is "totally worth it" to do ATDD. Her thoughts were mostly about the effects of the workflow changes. Automated testing was more of an after-thought. She could clearly see and articulate the benefits of simply implementing the workflow change associated with ATDD. She said that one day of ATDD specific training, plus a half day of coaching, plus a committed team is all you need to get going. Benefits she sited included:
- Team "happiness factor" increased
- Specifically, the lead developer and tester are much happier
- Less stress on testers, more distributed testing effort across the sprint
- Helped to create/enhance the "we are a team" feeling
- Fewer production defects
- Fewer test environment defects
What ATDD/BDD Solves
I've been teaching Acceptance Test-Driven Development (ATDD) / Behavior Driven Development (BDD) for many years. ATDD/BDD involves the triad (customer, developer, and tester) collaborating together on defining the details of a requirement or user story in the form of tests. These tests are written prior to coding and the developer uses these tests to check that his or her code passes those tests. The tests may be run manually or they may be automated. At the start of every course, I ask the attendees for issues they have with their development processes. After they have experienced the ATDD/BDD process, including creation of acceptance tests, I review their issues to see whether they feel that ATDD/BDD will help, hurt, or be neutral in respect to those issues. Here's a sampling of the issues in the attendees' own words. According to them, ATDD will help in solving:- Unclear requirements
- Missed requirements
- No detailed requirements
- Development without requirements
- Unclear business rules
- Poorly defined acceptance criteria
- Huge user stories
- Smaller stories leave gaps
- Not enough detail in stories
- Not enough testers
- Missing test cases
- Not enough time for testing
- Stories not verifiable until end of the sprint
- Test data not available
- Issues with integration tests
- Coding starts before tests are written
- Development team not understanding business process
- Acceptance tests written for coding, not testing perspective
- Communication challenges - IT is technical, business is not technical
If you have any of these issues, there's a great chance that adopting ATDD will help in solving it.
Benefits of ATDD/BDD
In my book, Lean-Agile Acceptance Test-Driven Development: Better Software through Collaboration, I have reports from many people on how ATDD has benefited them. Here's a summary of those benefits:- Rework Down from 60% to 20%
- Little Room for Miscommunication
- Workflows Working First Time
- Getting Business Rules Right
- Crisp Visible Story Completion Criteria
- Tighter Cross-Functional Team Integration
- Game Changing
- Saving Time
- Automation Yields Reduced Testing Time