The problem with reporting is we have 2 "masters" for our reports:
- Product - to ensure the product has been tested for each feature or deployment.
- Management - to show how the organization is working and have a measure to determine how well individuals are working.
The management reports are usually pretty easy. Most groups use some kind of test management system and pulling test cases and test runs by tester is something any good test management system can give you. In an agile environment, we sometimes substitute tickets completed and sub-bugs opened, but whatever method, we are able to have a consistent metric that all test engineers use in their scrums.
The product reports are a little harder depending on how you work. For a big bang project, the daily/interval reports are easy, but for agile and feature deployment, it was a struggle for us. The issue we ran into is we are now doing small features that don't really lend themselves to full pass/fail test case execution reports like we had in a big bang project.
A sample of one of our big bang project reports was something like:
Testing Summary | ||||||||
Testing Type | # Planned | # Actual | Passed | Blocked | Retest | Failed | N/A | No Run |
Ads/Meetrics | 264 | 264 | 211 | 0 | 8 | 8 | 37 | 0 |
Alerts | 716 | 716 | 652 | 0 | 0 | 48 | 16 | 0 |
Articles/TV/Hurricane/Tornado | 228 | 228 | 214 | 0 | 0 | 2 | 12 | 0 |
Commuter Forecast | 371 | 371 | 342 | 0 | 0 | 0 | 29 | 0 |
Haircast/Farmer/Fishing/Flu | 2806 | 2806 | 2661 | 0 | 0 | 0 | 145 | 0 |
Header/Footer/Navigation/Search/Location | 1290 | 1290 | 1232 | 1 | 4 | 6 | 47 | 0 |
Homepage | 1850 | 1850 | 1482 | 0 | 9 | 9 | 350 | 0 |
Maps | 1480 | 1480 | 1436 | 0 | 1 | 6 | 37 | 0 |
Social Share | 574 | 574 | 531 | 0 | 0 | 1 | 42 | 0 |
Titan | 652 | 652 | 652 | 0 | 0 | 0 | 0 | 0 |
User Profile/UGC | 1480 | 1480 | 1377 | 0 | 8 | 37 | 58 | 0 |
Video | 2384 | 2384 | 1586 | 0 | 0 | 106 | 692 | 0 |
Today/Hourly/Weekend/Day | 2006 | 2006 | 2006 | 0 | 0 | 0 | 0 | 0 |
Pages | 596 | 596 | 596 | 0 | 0 | 0 | 0 | 0 |
Video - 10/22 AMP Build | 816 | 816 | 753 | 0 | 0 | 1 | 62 | 0 |
Automated Test | 2399 | 2399 | 2378 | 0 | 0 | 21 | 0 | 0 |
Total | 19912 | 19912 | 18109 | 1 | 30 | 245 | 1527 | 0 |
% Complete | 100.0% | 90.9% | 0.0% | 0.2% | 1.2% | 7.7% | 0.0% | |
Failure rate goal is <5% |
This is a really good report for management. They get a feel of the number of tests being run and what the failure rate is. The tough part is this doesn't lend itself to agile.
We have started using milestone report for our major features in our scrums. The milestone reports are based on the milestone feature in TestRail.
This is similar to the above testing summary report, but a bit easier to generate because it comes from our test management tool. My folks are not big fans of having to write test cases and document execution during the sprint, but they see the value of it when product or management recognize the work and appreciate the status.
One big issue is the tool doesn't show the automated test results. We run automated tests on our test and production environment at least once a day and display the results on a dashboard that shows our current status one of the TVs in our area.
At first it was kind of hard to read, but more people are reviewing the results and see the value in keeping our failures under 1% prior to release. Combined with a milestone report, product owners can see if their features are ready to go.
Reporting results and work are still struggles throughout our organization, but we are trying to do better and provide people with information. I am interested to know what people are using for their reports. If you have a report that works well for you within your agile scrums, please let me know. I'm always looking for a better way of doing something.