Friday, April 10, 2015

More on test metrics

Last post I wrote a bit about test metrics for the scrums.  It was pretty light, but it was starting the conversation.  Since then, we have met a couple of times to further the conversation and narrow down what we want to report.

The problem with reporting is we have 2 "masters" for our reports:

  • Product - to ensure the product has been tested for each feature or deployment.
  • Management - to show how the organization is working and have a measure to determine how well individuals are working.

The management reports are usually pretty easy.  Most groups use some kind of test management system and pulling test cases and test runs by tester is something any good test management system can give you.  In an agile environment, we sometimes substitute tickets completed and sub-bugs opened, but whatever method, we are able to have a consistent metric that all test engineers use in their scrums.

The product reports are a little harder depending on how you work.  For a big bang project, the daily/interval reports are easy, but for agile and feature deployment, it was a struggle for us.  The issue we ran into is we are now doing small features that don't really lend themselves to full pass/fail test case execution reports like we had in a big bang project.

A sample of one of our big bang project reports was something like:

Testing Summary
Testing Type# Planned# ActualPassedBlockedRetestFailedN/ANo Run
Ads/Meetrics264264211088370
Alerts7167166520048160
Articles/TV/Hurricane/Tornado228228214002120
Commuter Forecast371371342000290
Haircast/Farmer/Fishing/Flu2806280626610001450
Header/Footer/Navigation/Search/Location129012901232146470
Homepage1850185014820993500
Maps148014801436016370
Social Share574574531001420
Titan65265265200000
User Profile/UGC1480148013770837580
Video238423841586001066920
Today/Hourly/Weekend/Day20062006200600000
Pages59659659600000
Video - 10/22 AMP Build816816753001620
Automated Test239923992378002100
Total19912199121810913024515270
% Complete100.0%90.9%0.0%0.2%1.2%7.7%0.0%
Failure rate goal is <5%

This is a really good report for management.  They get a feel of the number of tests being run and what the failure rate is.  The tough part is this doesn't lend itself to agile.

We have started using milestone report for our major features in our scrums.  The milestone reports are based on the milestone feature in TestRail.



This is similar to the above testing summary report, but a bit easier to generate because it comes from our test management tool.  My folks are not big fans of having to write test cases and document execution during the sprint, but they see the value of it when product or management recognize the work and appreciate the status.

One big issue is the tool doesn't show the automated test results.  We run automated tests on our test and production environment at least once a day and display the results on a dashboard that shows our current status one of the TVs in our area.


At first it was kind of hard to read, but more people are reviewing the results and see the value in keeping our failures under 1% prior to release.  Combined with a milestone report, product owners can see if their features are ready to go.

Reporting results and work are still struggles throughout our organization, but we are trying to do better and provide people with information.  I am interested to know what people are using for their reports.  If you have a report that works well for you within your agile scrums, please let me know.  I'm always looking for a better way of doing something.

No comments:

Post a Comment