Every now and again, I'll read something expecting one thing and after going through the article, determining the author is describing a related topic, but different enough to question if it is me or the author mixing up terminology.
The latest example I found is in the August issue of Healthcare Informatics, a monthly magazine about the healthcare information systems industry targeted to CIOs. In the Clinical Update section, the Editor-in-Chief (kind of a lofty title), Mark Hagland, wrote about a Leapfrog Group CPOE (computerized patient order-entry) study with a subtitle of "Leapfrog Leaders Discuss CPOE Performance-Testing Results". When reading this subtitle, I immediately thought I would read about average transaction times against plan or other performance testing SLA reporting. Instead, the article discussed their findings of medication orders not triggering appropriate warnings or alerts, which is more an integration test rather than a performance test. I understand in the IT or software world, testing terminology is not consistent and terms are used interchangeably to mean various things (i.e. Quality Assurance vs. Quality Control or testing), but in my opinion, mixing up a common term like performance testing with an obvious integration test, brings into question the whole study.
I can't really blame Mr. Hagland, because in the article, the Leapfrog CEO, Leah Binder, is paraphrased to say "... every hospital implementing a CPOE system needs to test and retest its system's performance, in order to ensure that is is actually averting medication and other orders", so Mr. Hagland is just reiterating what the Leapfrog group is selling. But you would think a publication with "Informatics" in its title, would understand common testing terminology a bit better.
With that in mind, one of the first things we did when rolling out our testing methodology was to get a standard glossary in place, so we all used the same terms. We had 78 different testing terms and came up with a common definition for all of them. It only took us about 4 weeks (yikes, that was a painful month), but it was worth it, because when we got to the point of describing what tests to run when, we all knew the difference between a performance test (testing to confirm a system meets performance goals such as response time, load or volume) and an integration test (a test of a combination or sub-assembly of selected components in an overall system).
No comments:
Post a Comment