Thursday, September 23, 2010

Security Development Lifecycle

I was reviewing my notes for a presentation I gave last year outlining a Security Development Lifecycle (SDL).  A SDL is more than just security scanning or dynamic/static analysis tools (being a tool guy, I usually only care about those things), but I found it interesting and tried to lay out how best to incorporate security in the Software Development Lifecycle (SDLC).

A SDL means something different to each organizations.  For a company like Microsoft, the SDL is a very comprehensive framework with many checks and gates and for good reason.  Their software is the biggest and used by the most people and they are constantly under attack.  For others, it doesn't need to be that extensive and just incorporate best practices and some checks.  We fall somewhere in between, so I decided to come up with the basic framework of a SDL, look at the different methodologies and then apply a maturity rating for our applications to determine how extensive we would be for the tasks in the model.

Basic Framework


The basic framework of a SDL is composed of seven distinct processes that run in parallel with the SDLC:

  1. Training
  2. Requirements
  3. Design
  4. Implementation
  5. Verification
  6. Release
  7. Response
Each process has security specific component to it even though they are consistent with other activities in the SDLC.


Methodologies

While there may be other methodologies, I really only looked at a couple:

  1. Open SAMM - this was probably the best of the ones I looked at because it is an open framework to help you put security into software development, without being overly prescriptive.
  2. Microsoft SDL - this is the most comprehensive, but it is still a framework and can be adaptable depending on your needs.  The tough part is once adapted to fit your work, you keep going back and trying to add the other processes, which then gets it back to the original.
There are various degrees of what will be done for all of those models, so to determine what tasks we would do for a particular application, we do a maturity evaluation of the application.

Maturity

We came up with a questionnaire to assess an application and give it a maturity level, based on answers for exposure, patient data, architecture and other criteria.  I kept the maturity levels to four:
  1. Low risk - internally developed or use, no personal or company information, not web-enabled
  2. Medium risk - web-enabled internal use, with some private information (Intranet)
  3. High risk - web-enabled external use with private but not sensitive information (Quality Center)
  4. Critical - web enabled external use with sensitive information (customer portal)
In another post, I'll write about how to put the people, process and technology into the SDL.

Wednesday, September 15, 2010

Outlook 2010

I have an MSDN subscription and decided to check out Office 2010 to see if there are any neat features in it.  So far I haven't been too impressed, but I did find a couple of improvements in Outlook that are pretty cool.

1.  I use the Unread Mail view in Outlook to keep up with new messages and when I opened it up this morning, I was surprised to see the messages grouped by folder.  This is pretty cool, because I have a bunch of rules setup and move messages to different folders trying to keep my inbox as clean as possible (currently at 6, but I'll take care of that this morning).  I don't remember this in 2007 (and I'm sure someone will tell me I could have changed my view... okay, I didn't think about it), but it is definitely helpful and better way of arranging my new messages.


2.  The coolest thing I saw so far in Outlook is you get a mini calendar view whenever you open an appointment request.  This one I actually showed to people in the office, so it did impress me.


I was always going back and forth between my appointment request and my calendar to see the conflicts or whatever is next to the time, so this is really helpful.

Another feature I did see, but haven't played with it yet, is you can group messages by conversation.  I keep a pretty clean mailbox and use Google desktop search if I'm looking for archive conversations, so I don't really see a need, but I'll check it out to see if it adds any value.  I think they stole this from Google, because I use it quite a bit with my gmail account, but my home e-mail is not as clean as my work.

Of course, with every Microsoft update, they did let a couple of escapes (defects found after GA) out the door.  The most annoying to me is I lost the 'online status next to the name' feature.  There is actually an option for this in the options menu, but it is checked and grayed out.  So even though it is checked, I don't see the Office Communicator online status next to the name in my messages.  Not a big deal, but something I liked and with it gone, annoys me. <9/16 - I applied the June and August cumulative updates for Office 2010 and the lastest OC 2007 R2 patches and this morning my OC status showed in my address list again.  They must have fixed something.>

The other thing is when I upgraded, it changed the tool bar order and I had to manually move my tool groups (btw... I'm not a big fan of the 2007/2010 tool bar... I think they are way too big and clunky).  Once again, not a big deal, but annoying.

Overall, I haven't seen the big splash I would expect from a major release, but I'll keep plunking away.  I think for Outlook, they are trying to integrate it more with social media, but I only use Outlook for work and stick to gmail for my personal stuff, so I don't see the need.  Hopefully I'll find something that makes me go Wow!, but for now, it is a nice to have, but nothing really special.

Monday, September 13, 2010

Testing Terminology

Every now and again, I'll read something expecting one thing and after going through the article, determining the author is describing a related topic, but different enough to question if it is me or the author mixing up terminology.

The latest example I found is in the August issue of Healthcare Informatics, a monthly magazine about the healthcare information systems industry targeted to CIOs.  In the Clinical Update section, the Editor-in-Chief (kind of a lofty title), Mark Hagland, wrote about a Leapfrog Group CPOE (computerized patient order-entry) study with a subtitle of "Leapfrog Leaders Discuss CPOE Performance-Testing Results".  When reading this subtitle, I immediately thought I would read about average transaction times against plan or other performance testing SLA reporting.  Instead, the article discussed their findings of medication orders not triggering appropriate warnings or alerts, which is more an integration test rather than a performance test.  I understand in the IT or software world, testing terminology is not consistent and terms are used interchangeably to mean various things (i.e. Quality Assurance vs. Quality Control or testing), but in my opinion, mixing up a common term like performance testing with an obvious integration test, brings into question the whole study.

I can't really blame Mr. Hagland, because in the article, the Leapfrog CEO, Leah Binder, is paraphrased to say "... every hospital implementing a CPOE system needs to test and retest its system's performance, in order to ensure that is is actually averting medication and other orders", so Mr. Hagland is just reiterating what the Leapfrog group is selling.  But you would think a publication with "Informatics" in its title, would understand common testing terminology a bit better.

With that in mind, one of the first things we did when rolling out our testing methodology was to get a standard glossary in place, so we all used the same terms.  We had 78 different testing terms and came up with a common definition for all of them.  It only took us about 4 weeks (yikes, that was a painful month), but it was worth it, because when we got to the point of describing what tests to run when, we all knew the difference between a performance test (testing to confirm a system meets performance goals such as response time, load or volume) and an integration test (a test of a combination or sub-assembly of selected components in an overall system).