Wednesday, October 22, 2008

Lessons for Teamwork in Software Quality

In my role as a volunteer youth leader in my Church, I have the opportunity to help put together a monthly activity. My group was responsible for the activity this month, and we chose to offer five team building exercises (all straight from http://www.wilderdom.com/games/InitiativeGames.html). We did four exercises in a rotation, and then all met together to perform the fifth. The exercises were:

  • Minefield: each member of a group paired up. We set up a maze using chairs and tables, and had our pairs blindfold one person. The second person in the pair had to 'lead' their partner through the maze. The challenge in this game is that all partnerships are talking together, at the same time. The purpose of the exercise is to emphasize the need for communication as well as the ability to pick out the voice you're listening to.
  • Toxic Waste: this is a group exercise. Participants encountered a bucket in the middle of a 10' circle and they're told the circle is full of toxins. They need to move the bucket out of the circle, and their resources are a bungee cord and ten thin ropes. The challenge is ingenuity, teamwork, and (again) communication.
  • All Aboard! In this exercise, the team all stands on a tarp. The challenge is to fold up the tarp to be as small as possible, fitting the entire team on it, WHILE the team stands on it. Teamwork, spatial relations, etc.
  • Helium Stick: I conducted this one. The team lines up in two parallel rows, sticks their hands out with their index fingers extended. A small stick (I used a 1/4" wood dowel) is placed on their extended fingertips and they are challenged to lower the dowel. In fact, it goes up.
  • Egg toss: each team is given 25 straws, 20 cotton balls, a 5' piece of tape and an egg (unboiled). The mission is to build a crate/ship/container so they can drop their egg and it doesn't break.

What is interesting to me is the way teams interacted to get the problem solved. As I said, I conducted the helium stick challenge. As teams started, they were astonished that their stick ROSE instead of sunk. In a flash, right on the heels of that recognition, came frustration. We were split up into groups of boys and girls--oddly enough, while both groups expressed frustration, only the boys started yelling at each other. I kid you not, they were really ripping on one another. They quickly moved into the blame game. Each group (all four) needed to be stopped multiple times.

After frustration/blame, the groups moved into 'try harder' mode. I finally would stop each group and point out that trying harder wasn't working - maybe there was a better way. At that point, it was again quite funny - no single group picked an organized way to brainstorm. It was 'herd brainstorming' and they all just started shouting ideas.

Eventually someone 'won' through all the shouting. In one group, it was an adult leader, who started back into try harder. In a group of 12-year old boys, one of the boys got everyone's attention and he literally talked the stick to the ground. You need to understand the magnitude of that feat - you've got eight people all pushing up on a stick (pushing up because they have to keep their fingers on the stick at all times, and by doing so they force the stick up). Somehow this boy talked everyone, step by step, through the process of lowering the stick.

The girls were pretty creative--they caught on quickly that they needed to coordinate, and touch was the best way to do so. One group of older girls interlocked pinky fingers and lowered together. Another group split into two smaller groups on each end, and just put their hands touching, next to one another.

For me the takeaway was:

  • If something isn't working, the first response is to try harder. But if it doesn't work, it's not going to work any better if you try harder.
  • When something isn't working, teams gravitate toward frustration and even blame. It's so destructive! No one was intentionally pushing the stick up, but they kept yelling at each other about it.
  • Someone has to step up and coordinate the discussion. Brainstorm on ideas and then try one of them -- any one of them. There was no single right idea and the teams that just tried something generally succeeded, as long as it was something other than trying harder.

How can this apply to software development, software testing and software quality? Well, software is built by teams - even the smallest unit of engineering is a feature team made of two or more people. Communication is important, and it's critical that 1) no one resorts to blame and 2) each person is allowed to share their point of view. "Writing unit tests just so we can hit a goal of coverage seems to be distracting us from the real goal--our output is actually lower quality right now than it used to be" needs to be answered with "Why do you think that way?" and not "Uh huh - not true. Besides, you're not helping at all with all these bugs you're bothering me with!"

Good communication includes illustrating the current status, as in "Hey wait, everyone seems to be pushing up" and then a group discussion of what's causing that: "We're pushing up because we all want to keep our finger on the stick." Only then can the group move on and start thinking about the solution. So in an engineering organization, that might be "why are we getting a flood of bugs? Wait, testers are off doing something else (building a battleship test automation system) rather than focusing on testing daily builds". Once the situation is recognized, only then can the team react with an appropriate response.

As quality assurance/software testing teams work with and communicate with development counterparts, quality becomes a natural by-product. As teams discuss what practices are producing current results, they can move forward and improve how they approach the challenge of writing quality software. A little communication can move our engineering teams from try harder to work smarter, and output increases in both quantity and quality.

Thursday, October 9, 2008

What's the Best Software Testing/QA Tool?

Frequently people will post a question like "What's the best tool?". It drives me nuts sometimes! Software testing and quality assurance are definitely challenging professions, but it's not fair to depend on others to solve your challenges for you! I've answered questions like this in the agile-testing@yahoo.com group, and on the MSDN forum, and I thought it was time I brought my answers together into one blog posting. That way, I can refer people back to the posting.

MSDN Answer

Lifted from my post on MSDN's software testing forum:

An American car manufacturer once tried to make a car for all things - it was a 5-6 passenger car in three different models, each with four wheel drive AND good fuel economy. It ended up mediocre at all things. You need to beware; resist the urge to pick one monolithic tool solution for such a wide variety of testing needs. The vendors (well, NOT Microsoft, of course ) would love to have you believe their tool can do it all. And in some aspects, their software testing tools probably can. However, will that one tool solution be effective and efficient in everything you do? No.

I'm not advocating a pantheon of tools, but you need to think like an engineer more than a manager or a customer. You find the right tool for the job at hand. Over time, you'll settle down to a group of 3-4 tools and you'll stick to them.

I've never used <some tool the asker referenced> - someone else will need to comment on that product's ability to do everything you're looking for. My take? It'd be too good to be true if it actually could. And I tell my three sons all the time 'If it's too good to be true, it's probably not true'.

Be wary.

Here's how you [should] approach this as an engineer:

  • List out the applications you'll be testing 6-12 months from now

  • Think about the test scenarios, especially those you'll want to automate

  • Ask yourself how many releases of that application/project your company will perform--will quality assurance be a repeated process, or are you testing this software just once?

  • Based on that, how much automation is 'worth the investment'? Microsoft Office 2007 *might* still be running automation I write in 1997--probably not, but I know 2003 runs it AND that automation is still being run, every time there's a patch or SP released.

  • Now that you know how much investment to make, look for best of breed solutions for each project. Don't focus on all-in-one solutions; just look for the best tool for the job. Use demos, read reviews, ask questions. Don't ask "Is this the right tool?" but rather "Which tool do you recommend" or "I'm considering this tool for this job - does anyone have experience using this tool to do this?"

  • Once you have a short list of tools, look and see if there is commonality/overlap. You'll see patterns. It's possible Rational or another all-in-one tool will appear in the list; it's equally possible none of those tools will appear.

  • If there's good overlap, ask yourself what you think the pain will be if you force a tool into a job it wasn't designed for. If you can live with the pain, go for it... If not, keep looking or open yourself up to a larger set of tools.

Hope that helps (in spite of not answering your question),

John O.

 

Commentary

OK - I searched through a number of documents and can't find my other posting on this subject, so I'll just write it once more.

  1. To all testers: look before you ask. Really! If you are wondering what tool you can use for a given test type, use Live Search and look for information! Performing a minimal amount of research shows respect to the audience you're turning to for help, and can actually prevent a question now and again. QA is all about searching for product defects; apply that searching capability to answering your questions.
  2. You're more likely to get help on a specific question rather than a broad question. For instance, asking "What's the right tool" won't get you much. However "I've evaluated Selenium, Selenium RC, and Watij--given my situation (describe it) what tool do you recommend?"
  3. Talk about the problem you're trying to solve. So "We realized we'll be running tests over and over and over again, but our UI changes frequently. What strategies can we take...?" is a question that raises the problem and let's people know what help you're looking for.
  4. Asking "what's the best tool?" is like asking "What's the best car?". If you live in Germany and drive the autobahn frequently, the best car for you is far different from the best car for someone who lives in Bombay, battles thick traffic, and drives Indian roads (notoriously rough). Be specific, give details, and look for recommendations. Software testing tools are the same way - a screaming "BMW" test tool will fall apart on a Bombay road. A right-hand drive Jaguar would be totally out of place on an American highway. A performance test tool isn't the right way to automate repeated quality assurance tests. The right tool is dependent on the project, the skill sets on the QA team, the timeline, and several other factors.
  5. Solve your own problems. The Internet is an incredible tool and offers us all a ton of opportunity to not have to reinvent the wheel. But don't ask other people to do your work for you! Ask for advice. If you want someone to solve your problem, ask one of us to consult (for pay) and bring us in. We'll be glad to get the job done for you!
  6. Give back: as you grow and learn, give back... Don't post a question, get your answer, and disappear. Remain an active participant in the community and 'pay it forward'.

Summary? Be specific, research before you ask, solve problems, and give back. That's how to get answers online--in a sustainable fashion.

Friday, October 3, 2008

What Makes a Good Automation System (automated QA/Quality Assurance/Software Testing)

So in my new job, one of my first tasks is to put together an automation system--by this I mean a harness and a framework. The process has had me thinking (and talking) a lot about what makes a good system in general.

The automation harness is the system used to schedule, distribute and run tests and to record results. In the open source world, some tools used here include NUnit, JUnit, and TestNG (my personal favorite). These tools all work in a one-off situation - they are all run locally out of the dev environment or via the command line. In software testing at Microsoft, though, a one-by-one approach to automation is useful for 1) developer unit testing, 2) tester unit testing/test creation, and 3) failure investigation. However for the 24/7 test environment we're building, this isn't sufficient. We need a centralized scheduling tool that allows us to push tests out to multiple clients (simulate load and de-serialize test runs). So we're working internally to Microsoft, evaluating the existing automation harnesses available and trying to find the one that works best.

A big factor for me in this selection is finding a harness which is configurable. At Microsoft, we are pushing the envelope in testing in a variety of ways: code coverage analysis, failure analysis, and several similar activities which allow us streamline our testing, reduce test overhead and automate many testing tasks. This means our framework MUST be extensible - we have to be able to plug in new testing activities.

A second element of the automation system is the framework. This is the abstraction layer which separates our automated tests from the application under test. This abstraction layer is critical to good automation. If, for instance, you are automating a web application, you will probably experience a lot of churn in the application layout. You do not want your automated tests hard-coded looking for certain controls in certain locations (ie, in the DHTML structure)--by abstracting this logic, your test can call myPage.LoginButton.Click(), and your abstraction layer can 'translate' this into clicking a button located in a specific div. In some organizations, this framework is purchased. At the LDS Church, we leveraged both Selenium RC and Watij to build this framework, developing most of it ourselves internally (kudos to Brandon Nicholls and Jeremy Stowell for the work they did in this capacity).

The challenge felt by most test organizations is two-fold: 1) finding the engineering talent to build these systems and 2) making the investment in innovation. Ironically, the very thing which can free up resources for other tasks (automated testing) is the thing most managers don't want to spend time on! This makes sense, sort of... managers don't like to invest in activities which (in their mind) don't contribute directly to the bottom line. In all but the smallest of projects, this makes no sense--test automation isn't a sellable product, but if automated tests can free up a day or two of test time, that's a day or two spent doing other activities! Each and every time automation is being run.

Recruiting top talent is also a challenge. In both IT organizations where I worked, there was a culture among developers that testers weren't engineers--they were click testers. Testers couldn't give input on 'extremely technical concepts' like architecture, potential bug causes, or the likes--they're there to pick up code as developers release it, and then to find bugs. It's no wonder that it's so challenging to hire engineers into testing - when they're treated like that, they're either going to leave or move to development!

So the keys to a great automation system are: 1) a solid, extensible and flexible harness, 2) a robust framework, generally customized to your test activities, 3) management commitment to invest in innovation and automation and 4) top engineering talent and the culture to reward them for their contribution.

Am I missing anything?

Wednesday, October 1, 2008

What A Feeling!

So I've done a relatively good job of being calm and collected in my first two days at Microsoft. Your first day starts in a long line filling in I-9 forms. Then you enter address and other contact info info the Microsoft system via an internal web form. Finally, you sit for another 7 hours in a room while they teach you about benefits and the MS culture. At the start of that first day, you're officially a Microsoft employee, but you don't get your card key or anything.

Day two starts in the Big Room again, with a discussion about corporate ethics, legal issues, and a discussion with a couple of recent Microsoft hires. Finally you get your card key and are sent to find your manager. Oh and by the way - the room has anywhere from 100 to 150 people in it. Yup - Microsoft starts that many people, each and every week (well, maybe not the week of Christmas or New Year).

So at about noon I launched off on my own. Unlike most new hires, having worked here for 11 years, I know where the buildings are, where parking is, etc. So I zipped straight to my temporary building whenever I'm here in Redmond. I swiped my badge and got access to the building. A little smile crept up on my face.

About an hour later, after picking up my laptop and getting it set up, I went for lunch with another new-hire for our Utah group. As I swiped my card and stepped into the cafeteria, I had a completely involuntary reaction: I jumped, throw both hands high in the air, and shouting "Yes!! I'm back at Microsoft!" Later during lunch, Tim told me how cool it was to spend time with someone who is so excited about working at the company. I have a bounce in my step which has been missing for many, many years.

I can't describe it. I was in software testing/quality assurance at Microsoft for 11 years. There were great days and there were really challenging days. I left two years ago to lead QA "for the world's largest retail IT project" at Circuit City. What an experience that project was. Quality assurance to team leads (mostly IBM project managers) meant proving happy path and avoiding negative testing. It was a cultural shock, to say the least. Testing software at the LDS Church was somewhat better. The people, for the most part, were great (surprisingly, there were exceptions - people who behaved less Christ-like than even at Microsoft!). The QA team suffered from a total lack of respect from development, however. And I found in that IT organization that everyone has a special little niche. There are enterprise architects, application architects, security 'specialists' (people who know about security policy, but don't know much about penetration testing), developers, and quality assurance engineers. If you dared to stray outside of your niche, well, that meant you were stepping on someone else's toes.

Back here at Microsoft means I have an equal seat at the table as an engineer. It means I'll have to work with other engineers to tackle really, really challenging problems. First challenge: building an automation harness using existing technologies at Microsoft, then building our own framework (abstraction layer) within which our tests run. Additionally, we need to take the mandate to "build virtualization management technologies" and turn that into a release software application: ideation, product planning, product specification, development, and release testing.

We also have the challenge of hiring incredible C++ developers and testers (engineers) in the Salt Lake Valley. Finding developers is pretty easy, but finding developers who respect QA and understand engineering excellence? A challenge. Finding a software testing professional who has the guts to take an equal seat at the table? A challenge!

But that's what I love about being back. I feel like, after two long years, I can finally do my best work and reach my full potential. I can bring all of my 14 years of engineering experience to bear on a software challenge. If I have an idea, I can run with it. I can provide input to the user scenarios, to application architecture, and to how we push quality upstream. I can prevent software defects, rather than find them!

I know everything won't be perfect. I left Microsoft for reasons, and those reasons haven't all changed (although judging by much of what I heard in New Employee Orientation, the last two years have been a time of growth and improvement for the company). There will still be bad days, there'll still be the struggle for a good work/life balance (now called 'blend'). But I have incredible health benefits for my family, stocks and bonuses again, and I have the chance to be challenged every day again. And I'll be working with super-smart people every single day. THAT is a cool thing!