Thursday, December 27, 2007

How Can I Become a Better Tester? Part II: Read

OK - so you are paying more attention to quality now, right? Starting to ask more about what makes a Mercedes so much better than, say, an Opel or a Dodge. What's another step you can take to become a better tester? Simple - expand your horizons by reading. That's what this blog entry is about: what can you read?

  • The first place to start comes for free: read off the Internet. Read blogs ( is a great one--there are many out there). Also, a site I'm affiliated with ( is a great resource. There is a performance test expert who frequently contributes, as well as several experienced test managers.
  • Related to blogs, start reading test forums. I have found that the Yahoo Agile Testing Group is a great source of information. Another good source is MSDN's forum on software testing.
  • The next place is to read books on testing. Cem Kaner, James Bach, James Whittaker are all great authors with significant experience. Whittaker has a series of "How to Break..." books which is rock-solid with good day-to-day steps for your testing.
  • The next great source of reading is books on the topic of engineering. These books talk about best-practices in teh engineering discipline and always lead me to think about how my team is approaching a project or a problem. Some of my best test strategies have come as I've read development manuals such as:
    • Test Driven Development with Microsoft .NET (Newkirk and Vorontsov)
    • The Object-Oriented Thought Process (Weisfeld)
    • Software Project Survival Guide (McConnell)
    • Writing Secure Code (Howard)
  • Conferences and seminars: let's face it, a lot of conferences are actually just about marketing something. But even those conferences offer the opportunity to sit with experienced testers and learn from them - either in presentations or during the 'chit-chat' portion of the conventions. A couple of thousand dollars is a lot of money to spend on a conference, but sometimes it'll pay back in spades, just with the new network of associates you build who can help you through a challenge you may be facing. I'm speaking in May 2008 at PSQT's Vegas conference, if you're interesting in meeting me face-to-face.
  • Certifications: OK - to be clear, certification is NOT a substitute for experience. It alone means nothing, but a certification can often lead you in the right direction in improving your skills. It can build a framework for you to 'flesh out' with experience over time. PSQT offers certification and there are a few other courses available. Again, I stress: certification in and of itself can only teach you the process and steps recognized as historically good testing. It's not everything you need to be a great tester, but it is a good framework.
  • Practice: the best teacher is practice, and that's what you really need. Be really weird - start a testing group which meets after hours and just tackles weird testing problems (meet at a local cafe and talk about how you'd test the espresso machine or the retail point of sale system).

These are just a few ideas. There are many, many ways to learn. The key to learning, however, is wanting to learn - and for the right reasons. If you want to be a better tester so you can move up the food chain and become a better manager, well, you'd better think about a career as a project manager instead. If you want to be a better tester and that's it, trust me - you'll learn. And the growth opportunities will present themselves, too.

Next subject: going beyond the requirements.

Saturday, December 15, 2007

How Can I Become a Better Tester?

In all my blogging and participation, I have met many testers. One recently asked me "How can I become a better tester?" As I answered, I thought "Hmm - might be a good blog series!" So here we go. Most of this is aimed at beginning testers (1-3 years) although I have to admit I would benefit from some touch-up work in each of these areas!

I gave my friend four steps to take. I'm sure I'll come up with more, but let's look at these four briefly, and then I'll spend a blog or two on each as time goes by (hopefully I'll finish the last one sometime before the Rose Bowl).

  1. Become aware of quality - what is it, what does it look like, what does it feel like?
  2. Read - read everything you can get your hands out about quality. Start off with books about testing. I really don't even care about what testing methodology (agile, monolithic beastly projects, etc.), just read.
  3. Related somewhat to (1), recognize that quality means going beyond functional and business requirements. You need to think of yourself as the gateway to quality or, in an agile world, as the trustee for quality. Teach people about quality and manage up for quality.
  4. Find a great test lead, test manager, or test mentor to work with. If you can be employed on the same team, that's great. If you have to be in a less formal mentoring relationship, that's great. Learn from this person, everything you can.

First: become more conscious about quality. I actually gave my friend a homework assignment - as this person lives in India, I asked them to compare two vastly different automobiles: the Ambassador and the Skoda. Ironically, the Skoda is the butt of many a joke in Eastern Europe, but it's actually been taken over by VW/Audi and, internationally, is an incredible automobile. It competes with (and beats) Honda in India as the ultimate status car within reach of the rising middle class. The Ambassador? Well, it's truly the Indianized car--it's one of two cars mass-produced throughout India's history. When you compare the quality of the two, well, it's like comparing a Chevette to a Lexus in the US. Or, I suppose, a Trabbi to a Mercedes in Europe. No offense intended towards the Ambassador--it's just reality.

So I like to think about quality and compare products and services. If you live in the US, try this - go to Mervyns or Kohls to buy a suit. Then go to Nordstrom. The difference is beyond belief! If you don't think consciously about it, though, it's hard to put into words what the difference is. A good business analysis would miss the point either - consider the business requirements:

  • Stock on hand: variety of sizes, multiples of the most common sizes. Mervyns: check. Nordstrom: check.
  • Merchandise is on the rack, readily visible to the customer: Mervyns: check. Nordstrom: check.
  • Sales reps available to answer questions: Mervyns: check (you do have to seek them out). Nordstrom: check.
  • Someone who can ring up the transaction: Mervyns: check. Nordstrom: check.
  • Merchandise is priced right: Mervyns: check. Nordstrom: check.

A tester who focuses on only the business requirements here would say both companies are of equal quality. But think deeper - look at the experience in both situations. The Nordstrom experience is full of stuff that could be measured, but it's necessarily the first thing you think of. How friendly and helpful is the service? What is the atmosphere like? What's the merchandise like? Will it feel cool during summer and warm in winter? Is it scratchy? Is it comfortable overall?

So the purpose of the homework is to think deeper about quality. Is quality just meeting the requirements, or is there more? Is it about understanding the customer and knowing there are often unwritten requirements?

How would this apply in the real world? Well, I've already blogged about a huge data corruption issue that arose at a previous employer because the consulting firm running an implementation there insisted negative testing wasn't necessary. They focused (barely) on proving the software met requirements--since none of the requirements discussed issues like "the data needs to have integrity" or "the client must respond to and handle error messages from the server", no testing was done in that area. Thirteen million corrupt rows later, four workstreams came to a halt while the data team narrowed the issue and implemented a fix - oh, and fixed the corrupt rows.

Becoming aware of quality is the first step to contributing to quality. It's what helps us lift our eyes beyond requirements and to be stewards of quality.

Wednesday, December 12, 2007

What's the Right Tool?

I am pretty active on several testing forums (Microsoft's MSDN forum, and a Yahoo! group for agile testing are the two I frequent the most). I cannot count the number of times someone sent an e-mail asking which big, monolithic tool is the right one for them.

My friend (and manager) has a great response to this. At the LDS Church, we are frequently asked by new testers "When are we going to standardize on an <insert category here - performance, automation, etc.> tool? His response: you're engineers. Look at the tools in use in the organization today, look at the tools available in the industry, and make the best decision for the organization. Sometimes you might give up a little functionality or ease of use in exchange for a tool which is already used in-house. You get a built-in support forum, and sometimes you can even leverage existing tests. Other times, you'll probably pick a best-of-breed tool.

Our performance testing is a perfect example of this. When I came on board, we were using JMeter, an open-source java-based tool. The benefit of JMeter? It replays Apache logs. There's no better mimic of production than replaying production logs! The problem? We have been seeing weird things with JMeter - for instance, if a connection times out, JMeter doesn't drop it and move on. Instead, it opens another connection request but leaves the first request hanging. In some ad-hoc experimentation today, an engineer on my team started with just 5 connections and ended his test at 10 connections. Scale that to the 400, 500, 1000 connections we were using to load up, and you end up with incredibly unrealistic test scenarios! I never felt we could trust JMeter.

So I started looking around, and I have settled on WAPT (Web Application Performance Test Tool, It's a commercial product, but only costs $350 per license. There's a lot I love about it - first of all, connection management seems to be so much more reliable. Secondly, the built-in reporting is incredible. Third, there are actual support personnel if there's an issue. Oh, and fourth, I can bring a $250,ooo enterprise server to its knees with a dual-proc Dell running WAPT, whereas I needed 10 machines to do the same with JMeter. I'm pleased as punch with it. However, we've had to abandon a year's worth of test scenarios and rebuild them. I've had to justify my decision to change tools. And I'm the only person in the org running WAPT (right now).

The bottom line is, there's a better way to get advice about tools. Don't ask "Which monolithic tool is the best?". Describe your test environment and the project you're working on, and ask "What are some tools people have used for this type of testing? What are pros/cons/strengths/weaknesses of these tools?" Gather information and make your own decision with advice from others.

You're the engineer on the spot and you need to make the final decision. If you're a test lead in Gurgoan, don't let me decide for you what tool to use.


BUT: don't hesitate to ask! If there's one thing most testers have in common, it's an eagerness to share their (opinionated) opinion about something. And I personally love to give advice and help out. This isn't a censor or a rant about asking questions - it's about people needing to step up and be engineers.



Wednesday, December 5, 2007

WWII and Test Management

I've been on a WWII kick again, reading a lot about the war. Eisenhower's job was to decide on the strategy - do we attack Germany in one concentrated punch (Montgomery's plan) or do we attack along the entire front (Eisenhower, Patton's plan). Patton's job was to figure out how to implement--he had to sit and watch as Montgomery's punch plan failed in places like Arnhem. Then he got to decide strategy for crossing the Rhine and driving for the win - Berlin or the Eagle's Lair? Once these decisions were reached, they were carried down the chain of command - there was a big marshmallow layer of bureaucrats who procured supplies, managed replacements, etc. and then - and then there were the lieutenants.

The senior leaders (generals and majors and such) are the test managers. High-level strategy, supply mangement, personnel. They're negotiating at a high level with PM or dev on things like engineering process and such. They may be fighting over budget and headcount, too. The lieutenants? These were the boots-on-the-ground leaders who took initiative and got things done. They trained for D-Day for over two years, learning to climb cliffs and take out bunkers. The day after D-Day, they encountered hedgerows and had to make up a new strategy on the fly - and good thing, too, that they did it--there were there, in the heat of battle, and they knew what worked and didn't. They called in support from the Navy or Air Force. These are your test leads, the folks rallying the troops. They're the folks who should decide to implement MBT or PICT or the likes. They're the ones who need a great working relationship with dev leads (something I'm not doing great at right now myself - need to work on that). They're making sure the privates are moving along, they're training new leaders, and they're calling for sacrifice at appropriate times.

Black Box Testing - time to go?

Regarding black box testers... Having lived in both product software AND IT engineering worlds, I'm beginning to understand the root cause of the argument about the value of blackbox testing. Developing user product software requires a ton of testing; at MS, we did this against more functional specification rather than business requirements. I'm convinced software applications (even apps like Oracle Retail, which suffer significantly from the lack of blackbox testing) need serious blackbox testing because the target audience and the way the product will be used are so varied. Integrating business applications in an IT setting, however, really doesn't need as much and yes - I believe this could be covered by business personnel rather than by QA engineers.

But here's the conundrum... Until the IT industry accepts testers as engineers and allows them to work with dev as code is developed (and this on a day-to-day norm rather than an exception to the rule), it'll be difficult to convince engineers to go into testing and to stay there. In my role as TM for Circuit City's MST project, I interviewed 30-40 people for open roles and found practically no one who could code. None of my 30-member team in Richmond could code beyond writing XML (well, one could--he was my best test lead, too). Very few of the personnel provided by IBM, the engagement partner, could code either. One tester they provided, who had a rather high bill rate, was justified b/c she had 'retail experience'. She worked in a clothing store in NY during college!

So there's a chicken/egg thing which has to be solved one company at a time. Convincing management to 'up the bar' for test engineers, convincing development to be more agile, etc. are all necessary steps and I think it's still going to take a while. I for one am totally convinced that you get more for your buck with whitebox and gray box testing, and that you get way more for your buck hiring technical testers. But convincing a CS grad that she should consider test when all the glamour is associated with development? That's a challenge.

All that having been said, don't scoff blackbox testing. A great test engineer (very rare commodity) is a good programmer who excels at white box testing, who can black box test as well. Those finishing touches, esp in product development, are critical to project success.

Tuesday, October 16, 2007

Shameless Plug

Looking for two-way interactive information on software quality? Head on over to and check it out. There are articles, webcasts, and even an 'ask the experts' column. Yeah, I am an expert there... Come on by and ask a question!


Tuesday, October 9, 2007

Testing Patterns and Practices

I was chatting with a good friend the other day, and he commented that he thought it'd be logical to have testing patterns and practices. Well, that opened up a whole line of thinking, and I decided to post a blog on it and see what comes up!

So sound off - what patterns and practices do you think are logical for testing? I can think of a ton--both programmatic and just sheer test cases. For instance:

Test Cases
  • Boundary conditions (int): lower boundary, lower - 1, lower + 1; upper boundary, upper +1, upper -1
  • Text/string: alpa, numeric, 'extended' (takes you back a few years, doesn't it?), double-byte, multi-byte, UTF8
  • Input field: valid input type, valid input length, invalid input type, invalid length, XSS/SQL injection


  • Web service input, output
  • Accessing fields on a form in an application
  • Web request/response (threaded)
  • ...

This is short list just to get people thinking. Time to get those creative juices flowing folks!

Wednesday, October 3, 2007

Where've I Been?

So I know I've been somewhat inconsistent in posting lately. It's one of those things - every job has had a push time, and this is one for us. The LDS Church holds a world-wide general conference every 6 months - leaders from the Church meet and give sermons twice a day, two hours per session, Saturday and Sunday. For, this is the web equivalent to Christmas and Easter combined.

Coming into the October conference (this weekend), we've been working through a series of stability issues. That means pulling some all-nighters (literally - I came home yesterday as my son was getting up for school) to drive load when the site isn't utilized.

Test has been involved primarily as helping generate test load against the servers. It's an interesting challenge - who ultimately owns 'quality' on a web site? Test? Operations? Test is responsible for the applications coded, whereas Ops is responsible for the configuration and maintenance of the servers. The good news is, at the Church we all pitch in and work together to get through this kind of thing.

So that's where I've been. Hope to get back to my posting about quality - what is it, how does it happen - very soon. Meanwhile, catch some rest for me too, OK?


I was catching up on some Google alerts and came across an article on ZDNet:

40 IT failures caused by software bugs by ZDNet's Michael Krigsman -- Rick Hower, who runs the Software QA Test Resource Center has compiled a lengthy listing of “major computer system failures caused by software bugs” Here are several entries from that list: A September 2006 news report indicated problems with software utilized in a state government’s primary election, resulting in periodic unexpected rebooting of voter check-in machines, [...]

This article is a great source of justification. I've had my own negative experiences caused by inadequate testing (I keep harping on it, but 13 million corrupted rows in a production database is a bad thing - before my time, so I don't bear responsibility, but I failed in several related arguments about the level of testing needed).

This goes back to my post about levels of risk. If I'm an online web site, I need to devote more resources to my online transactions and account privacy than I do to my site UI. Yes, my site is my 'best foot forward' but all the benefit of a well-designed, well-implemented, well-tested site goes out the window the instant I'm hacked.

Business leaders just don't seem to get it... Testing is NOT about proving functionality. That's what IBM and Circuit City management kept pushing us to do. I had a 2-hour argument with an IBM dev lead my first week at Circuit over this very topic - I was trying to do too much negative testing (one case was too much, in his eyes).

Finding that leadership to influence from 'below' is a challenge. It takes patience and consistency--constantly reinforcing the value of testing. Articles like this one from ZDNet help, too. Unfortunately, many businesses refuse to learn from others' experience. Wisdom or Experience - we can learn from one or the other. Experience is truly an expensive teacher though!

Trackback URL:

Wednesday, September 26, 2007

How much is enough testing?

Amazing article! Today hacker Robert Moore spoke up about how he pulled off his crimes (hacking into tons of VOIP serves and reselling communications services). His way in? So simple - he just used the default password for common communications devices (Cisco routers, etc.). Once in, he took control and routed traffic as he desired. URL:;jsessionid=WIARC3KZRXVXQQSNDLRCKHSCJUNN2JVN?articleID=202101781&pgno=2&queryText

This blog isn't about the hack. It's elegant, but in the end common petty thievery and nothing worth a bit of praise. This blog is about the quote from page two:
"Kenneth van Wyk, principal consultant with KRvW Associates, said leaving default passwords up is a widespread and dangerous problem. "It's a huge problem, but it's a problem
the IT industry has known about for at least two decades and we haven't made
much progress in fixing it," said van Wyk. "People focus on functionality when they're setting up a system. Does the thing work? Yes. Fine, move on. They don't spend the time doing the housework and cleaning things up." "

How many times have I been told in the past year "Just run through the test cases" and "just test the positive cases"? I was literally told by one employer (not my current) that SQL injection and other user-security cases were unimportant. This was from an employer going through multiple rounds of lay-offs and terrible morale.

Testing is about proving things work but it is about so much more. If a web page is served up, does it mean it 'works'? What if it takes 3 minutes to serve up the page? Is it OK then? If I can update information about an entry in my database, but I'm not monitoring for errors, can I say it works? What if, each time an update is sent, the update is 'written' but an error is thrown? If all I'm looking for is a row with the updated information, and I'm not running a negative test case to ensure the old information is gone, can I say it worked?

Now that the recruiting series is behind me, I'm going to spend some time investigating this concept. When is testing REALLY finished? (And I mean that as the verb 'testing', not the noun "Testing".)

Recruiting for Intellect

One other skill I am looking for during an interview is intellectual horsepower. This is probably the most controversial element of interviewing – for a kick, read How Would You Move Mount Fuji to learn more about the subject, and for a ton of great interview questions!

The basic argument against the puzzle question is that it’s generally an ah-hah question and doesn’t show whether a candidate can perform the work they’re being interviewed for. OK – I buy that… I could bring in a Nobel-winning primatologist and they’d probably crush my interview question but they’d be a terrible developer. However, when I’m hiring I’m not just interviewing for someone who can slap together a few DHTML objects… I’m looking for an engineer who can code, for sure, but who can pick apart problems and drive creative solutions to them. I’m not just looking for someone who can come in and code in Java today – I may need a low-level C++ driver written tomorrow or I may need someone who can build a replacement to a multi-million dollar line-of-business application. So you know, I agree—no problem question will allow a candidate to prove their coding ability. But I also need to see their problem-solving ability.

That having been said, there’s more than one way to evaluate what my former General Manager calls IHP. As a matter of fact, I quite often couch my question in a coding question. For instance, I may throw a quick and dirty pointer question at a candidate (to see the depth of their CS skills—and you’d be amazed at how few candidates can code with pointers… it’s a shame), and then I’ll follow it up with a question like “Design a tool which takes in two string arrays – one is a list of words, the other is a crossword puzzle—and evaluates whether all the words are in the puzzle.” That’s an engineering challenge – seems straightforward, but it’s actually a puzzle.

I am not afraid to throw the puzzle question at people either. Yes, most of them are ah-hah questions where there’s only one answer and you either get it or you don’t. I try to stay away from them, although I don’t mind them too much. Very much like my coding or testing question, I’m less concerned with candidates arriving at the answer and more concerned about them showing critical thinking abilities, the ability to step back and re-evaluate a solution, and a little bit of thinking outside of the box.

What are some sample questions? How Would You Move Mount Fuji is a great source for these. Some I like are the bridge over the chasm (four people need to get across the bridge, it’s dark, and you only have one flashlight—how fast can you get them all across?) or the miner stealing an ounce of gold for every pound. There are thousands of these questions though. And anyone who thinks they aren’t fair should know that I actually run them past my Boy Scouts and they can solve them (sometimes with a little hand-holding, I’ll admit – but these are 12-year old boys!).
In evaluating, be sure to avoid the emotional evaluation. Don’t be overjoyed or think the candidate is a hire just because she got the answer. Stop and think about HOW they got there. Did the candidate just stumble on the answer? Did they guess? Did they get there so fast that it’s obvious they knew the question in advance?

Look for critical skills. Can they break the problem down? Do they arrive at an initial answer, but then continue to probe looking for a more elegant solution? Are they organized in their approach? Are they cool under the pressure? (That’s a lot of it for me – I am looking for candidates who can tackle a tough problem—with a senior manager in the room.)

Test scores and higher ed grades are a good indicator, as well, of performance. Keep an eye out, though, for the brainiac with no common sense. You probably don’t want them unless you have some really deep experimental research type of effort (cryptography?).

In the end, I’m looking for a balanced candidate. A super-smart person with poor coding skills isn’t as helpful to me. A hard-working coder who can’t think his way out of any problem might help me short-term with the challenge I’m facing at the moment, but I’m not convinced they’re a good hire. They might make a great contractor, but I’m only looking to hire bright people who will grow with my challenges and needs.

Wednesday, September 5, 2007

Recruiting for Potential

Potential. OK I admit it, I left potential out as one of the hard characteristics to probe for. What is potential, anyhow? Well, one definition is that it’s the ability for a candidate to progress to higher levels of contribution within your organization. If hired in as an entry-level test engineer, could the candidate progress to a technical lead? Can she become a test lead, test manager, or even an engineering director?

So how do you gauge this? It’s not as straightforward as measuring, say, technical skills or the ability to tear apart an application into test questions. But there are some guidelines I like to use in my recruiting which might help you.

First of all, I’m looking for a level of maturity about career opportunity. I don’t expect a college candidate to be as realistic or mature about where they want their career to go as I would, say, a five-year veteran. However, I’m still looking for a candidate who wants to progress and who wants to make a difference. I’m looking for a candidate who’s evaluating my opportunity on the basis of where she can go next and what skills she can gain. I’m also looking for a candidate who’s thinking about how they can contribute. A campus hire isn’t necessarily going to change the way my department does everything, but they may well come in with new coding skills or a process improvement they picked up during an internship.

Next thing to think about is what have they demonstrated in the past in the area of improving their abilities? I will ask things like “What technologies have you learned recently?” or “What have you had to come up to speed on in your workplace?” For me, I’m always needing to learn something. After 11 years at Microsoft , I am facing a huge wall of IT skills now that I’m in an Oracle and IBM heavy environment (more on that someday…). So I’m learning all about open source test tools, getting up to speed on Oracle SQL and I’ll probably have to jump into Java programming (sigh). But I do what it takes to keep on top of the technologies behind my projects—I’m a professional.

There’s something next which is really tough to describe, let alone measure in a candidate. Put simply, does the candidate ‘get it’? Do they understand the role of IT in the org they’re interviewing for? Do they understand issues or challenges they may have faced (or that they face) in previous positions? Are they all blame and no responsibility? Are they open and introspective about their failures, or do they continue to point the finger at outside influences? Do they show the mental maturity to make decisions on a team or group-wide level?
As not all candidates want management to be in their future, you also need to probe for deeper technical skills. Do they have any experience in distributed programming environments? Do they understand that their projects have broad impact? Can they see, for example, how an automation framework they worked on in a previous team could have been used across their entire organization?

It’s a challenge to measure a candidate’s ability to grow, but hopefully each candidate is going to be with your group for a very long time, so you want to be sure you’re hiring people who can grow over the long term and won’t just consume oxygen.

Recruiting for Testing Skills

I’m a test manager, so ultimately my interviewing is to find candidates who can test – who can find bugs, prevent bugs, and write tools to help in those processes. Testers have a special mentality—seriously, I’ve interviewed (and worked with) plenty of solid development engineers who couldn’t test their way out of a paper bag. It’s a special person who can code AND test. The good news is, finding a person with a ‘break it’ mentality is pretty easy. Probably the easiest of all the characteristics I interview for.

The best tool I’ve found for this is to simply throw the candidate a test question. It can be something theoretical, or it can have a real-world application. One of my favorites is to test a function which receives three integers representing the three sides of a triangle, and returns whether the triangle is equilateral, isosceles, a triangle, or invalid. This question works really well; it presents the candidate with a real world situation (who remembers what an isosceles triangle is anymore?), it has a ton of effective cases, and it can be easily moved into additional problems such as becoming a web service or the likes.

What’s the right answer? Well obviously candidates are going to have to catch the positive cases of each type. But from there the sky’s the limit—I’m not going to list all the cases here but suffice it to say that I was once interviewed by a test manager and given this question. We went on for a good 30 minutes and could have continued. And that’s one of the key points I’m looking for – does the candidate throw out a series of cases over a few minutes and give up, or does she keep going? A real tester should be able to run on and on and on with new cases, if your question lends itself to this. That’s why the question is so important.

So I’m looking for my candidates to just keep generating test cases. I’m looking to make sure they cover the basics – positive cases, boundary cases, etc. I’m going to keep an eye out for candidates who aren’t equivalence-classing their cases. A candidate who tells me 1,1,1 2,2,2 3,3,3 and 5,5,5 isn’t getting anywhere; there’s no difference in any of these cases (well, maybe 1,1,1). So rambling along with repetitive cases is no help. But going on and on and on with cases is definitely a key.

Another thing to look for – is the candidate organized? I’ve interviewed with a number of leads and managers who claim no candidate is sufficient who isn’t organized in their approach. I disagree with this somewhat—a candidate who, through free associate of thought, jumps from performance to security to positive test cases isn’t necessarily incompetent. As long as candidates 1) cover all those case types and 2) make sure they get the required coverage, that’s all I’m looking for. I happen to be rather random in my own test case generation. I may use an outline, but I’m plugging cases in all over the outline rather than working top to bottom. It’s simply how I think. Don’t mistake the trigger of association for disorganization.

Another thing I’m looking for is the candidate’s ability to think like a user. In testing a problem like the triangle evaluation, are they coming up with good use cases and showing customer empathy? I consider myself a technical guy and I’m always looking for good perf bugs or bad data manipulation or the likes. But the most recent bug I wrote up (just yesterday) had to do with how difficult it is to perform a certain workflow procedure in our line of business application. Nah—it’s not engineering. It’s about solving the problem the right way for the customer. In our organization, the IT department has made huge progress but still has room to go in winning over customers. Many groups are still working their own solutions rather than using our expertise. Fixing weird useage scenario bugs like this has an amazing positive effect on the department’s reputation for being client-centric.

So don’t just focus on whether the candidate got the obvious cases. Look deeper and evaluation for thoroughness, for customer empathy, and for the ability to derive great cases out of what seems like a straightforward and obvious tool. Your department will benefit greatly from this kind of test resource!

Tuesday, September 4, 2007

Recruiting for Skills

One of the easiest characteristics to recruit for is hard skills. At the end of the day, I need a person to code or test—since that’s what they’ll be doing all day, there’s no better way to probe than to have candidates actually code. I was a bit surprised during my recruiting in India to see how many companies administer a coding question in the form of a seated, written test—there’s probably some value in that, because it does allow the candidate to prove their skills but a written test is such a sterile environment. When I probe for coding, I’m looking for a couple of things at once: can a candidate break the problem down appropriate, can they get the right logic, are they looking to refactor or improve their code, and are they thinking about performance or scalability. In a written test, the candidate does all that thought in a vacuum – all I get is the final code. So a written test can be used as an initial weeding out, but don’t rely on it as the sole measuring stick.

In an interview, I’m dealing with limited time—usually 30 to 60 minutes. Because I don’t have all day to get through a question, I need to pick something that will let candidates demonstrate as many skills as possible. I have, therefore, developed a few standard requirements. First, if I’m interviewing for C or C++ developers, my questions do not allow candidates to use outside libraries. I once had a candidate yell at me – literally – about that restriction, but I don’t care. Anyone can consume a library – but can they write clean, elegant code on their own? I often ask them to implement a library—for instance, one question I like to ask is reversing the characters in a string (I usually throw a twist in that, such as reversing the characters in each word while preserving word order). There are some very simple bars to measure by – does the candidate use pointers or temp arrays? Memory isn’t constrained any longer, but a clean coder is still concerned. How complex are their logic statements? Are they nesting double and triple negatives together? Do they over-complicate things? Are they getting lost in their own algorithms? How confident do they appear? Do they tackle the problem without thinking, or do they put together an approach in their mind.

I really don’t care about the solution they ultimately arrive at (for the most part). I’m mostly concerned that they write clean code, the solve the problem, and they do it with quality. I’ll take a conscientious coder who returns to his algorithm one, twice, even thrice during an interview over the cocky application who codes quickly and sloppily and just leaves the code when it’s ‘finished’.

Saturday, September 1, 2007

Recruiting for Passion

As I mentioned, in my recruiting I’m looking for six key characteristics: passion for quality, skills, break-it mentality, potential, fit, and intellectual horsepower. How can you tell if a candidate has these traits, for sure?'

I think of the six attributes, passion and fit are the toughest to measure—they are soft skills and thus they’re challenging to measure. I can gauge a candidate’s coding skills by having them write code. I can probe for IHP with a few challenging puzzle questions. I can count the test cases a candidate comes up with, and I can see how far a candidate will go in their career. But knowing their true passion and measuring how well they’ll fit into my organization are much more difficult to determine. But there are a few keys I use in my interviews, and I feel I’ve been right many, many more times than I’ve been wrong.

When I look for passion, I’m looking for a candidate who’s thrilled about technology. As my current CIO pointed out recently, a passion for quality isn’t necessary a good thing – it’s putting the passion in the right place. Engineers are geeks – we get into weird things. We like to program to relax; need I say more? We’ll spend an hour arguing over the purest object-oriented method to serialize an object! That’s passion, but it doesn’t necessarily make for a good tester.
The passion I’m looking for is applied technology. Great candidates are just as excited about pure object-oriented code or simple implementations, but they want to see them put to use in a setting where technology is solving a problem. That’s the key to me – can the candidate recognize the value of technology in a given situation. I once had an acquaintance who took time from his family to head into the mountains and write haikus—then burn them. Poetry that’s never shared is a waste, and technology that has no purpose is a poor use of a department’s time and money.

Questions I like to use to probe for this passion give candidates an opportunity to talk about a problem they see and how technology can solve it. Some questions I use to get candidates talking include “What’s your favorite software product?” or “If I gave you a million dollars (or a blank check) to form a team and build a software product, what would it be?”. I’m going to pay particular attention to the problem they set out to solve, and less about how they’ll solve it.

For instance, my favorite application is Microsoft Visual Studio 2005. Look, I’m not the greatest coder in the world—I’ll admit it. But VS 2005 gets even someone like me moving quickly. The IDE environment helps me get everything done quickly – I get squigglies when I mess up, I get F1 help on pretty much any code snippet I highlight. And finally (finally!!!) I get help lining up my curly braces. The productivity contained within that application is amazing—it is to development what Word was to document processing. My productivity increases hundredfold (literally…). Pure coders will scoff at me, and that’s fine. There are a ton of engineers in this world who are highly productive with a bare bones coding environment – more power to them! But when it comes to getting the masses to get their work done, nothing can compete with VS.

Now, you can take issue with my passion for Visual Studio. But hopefully it’s a great example for passion for technology. The problem? It takes a long time to code up applications. The solution? Visual Studio, .NET frameworks, and all the other productivity tools that come with the application. There’s a clear problem and a clear solution.

What if someone offered me a blank check and told me to build any product I wanted to? It’d be a dream! The biggest opportunity (which is a key word for problem) I see for technology is small business. Enterprises have a ton of great software out there, but small business owners suffer terribly. Most solutions for their business are garage-built one-off applications which have been distributed widely. Applications are characterized as procedural spaghetti code built on proprietary technologies with little or no ability to adapt to modern technologies such as Internet technologies or handheld devices. The company that comes up with a method to apply common code and technologies to various small-biz verticals will have really solved a major problem. (BTW: if you want to talk further about this opportunity, contact me via e-mail and let’s chat – I have some answers…).

Hopefully I’m making my point clear. A candidate who goes on and on about the latest extreme programming examples or who talks about the purest object-oriented code might be a brilliant developer, but can they actually contribute in the real world? There’s no telling. But a candidate who can see how technology can be applied to day-to-day business problems is someone I want on my team.

I’ll give you a couple real world examples. I don’t remember the problem I presented him with, but I hired a tester in India for an engineering team I ran. This guy is a super-geek! I once asked him if he had any hobbies, things he does outside of work… His answer was “Yeah, I like to read up on coding on the Internet.” To be frank, I think he can code circles around most of the developers on our team, and I’m sure glad he chose test because he single-handedly took us from 15% to 90% automation on our test cases in a matter of weeks. He has a code answer for every problem we faced in testing. He’s passionate about applying his coding skills.

Another example is my SDET lead on Education Products. She is a brainiac—one of those people you’re afraid of because she’s so smart. I could hit her up with a problem we faced with testing and she’d literally build a solution in a matter of hours. It doesn’t hurt that her code looks like poetry in C#, either. She is methodical and organized in her coding and produced consistent, reliable solutions.

So the best candidates out there combine hard skills with a real-world application. Finding candidates like this to join your team makes a world of difference because they’re using their skills to solve problems – whether you’re out to make money in product software or cut costs and increased efficiency in a business, you’ll only benefit from applied technology—and hiring people with a vision for applying technology to solve real problems.

Friday, August 17, 2007

Recruiting & Hiring

You know, I was thinking... In my new role in QA at the LDS Church, I'm really beating the bushes looking to hire. We have an extra pair of challenges - the position is in Salt Lake City (non-negotiable) AND candidates must be members of the LDS Church, in good standing. That adds a factor of difficulty for sure!

In my time at Microsoft, I think I probably did 300 to 400 interviews, in the form of campus screens or full-length interviews. At Circuit, I was surprised at the calibre (or lack thereof) of many of our contract test candidates, and frankly some of our full-time hires--they were nowhere near the bar I had set before. I eventually had to settle when hiring contractors, because Circuit was well down the project path before I got there and there was no time to find the best candidates. But in each organization, there were also people who really stood out. What made them so special? Is there a way to find people of that caliber in the interview process?

As test managers, what are we looking for in hires? There are about six characteristics I look for during a phone screen or an interview (and, by the way, I usually get a feel for these in the first five minutes--see the book Blink for a discussion of the split-second decision):
  1. Passion: do they have a passion for technology, and do they have a passion for testing? I'm not necessarily looking for candidates who are total geeks and know everything there is to know about DIVX or the latest XBOX game. That's good, but I want is someone who's passionate about the technology they work for. Do they see where technology can make a company more efficient? Do they see how it can change someone's life? And I don't want a tester who's in test simply because they didn't meet the bar for development. Chances are, if this is case they aren't going to meet my bar. So I need to weed through all these candidates, to find the folks who are passionate about technology and about driving quality into tech projects.
  2. Skills: a successful candidate has to have something really special about their skills. At Microsoft, by the end of my elevenyear the company was only hiring 'developers' into test roles. The argument was that a team which was full of automators would be more efficient than a team of interactive or UI testers. Eh... not sold, personally. If there is any absolute in technology, it's that there are no absolutes! So I'm looking for someone who's going to have something great about them - maybe they are bug machines, focusing on interactive testing but simply tearing up the application they're working on. Or they might be great coders, able to solve big tech challenges. Whichever - my requirement is that they have something they are great at. Oh - and it has to match my team's needs... right now, I have an incredible interactive tester who can script automate much of his testing. I'm really hurting for that incredible developer who can test as well.
  3. Break-it mentality: no bones about it - the candidate has to prove to me in an interview that she can take a sample application and test the snot out of it. What's a break it mentality? Well, I'll give you an example... I would present candidates with varyious test questions when I was recruiting in India. The candidates who I felt would make (at least) good testers were the ones who went on and on and on and on generating test cases. I really didn't care what the sample question was, and it never really matters. As long as the question is sufficiently complex that it has more than 20 or so cases, and as long as the candidate just spews out cases non-stop, you'll know. For the record, I interviewed 175 engineers in India, and hired 25 (dev + test). Finding *great* engineers is a challenge, no matter where you go.
  4. Potential: I won't make a hire if I don't think the candidate is going anywhere. I'll probe for things like career growth or even challenge and goal setting. I may hire a person who flat out tells me he never wants to be a test manager--that is, if he demonstrates to me that he's been growing in his career to-date, and he's going to continue to grow. NOTE that this matters less when I'm staffing a contractor. I'm hiring contractors to tackle and finish an immediate job...
  5. Fit: finally, the candidate has to fit. If he or she doesn't fit on my team, well, it's a waste of everyone's time. A candidate in a poor fit sucks up management's time, peers' time (in the form of gossip and complaining) and their own time in terms of effectiveness. Co-workers are less willing to collaborate, and the square peg is left to do everything on her own.
  6. Intellectual horsepower, problem solving, and other skills: the final area I look at is a big lump of 'soft skills'. These are things like problem solving, communications, and sheer intelligence. If you've read How Would You Move Mt. Fuji, you've been exposed to the argument that looking for these skills is a Bad Thing, that you miss good candidates that way. Phooey! If I throw a puzzle question at a candidate, believe me arriving at the answer is about the last thing I'm looking for. In a good problem-solving question, I'm giving the candidate an opportunity to show project management skills, demonstrate the ability to make trade-offs, prove they can think on their feet and that they can keep their wits about them at the same time. Critical for an entry-level QA engineer? YES!! First of all, in my teams I expect that engineer to be as outspoken as my automation lead with five years' experience. Secondly, this also shows the candidate's long-term potential. If they can't think on their feet or solve problems, they might make it through their first year, but when growth is expected they're actually going to fall flat!

Next few posts will be how I probe for each of these areas.

Monday, August 13, 2007

What's the cost of quality?

What is the cost of quality? Well, it's definitely an investment and, like all investments, it can have a varying rate of return. I like to look at cost in three 'investment' sizes:
  • Minimal investment: like the 'day trader' who is only willing to trade with $25, or $100, there's the minimal investment philosophy. The thought here is that quality investments are high-risk or low-return investments. The less squandered, the better. This is the organization that says test should only prove the positive case--if business requirement exists for a given project, that requirement and ONLY that requirement should be tested.
  • Limited investment: this is an individual with limited resources who invests wisely but clearly can't invest across the board in a broad, distributed portfolio. This kind of organization scrimps in areas which may cost down the road, but those 'savings' are targeted and based on some sort of strategy. Safely limited investments might include sort-cutting the authoring of test cases, risking a lack of portability or re-usability. The cases are used to ensure the initial and subsequent pre-release test passes are completed, but are not expected to be used post-release. A new web feature which is expected to remain static going forward would be a safe limited investment, for instance.
  • Full investment: this is a wealthy individual with the luxury of spreading her wealth across multiple portfolios and sectors. It takes money to get money and this investor is deeply rewarded. Not many people have the options she does, though. This is the organization which invests heavily in the QA process: in-depth test case development, lots of automation, plenty of lower-priority fit-n-finish test cases, and multiple rounds of testing.

So which one is right? Well, it all depends (don't you hate that about testers? We're like lawyers--we seldom speak in absolutes!). I have a hard time ever thinking of a time when the minimal investment is appropriate -- perhaps when the project is an internal-only, proof-of-concept which will never see the light of day and NEVER be the foundation (base code) for the actual project. But when millions of dollars, and hundreds of thousands of subsequent man-hours of development are riding on your project, this is just a stupid approach. It's short-term thinking, it emphasizes savings over any consideration. It's the penny-pinching fool who buys a 4,000 SF house decorated with paper trim. It'll never last!

So the real choice is between two and three - limited and full investment. At Microsoft, I participated in projects which were years in duration (some went years beyond their original ship date - I know, I know...). These projects saw repeated upgrades on top of existing code. Millions of users bought and used the products we produced, and therefore we invested heavily. It was the right thing to do! I can't think of a single release I was ever involved in where I said "We over-tested that...". I can sure think of a number of releases where I wished I'd had more time - even if it was just to automate, to benefit future releases!

At the same time, some organizations are working on web components which (for the most part) will release once and may have one or two maintenance releases. They aren't foundational code; each compiled applet is relatively standalone. On projects which have a very small chance of being revisited, is it safe to only write brief test case descriptions? If the functions are tested thoroughly, but little investment is made in repeatability or portability, is it OK? Is it the right business decision? I think so! If a team is building foundational blocks like a content management system or the business object layer, well, they'd better be sure to spend much more time QA'ing it. But if the work is for one or two releases, I think it's OK--actually, it makes the most business sense--to cut short on the test design/documentation and focus on execution.

How about you? What do you think is the right balance, the right investment? Can't wait to hear your responses!

Another blog on quality

Why do we need another blog on quality? Well, because I still don't think managers get it. Quality matters! I've worked at four different organizations now (I know - not much variety, but bear with me) and eleven different products/teams. I've seen teams where quality mattered (Microsoft Education Products Group - India), and I've seen companies where consultants convinced management that quality could be achieved on the cheap (IBM @ Circuit City). So I'm adding my voice into the fray, and I hope the arguments and discussions here will help change the tide a bit.

What are my strengths/interest points?
  • The business case for quality and thorough testing of IT software projects
  • Recruiting and hiring
  • Outsourcing and offshoring
  • The test process--getting consistent

Your comments are welcome.