Friday, May 30, 2008

Advice to Developers from a Test Manager

A good friend pointed out that a blog on advice to developers from an experienced test manager would be helpful. With 13 years of tech experience now, I have a good idea of what works and what doesn't.

Most Successful Projects

I'm basing my advice on my experiences in the most successful projects I have worked on. These projects have been released on time, have had the highest possible quality, have thrilled customers, and have done so with minimal pain to the entire engineering team. These projects themselves are:

  • Microsoft Server ActiveSync: synchronization layer between Microsoft's mobile devices (Pocket PC and Smartphone) and Exchange Server.
  • Microsoft's Learning Essentials for Microsoft Office, a free enhancement to Office to help tune it more for the needs of students and teachers.
  • The LDS Church's Its About Love adoption site (unreleased, sorry!)

So let's jump in, shall we?

Collaboration

Successful projects all share the element of collaboration between development, test, and program management/product design. There's a feedback loop in this relationship which results in higher quality. Dev can provide input into design tweaks which might result in significantly lower development effort. Test can provide feedback which enhances testability and reliability.

During the engineering process, this collaboration continues both in refinement of the feature list and of features. That close relationship between dev and test also helps improve quality through quick defect remediation as well as mutual support in defect detection. When dev and test work together (as opposed to dev working in a vacuum and throwing something over the wall), invariably the result is fewer bugs, more bugs fixed 'right', and a significantly shorter bug turnaround.

I want to stress this point. It is so very critical for bugs to be detected and fixed early on. The longer a bug remains in the project, the more code there is which is written around that bug. It becomes like a sliver under the skin - the code literally festers, becomes infected, and removal/healing are hindered. This quick turnaround simply can't be called out enough. It's definitely a must-have.

An additional aspect of collaboration is seen when dev and test work together to find, investigate, and remediate defects. Frequently this takes the form of "Hey Tester - I checked in some code today, it's all passing unit tests but I'm a bit concerned about it." Just a heads up generally points a tester down a different path than they may have planned to take.

Paired investigation can be so critical. In It's About Love, we have found a series of strange performance issues. I've got the time, experience, and tools to find the bug whereas my developer has the experience to tweak his code to find potential solutions. It's a symbiotic relationship. Without this paired work, he'd be blindly trying to fix something, throw it over the fence, and get it back the next day as a failure.

I list collaboration first because it is the key. Everything else you do to improve your product and code will stem off of collaboration, so you had better learn this skill if you want to be an effective, efficient developer.

Trust Your Tester

Ah, trust... that word of words, that concept of concept. I climb mountains, or at least I did until my kids got older. Now we hike, and someday we'll climb again. It's taught me a lot about trust. When you are scaling an ice field, your life is dependent on your skill and strength. It also lies in the hands of the person at the other end of the rope. You learn to trust your partner explicitly--you have to!

Some developers look down their nose at testers. Maybe it's because the developer feels they have more experience writing code. Maybe its because the developer has a degree in computer science and the tester does not. Maybe its because the developer feels he or she 'won the lottery' and the tester did not. It may even be that the developer feels like they are in a master/servant position. Believe me, these feelings will not help your project, not in the least.

Hear me on a few things:

  • There's no master/servant relationship. The tester does not exist to beat quality into your rushed, careless code. If this is your attitude, the first time you run into a testing professional, you are in for a rough time. There's nothing sadder, in my opinion, than a dev who rushes through their code, cannot be bothered by incoming defects, and then throws the whole mess over the fence to a tester to root out the issues. That's pure laziness, it's pride, and it's a lack of professionalism. If this is your approach to engineering, you seriously need to rethink your value to your organization. You also need to step down off your high horse and start to think about the value in a team approach. You need to learn to take pride in your results - not just that you threw together a bunch of web pages or produced an application thousands will use. Will they enjoy their experience? Is it the best thing you could have possibly built? If it isn't, wouldn't you rather learn to make it that way? Western society is built on the concept of pride in outcome, of making highest-quality products. Don't push together a bunch of crap, throw it over the fence and expect QA to work it out. Or worse, don't get annoyed when you product a lousy piece of software and testing has the audacity to find a bunch of bugs in it! Just because it compiles and passes your unit tests doesn't mean it's worth the cost and effort spent on it!
  • There's no lottery. You may be performing your dream job, and you may think being a tester would be the worst thing that could happen to you (equivalent to, say, working a cash register at McDonalds). Believe me, there are a lot of 'testers' who would rather be developers. But most testers have chosen this career path and they are as passionate and excited about being a tester as you might be about being a developer. Think about where the US Space Shuttle would be without QA, or the Lexus automobile... QA professionals love their chosen career. I, for one, would rather drive a city bus than be a developer. Don't get me wrong - I love to code, it's fun. But to sit in front of a blank screen day in, day out, doing the same thing over and over and over... Yuck! I don't know how you do it! I wouldn't trade places with you for anything. So you may be happy in your role, but believe me - there's no need to pity a tester, nor to look down your nose at one.
  • Your degree doesn't mean much. I once interviewed a candidate for a development position. He had a a Bachelors in CS from a prestigious nationally-recognized school, and had won multiple national competitions in programming. Unfortunately for him, he had no clue how to write elegant code. I have probably interviewed a thousand candidates, from PhDs to bachelors in CS. And I've hired maybe 50 or 60 total. Your degree is evidence of a lot of hard work (well, maybe...) and definitely of perseverance, but it meant nothing the day you left school and started working for pay. I have a degree in German Literature and International Relations, but I can roll up my sleeves and dig into code and push quality into a project. It takes so much more than a degree in CS to make a good programmer, and some of the best programmers I've known don't even have CS degrees, or earned them as an afterthought. Sure, you gained some experience when you took your CS classes, and you got some exposure to programming theory. But what counts in software is experience, intelligence, and diligence. And your tester can have all of that and never have received a degree, or could have a PhD in physical therapy. Learn to trust the experience. I had a developer rip a test plan up one side and down the other because it had holes in it--that's great. But her excuse for not wanting to discuss the plan was "I have five years of experience - believe me, this is wrong" Well, that's just great, but the point is, she lost me there. At the time, I had 12 years of experience. I may not have known the ins and outs of the code (I was new to the project) but I know quality and I know we weren't approaching it right.

So when you're thinking about your project and questioning whether you should let that pesky tester into another meeting, don't listen to that voice that says you're better off without her. She's going to bring experience, passion, and perspective you simply don't have.

Stick to the Basics

I've been learning a universal truth in life, and that is that things generally don't happen all at once, but step by step. It truly is the little things that matter - it's the basics that count. You know, there's a reason why so many teams are jazzed about Agile, XP, and scrum. In many instances, these disciplined changes introduce quality and drive projects to a faster, more successful completion.

A lot of developers have glommed on to Agile like batter on a bowl. They love the 'freedom' of no rules, no documentation, no meetings - just pure coding. And who wouldn't? Unfortunately for them and for the projects, they don't understand: there are rules in Agile. Agile is based on XP. There are fundamental rules in XP like you don't move ahead without stabilizing what you've got, you don't cut corners, you work in pairs, you don't write code you don't need. These are basics that, when they are ignored, spell doom for your project.

As a test manager, my job--my very reason for being--is to make sure the project completes with highest quality. I'm going to be a stickler for the basics. I once worked two concurrent projects. They both started at about the same time. They were both green-field projects (white-monitor projects?). They were both staffed with some of the best people in the organization. One project was a short-term, single iteration. The other project was a three-cycle, year-long effort.  Both projects held frequent scrums where they went around the table and talked about progress. But that's where the similarities end.

The difference in approach was astounding, in spite of all the similarities. The team on the short project cut corners. When the encountered a roadblock on a given user story, rather than spiking in and working through it, they moved to another story. Soon, 95% of the stories in the project had been started, and none had been completed. No master build was produced; devs were checking in code after self-hosting it. No unit tests were developed ("we're too busy coding to write tests" was the excuse). Critical infrastructure tasks such as URL rewrite or SSL were excluded from the development phase because they were 'operations' tasks. Test was uninvolved because 1) test was understaffed and 2) there were no builds to use.

The other project was difficult to start. Testing was involved from day one, and we insisted on adherence to standards. Ironically, we discovered that, while .NET web services had a standard of strong typing out of the box, Apache's CXF does not. We had to figure that out. We had performance issues. But we dug in deep and worked hard - development put in an incredible showing. Test as well, in spite of being understaffed.

Can you guess the results? The first project released, barely. In the final four weeks of the project, official builds were produced and literally 350 bugs were opened. The code churn was unimaginable. Regressions were high--we were at about 2:1 (for every two bugs regressed, either one was reopened or a new bug was found). The product was released, but came in a month late and still ended up in a severely restricted beta. Developers worked insanely long hours, rather than cutting features, because no one story was less than 75% or 80% completed; there was nothing obvious to cut. The program manager got beat up over the project. I'd call it anything but a success.

The year long project? Cycle 1 of 3 was a solid foundation. Cycle 2 forged ahead - there were some challenges integrating, but two weeks of extended time worked through all that. There are performance issues yet to address, but Cycle 3 is going strong, the project looks great, and the team has really banded together.

Don't ignore the basics. You're not immortal, omnipotent, or above the basic rules of engineering.

Take Pride in Your Work

Related to focusing on the small things is the concept of taking pride in your work. I have to laugh at how many times in one of my web application projects I have entered a bug against the application which needed to be fixed for FireFox or IE. The developer would quick-like code up the fix and throw it back to test. The first thing I would do is regress the bug in the browser it was found. Generally worked like a charm. The second thing I'd do? Test it in the other browser. 5 times out of 10, can you guess what happened? It was broken.

No developer worth their salt should be beyond testing their fixes in multiple browsers. In the open source, Java/Oracle-heavy environments I've worked in for the past two years, I've seen a lot of FireFox fans in engineering organizations. That's cool - I like it too. But only 30% of your clients are using FireFox, generally - that means about 60% of them are using IE 6 or IE 7. If you fix a bug in one browser, you still have to look at it in the other two browsers. It is astounding to me that, after 5, 10 of these bugs being bounced back, developers still can't get it.

If you are going to spend time fixing a bug, take pride in your work - fix it all the way, and make sure it's fixed before checking in your change!

Value Your Tester

OK - so you're XP and you write a lot of unit tests. That's great! Congratulations, you've taken the first step to becoming a great developer. But now it's time to recognize that there are still more important things to do. You need realize that quality doesn't end when you finally pass your unit test--just like, when building a house, you're not finished framing when you cut the two-by-four.

Testers bring a very unique perspective because they can generally think in holistic, systemic terms. As a developer, you're thinking (or should be thinking) in terms of methods and functions, services, etc. A tester is thinking in terms of code and database, interface layers, and user interaction. You tie methods and functions together. A tester makes sure entire systems together.

Want an example of this? Your unit tests stub out data coming in and out of a database, to isolate the code layer. You prove your function works, but you don't prove it works with your database. Testers deal with real-world data, and they make sure it comes from the database.

Need another example? Your job as developers is to achieve something - it's to create something of value which fulfills a requirement (a user story). You can prove you are finished because you can demonstrate the functionality of your user story. You start at a broad point (there is any number of ways to accomplish your programming task), and drive narrower and narrower until you accomplish your work. Testing is just the opposite - testing starts narrow (with your completed user story) and goes broader and broader because there is an infinite number of test approaches. Testing can go forever.

Another way to look at it is this: your work culminates in one completed user story. A tester's job can potentially never end, because there could be an infinite number of bugs in your code (it seems like it sometimes!). It is the old turn-of-the-phrase "You can count how many seeds there are in an apple, but you can't count how many apples there are in a seed".

Testers are trained in this. They are used to this and a good tester is comfortable in this. This mind set is opposite to how you work - testers are different. If you embrace and welcome that difference, you will be more successful. Value that difference and watch what can happen as you collaborate on architecture, on infrastructure, even on code design.