The tester x developer relation

May 11, 2010 10:26

Lately I've been thinking over and over about the hate-relationship between developers and testers in a software solutions company. Testers usually think that they're world's last hope and without them the software release is destined to self-destruction. On the other hand, programmers tend to think that testers are, in simple words, a pain in the ass. Always complaining about stuff that is not working because they're just too stupid to understand how the think really works. Since I work both as tester and developer, let's just say that I get pissed off whenever any of the parts says something bad about the other.

In my humble opinion, developers are obviously the essential part of a software development. Testers usually don't have enough knowledge to take part of the software coding process whereas developers not only know how to program and fix errors, but they can also test their own code as well. I'm not saying that testers are useless. In fact, as a tester, I say that they are important for the simple fact that their work makes developers' work flawless (or at least that's what should happen as a final result). They may not be so essential in terms of coding and stuff, but they have one important role in software development: they see what CLIENTS see, not what developers see. They're not, say, "addicted" to the code, so they can find errors that the programmer usually wouldn't see and point them out.

So, technically, both programmers and testers should walk side by side, since their roles are directly connected, right? Why does this never happen? For the simple fact that most of the testers are just too arrogant about their own roles in the software developing process. I'm also a developer, so I do know how irritating it is to have someone in your heels pointing out mistakes on your awesome code ceaselessly, but not saying HOW to get on that error or HOW to fix it. Most of the testers usually don't even get to take a screenshot of what's going on. Some of them saw the error once, sometimes it's not even your code problem, it's just the way their machines is configured, but they complain about it nevertheless and get furious when you say that the problem is not yours.

On the other hand, playing the tester role, I noticed that many of the defects submitted by developers are almost like an "internal joke": they submit defects that frequently only they know how to test. Okay, I understand that defects like "Line 57 declares an int variable and as a result the method getSQLValuesTable() may be invalid, for it should be returning a double[]" are just not testable and there is not way to reproduce this error unless you're using the software and suddenly get a beautiful "SQL ERROR" message. Still, how the hell will you test something like "display 654 is not correct" when you get no testable descriptions?? It's not always that testers have knowledge of the whole scope of the project, so, like, they don't know how to get to this error and what's the correct behavior of it. And believe me, I see a LOT of these. Whenever I face defects like those, I do what everyone would do: send an e-mail do the developer asking "how the hell do I test this". Yeah, it's the right decision to take, but the thing is: that shouldn't be what I do. This ruins the purpose of having a process. Sure, this is great to developing love/hate bounds amongst co-workers, but unless you want to get more details about the defect, I think it's just wrong to keep contacting the developers over and over to ask how the hell you test that.

I try to work as neatly as I can. I'm not saying that I'm the voice of righteousness, but playing both developer and tester roles, I can say that a couple of measures that I take while working help a lot in a healthy tester x developer relationship.
As a tester, I try to talk about the issues with developers and managers as much as possible. First of all, to make sure that this really is a defect, and not a recently required system behavior. Also, to make sure that I don't make duplicate defects. I know how irritating it gets to spend more than 15 minutes running through a giant list of defects and closing most of them with "this defect is a duplicate of #547270, which is currently being solved, therefore I'm closing this one".
Also, I try to get the log of the defects. Or detailed steps of how to get them. Or screenshots. Most of the times I ask the developers if there's something else they need, some debugging that is within my reach or anything of that sort. Whatever helps them (I just don't buy them coffee, LOL).

As a developer, I just try to be patient to testers. Knowing that my beautiful, clear, commented code has defects is always like being stabbed in the heart, but I'd rather have my coding defects pointed out by someone who can explain how that happened rather then being scolded by a furious client that will probably only say that my software is a crap and it doesn't work. If a defect is not clear, I ask for detailed steps to get to the error. I ask for logs, screenshots, more information. In 85% of the conversations I had with testers, they gladly provided me with all the information I needed.

What's really missing for a healthy tester x developer relationship is a little bit more of patience. Testers, provide the developers all the data they need to solve the issue. Give them TIME to fix the issues. TALK to them. Generally, developers are quite understanding. Developers, have some more patience with the testers. They usually see the software by the final user's eyes. Ask them details of the problems if necessary. Describe the defects in a way that someone else can test it.

As for me, I'm doing my part of the work. :)

software development, work, software quality

Previous post Next post
Up