I will never forget the time when a release went to production and a business user thought he found a serious problem. He proceeded to tell everyone about the bug which made the developers and testers look bad that we missed something so serious. Long story short… the business user made a mistake and there was not a bug. But the damage was done. Of course no one went around telling everyone there was not a serious bug. Since that time I became rather sensitive about using the word “bug”. I prefer to say that we have a problem that needs to be researched. It may or may not be a bug.
I originally wrote an article “Guidelines on a Meaningful Problem Report” in the November 2011 edition of Testing Circus. Click here to read the original publication. I have revised this article to include the perspective “Is it really a bug?”
Providing the correct information to the development team on a problem discovered during testing is an important role of the tester. To gain an understanding of the functionality’s expected behavior the tester may talk to a business user or a business analyst. Through researching a problem we may discover it is a bug; we did not understand the expected behavior and it is intended functionality; or it might be an enhancement.
General Guidelines to Research a Problem
Below are guidelines in researching and writing a problem report. I do value communication with the development team over sending them problem reports. But sometimes they are busy and cannot stop what they are coding to review a problem. In those cases and when submitting to a bug tracking system it is important to write a concise, accurate report.
When writing a problem report or in your verbal discussions with a developer it is important to provide the correct steps to reproduce the problem. It is best to submit a report once the problem is verified with the expected behavior. False or incorrect problem reports can harm a tester’s reputation and relationship with developers. Make sure you can reproduce the problem. For example, it is important to reproduce the problem to determine if:
• you can reproduce it upon demand;
• the failure is related to specific test paths;
• the failure is associated with specific data;
• certain environments cause the problem (ie., client, browsers).
If you encounter a sporadic problem, work with the developer to determine what information will help him find the solution. I would document as much information as I can capture. Depending upon the type of problem, I might add what other programs are running at the same time. Providing a percentage of the failure can be helpful, such as: I ran test “x” 10 times and it failed 60% of the time and here is how it failed. Sporadic problems can be difficult to fix; therefore, it is important to provide as much information that you can gather. Sometimes I like to review the information I am gathering with a developer before I complete my tests in case he can identify additional details that will be helpful.
When appropriate, have another tester reproduce the problem to help identify additional details. If the problem cannot be reproduced on a different machine, document any differences between the two computers.
Reproducing a problem can also identify tester error and it is best to catch those prior to submitting a problem that does not exist.
Some problems will produce an error log with additional information that is valuable to the developer. Understand what additional technical information is valuable and be sure to capture that information.
I only provide screen shots if the problem is difficult to describe. In those situations, I would prefer to discuss the problem with the developer. If that is not possible or more documentation is needed, I will consider adding screen shots. However, I will crop them to show only the important details as to reduce the noise in the report. I use them as supplemental information as my written report will contain all the information without needing to review the screen shots.
Document the Steps
Remember that someone else needs to reproduce this problem on his machine and sometimes problems are fixed months after they are submitted. Therefore, take time in documenting the steps and review them by re-testing to ensure all critical steps are captured. Identifying at what point the problem occurs is important. For example, if the problem occurs when clicking on a create button or when clicking on the save button is valuable information to the developer.
Is it a Known Problem?
Before submitting a new problem, review the bug tracking system to ensure it has not been logged. If a problem has been logged, add any additional details if the information provide new insight. If a similar problem has been entered, work with the employee who can make help determine if a new problem should be submitted. You do not want to add too much noise to a bug tracking system. I have seen the results of a bug tracking system where everyone adds new issues without consideration of what was previously entered. It becomes difficult to find information because there are duplicate bugs and issues are entered that are not bugs. I always believe it is a good idea to first check your bug tracking system before you get too far in writing a problem report.
Discussions with the Developer
If there is an opportunity, review the problem with the developer to gather more information to add to the report. For example, a developer may ask for additional tests to be performed and those results can be part of the report. This conversation may help the tester understand if the problem has been logged or if there is a difference in opinion on how the functionality should perform. Any differences should be resolved with the business user before submitting the problem.
Discussions with the Business User
If the tester is unsure if it is a problem, or does not understand the intended behavior, review those questions with the business user. I prefer that the tester first review questions with a tester who may understand the business functionality. That way we are tapping into the business community to seek out new information for the department instead of continual training on the same areas. I use the same process with the development team. Before submitting the report, make sure it is a problem and that the correct expected behavior is defined.
How Much Information to Document
How much information should you document in the test report? The details can depend upon:
• the company’s standards;
• how difficult it is to reproduce;
• the complexity of the problem;
• how much information the development and testing team requires.
Writing the Report
When writing the report, I prefer to keep the sentences concise providing critical information at the beginning of the report. Additional information that may or may not be helpful to the developer should be placed at the end of the document with a descriptive heading. This ensures the developer does not have to read through the document to find the important details. Re-read your report to ensure steps are not missing, sentences are clear, unnecessary information and words are not included, and follows any company’s standards.
Submitting the Report
The tester will submit his report based upon the bug tracking system adopted by the company. Sometimes a developer may question a tester’s report especially when a developer is closely attached to the feature. It is best to handle these conversations with the facts, reproduce the problem with the developer if that is helpful, and do not get into an emotional discussion. If an agreement is not reached, then be sure to involve the business user who is responsible for the functionality for the final decision. It is best to minimize these situations by gathering sufficient upfront information and having conversations with the developer and business user (or Test Lead, Test Manager) prior to submitting the problem. Submitting a false or incorrect problem report can hurt the tester’s reputation and relationship between developers and testers. Refer to my posting on Crucial Conversations for a communication model that can help with bridging conversation gaps.