Common Software Testing Pitfalls


by – Wing La; Source: http://ebiz.u-aizu.ac.jp/~paikic/lecture/2007-2/adv-internet/papers/TestE-Commerce.pdf

Poor estimation. Developers underestimate the effort and resources required for testing. Consequently, they miss deadlines or deliver a partially tested system to the client.

Untestable requirements. Describing requirements ambiguously renders them impossible or difficult to test.

Insufficient test coverage. An insufficient number of test cases cannot test the full functionality of the system.

Inadequate test data. The test data fails to cover the range of all possible data values—that is, it omits boundary values.

➤  False assumptions. Developers sometimes make claims about the system based on assumptions about the underlying hardware or software. Watch out for statements that begin “the system should” rather than “the system [actually] does.”

➤  Testing too late. Testing too late during the development process leaves little time to manoeuvre when tests  find major defects.

➤  “Stress-easy” testing. When testing does not place the system under sufficiently high levels of stress, it fails to  investigate system breakpoints, which therefore remain  unknown.

Environmental mismatch. Testing the system in an environment that is not the same as the environment on which it will be installed doesn’t tell you about how it will work in the real world. Such mismatched tests make little or no attempt to replicate the mix of peripherals, hardware, and applications present in the installation environment.

Ignoring exceptions. Developers sometimes erroneously slant testing toward normal or regular cases. Such testing often ignores system exceptions, leading to a system that works most of the time, but not all the time.

Configuration mismanagement. In some cases, a software release contains components that have not been tested at all or were not tested with the released versions of the other components. Developers can’t ensure that the component will work as intended.

Testing overkill. Over-testing relatively risk-free areas of the system diverts precious resources from the system’s more highrisk (and sometimes difficult to test) areas.

No contingency planning. There is no contingency in the test plan to deal with significant defects discovered during testing.

Non-independent testing. When the development team carries out testing, it can lack the objectivity of an independent testing team.


Requirement Specification document Review Guidelines and Checklists


To prepare effective test cases, testers and QA engineers should review the software specs documents carefully and raise as much queries as they can.
The purpose of Software Requirement Specification Review is to uncover problems that are hidden within the specification document. This is a part of defect prevention. These problems always lead the software to incorrect implementation. So following guidelines for a detailed specification review is suggested:
1. Always review specification document with the entire testing team. Discuss each point with team members.
2. While reviewing specification document, look carefully for vague/fuzzy terms like – “ordinarily, most, mostly, some, sometimes, often, and usually” and ask for clarification.
3. Many times it happens that list values are given but not completed. Look for terms: “etc., and so forth, and so on, such as.” And be sure all the items/list values are understood.
4. When you are doing spec review, make sure stated ranges don’t contain unstated/implicit assumptions. For example: “The range of Number field is from 10 to 100.
But is it Decimal? Ask for Clarification.
5. Also take care of vague/fuzzy terms like – skipped, eliminated, handled, rejected, processed. These terms can be interpreted in many ways.
6. Take care of unclear pronouns like – “The ABC module communicates with the XYZ module and its value is changed to 1.” But whose value (of ABC Module or XYZ Module)?
7. Whenever a scenario/condition is defined in paragraph, then draw a picture of that in order to understand and try to find the expected result. If paragraph is too long, break it in multiple steps. It will be easy to understand.
8. In the specification document, if a scenario is described which hold calculations, then work on its calculations with minimum two examples.
9. If any point of the specs is not clear then get your queries resolved from the Business Analyst or Product Manager as soon as possible.
10. If any mentioned scenario is complex then try to break it into points.
11. If there is any open issue (under discussion) in the specs (sometimes to be resolved by client), then keep track of those issues.
12. Always go thru the revision history carefully.
13. After the specs are sign off and finalized, if any change come, then see the impacted areas.


Art of Test case writing


Objective and Importance of a Test Case
– The basic objective of writing test cases is to ensure complete test coverage of the application.

  •  The most extensive effort in preparing to test a software, is writing test cases.
  • Gives better reliability in estimating the test effort
  • Improves productivity during test execution by reducing the “understanding” time during execution
  • Writing effective test cases is a skill and that can be achieved by experience and in-depth study of the application on which test cases are being written.
  • Documenting the test cases prior to test execution ensures that the tester does the ‘homework’ and is prepared for the ‘attack’ on the Application Under Test
  • Breaking down the Test Requirements into Test Scenarios and Test Cases would help the testers avoid missing out certain test conditions

What is a Test Case?

  • It is the smallest unit of Testing
  • A test case is a detailed procedure that fully tests a feature or an aspect of a feature. Whereas the test plan describes what to test, a test case describes how to perform a particular test.
  • A test case has components that describes an input, action or event and an expected response, to determine if a feature of an application is working correctly.”
  • Test cases must be written by a team member who thoroughly understands the function being tested.

Elements of a Test Case
Every test case must have the following details:

Anatomy of a Test Case
Test Case ID
Requirement # / Section:
Objective: [What is to be verified? ]
Assumptions & Prerequisites
Steps to be executed:
Test data (if any): [Variables and their values ]
Expected result:
Status: [Pass or Fail with details on Defect ID and proofs [o/p files, screenshots (optional)]
Comments:

Any CMMi company would have defined templates and standards to be adhered to while writing test cases.

Language to be used in Test Cases:
1. Use Simple and Easy-to-Understand language.

2. Use Active voice while writing test cases For eg.
– Click on OK button
– Enter the data in screen1
– Choose the option1
– Navigate to the account Summary page.
 

3. Use words like “Verify” / ”Validate” for starting any sentence in Test Case description (Specially for checking GUI) For eg.
– Validate the fields available in _________ screen/tab.

(Changed as per Rick’s suggestion – See comments)

4. Use words like “is/are” and use Present Tense for Expected Results
– The application displays the account information screen
– An error message is displayed on entering special characters


Why start testing Early?


Introduction :
You probably heard and read in blogs “Testing should start early in the life cycle of development”. In this chapter, we will discuss Why start testing Early? very practically.Fact One
Let’s start with the regular software development life cycle:

When project is planned

 

 
  • First we’ve got a planning phase: needs are expressed, people are contacted, meetings are booked. Then the decision is made: we are going to do this project.
  • After that analysis will be done, followed by code build.
  • Now it’s your turn: you can start testing.

Do you think this is what is going to happen? Dream on.

This is what’s going to happen:

This is what actual happened when the project executes

 

  • Planning, analysis and code build will take more time then planned.
  • That would not be a problem if the total project time would pro-longer. Forget it; it is most likely that you are going to deal with the fact that you will have to perform the tests in a few days.
  • The deadline is not going to be moved at all: promises have been made to customers, project managers are going to lose their bonuses if they deliver later past deadline.

Fact Two
The earlier you find a bug, the cheaper it is to fix it.

Price of Buggy Code

If you are able to find the bug in the requirements determination, it is going to be 50 times cheaper
(!!) than when you find the same bug in testing.
It will even be 100 times cheaper (!!) than when you find the bug after going live.

Easy to understand: if you find the bug in the requirements definitions, all you have to do is change the text of the requirements. If you find the same bug in final testing, analysis and code build already took place. Much more effort is done to build something that nobody wanted.

Conclusion: start testing early!
This is what you should do:

Testing should be planned for each phase
  • Make testing part of each Phase in the software life cycle
  • Start test planning the moment the project starts
  • Start finding the bug the moment the requirements are defined
  • Keep on doing that during analysis and design phase
  • Make sure testing becomes part of the development process
  • And make sure all test preparation is done before you start final testing. If you have to start then, your testing is going to be crap!

Want to know how to do this?
Go to the Functional testing step by step page. (will be added later)


Golden Rules for Software Testing


OK Folks, the original article is written by Ray Claridge here – (http://www.testertroubles.com/2009/05/golden-rules-for-software-testing.html). In this article we are adding few more tips.
Introduction

Read these simple golden rules for software testing. They are based on years of practical testing experience and solid theory.

Its all about finding the bug as early as possible:
Start software testing process as soon as you got the requirement specification document. Review the specification document carefully, get your queries resolved. With this you can easily find bugs in requirement document (otherwise development team might developed the software with wrong functionality) and many time this happens that requirement document gets changed when test team raise queries.

After requirement doc review, prepare scenarios and test cases.

Make sure you have atleast these 3 software testing levels
1. Integration testing or Unit testing (performed by dev team or separate white box testing team)
2. System testing (performed by professional testers)
3. Acceptance testing (performed by end users, sometimes Business Analyst and test leads assist end users)

Don’t try to become popular by completing tasks before time and by loosing quality. Some testers do this and get appreciation from managers in early project cycles. But you should stick to the quality and perform quality testing. If you really tested the application fully, then you can go with numbers (count of bus reported, test case prepared, etc). Definitely your project partners will appreciate the great job you’re doing!

Regression Testing is MUST:
Once the development team is done with the bug fixes and give release to testing team, apart from the bug fixes, testing team should perform the regression testing as well. In early test cycles regression testing of entire application is required. In late testing cycles, when application is near UAT, discuss the impact of bug fixes with deal team and test the functionality as per that.

Don’t expect too much of automated testing
Automated testing can be extremely useful and can be a real time saver. But it can also turn out to be a very expensive and invalid solution. Consider – ROI.

Test With Real data:
Apart from invalid data entry, testers must test the application with real data. for this help can be taken from Business analyst and Client.
You can take help like these sites – http://www.fakenamegenerator.com/. But when it comes to Finance domain, request client for sample data, coz there can data like – $10.87 Million etc.

Keep track of change requests:
Sometimes, in later test cycles everyone in the project become so busy and didn’t get time to document the change requests. So in this situation, for testers (Test leads) I suggest to keep track of change requests (which happens thru email communication) in a separate excel document.
Also give the change requests a priority status:

  • Show stopper (must have, no work around)
  • Major (must have, work around possible)
  • Minor (not business critical, but wanted)
  • Nice to have

Actively use these above statuses for reporting and follow up!

Note – In CMMi or in process oriented companies, there are already change request management (configuration management systems)

Don’t be a Quality Police:
Let Business analyst and technical managers to decide what bugs need to be fixed. Definetely testers can give inputs to them why this fix is required.

‘Impact’ and ‘Chance’ are the keys to decide on risk and priority
You should keep a helicopter view on your project. For each part of your application you have to define the ‘impact’ and the ‘chance’ of anything going wrong.

  • ‘Impact’ being what happens if a certain situation occurs – What’s the impact of an airplane crashing?
  • ‘Chance’ is the likelihood that something happens – What’s the chance to an airplane crash?

Delivery to client:
Once the final testing cycle is completed or when the application is going for UAT, Test lead should discuss  that these many bugs still persists (with priority) and let Technical manager (dev team), Product manager and business analyst to decide whether application needs to be deliver or not. Definitely testers can give inputs to them on the OPENED bugs.

Focus on the software testing process, not on the tools
Test management and other testing tools make our tasks easy but these tools cannot perform testing. So  instead of focusing on tools, focus on core software testing. You can be very successful by using basic tools like MS Excel.