Skip to main content

To achieve a goal, it is important to estimate the effort and resources required to accomplish the different tasks involved in reaching that goal. Having estimated figures wind up close to the actual ones helps deliver the intended results per the defined schedules.

Organizations nowadays are spending more time on estimating efforts than even before, since more accurate estimates are the key to success. Like any other activity in the software development process effort, estimations play an important part in delivering quality products. Test estimations provide approximate timelines for how long testing tasks should take to complete.

Many test estimation techniques have been developed. Some of the most widely used include:

  • Work breakdown structure (WBS)
  • Three-point estimate
  • Function/Test point analysis method
  • Use case points technique
  • Wide band Delphi

All the techniques mentioned above provide good estimates. However, each technique is missing some important factors/aspects that should also be considered while estimating efforts for testing. What’s needed is a strategy/technique by which efforts for both manual and automation testing can be calculated. The best way to get closer to actual efforts is to identify the testing scenarios and assign them a complexity—high, medium or low. In general, the following key parameters should be considered when defining testing complexity:

  • Number of operating systems/browsers supported by the application
  • Number of environments on which application needs to be tested
  • Type of testing like functional, performance, security, etc.
  • Testing methodology like manual or automated
  • Type of validation like UI/database, etc.
  • Whether API involved in the application need to be tested
  • Third party integration like payment channels/authentications
  • Type of application like web/mobile/native
  • Domain expertise of testing team
  • Test lab setup by the testing team before testing (if required)
  • Data setup (if required)
  • Number of steps/validation in the test scenario
  • Availability of supporting document like SRS, use cases, stories, etc.
  • Availability of application while writing test scenarios
  • Current state of the application (stable/unstable)
  • Test review and rework

For manual testing efforts, the following historical data should also be collected and considered while estimating the efforts:

  • Defect logging time
  • Percentage of defects found
  • Number of release/iterations to test
  • Re-validation of defects
  • Percentage changes in application across the builds

For automation testing efforts, the following additional parameters should also be considered:

  • Continuous integration (CI) requirement with the automation framework
  • Use of automated tool for test management
  • Distributed test execution

Once the complexities of the requirements/test scenarios are set, the number of high/medium/low test scenarios is calculated. Based on the historical data, past experience and above-mentioned parameters, time taken to test the scenario for each complexity is derived. Following table can be used to calculate the total efforts required for testing activity.

In addition to the above parameters, application knowledge transfer, framework customization/creation and contingency efforts should also be considered while estimating the test efforts.

Over the years Pyramid Consulting has developed its own estimation technique that not only provides testing estimates that are very close to actual figures but also gives an idea about the ROI that an organization can get by leveraging automated tools for testing.

Vikas Shukla

About the author

Vikas Shukla

An agile and customer focused professional, Vikas has been Pyramid Consulting's Director of Quality Assurance since 2009. Vikas manages a team of 90+ QA professionals and leads our QA Center of Excellence.

Cookie Notice

This site uses cookies to provide you with a more responsive and personalized service. By using this site you agree to our privacy policy & the use of cookies. Please read our privacy policy for more information on the cookies we use and how to delete or block them. More info

Back to top