Automated Regression Test Script Sprawl Can Cause Maintenance Migraines

Dan CagenDan Cagen
minute read

When test automation goes too far, engineering teams can suffer and spend too much time maintaining old tests

Automated regression testing is required if you’re looking to incorporate Agile and DevOps practices into your development lifecycle. But if you create too many test scripts or have extremely complex test code, you might be causing yourself significant long-term maintenance challenges.

If you write the test cases too loosely, they’re too easily passed. But if you write them rigidly, they’ll need to be updated and rewritten frequently as the codebase changes. This requires engineering time and talent that is usually in short supply. For example, an automated UI test that checks for the presence of a specific success message when a user performs an action will need to be updated every time a product manager decides to change the text of the success message.

Many teams underestimate the full costs of changing and testing peripheral aspects of a development project such as error messages, log output and user interface text. These costs include time, money and resources. Instead of focusing on new builds and products, the engineers are forced to constantly look back at older regression test scripts and ensure they are still accurate. Some organizations will make multiple full-time hires to handle these tasks. The costs can add up and prevent organizations from increasing development velocity.

Many teams underestimate the full costs of changing and testing peripheral aspects of a development project such as error messages, log output and user interface text.

Ultimately, it’s about prioritizing what should or needs to be automated. There’s no magic number of “just enough” or “too many” test scripts. Rather, your team needs to take all of these variables into consideration:

  • How fast do you need to go?
  • At what point does your ROI in automated testing and maintaining automated tests start to flatten out?
  • Where does it start to become more costly and time-intensive to build more automation scripts?
  • How long does it take to run the automation? Is that quickly enough to incorporate changes and fixes into the development process without slowing you down?

The answers are often based on your specific assets and needs — every organization automates a different number of test scripts.

Additional cost considerations should include:

  • Development platform and frameworks: Does additional testing mean additional design, maintenance, updates and enhancements?
  • Infrastructure, including integrations with CI/CD, deployment, hardware, networks and devices: Do you need to consider purchasing and maintaining devices to run automation on or to test against?
  • Development team: Do you have the staff and expertise to handle a growing codebase of tests while also moving your product codebase forward?

Many teams focus on the hardware and software sides of this equation, but that last point is just as important — maybe more important — to consider. Small and midsize companies need to also manage staff turnover. In non-enterprise companies, there’s often only one regression test automation expert. And every time a new expert is hired, they’ll spend time getting up to speed with the existing test regimen — or writing their own from scratch.

Automation Doesn’t Always Save Time

Once automated tests are in place, it doesn’t mean that the work is over and you can simply let the tests run and do the job. On the contrary: running the tests is just the beginning.

There are failed tests, which must be analyzed one by one. Some tests fail because of real bugs in the software (which is why we write tests, right?). However, many failures are not “real” — they can be due to poorly designed tests.

Analysis of the failures takes time, especially when you have lots of tests. If you have 1,000 test cases with a failure rate of 5%, that's 50 failing tests to review. If you take 10 minutes — a very optimistic assumption — to debug and verify each test, that would be 500 minutes. You're talking over eight hours! This doesn't even take into consideration the time to rewrite the poorly written tests.

If you have 1,000 test cases with a failure rate of 5%, that's 50 failing tests to review. If you take 10 minutes — a very optimistic assumption — to debug and verify each test, that would be 500 minutes.

An alternative is to simply disable some tests. The pressure from management to release software, and the lack of resources to analyze and rewrite the tests, sometimes makes teams take the easiest path: disable the "unimportant" tests so they're not a distraction. And this is a recipe for disaster. You think that you have tests in place when, in fact, you have nothing to protect against unexpected failures. You've already invested the engineering team’s time to write these tests and now you're throwing them away!

Make Sure You Have a Balance

With modern development practices, automated regression testing is a must to deliver quality software quickly. But it's important to balance how much automated testing is needed in order to optimize resources and avoid unnecessary maintenance efforts. Automated tests should be written in a way that ensures effective and complete testing of the software without excessively burdening the team.

Once you start spending too much time maintaining and revising test scripts for regression testing, it might be time to ask yourself if you’re really getting the ROI you need from test automation — and whether incorporating some manual testing to complement test automation is a better way to spend your resources.

The Value of Manual Regression Testing

Ebook

Test automation plays a critical role in regression testing. But test automation is not a golden hammer, and manual testing plays a critical role in a robust regression strategy.

Read Now

You might also be interested in: