How to Involve QA in Design, Test Automation

Dan CagenDan Cagen
minute read

Here are two ways to leverage QA testing teams’ expertise

As organizations shift testing to the left, they are starting to leverage QA in new ways. Two of those ways are getting QA more involved in product design and helping to build a test automation strategy.

These tasks are traditionally under the purview of product and engineering teams, respectively, but there is obvious value to including QA teams in these processes, given QA’s testing acumen and familiarity with how the product should work in the real world.

In this blog post, we’ll look at how organizations can incorporate QA into design and test automation to fully leverage QA’s skills.

QA Is Proactive in Reviewing Design

Particularly in a DevOps organization, QA testers work tightly with the product team (or whomever is creating the customer requirements). This gives the QA tester an opportunity to add business value by reviewing design elements upfront.

Think about it: the QA tester should know how the application works from end-to-end, and how it works in concert with related applications. They can analyze the app’s design upfront and analyze any shortcomings. By finding defects or missing requirements before the application is coded, the business saves both time and expense.

In a DevOps methodology, the QA team must expand its role beyond testing only Dev changes. Now, the QA team should proactively find and verify defects.

Develop a system where the whole DevOps team defines the deployment process. As you build the process, consider these questions:

  • What are we verifying during the deployment?
  • Are we manually verifying production functionality?
  • Is the team verifying deployed code in production using integrated automated test scripts?

For example, many applications build integrated code-based tests into the code that notify if an error occurs. Most of those errors are around API connection failures, backend data processing failures or database disconnects. The team needs to also figure out a way to do a smoke-level test of the main customer workflows on production while limiting risk, and not creating junk production data.

Building out this process is easier said than done. Take the time as a team to plan, discuss and figure out what works best for your specific application.

QA Builds the Test Automation Strategy

The QA tester needs to analyze the application design and create the overall test strategy. Granted, the QA testers at some organizations won’t actually code the automated tests themselves, but they should define what the test case is covering — or what requirements it tests and an expected result proving it works.

Reviewing the design for test conditions is essential to releasing a quality application, and the QA tester is the one to do it. Developers know the function they are coding, but often don’t understand how the whole system functions as a customer workflow.

Reviewing the design for test conditions is essential to releasing a quality application, and the QA tester is the one to do it. Developers know the function they are coding, but often don’t understand how the whole system functions as a customer workflow.

To be successful, the QA tester and the person coding the automated tests should collaborate to determine what needs to be tested and the priority order. Initially, focus on automating only critical items, such as ensuring the backend connections and processes are functioning. Then, build your suite to also cover the highest-priority functions for the main customer workflows.

QA teams can also get more involved with setting priorities for ‘risk-based testing,’ which helps you prioritize testing based on the risk of failure. If a failed test has an impact on the business, then input should come from product owners. However, for the more complex areas of the application that are likely to contain more coding issues, input should come from development. QA can leverage formal ways to process and manage risk-based testing and report it as a metric.

If your application changes significantly with every release, your automation strategy should include a way to maintain the test scripts so they remain valid and executable. The only errors you want to see are true defects, not automated script issues. A good rule is your automation test script should be as good or better than the script it’s testing.

We’re well past the era of QA being solely responsible for testing at the end of the SDLC. Incorporating QA in more areas, such as design and automation, can pay off and make your products stronger in the long run.

Essential Guide to Shift Your Testing Left

Essential Guide to Shift Your Testing Left

Ebook

Shift-left testing brings quality assurance to the beginning of your SDLC to assure quality of the code at every step. Learn how your team can benefit from shifting testing left.

Read Now

You might also be interested in: