Why Creativity and Collaboration Matter in Software Testing

Ask most people what problems crowdtesting solves, and they’re likely to list common testing requirements – certain types of coverage, scale, etc. But all too often, our customers overlook that the world’s largest and most diverse crowdtesting community can provide more than just sheer expertise and numbers. That community holds powerful connections, serious creativity and problem-solving skills.

In this blog, I cover the importance of creativity and a willingness to embrace new and complicated testing scenarios. I call out some unique testing cases we’ve executed here at Applause, as well as the need for organized and collaborative testing teams. The examples I share are from real engagements with global firms.

Testers who work, live and play

When it comes right down to it, so much of software testing involves testing things we all use on a regular basis. Applause has approximately 1.7 million people around the world who work, live and play, and in doing that, access the digital experiences our customers create. This means that our testers have first-hand experience with many of the apps they test. They know the things they need and/or want in an app and this informs much of the dedication and creativity they employ in their work. So how does this real-life experience play out in some of the more unique software test requirements we’ve seen? Here are a few examples:

  • A major U.S. sports app needed Spanish-speaking, color blind fans to test - Applause sourced testers who had these requirements from its large community, but also had to ensure that the testers had a certain level of knowledge of the game. Applause created a screening test to ensure that prospective testers understood key rules, terms and how the game progressed under certain conditions of play.

  • A payment tester in Vatican City - A global card issuer asked Applause to test a card in every country in the world, including Europe’s smallest state, Vatican City. Applause did not have a tester within our global community who lived within Vatican City, but one of our testers knew someone who lived there. We were able to train this individual and execute the testing.

  • A leading video game platform needed a tester in Antarctica - Again, a member of the Applause global testing community had a contact who lived in Antarctica and we were able to arrange for testing as needed.

  • Voice testing for French Canadian speakers with foreign language accents - One of our customer’s voice technologies was having difficulty understanding native Italian speakers in Canada when they spoke French Canadian. Though the blend of the Italian accent and French Canadian was the initial issue that required testing, this prompted testing in many other countries where speakers brought their unique accents to the French Canadian dialect. In a matter of a few weeks, Applause recruited hundreds of people and quickly organized appointments for them to read a script to help train the AI algorithm behind the voice application to better handle variation in French Canadian accents.

  • A global hotel chain testing room access - Various hotel locations needed testers onsite to ensure that the app that enabled room entry not only stopped working after checkout, but also disabled access to other common areas on premises: the fitness center, the pool, the garage and more.

These examples show how lab tests simply cannot address so many real-life testing scenarios. Do you need to test a wearable device with a runner in a specific mountainous region? How about at varying temperature extremes and humidity levels, with testers that sweat heavily? Or while swimming in salt water? Our testers are regularly asked to do field work that covers a wide spectrum of requirements. There’s simply no way to check certain functionality without real people in real situations doing the testing.

Collaboration, connection and creativity

Sometimes it’s not about finding or training a tester in a specific place, but rather coordinating many testers around a specific project. Take fantasy sports, for instance. To test certain elements, such as the draft phase of fantasy football for example, requires several testers who are interconnected in the experience. In fact, because the draft pick is a rotating process, nothing proceeds if one tester is missing. The key here is coordination and collaboration. In another example, it may be prohibitively expensive to do a load-based bandwidth test at a rock concert by purchasing 100 tickets and sending testers to the show. But testers can test outside the event or in other similar crowd-based scenarios. This is something that can’t be accurately tested in a lab, and again, requires coordination.

Solving testing problems for a few that become useful for many

We regularly stretch to help our clients solve problems that we may not have solved before. Here are a few examples of which met specific client challenges and have become common tests we now do around the world:

Testers available every day at the same time - One of our large customers needed faster feedback and testing than a typical 2-day cycle. They release every evening at the same time and need a handful of testers available daily from 4 - 5:30 pm to do pre-production testing. Developers then fix all defects that had been entered into the system by 7 pm, and send it back to the testers for bug fix verification from 7-8 pm, ensuring the company could launch at 8:15 pm – every night.

We now do this for several clients, and it has led to us building a configurable application where we take a pre-production build from a client's Jira system into the Applause platform. We staff testers within a predetermined window for rapid feedback.

Sourcing international testers for a major launch in 100 countries…that may not launch when planned - A major international streaming media company required in-market launch testing immediately after release. It required testers in 100 countries to test, but they would not have pre-production access. Testers needed to be available at a specific time to test a wide range of OTT configurations including Apple TV, Samsung TV, Web, mobile, Roku devices and many more. They also needed to test that payment instruments worked for daily, monthly or annual subscriptions, as well as validate promotion codes and various telco subscription services. Once the launch happened, testers worked intensively for 48 hours and fed all issues back into the client’s system, and executed bug-fix verification.

A major piece of this testing scenario was that the streaming company needed all the testers to be ready, but there was a likelihood that the launch might be pushed out multiple times. This required tremendous flexibility with all testers in all countries available for the initial targeted launch time, and any subsequent backup launch times.

Testers in 200 countries to provide payment instrument details - One of the world’s largest software companies was experiencing theft due to people using fake credit cards on its e-commerce site. It needed an infrastructure of valid credit cards in 200 countries. It asked Applause to recruit and manage all these testers and have them provide their actual credit card details to the company. We did this and the model is now used with over 150 clients around the world.

We do things we have never done before

We’ve executed many thousands of common test cases for close to 15 years. But most people miss that Applause’s crowdtesting is built to solve problems that we have never encountered. We regularly take on testing that pushes us to do new things and solve new problems that will, inevitably, be seen by other customers.

It’s part of crowdtesting’s DNA. The diversity of the model, with its flexibility and creativity, becomes enhanced over years of solving problems that were new to us. And no one does it as well as Applause.

Stay tuned. In future blogs, we’ll go deeper into specific use cases and unique testing done by the Applause global community.

Want to see more like this?
Paul Hoffman
Senior Content Manager
Reading time: 7 min

Is it Realistic to Shift Testing Left?

Hear from Vodafone's User Acceptance Testing Lead on what it takes to make shift-left testing a reality.

Should Testers Learn How to Code? Pros and Cons

Coding offers opportunity for career advancement, if you’re up to the task

Improving Digital Quality in 2023: Where to Focus

Learn the quality systems, processes and capabilities that drive outstanding digital experiences and customer loyalty

5 Highlights from the 2023 State of Digital Quality in Europe Report

Learn what's in the State of Digital Quality in Europe report this year

How Vodafone Shifted User Acceptance Testing Left

Learn how Vodafone embraced Agile and shifted user acceptance testing (UAT) left.

Benchmarking Progress Toward Digital Accessibility

Survey results show organizations are placing a higher priority on accessibility and inclusion