Company Size

Large

Company Location

Exeter, UK

Testing Type

Functional

App Type

Web

Testing Coverage

Global

Met Office

Expand Your Testing Coverage

With more than 1800 employees around the globe, the Met Office is among the world’s largest providers of weather-related services. In late 2009, as the company prepared to expand their web presence, Senior Tester Bob Doubell needed to find a sensible way to expand testing coverage within an agile framework.

"With an agile cycle, there isn't a lot of time to test," said Doubell, who manages a small team of three in-house testers. "Since there's no predictable window, we needed a solution that could make our testing efforts more flexible. It used to be a tough nut to crack, but that's where uTest (now known as Applause) has helped the most."

Although familiar with standard outsourcing procedures, Doubell was new to crowdsourcing, so while he was eager to see it in action, he was not without his doubts.

"We were skeptical, cautious and somewhat cynical about what crowdsourcing and uTest (Applause) could offer," he said. "But those feelings were eased after a few pilot test cycles."

This case study will illustrate how the Met Office leveraged the Applause community to greatly expand its testing coverage ahead of several major launches. Along the way, we’ll cover the process of getting started; selecting testers; managing the Applause platform and more.

Getting Started: Defining Goals

Improving time-to-market is the number one reason why companies sign on as Applause customers – and the Met Office was no exception. WIth testers around the globe, Applause was able to provide Doubell with the expertise he needed, precisely when he needed it.

"With a typical outsourcing firm, you can have 50 contracted testers sitting around until it’s time for you to test. uTest (Applause) has testers that are available 24/7, 365 days a year. Better yet, they’re available on practically no notice at all, which allows us to essentially compress 50 days of testing into a 24-hour period."
-- Bob Doubell, Senior Tester, The Met Office

But just because testers were available did not mean they were qualified for his project. Doubell emphasized his need for skilled, proven testers, and so he would make good use of Applause's advanced tester rating system.

"I start off each test cycle by inviting the ‘gold’ testers," said Doubell. "I'm looking for the top-ranked testers because their skills and enthusiasm have been proven."

Test Cycle Execution

Like many Applause customers, Doubell divides his test cycles into separate components. His first step, as mentioned, involves inviting the gold testers. He then selects testers based on factors such as geography and time zone.

"If it's midnight in the UK, then obviously I will exclude them from the test cycle and open it up to countries where they are more alert and active, since we don't have time to mess around."

Doubell ends his test cycles by inviting new members of the Applause community, an exploratory approach that helps his team get a fresh perspective.

"They might not be able to write a perfect bug report," he said of the novice testers. "But they will give me a sniff of where the real defect is. I like the ability to pick new people for projects and give them a chance to improve as a software tester."

In terms of compensation, Doubell notes, "If I don't treat them well, they are unlikely to come back and work for me again, so I always try to show my appreciation in that regard."

Test Cycle Results

On a typical Applause project, Doubell expects to see around 150 reported defects within a 24-hour period. Of that number, he said he usually approves around 100 of them.

"I'm looking for defects that are going to add value to our business; things that I think need to be fixed and things that need to be sent to developers in a short amount of time," he said.

To find these bugs, Doubell sets up a series of recurring functional and exploratory test cycles, in which the Met Office applications are tested across browsers, operating systems and other criteria (see below for details).

Conclusion

Few companies have a user base as large and diverse as that of the Met Office. But thanks to Applause, they were able to compress their testing activities into a much smaller time-frame – without sacrificing quality.

As one of the earliest adopters of crowdsourced testing, the Met Office dealt with some of Applause's growing pains, but now says that it plans to leverage the global community of skilled testers with every major iteration, including usability and load testing.

"I really believe uTest (Applause) has a great product," said Doubell. "They provide a solution to a very big problem, and that's why we're so happy to be using them."

"It's really like a three-way street," Doubell observed. "We're on one corner, the testers are on another and uTest (Applause) is on the third. If things go as expected, everybody wins. That's the magic of it."