How to Onboard a Crowdtesting Partner Effectively

Working with a crowdtesting provider differs from most typical technology partnerships — and often, software development organizations aren’t sure how to negotiate those differences. In a recent webinar, Applause’s CTO, Rob Mason, outlined the process for getting started with a crowdtesting partner and developing an effective relationship that delivers value. Rob walked through how to:

  • Determine the problems you want to solve. Do you want to reduce the number of bugs reaching production? Speed your release cadence? Offload repetitive, time-consuming tasks from your internal teams so they can focus on more strategic priorities? Get insight into what your customers think and how they use your products in the real world?

  • Choose a pilot project. An ideal project has a narrow focus, clear scope, and reasonably short timeline — enough so you can get a sense of whether or not the crowdtesting provider can deliver what you need.

  • Define success criteria. How are you going to assess the project? Is it speed? Number of bugs reported overall? Device coverage? Increasing your test automation? Decide how you want to measure the partner and let them know what KPIs they need to hit.

  • Communicate with your crowdtesting partner to get the best possible results. If you want to see only functional bugs but not UI issues, say so up front. Only want to see critical issues? Want bugs prioritized in a certain way? Share that information to make sure you get what you need.

  • Review and use what you’ve learned from the pilot to strengthen an extended relationship – or inform your search for another provider. Treat the conclusion of the pilot like a sprint retrospective to guide what you do next.

Access the on-demand webinar or read the transcript below.

This transcript has been edited for length and clarity.

Jennifer Waltner (JW): Welcome, everyone. Thank you for joining today's webinar, Onboarding a Crowdtesting Partner. We're going to start talking about how to get the most value out of crowdtesting. And really, it works best if you start with the end of mind. So we're going to talk about how you can come up with goals for what you want your crowdtesting engagement to look like. Then we'll go through setting the scope of your initial pilot project, how to get the best possible outcome out of that pilot project in engagement, and then some guidelines for measuring success.

RM: Thanks, Jenn. So, you know, the most important thing when you're starting with any kind of project, whether it's crowdtesting or development projects or what have you, you have to start off with what problems you're trying to solve. And clearly define what they are, where your challenges are, the problems you've had in the past and what you're trying to accomplish with the overall project.

Identify the problems you want to solve

RM: Are you trying to improve your overall quality as a good example? Are you trying to get more efficient? Are you trying to take a load off of your resources? You know, a lot of times our internal teams are overloaded and you need to take a load off of them so they can focus on other things.

Sometimes you don't have the scale, you don't have the number of people or devices. You need more coverage or just more people in general to hit the system with more load testing or different environments. Sometimes you need to know how things are in the real world, once it gets out in the field. So, after your app or site is already launched, you need people to go in and keep an eye on things, in terms of is it still responding correctly, are the experiences correct, all that sort of stuff.

And then, sometimes your developers are actually held up. Often development can slip into the QA cycle which shortens QA times, but then they can't start to work on the next feature. So you need to pull in additional resources to accelerate. So that's a good way of improving developer velocity by not doing QA faster or actually closer to the development. And there are some ways of doing that that we can talk about in the future. Beta management, dogfooding, is another good example of early releases and getting them out there in the real world: real devices, real hands using the apps or sites the way they're supposed to be used.

And user acceptance testing is another great example of feedback on how the users actually feel about the product. We had the vision and the product team. We designed our product, we built it according to our specifications. We put it out there. But how do the users actually feel about it? What's their reaction to it? Not just is it working, but is it working with minimal clicks, with a smooth and intuitive interface. You want to collect that kind of information, too. And then, ultimately it's all about the customer. So as close as you can get to people that are like who you are trying to target with your product, the better.

So, you can have one or many of these problems that you're trying to solve. It's important to prioritize them. So what's the most important when you approach this problem? Is it quality? Is it speed, velocity? Is it relieving internal teams? You really want to come into the plan and tell your partner, here are my problems that I want to solve. Here's the order of importance for them, and I need you to help me work on solutions to solve these issues. Make sure that's well framed up from the start, because ultimately you're going to come back to this problem later on and measure the partner according to this. So at the end you're going to come back and say, did they improve my quality or did they improve my speed or what have you? And you want to have good clear measurements and a feedback loop with the partners as you go through that.

Some of the benefits of crowdtesting over other QA approaches

So crowd testing is, you know, fairly new… it's been around for over a decade now, but it's still fairly new to a lot of people in terms of thinking about it. And people are very familiar with in-house testing; that's been around for decades. Offshoring has been popular for decades at this point, but crowdtesting is a little bit different.

So there are some key attributes it has. It's generally faster to onboard with crowdtesting because you're not standing up a dedicated team that is in one or a few isolated locations. They are fluid in that the crowd people come and go from the crowd. They are tested and vetted, but they're much more flexible and dynamic and because the crowd is much more geographically dispersed, you can get people at any time because they're in all different time zones, all different geographies. You can really onboard people quickly and you can test your nights and weekends or holidays or whatever you need to test. They're available, which is very unusual versus, you know, in-house or offshoring, offshoring or in-house. You're going to run into local holidays or local vacations, vacation times, all that sort of stuff — with crowds and crowdtesting, you don't run into that. You can have people literally around the clock or at some specific time.

A lot of our customers, for example, will like to test over weekends. So they'll do all their work around the week, they send it off to a crowd for testing for the weekend. On Monday morning, they’ve got all the answers back and no time was wasted. So that's a great way of extending productivity and not wasting time around QA. And you can get rapid feedback so you can stand up these teams again, onboard them quickly and ask for feedback within hours versus days or weeks.

That's just tremendous for getting feedback closer to the development cycle, which makes your developers more efficient because they haven't changed contacts, they haven't switched on to new projects. They're getting the feedback on their code or whatever they build as soon as possible, which is super helpful in crowdtesting. There's more people involved, so you have more access to devices. It can be millions of devices that you have access to. Depending on what your target audience is and the target devices, you can get access to things outside of labs or isolated locations unlike, you know, in-house testing. You're very limited in what you can do in the labs or simulate in your labs offshoring; you're often limited to just where that partner is.

With crowd testing, it can be anywhere. You can have people test in Starbucks or you can have them test in their homes or in shopping malls or whatever you want. That can add interesting challenges around the testing. For network connectivity and networks that are flaky, a lot of development is done with perfect networking and perfect devices and well set up, but when you start using real devices in the real world and networks come and go and the devices have a lot of things running on them in the background and all that sort of stuff, it gets a lot more interesting and you get much better feedback. Again, like with that diverse group of crowdtesters, you get exposure to all the differences.

So you can have testers from hundreds of different countries participating and giving you feedback on how their languages are either translated or rendered. And you know, the right to left languages, left to right languages. And there's also, you know, the differences in payments systems, in instruments, differences in banks and credit cards, credit card transactions, debits, cryptocurrency that comes into play for you. There's a lot of flexibility.

Basically what you're seeing in this description is that the crowd is a lot more flexible. There's more of them. They're more available, they have more access to devices, networks, locations, languages, all that sort of stuff. And they'll test in the real world. So it's not lab testing, even though it can be very structured. You can run test cases and do that with the crowdtesters. You can also run exploratory testing or unstructured testing and get feedback on just… use the app, use the product and give me feedback for how it's working and tell me when things go wrong. So you get a lot more flexibility. That doesn't mean you have to give up on the quality of what you get back.

We'll talk more about that. But you know, it's important with all these benefits you get that you don't give up anything as well. So you should expect bugs that are as well-written as anything that could be done in-house or offshore. You should expect test cases to be as well-executed as they would be in-house or offshore. So just keep in mind that you should not have to give up anything when you choose to go with the crowd. And there are a number of benefits for going with the crowd.

Narrow your focus

So when you're again coming back to the beginning, where you were outlining what you need help on, you've got certain goals, whether it's improving quality or improving velocity, there are still more things you have to consider, which are the areas that you want to focus on. You might just want to do functional testing… you can add things to a shopping cart and check out and that sort of straight line functional testing. But you also might want to add in things like usability testing. So I want to know how usable the product is, is it minimal clicks to checkout? Is it intuitive, all that, all that sort of stuff. So there's a number of attributes that you could be looking to test in that area.

Not everyone understands omnichannel, but it's very common these days with solutions that our customers are putting into the market where you might start ordering a coffee online at your desk from a web browser, and then you go and drive to the store and use your app to pay for it. When you get to the store to pick it up and take the coffee and then maybe provide feedback on the overall experience. So omnichannel is crossing multiple channels or interactions with the product or services provided by the company. And so that's another good example where you need people that can order from a web browser and then drive to a store and pick up a coffee and then pay for the coffee and give you that whole experience.

It's not good enough to have an app where you can just pay for the coffee if the whole experience is what matters. And so for some customers, that whole experience and satisfaction is critical. And so that's another area that can be focused on payments and payment testing. Again, you need access to lots of real credit cards, not fake or test ones, with a lot of different transaction devices in a lot of real world scenarios. You know that payment cards could be dirty, they could be cracked. There can be all sorts of differences there. And testing that broad range means that when you finally launch the product to the market, you'll have good success from day one. Is that important for your application, for your solution? Do you have payment in your app?

Do you need test automation? We know it's important to automate as much as you can in QA and testing so you want to think about what's your strategy around automation. A lot of customers do that themselves. A lot of people choose to do that in-house. But automation can be sent out to some of the crowdtest providers that are out there. Do you need help with that? Do you need additional tests written? Do you need test maintenance? Do you need test monitoring? There are options there, even from crowd testing partners.

Accessibility is another huge testing area, very hot these days, lots coming in terms of legislation around requirements to be accessible. Are you designing for accessibility? Have you built that in? Do you need advice and consulting on that? Do you need help testing accessibility? Is that accessibility testing just lab testing or people simulating disabilities? You should really test with people with real disabilities and people who have real accessibility challenges that can provide feedback on how you're hitting the mark or not hitting the mark.

Accessibility is not just about checking the box, it's about actually making your product accessible. And to know you're doing that, you want real people with real needs in that area so that you can make sure that you're serving them and serving them well. There's a large audience of people that have accessibility needs and it's a shame to miss out on them.

Webinars

Accessibility Now Means Less Pain Later

This webinar covers key accessibility planning and execution issues as well as the impact of the global impact of the European Accessibility Act.

From a product perspective, voice is another hot and growing technology. A lot of people are building that into their products today, spoken commands, chatbots, spoken or text. Is that part of your product? If so, you need to start thinking about utterances and dialects and languages and accents and all the things that go along with that. It's a very complex area. So if you're training a voice system, there's a lot that goes into that. Crowdtesting is a great way of doing that because you can get a wide dispersed view of people with all those different accents and geographical differences. And so it's actually a great fit for crowd testing.

And finally, the last area: security. Various crowd testing partners offer security solutions where they can come in and run various off-the-shelf tools, sometimes homegrown tools, to explore the security perimeters of your product. Again, you know, something like that. You want to clearly set up the parameters with your partner ahead of time, make sure they know what they can and cannot do. You know, if you don't want them to do a denial of service type attacks, you tell them not to do that, or you only want them accessing certain systems or staying within certain boundaries. You want to very specifically define what they can and can't do as they're doing security testing for you.

Overall here, this is not the whole list of solutions you can get from a good crowdtesting partner. This is just a set of them. So you really want to understand, you know, again, coming back to your needs, what are you trying to do? What are the offerings that your crowd testing partner has and which ones do you want to engage on? I would always encourage people to start with one solution first and then expand from there.

Choose your pilot project

Whether it's manual functional testing or usability testing, start with something that is well defined, that's at the top of your needs list. Go through a pilot with them, go through a trial, get to know that partner. As that is successful, then expand into other solutions. Oftentimes we run into customers that have not seen or thought about some of the areas, maybe accessibility, even though it's top of mind for many businesses these days. Maybe they're late in the game or they're trying to get quickly to market and they haven't thought about accessibility. So expanding into that area is a great way to get a jumpstart. You can have an assessment done, understand where you are in a report and get a prioritized list of things to attack there.

So again, start with one solution with a crowdtesting partner. Take them for a test drive and provide feedback and then expand from there. Don't start with all their solutions all at the same time — that'll drive you crazy and it's not going to set you up well for success. So that leads us into the next area, which is choosing that pilot project project. That's a critical part of the whole process, so you want to make sure that what you're doing is super well defined.

Again, you're not writing a detailed specification. You know, you can just have conversations with your crowdtesting partner and outline what you're trying to do. Is it a whole new application? A major piece of functionality. Are you just nailing a bunch of bugs in a release? Make sure they understand what the mission is and keep it as tight as possible.

If it's something like bug testing, give them a list of bugs and say, I want you to go through and verify all of these. That's a good test for them. Measuring the partner to see if they know how to navigate around your application, how to get to the different areas, understanding the bugs if they didn't write them, how to interpret those and recreate the scenarios. So that's a good example. But make sure that again, the project is well defined. Don't make it super long either. Make the project fit in within a few weeks so that it's not unbounded and you're not exposed for too long with a partner in terms of too much of a time sink or not enough time between feedback.

Define success criteria

In terms of success criteria, you want to make sure that you've got the coverage you want. Maybe in the beginning you don't need a lot of device coverage and devices, but you do want to think what that next phase would be. So on the pilot project, okay, I want you to test the application, maybe verify some bugs on your basic iOS devices and Android devices, but then for the next quick follow on, I want you to expand and cover the last three major versions on the ten most popular devices and each of those operating systems. So you want to think about how that pilot project might progress and what your goals are like. In the end, what are you expecting the partner to provide?

For me, one of the biggest quality factors is, if I'm asking for things that would result in bugs, whether it's functional testing or structured testing, I want to know how well does that partner write bugs? Are they consistent? Because again, with the crowd, you've got a lot of people involved. Are they consistently giving me great quality bugs that are well defined? They have screenshots, they have videos when necessary, things are annotated. If things may not be clear from the screenshot or video, the steps to reproduce the bugs are extremely clear and I know exactly how to understand the results that came back from them.

You want to look both at what you ask them to do as well as what you expected them to do without specifically asking — you shouldn't have to tell a crowd testing partner to write good bugs and to make sure they're reproducible. They should do that for you automatically, but you should measure that and understand the quality of those. It should be at least as good as what your teams do in terms of the quality of the output.

Also, how fast do they move through the testing? If you're doing structured testing, they're getting through 100 test cases or however many you've given them for the pilot project. How fast did they get through that? How accurate are they? What kind of bugs came out of that? And another good area is signal to noise ratio. How many valuable bugs are they giving you versus, small formatting issues or things that are more subjective, like “I don't like this” versus “this thing doesn't work.” You want to look for a partner that is giving you valuable bugs that are actionable that you want to fix.

Pilot projects are good for things that you're very familiar with. So if you haven't tested the area yourself or are not super familiar with it, it's not a great project to give to a crowd testing partner. If you're putting them to the test, you want to add something that you understand well so you know what their challenges will be. And then from there you can scale it up and add more products, geographies, languages, all that sort of stuff. But in the early days, you want to test them and know what that test is like from the participants’ perspective so that you can measure them and see how well they'll do. Often it's good to start with a project that's well known and that you've tested or done in-house several times in the past.

So scope, again, hitting on this a lot, what are you testing exactly? What do you want to test, what devices do you want to test on? This is what you call coverage. You know that can be very specific: major versions of minor versions, brands, Samsung versus the Google Pixels, something like that. You can be very specific and that might come back from your own marketing demographics for who's adopting your product and who's using them. So you might want to make sure that that is well-targeted and that the crowdtesting partner has access to the devices you need. A good crowd testing partner will have access to all the most recent and even several generations old of devices and operating systems. But make sure that you get the coverage you need based on what you're seeing back from your analytics, from your application usage.

A lot of times crowd testing partners get pulled in fairly late in the game… often right before release, they're almost seen as a final check point for real people on real devices in the real world. But that doesn't need to be the case. Crowdtesting partners, depending on their capabilities, can be pulled in as early as in-sprint, as early as when the developer is done fixing a ticket or finishing a story. The crowd testing partner could jump in then. So depending on their capabilities, you can pull them right into the development phase of the SDLC and ask them to test as soon as tickets are completed or you can push them out all the way towards the end, even when things are in production.

Ebooks

Essential Guide to Usability Testing

Tap into the value of user insights. In this guide to usability testing, we discuss types, strategies and examples to help you get the most value from the approach.

Read Now

What you do want to think about is where you're engaging the crowd testing partner in the SDLC and access. So if it's during development, is the product or service deployed in an area that the crowd testing partner has access to? This is a bit different versus in-house teams. Your in-house teams are inside your firewall. They're familiar with VPNs and access and they have access to your internal systems and the various logins that are needed. With a crowdtesting partner, you have to make sure they can get access. Good partners will have their own offerings in terms of VPNs and proxies that you can allow into your environment. But you need to talk to your infosec team, make sure that you're comfortable with the access being provided to outside people at whatever stage you're engaging in in the SDLC.

Developers can't always reproduce things with their own environments or own devices. So that is why testing is important in different environments. You want to understand where things are going to be tested. Are you asking them to test anywhere in their homes, in the labs, in stores and different networking scenarios? What does that look like? So you want to make sure that's pretty well-defined and so is your tester criteria.

You might be developing an application that is targeting a specific demographic. It might be a clothes shopping app that appeals to a certain group of people, maybe a younger group or an older group, or it might come from a number of different attributes. And really the sky's the limit here. But if you're making a product that is dependent on a certain type of a person that you want to make sure your crowdtesters are in, that's that demographic that you're targeting. So there's no reason to be testing your application with people that don't fit the kinds of people that would use your application. Ask for people that are just like your users: these are potential customers in the future.

And that's a good test for the crowd testing partner. How many of those people can they turn up? If you're looking for a specific type of person in a specific country and specific language, ask for that and say, I need 10 of these and I need feedback in 24 hours. So you can be very specific there. Again, don't let them choose for you: set your roles based on what you're building and who you're building it for, and then ask for that type of person to be testing. Maybe it's doctors that you need to test your application because you're making an application for doctors. Ask for doctors with different experience levels or maybe you need nurses or lawyers. It could be very specific like that, or it can be just a shopping app. I just need people to buy stuff. You know, it's a grocery store type app and I need people who are shopping, but I need them in the range of 20 to 60. You can pick that kind of demographic.

Then with your partner, if you don't tell them what kind of bugs you want to see, you can get a lot of noise and a lot of mess. So you do want to be very specific. You could say, I don't want more than ten bugs, so give me the best ten bugs you can find. Or I want bugs that are only critical and higher in terms of severity. You should align with that crowdtesting partner on your definition of a critical bug. What's their definition? Make sure you both agree on that. And if you're saying I only want to see bugs above this level, tell them because with some partners, you could say, find all the bugs you can. They could give you thousands of bugs. And that's not very productive, it’s not a good use of your time.

So what you want to do is make sure that the partner understands the range of bugs you want to see or the level of bugs you want to see, or maybe both and make sure you're well aligned on that call, that quality gate right there.

This comes back a bit to where you are in the SDLC part, but make sure you understand the security around this partner that you're choosing. So how is this secure? How do people come into the community? Is there a training program? Are they trained on basic implicit policies and procedures? Do they have NDAs in place? Can you provide your own NDA for them to sign if needed? Are they rated and vetted and curated? How do they progress through the system, from the time a new crowd tester comes on board to the time they join your project? What does that look like? How does the partner deal with those people? How do they make sure that they're able to do good work for you in a secure manner?

It's especially important when you're dealing with a pre-production product that hasn't hit the market yet. You want to make sure that you're well-protected from any leaks into the field of your product that's not yet released yet and hasn't yet been reviewed or or launched. You want to make sure that it isn't exposed to the world. So you want to make sure that the partner has good processes around that and can control their crowd.

Crowdtesting isn't about just a random mix of people with no experience. It should be a carefully curated, trained, tested and vetted group of people that are specialists in testing for your needs. If it's just a random set of people that's not very helpful and you get better quality bugs, you'll get trash, you'll be exposed from a security and confidentiality perspective. So you do want to make sure that the partner has a great story around that and that there's a way for you to verify that.

A good example of tests that I always like to do is if they say anyone can join their crowd, don't tell them you're going to do it. You know, go join their crowd either with a personal email address or something. See if you can join the crowd and what that experience is like. That experience will tell you kind of what that partner is like from a vetting perspective and the quality of that onboarding experience kind of will tell you how good they are too. So you can put the partner to the test with formal projects and pilots, but also put them to the test with informal poking around them. Go into their forums and read what their testers are saying about them. Become a tester. You don't have to participate in a lot of projects, but just go through that experience — really understand that, put them to a test. And through that you can also understand some of the security requirements in training, NDAs, and things that the partner will put those testers through too. You could ask, you know, but trust and verify is always a good thing.

So how will those testers get access to your product? You know, whether you're distributing builds through TestFlight or they're downloading your apps directly, you're providing them to the partner and they're redistributing them. Or if it's a secure website and they need to access it through a proxy or VPN, make sure you understand how that access will be provided and that you've cleared that with your infosec team so that it's going to be smooth sailing.

A lot of projects can get stuck early on if you've got testing that you need to do and you haven't thought through that because in the past, only internal teams did it and you didn't have any infosec challenges because they were internal. Now all of a sudden you're in this waiting state to get started. So make sure you've cleared the barriers for outside access from the teams. It doesn't have to be a wide open access, but it needs to be controlled in some way so that an outside person can get access through a secure method.

Ask the partner how they're going to monitor and measure the testers. What stops testers from spamming you with a bunch of bugs that are no good? What if you have a rogue tester that's doing the wrong thing? What if they're just crazy and use foul language or what have you? How does the partner manage that? How do they monitor things? How do they isolate you from the noise? Good question.

A good thing to experience as you're going through the pilot, you can choose how much you are or are not involved. So make sure you communicate with the partner on how much you want to be involved. Do you want to just receive bugs in your bug tracking system? I just want to see bugs arriving in JIRA that are critical bugs or above. Or do you want to go in and see as bugs come in, how they're triaged, and choose for yourself which ones get exported to your bug tracking system. You do want to sort of calibrate there on how much you want to be involved in this whole pilot. And again, that depends on your needs.

If your needs are, I just want to send this to let them do it, and I want your white glove service and I just want high value bugs that go straight to my developer, you can tell them that and ask for that. If you want to be highly involved, you want to see everything that's found and understand the rate things are coming in, tell them you want to monitor the whole thing, start to finish, and you want to be involved in that and make sure that they have a platform that lets you do that. Make sure that they provide access where you can see where they're managing their testers, how the bugs are coming in and where communications happen.

Can you communicate directly with the testers through that system? Make sure that is a secure system and that it's got various compliance things in place and security around it. So again, security is important, especially as you're going outside of your building and outside of the in-house teams.

Start the pilot

The pilot process: kick off, meet the teams, define and set your success criteria. Make sure that's as clear as possible, understand how they're going to approach it and then finalize that scope. Make sure you have it. Check in pretty early on, as soon as the team has really started going, check on the progress, see how they're doing with access to the product, navigating around the product, looking at the bugs that are starting to come in, test case execution, what those look like. If you're getting feedback on surveys, agree on the timeline and agree on any corrections you need to make. If you're getting too many bugs, not enough bugs, provide that feedback. Again, these crowdtesting partners work for you and they're essentially contractors so you don't have to be worried about hurting their feelings. They want direct, honest feedback.

A good partner will correct and adjust quickly for you. Tell them exactly what you want, how you want it. Be very opinionated. They're there doing work for you. So make sure that you tell them exactly what you want. And then as that pilot progresses, you get a readout on the pilot of how did it go? What were the tests like? You have midpoint chances to correct things. But review the things that were done, the bugs that were filed, the test cases that were executed, the coverage that you got out of it, recommendations from the team. The team has experts on it that will have strong opinions about things you should do next. Things that worked, things that didn't work, things they would adjust. And we'll look for your feedback on that.

Definitely provide a lot of that feedback and then talk about, from that pilot, does that expand to more coverage? Does it expand to more geographies, more areas of your product? Do you start looking at other testing areas like accessibility, as we talked about, you start thinking about what you know, assuming that pilot was a success, how would you expand from where you were? What does that growth look like?

Make sure you understand who's who on the team when you're talking to a crowdtesting partner: who is your main point of contact? Who's doing the vetting of the bugs, who is running the programs? Make sure you've got the correct contact points and make sure they know who to contact in your own company. Make sure they know who to reach out to so that they're reaching out at the appropriate level with the right sort of feedback. So with that, I'm going to let Jenn pick up on some recent pilot projects for some examples.

Examples of good pilot projects

JW: Great. Thanks, Rob. So you were talking about how important it is to have something that's very well-scoped for a pilot; where you can understand the impact of what you would find if your team did the testing. You already know what that looks like for you. And these are some examples of recent projects that we've done that have that clear focus, very clear boundaries: This is what we're looking for. This is where we're going to focus. These are the success criteria.

So the first one is a website accessibility audit for a financial services company. Particularly with accessibility, yes, there are a ton of automated tools that will go out and scan your site and crawl it and say, “This is problematic. You don't have the right alt tags or you don't have this button labeled correctly.” But having an accessibility audit from people with disabilities who can specifically identify, “this is what is broken for a user with vision impairment, this is what doesn't work for somebody with mobility impairments, this is where your keyboard navigation breaks” and having concrete recommendations — that's very valuable. But again, it's also a very well scoped project. We're only looking at the website, we're not looking at the apps, and we're just looking at accessibility. We're not looking at all functionality. We're just focused on how we can make this more usable for people with disabilities and how we can improve our accessibility conformance.

The next example is an online shopping cart for a retailer. We did both functional and exploratory testing across 5 different browsers on Windows 10, and there were a combination of test cases where obviously we want testers to complete specific transactions using these different browsers, but then also can you break the shopping cart. Not trying to break the whole website, not trying to use all the different apps, we're just looking at this specific website and we're only looking at the shopping cart. Again, very clearly defined, very focused, with room to expand in the future based on what we learned from that project.

Another good example was a UX study focused on a retailer’s website with testers in Germany. This particular retailer had done some market expansion and wanted to understand how users in that German market were experiencing their website and they wanted to get feedback on what they could do to improve in this market that was relatively new to them.

The last example that I'll share is payment testing for an airline using specific payment instruments and test cases. What happens to somebody using a combination of an American Express card and some of their loyalty points? Now, what happens if they add a checked bag? What happens if they upgrade to first class? What happens if they use a different payment instrument or decide that now instead of two adult travelers, they're going to book two adults and three children. Covering those different use cases and how it affects the payment process along with the different payment instruments was really important. But again, just testing the payment structure, not testing can I book specific flights? Very focused on the payment process and what that looked like.

So these are good examples because again, it's very clear what the scope is. It's very clear what you're trying to accomplish in each of these. You're trying to understand a discrete digital property or process, not every single digital thing that your company offers or your entire end-to-end experience. These are good pilots because they are so tightly focused and it's easy to have conversations about whether the types of bugs and communications you're getting are appropriate. But again, they're all good examples because they're clear on what type of feedback you're looking for, what's covered, what market, the timeline. These were all solid starting points for pilot projects. So with that, I am going to hand it back to Rob and he will go from there.

Walk through a sample workflow

RM: Okay. I'm going to walk you through an example of functional testing, what that workflow looks like. When you're starting a crowd testing partner, you want to visualize what the flow looks like. So it starts off obviously with, coming back to that criteria you’re setting. Make sure you have the project well-defined. You set up the build scope, the coverage, all that sort of stuff. You hand it off to the crowd-testing partner and they create, you know, a test cycle — basically an instance of what they're going to run.

And then they find the right team for you, from their crowd. So they create the team based on any demographics you need or any particular skill sets you need. They find that team for you and then invite them to the project. Then those people will join. Again, these are invites. They're not employees. So if you want 10 to 15 people, you might have to invite 20 to 25 because people may not accept the invite. You need to make sure that the vendor has enough of a crowd that can hit the numbers that you're looking for.

Once [testers] accept the invites, they'll come in, they'll start executing the test cases, they'll look for bugs, and then they'll file those bugs inside of the crowdtesting partners’ platform. And then as they're doing that work, it's monitored by the crowdtesting partner. Again, that's making sure that you're getting the right quality of bugs, that they're well-written, you get the right severity. Well-trained crowdtesters will not have this problem, but it's important to have a quality gate in there before things get to you or your team. So that's what the crowdtesting partner managers are doing in terms of managing the crowd, testing, execution, the flow.

From there you want to evaluate the bugs that are coming out of that partner. Those are either getting exported directly into your bug tracking system or you can go into their platform and view the bugs there. You can ask follow-up questions if something isn't clear. Again, good crowd testers will file very clear bugs, so there's no reason to go back and forth. But sometimes there are nuances based on the application and things like that. You can set rules upfront and have the crowdtesting partner immediately export everything into your bug tracking system based on a set of rules or you might want to go in there and triage it with them. So make sure you've discussed that with them.

And then, once those bugs are in your system, your developers jump in, fix things, and then you want to know that the issues that were found have been corrected. So that's what we call bug fix verification and the testers will go back in and retest the issue they found and make sure that it has been corrected. And again, you can just get into an endless loop there in terms of adding new functionality, testing, verifying bugs and validation. So that's what that flow looks like.

A mini version of this for a pilot is a great way to get started. You know, pick a small project and run through this whole flow, get that experience and just think about that at scale and branching out to other solutions like accessibility or more environments or larger numbers of testers, geographies or whatever the differences are for your product. So just think about those.

Getting the most out of a crowdtesting engagement

Now I'm going to jump into some things that are kind of summarizing some of the highlights of getting the best results out of the system. Be involved and communicate. Again, you can just throw it over the wall and say, hey, white glove service, do your best. But if you set expectations and you're involved in giving feedback, especially in the early days of an engagement with a crowdtesting partner, you'll get the best results.

Case Studies

Multinational Financial Services Company

See how a Canadian financial services company used Applause to test onboarding new customers via its app and more.

They work for you, give them direction. Over time, they'll understand exactly what you like and what you don't like. They'll get to know you well and will be able to anticipate your needs. In the beginning, they won't know you that well. You need to get introduced. There are very fine-grained preferences you might have around how bugs are written. Even though the crowd testing partner will know how to write good bugs, you might have some strong preferences: you might always want a video, never want a video, always want things annotated. Just make sure you've communicated well with that partner.

Find the right signal to noise ratio. I mentioned this earlier. If you only want critical bugs, tell them that. If you want every single bug that's found, tell them that. If you don't want UI bugs, you only want things that are functionally broken, so if the shopping cart looks bad, that's okay, but if I can't put something in the shopping cart, that's not okay, make sure that's clear.

You know, all software has bugs. So make sure you're clear on what bugs are unacceptable that have to be found, and ones that you don't want to spend time fixing because you don't always have time to fix every single thing that can be found. That's just the nature of software development. Make sure you set that ratio. Don't get more bugs than your team can fix because there's not a lot of point in that. But make sure you are finding the ones that are critical.

Make sure you're challenging the vendor. If you're getting trash bugs that are just wasting your time, that's not good value. So make sure they're giving you the best value bugs that would impact your business that you could actually tie back to revenue or sales or impressions or what have you. Make sure you can tie it back to value and that what they're giving to you is easy to quantify. If it's getting releases out faster, then time them, make sure that they're getting through the test cases in the correct amount of time. If it's finding all the high severity bugs, make sure that you have good feedback from your post-release system so that you can see anything that escaped through QA into the field. If that happens, tie that back and ask the vendor, how did this happen. How did you miss this? Right? That’s another good example. Tie it back to value so that you can justify the engagement with the crowd testing partner and that you get the most out of them because they're working for you.

At that midpoint, make sure to ask that question about value, make sure the teams are working well together. Are they people of a like mindset? Are they professionals? Are they an enterprise-grade testing team that is just built a different way with crowd testing versus in-house? Are they measuring the right things for you in terms of velocity, quality, bugs, any of that and provide the feedback? Always provide the feedback and your preferences. Again, don't hold back in that area.

Measuring success now: set [the criteria] up front and then come back at the end to measure it. So is it speed? Is it coverage? Is it the type of bugs you want to come back and quantify it? Ask the partner to score themselves like this is a self-review. We're getting towards the end of the year here in performance management time, but ask them to score themselves and then look at their score and then score them yourself and provide that to them. Again, don't worry about hurting feelings. They're working for you. So ask them to self-score on how they did and the areas that are important to you. Score them and then have that discussion and then if necessary, run another one or see how they adjust, make sure they take the feedback well. They need to do that. They work for you.

Again, make sure that they're accomplishing your needs, and then you can measure how they're doing and make sure they know this isn't a one-time thing. It's not just on the pilot that you're going to do this. You're going to constantly measure them, constantly provide feedback, because that's how you get the most value out of that partner. So do continually provide the feedback, not just the first time, but always. And if there are differences of opinion, make sure that they understand what you need and how you need it.

This kind of comes back around to the Agile software development process, right? That feedback we were just talking about, it's the retro, right? Basically, we just completed the pilot. I'm going to tell you how we did and what we could do better. I'm going to give you that feedback before you start the project… Sprint planning essentially.

It's setting things up for success. The check-ins, midpoint or any additional ones as needed. That's your daily stand-up or scrums. Again, you don't have to do that daily. You pick the cadence that works for you and then there's the review of the output. That's the sprint reviews where you're looking at the quality that came back and then you've got the various people involved in that driving all those upward.

If you're coming from a development world and you've got a shop of all developers and not enough QA involved, you can map what the crowdtesters are doing to an Agile process. So it's another good framework for thinking about things and things that might be missing. Like, oh, I didn't have a retro, I didn't review with them how things went at the end or, I didn't have a check in midway and I should have because I should stay in touch with them and continue to provide feedback.

Another good way of looking at how to engage with that crowdtesting partner: you want to look at how well that team integrates with your team. So if you have an onshore team that you're using as well as the crowdtesting partner, what's that interface look like? Who does what? What's that handoff? Do they understand your SDLC? Are you waterfall, Agile, any modifications? Almost nobody is pure waterfall or Agile, there's usually nuances in there. So do they understand that? Do they get it? Can they map it? Can they work with you a lot in that SDLC? Are they accessing your systems fine through whatever VPNs or proxies that are needed? Did they have all the access they need?

Are they communicating in your language and in clear English, if that's the language you're using. Are they communicating clearly on what happened, what they're going to do next, and answering your questions as clearly as possible to make sure that the communication paths are good. A good partner is not just about good bugs, but it's also about good communications, because that's how you make sure that you're getting the most quality out of them. So that's kind of the recap.

We're going to move now into the Q&A process. I think Jenn has been collecting some questions as we've gone. I know we've gone a little bit long, but we'll try and address some of your questions here.

Q&A

JW: Great. So one of the questions that came up is: do you see choosing a crowd testing partner and getting started with that as fundamentally different than working with any other sort of technology partner?

RM: Yeah, that's a great question. I actually do think that's different because a lot of technology partners, if you're choosing an offshore team or a software development partner, you're thinking about a very long engagement. It takes a long time to get them started. You have to define things upfront, the ability to provide feedback. Once you've chosen them and started them on a project, they’re a lot less flexible because you tend to have to write explicit specifications around what you do want them to do and what you don't want them to do.

Crowdtesting is about agility. It's about reacting to your needs. So while you do have to set your expectations and needs upfront, you don't have to be as precise. You don't have to write a manual or you don't have to write it like a legal contract. Obviously you need an MSA in place, but you don't have to write a legal contract for what they will and will not do. You can be flexible with that and know that upfront, which allows you to rapidly iterate on what you really need. Keep that in mind: that speed is available to you and the ability to adapt to your needs over time is there. So that does give you a bunch of flexibility there and it changes the way you interact with them, which is really beneficial because it's painful sometimes to go down the old path of doing that.

JW: Great. And one more question. What do you think is people's greatest misunderstanding about getting started with crowd testing?

RM: I've been at Applause for five years now, and before I joined Applause, I had the same misconception I thought of crowdtesting as just a bunch of random people doing random things and hoping for good outcomes. You know, it's kind of like the old how many monkeys do you put in a room to write Shakespeare? And if you put a million monkeys in, eventually Shakespeare will come out. Well, that's not true with Shakespeare and monkeys or quality. What you really need to put in the room are seasoned, vetted professionals that do testing. Just because it's crowdtesting doesn't mean you should lower your expectations of the quality of person or the quality of work that comes out of those people for your testing.

These are professionals. You should expect professionals. They know how to write bugs. They know how to execute test cases. They are serious about their jobs. They're trying to make a living. They're not different from what you have on your teams. It's just the way they're doing the work is a bit different. So keep that in mind. I had that misconception before I ran into Applause. And many people I still run into think about it that way. It's like an Uber driver that doesn't know how to drive. That's not the case. Uber drivers, they know how to drive. It's one of the requirements. Well, our crowdtesters, they know how to test.

If you're with the right partner, they're training them, they're vetting them, they're testing them, they're kicking them out [of the crowd] if they're not doing a good job, they're on top of it. You can trust that if you pick a good partner, you will get good testers that are quality professionals and are experts in what they do. [That] depends on the solutions. If you're going for accessibility, they should be accessibility experts that are engaging, not just random people that can look at differences in colors, right? They need to be accessibility experts. Or payment instruments — if you're doing that kind of testing, they need to have credit cards and they need to know how to use them and how to find the right locations for them, all that sort of stuff.

So again, expect expertise, demand it, provide feedback.

Want to see more like this?
Jennifer Waltner
Global Content Marketing Manager
Reading time: 25 min

Why Live-Event Testing Is an Imperative for Media Companies

Streaming audiences are demanding more live content — delivered without defects

How to Onboard a Crowdtesting Partner Effectively

Guidance on getting started with a crowdtesting vendor to improve your organization’s quality, speed, and agility.

What Is the SAFe Agile Framework (Scaled Agile Framework)?

Learn what SAFe is, and how it can help organizations scale and distribute Agile practices

An Introduction to Contract Testing With PactumJS

Learn how to ensure systems can communicate between each other reliably and repeatedly.

6 Emerging Types of Streaming Media Ads

Static ads are a thing of the past, and advertisers must take advantage of emerging technologies.

How an Online Ticket Marketplace Got Started With Crowdtesting

Candid conversations and a pivot during the pilot project set the stage for success