UX Metrics That Actually Make Sense
About This Episode
Some businesses have absolutely no clue how to measure the user experience — and it shows. Whether they’re measuring the wrong things from the start, beginning too late in the development cycle or outright abandoning UX projects, some brands consistently fail to create a high-quality customer experience.
Inge De Bleecker, Principal UX & Conversational AI Consultant and Founder at outriderUX, has seen the best and worst of it. In this episode of the Ready, Test, Go. podcast, she talks about how to build UX awareness from the ground up. It starts with having the right voices in the room to represent the necessary perspectives, which can help organizations build out cross-functional teams and actual organizational buy-in.
She also talks about co-creating the USERindex score to measure the user experience on a deeper level for the mobile age.
Inge De Bleecker
For three decades, Inge De Bleecker has designed, developed, managed and consulted on the user experience. Inge has spent her career focusing on the user and making the user’s life easier when interacting with digital products.
(This transcript has been edited for brevity.)
David Carty: Everyone knows the iPhone changed the world. Whether you wanted to video chat with a friend across the world or just look at cat videos on your way to work, the iPhone changed our way of interacting with the world.
As it turns out, it also changed the world of usability benchmarking. The iPhone revealed an unmet need, so Inge De Bleecker co-created the USERIndex. The letters stand for…
Inge De Bleecker: Usefulness, Satisfaction, Ease of use and Reliability.
So those are the four dimensions that we measure in the USERIndex. And that does give a bit more of a comprehensive overview of sort of what today, I guess we define as the user's experience with the iPhone. It wasn't just any more about ease of use and learn ability, but now it was about delight. It was it was just about that satisfaction. And it just became about more. So measuring a user experience just became something that we felt needed more dimensions.
Carty: The open source USERIndex poses ten naturally worded questions to users to gauge their experiences with a digital interface.
De Bleecker: We did a number of experiments just trying to fine-tune the statements and the way we calculate the scores and then also what those scores mean because what does a 3.9 mean versus a 4.2? So we have a nice scale all the way from green to red with orange in the middle. That kind of helps give an understanding of where you truly sit. In addition, of course, we have the historical data, which by now is quite significant from eight years’ worth of studies that also help companies understand where they sit within their industry in terms of their score.
We also use it for inclusive experiences. We've tested it out on anything from websites to mobile apps to conversational experiences. So it's quite versatile as a high-level benchmark.
Carty; And the most rewarding part for Inge is hearing second-hand how it is helping brands all over the world.
De Bleecker: It's a little bit hard for us to know. We do definitely get emails from either people who are interested or people say, "Oh yeah, I've been using the USERIndex and it's really been working out well." A consultant at Porsche came to us, this is about two years ago, and he said, "You know, I am evaluating our benchmarking score for Porsche. And I looked at a number of different ones and your score, the statements, they really hit the values that the brand is going for. And because of that, I've recommended your score." And so that really kind of hit something where it's like, okay, yes, these statements are relevant to the user experience, so relevant, they're relevant to the brands themselves.
Carty: This is the Ready, Test, Go. podcast brought to you by applause. I'm David Carty.
Today's guest is UX expert Inge De Bleecker. Inge is a UX designer, a UX researcher. She has built and managed UX teams and been the driving force behind UX process implementation across organizations. Inge is the founder of outriderUX and also its principal UX and conversational AI consultant. Over 30 years in the business, she's seen it all the good, the bad and the ugly when it comes to UK strategy. And that's what we spoke about.
Inge, thank you for joining us. Let's start by setting the stage. Why do we want to capture usability metrics? This takes time, investment and effort. So what's the benefit and why do we go to the trouble in the first place?
De Bleecker: It does indeed. It takes time, investment and effort. Absolutely. Why do we want to measure it? Because we want to understand our progress. We start with a certain baseline, obviously, after having identified what the metrics are that we want to actually keep track of, very important part. And once we have that, we set a baseline. Where are we at right now? And we understand the goal. Where do we want to go? And so the way to gauge that is by doing something that is metrics- or data-driven. The other advantage of doing that is that hopefully there is improvement. Right. You are working towards your goal. Now there's a very clear way to communicate progress, to communicate let's call it success. And so if you are part of a team that is working towards improving that user experience and you need more budget, you need a tool, you need whatever you may need, now, you really do have some hard facts to go to an executive team and a manager and say, "Look, this is what we've done so far. This is good stuff. Give us some more."
Carty: We live in a data-driven world now more than ever. Data drives business decisions, hiring decisions, even personal decisions, right? So with that in mind, explain to us the importance of not only measuring usability in a comprehensive way, but also in a comprehensible way that everybody in the business can understand.
De Bleecker: Yeah, that's a big challenge. I think it's a challenge with data anyways. So [with] data, you've got numbers and then really right away it's like, “Well, okay, I've got numbers. What do they mean? What does this really represent?” The key there is to be able to provide the right nuances behind the numbers and to also look at the numbers in the right way.
I actually worked with a client earlier this year, and they were very interested in a metrics-driven approach to their product development. So we set up a quite comprehensive strategy for that, and it was great. But, then as we started measuring early on in product development, we got this super high numbers like, oh, yes, you know, "90%. 100%. We're there!" Clearly, of course, we were not there. We're still very early in development. So what happened was, say, there was somebody who was who was working against a script. Doing QA against the script. You know, they've got 100%. Well, they'd better! You know, that was exactly what was laid out. That is very different from measuring a user out in the wild using the product. At the end of the day, those targets that we want to get to are for the user experience out in the wild.
So, it does really need to be nuanced. It needs to be explained. The first thing I said is that, look, you know, we cannot go to the executive management with these numbers. It's misrepresenting things. So, it needs to be comprehensible. It needs to be meaningful across the board. And, more often than not, you actually need some qualitative information, some user feedback or some additional background information in order to be able to understand what those numbers truly mean.
Carty: And you work with clients all over the world, right? So you must see this somewhat often, well-intentioned efforts around usability and UX, but they miss the mark. Any other examples come to mind of how you see that in the real world?
De Bleecker: Yeah, so as I was alluding to just earlier is it's not only about the measuring, it's also about what you measure. That is where, very often, is where it goes wrong in the first place. That really is something that does require thought.
I've run workshops with clients where we were going to have a workshop, we were going to talk about what metrics, the metrics strategy, what metrics to focus on, and how to collect these and etc. I give them the homework of thinking about their company's KPIs because at the end of the day, when you think about, "What do I want to measure?" A part of it is the user, and definitely part of it is a business as well. And then there's technology and a number of different aspects as well. But if we just think about the user and the business, that that is where I'm like, "All right, come to us with your KPIs. What is your business focused on? What does success mean for this product, for your business?" And it is amazing to me how that is very difficult for teams, especially in larger corporations. It's very difficult for teams to come to the table with that, to really start building out a meaningful set of metrics. And, again, that's step zero, honestly, right? If you don't get that right…
It's really interesting because just this morning I was on LinkedIn and I saw a thread. It was about metrics, and gathering metrics, and the comments were all about how the metrics are measuring the wrong things more often than not and are not understandable. So, this does seem to be a very systemic problem throughout the industry.
Carty: As part of this, not having the right voices in the room. When you're deciding what to measure in the first place, who should who should be in the room to help define those metrics that the business will ultimately measure and make business decisions against?
De Bleecker: Yeah, absolutely. You need a lot of different people in the room because you need all those different voices to be heard. The user’s voice, the business voice, and at times the metrics and sort of the decisions on what is success will have to include a compromise between some of those voices. I mean, when you think about sort of IVR systems, for instance, you've got the user experience that's important, but then you also have the containment rates and those business constraints that are important as well. So you do need to come to sort of a comprehensive overview. To do that, you really do want all the different stakeholders across the board, including the engineers, including the product managers, UX, business, everybody in the room from the start.
Carty: Do you find yourself being the negotiator between two sides a lot? Is that is very conflict driven? Are you put in that position a lot?
De Bleecker: There's definitely an aspect of that. Now, depending on the organizations, that may be easier or more difficult to navigate. But, yeah, absolutely, at least making sure that there's an awareness across the room, right? A mutual awareness of the importance of all of these different aspects, because everybody tends to think from their own silo and they're also quite focused on their objectives. But, at the end of the day, and this is kind of where it comes down to the customer experience, right? Something that really has to be driven across the board, across the different parts of the organization.
Carty: Ideally, what perspectives do you like to have in the room to help give input toward this problem? Is there anybody that's commonly left out or anything like that?
De Bleecker: I think the UX team is probably commonly left out, or at least that's a voice I hear a lot. Yeah. So I think it's, it's pretty common for the product team and the business side to come together and understand that this is an effort that needs to happen. And I think it's sort of the other, to some extent maybe more auxiliary, in a way, teams that are left out or people just, people just don't think about it. That really, I think largely can be the case is just people just don't think about the fact that more different people really should be involved from the start.
That's the other part of it, right? I mean, we're really talking about defining these metrics very early on. So who's involved at that point in time? It is very often largely the product teams, the business side.
Carty: Bias is going to come into play. Institutional thinking is going to come into play, right? So what are some ways that we can reduce or eliminate those issues?
De Bleecker: Communication. You know, as difficult as sometimes it is, and I know that these things take time, they're some of the efforts that I've suggested, and [what] teams have adopted is a committee of sorts. So a sort of a cross-functional committee to come together. I mean, [there’s] pros and cons there as well, because everybody's already very overburdened. They don't need another meeting for something that may not be directly in their daily line of sight. So, there's that, of course. And then I think just in general, HR, for instance, can play a role in fostering across the across the board some communication and inclusion, right? Inclusion in the customer experience. Again, that's sort of what it all comes down to at the end of the day. And, also, you have to have executive buy-in, a strong executive voice. That definitely is very helpful as well.
Carty: How soon in a life cycle can you begin to gather this kind of feedback, and are businesses missing an opportunity by not shifting usability testing left in the lifecycle?
De Bleecker: So usability testing should absolutely be as left as possible in the life cycle. As soon as you have a prototype. Start usability testing now.
In terms of gathering the data, it's a good practice to start gathering the data. But, it is, as I mentioned earlier, with the example of my client, that data does need to be taken with a grain of salt. In terms of the bigger picture. But at least it gives you an opportunity to, again, get those best practices in there and maybe refine your metrics as well, just because you have to put these metrics together at the very beginning so that you can use them throughout. It's sort of a big task because you'd like to refine it a little bit. It gives you a little bit of an opportunity to refine as you go early on. Not quite high stakes. You won't have to go up on stage and say, "Oh, yeah, well, we don't have data for that particular time there because we were still fiddling with it." So, that's sort of an advantage and a disadvantage there.
Carty: You spoke before about getting executive buy-in. I'm sure that one of the big obstacles to process change is just that internal resistance in that kind of inertia that you have to work through. How do you go about handling that and getting people on board with the program?
De Bleecker: Again, a lot of communication, evangelizing, raising awareness, whether that's through campaigns, brown bags, any and every way possible. It very much depends on the type of organization, sort of the way the organization works. What are they most receptive to? [There are] actually a lot of teams these days are very distributed even without their tools. What are they using as tools? What do they already have in place in terms of monthly meetings, knowledge sharing, things like that? And then it’s] just really gauging what people are open to. You've got some teams that are actually very open to just learning new things and embracing that, have some sense of innovation, I guess, within the company.
So you kind of look at what the company values are, and what's already underway and then you kind of pick and choose a little bit and kind of tack on to that. That's for sure the best way to gain traction if you're starting something completely from scratch. It just takes a longer time. And there's a lot more risk of things just dying. You start something up and then, poof, and it's done. Maybe you did some good, right? But that may be not so.
Carty: So, really, aligning it behind shared business goals is super important there, right?
De Bleecker: Yeah, shared business goals. Just understanding what do business, the employee experience, what the business focuses on in that, all of those.
Carty: Great. Now, it's one thing to put together a logical strategy, and it's another to execute it at scale. So what kinds of challenges do organizations face with scaling usability, data collection and analysis as they're launching new markets and add to their products?
De Bleecker: Yeah, so new markets are definitely, if we're thinking globally, a really interesting topic. Specifically also because if you're looking at, if you're gathering usability metrics or you're really gathering user feedback in a data driven way, what we've seen time and time again is that users will rate things differently in different cultures, whether it's because they're more positively inclined, more negative, and negatively inclined, they're too polite.
Carty: So not over here, right? Not in our country.
De Bleecker: The British can be very polite, you know, and some European countries will be a little more straightforward. And it's really fascinating because I've seen, especially in studies where you have the quantitative aspects, you've got the data, and you also have the qualitative user feedback, and you run that on the same products in different countries. If you have a little bit of a sample size, you really have a nice little, little project there in terms of really looking at truly what it means and that, that is what we see over and over again.
So, that's, again, the metrics are not the end all, be all in a sense. They're just one tool, right? They're one tool that needs to be handled carefully, but if handled correctly can be very helpful. And, yeah, looking at, for instance, a global audience is very interesting. Then things like, if you've previously looked at your entire product and now you're looking at just one new feature that's different, right? So again, sort of keeping all of that in mind.
Then I do run into clients who say, "Well, what's the point then, at the end of the day, right? What am I truly going to learn from this?" And that, again, is where I sort of come in and say, "Okay, I understand this is not magic." Nothing is magic. So just use it, use it wisely. And that's the goal there, really.
Carty: So you have to fight that nihilistic perspective on top of everything else, "Well, then why try in the first place?"
De Bleecker: Yeah, and that's okay. I mean, I think it only helps to be able to explain how or where the value lies. So, that's all right. I don't mind those conversations.
Carty: Sure. Now, if you're consistently failing to see ROI on your usability or UX efforts, is it time to take out the wrecking ball? How should organizations rebuild their approach from the ground up?
De Bleecker: Yeah, if there's consistent issues, then there are issues, no doubt. My suggestion would be to perform an assessment across the board to really understand where the challenges lie. It can be anything — very many things. It could be a lack of bandwidth for, just people don't have enough time to do X, Y or Z. It could be that the metrics are wrong. I mean, there are so many things, factors, and it's probably a combination of a number of factors.
So, you know, even if the answer is to rebuild, which I think it's just generally more [about] fixing than completely rebuilding, but even so, you would want to know where exactly the challenges were so that you can avoid those in the future. So an independent assessment from literally an outside vendor or an outside set of eyes, that is definitely a good first step.
Carty: So that you don't fail all over again.
De Bleecker: That's right.
Carty: Moving on to our final sprint questions here, our lightning round. In one sentence, what does digital quality mean to you?
De Bleecker: So, to me, it means that a digital product works. [That means], as a user, I don't encounter any bugs. It is easy to use. So usability is there. And it is appealing to use. Something that people will feel that they like using it. Maybe that's a bit of a bonus, but if we can get there, then I would say that that's, to me, digital quality. There's other aspects [too], obviously performance, things like that that will have an impact on my user experience.
Carty: What will digital experiences look like five years from now?
De Bleecker: Yeah, that's always a great question, isn't it?
So, on the one hand, David, it is going to look completely different. I mean, it is going to be nothing like what we do today. We are going to be talking to everything. AI is going to be everywhere, right?
No, it's not. I mean, that's sort of you know — because the other side is that, I think five years from now, in many ways, very little will have changed. We will still be using the websites we use today to check our email [or], I don't know, to submit a claim.
Carty: So a little bit in the middle there somewhere?
De Bleecker: I think so. I think you'll continue to see sort of the mainstream of what we already have.
I think submitting a claim will only become easier. Little bits, because by now we're quite mature on some of that side of the house, so to speak, when it comes to innovation. When it comes to AI, I think we'll continue to see a lot of different experiments, I call them, to some extent, or trying to do things a certain novel way, some of them will land, and some of them will not.
So, you know, at the end of the day, where do we end up there five years from now? Probably some progress. But, no, the world is not going to have completely changed.
Carty: Not exactly flying cars and things like that.
De Bleecker: I know, that was exactly what I was thinking about when I was saying this. Yeah, no, not yet. Still, not yet.
Carty: We'll keep an eye out. You never know.
What is something that you are hopeful for?
De Bleecker: I would like to think I call it discipline. I'm now specifically going to be talking about conversational AI and that industry because it is a little bit newer, a little bit more nascent, less mature. So what we see when we think about digital products in general is, we have the frameworks, we have the processes to do the software development lifecycle, right? We've been doing this — it doesn't really matter what type of product it is, it largely can be applied, right? So, we know how to do these things. We know what the right way to do things is.
I think we lack discipline. I see that then, especially in something like conversational AI where it's still very innovative. But, at the end of the day, if we want to develop those products in a way that they are successful, they're great products to use, we're just going to have to be a little bit more disciplined and truly go through the right processes, apply the frameworks, and just do it right rather than doing something kind of haphazard, not quite right.
Carty: Inge De Bleecker, thank you so much for joining us today.
De Bleecker: Thank you so much for having me, David.
Carty: That was our conversation with Inge De Bleecker, co-founder of outriderUX. Really interesting stuff.
Thank you for tuning into this episode. Thanks as well to our producers Joe Stella and Samsu Sallah and graphic designer Karley Searles.
If you'd like to reach out, please contact us at firstname.lastname@example.org. That's plural, email@example.com. We'll catch you next time.