Select Page
Ready, Test, Go. brought to you by Applause // Episode 16

A Look Back at Digital Quality Insights

 
Listen to this episode on:

About This Episode

Our guests of the Ready, Test, Go. podcasts offer digital quality insights in this compilation episode that spans everything from efficient testing to ethical engineering.

Special Guest

Compilation
Over the first full year of the Ready, Test, Go. podcast, we’ve been privileged to welcome expert guests across a variety of disciplines. As organizations grapple with the multi-faceted challenge of crafting a comprehensive approach to digital quality, these experts provided insights that can give dev, test and product professionals an edge. This episode reintroduces some of our past guests to provide a high-level overview of digital quality strategies.

Transcript

DAVID CARTY: In the first episode of the year, we dove into the world of puzzles, both literal and figurative, with Kristin Jackvony, a master puzzle solver, speaker, author, and test engineering expert. Kristin’s journey from escape games to QA management offers fascinating insights into the mindset of a top notch software tester. Listen to Kristin’s take on the essential skills for today’s software testers and some common mistakes to avoid in the industry.

KRISTIN JACKVONY: Well, the most important thing– and this is often sadly overlooked– is they need to be able to find bugs. That’s why we are software testers. We are trying to find the bugs before the software is released and the customers find the bugs. So that’s absolutely the number one thing. Another thing that is very important, especially these days, is to understand APIs and how APIs work and how to test them. So that’s very important. And then I would say the next level is test automation. You need to be able to automate. At the rate that software is being developed today, we cannot get by on just manual testing. We need test automation. And then finally, it’s very important for software testers to understand some of the adjacent software testing areas, such as security testing, accessibility testing, performance testing. These are all very important, and they are becoming more and more important each year. A lot of times in my career, I’ve discovered that even though we’re supposed to have documentation about how features work, a lot of times there’s no documentation. So it takes a lot of experimenting, a lot of playing around. And I really enjoy that. I really enjoy the exploration part of trying to figure out how a feature works, and then thinking, what are some ways that I could test this? Or what are some edge cases that we might need to explore? And then thinking, are there any ways that I could break this? I think that’s really, really fun.

DAVID CARTY: From helping launch Foster’s Beer in the US to shaping the future of payment solutions, Gary Larkin’s experiences offer a unique perspective on adapting to change and innovating for the common user. Let’s hear how Gary blends his rich background with fresh insights to stay ahead in the customer-centric world of product development.

GARY LARKIN: I think the first thing is be prepared to fail fast. First and foremost, you really have to find out if the dog is going to eat the dog food. It doesn’t matter how good for the dog it is. If the dog is not going to eat the dog food, it’s not going to sell. I have a great idea, but does everybody else think it’s a great idea? And even if they think it’s a great idea, will they adopt? You can be easily drawn into sitting in a vacuum and talking to yourself when you’re doing product development and trying to revolutionize or bring new products to market. It’s very important, I think, to get a proof of concept out into people’s hands quickly at the risk of being told it doesn’t work. And do that early enough to pivot, come back, learn from that, and keep iterating on that product, thought, and design. Don’t get too hung up in the technicalities. Don’t let perfection be the enemy of good, I think, is the important part of how to bring something into market. And to be measured in where you release it– clearly getting to the typical target and consumer quickly for input is important, but most products have interlocking circles of target customers. That center of the bull’s eye doesn’t always jump out at you. And sometimes your product’s going to fit a little better for the customer profile you didn’t anticipate, so you you’ve got to be open to this. I think it is important to check yourself and make sure you’re not putting your thumb on the scale. Really, really be careful, and try to be as clinical about how you allow the customers to come to your product as possible. If you’re there helping them, that’s not a true experience.

DAVID CARTY: Matt Heusser has a unique perspective on life and a unique approach to testing. As a writer, speaker, and managing director of Excelon Development, Matt prioritizes finding real value in software testing processes. How does one navigate the complexities of testing efficiently and effectively? Let’s explore that challenge with Matt, who emphasizes practical testing strategies.

MATT HEUSSER: Most organizations view testing as a thing that has to happen so the software can go out. Thus, it should be done really, really quickly. In terms of assessing the value, that’s not really a thing they do. It’s just a blocker for release. There’s an infinite number of combinations of possible things in the whole universe that we could test. What we want to do is we want to narrow the complexities down, simplify it to something that is valuable but understandable. Maybe we have a tool to do that. And when you say oversimplify, that’s one set of test ideas that we can run in an hour that is our smoke test, and then we say the software is good. And that’s our coverage model. Or we measure the goodness of the software by counting the number of bugs. Or we say testers find bugs, good– developers have to fix bugs, bad. And we actually create an active conflict between the two where the developers have an incentive to have no bugs, and the testers have an incentive to find a whole bunch. All of those are very, very simplified, easy to understand models for software testing. And they’re incredibly broken. They actually drive the wrong behaviors.

DAVID CARTY: Now let’s venture into a galaxy far, far away to extract valuable lessons from the Star Wars franchise with Adam Shostack, a seasoned security engineering expert and author. As a longtime Star Wars fan, Adam intertwines his passion for the series with his expertise in digital security and quality. From discussing the infamous thermal exhaust port of the Death Star to exploring the complexities of modern software security, this award-winning episode delivers unique insights from the world’s favorite sci-fi saga.

ADAM SHOSTACK: When we think about security, like other things that we can think of as quality, you can’t bolt it on at the end. You can’t sprinkle it on. You’ve got to design for testing, and you’ve got to design for security. You’ve got to think about what can go wrong. As you’re making choices about how to build things, you’ve got to think about that through the whole life cycle of the software. But this question of how do we get people to work effectively is complicated because it’s complicated. We all have a different perspective on things. And balancing those and orienting people without being overbearing is a tricky subject. The first thing I think about when I hear this is diversity and inclusivity. If we have a whole set of people that we’re paying to do work, I would like them to bring their whole brains to the problems we put in front of them. And if we’re scapegoating, we’re blaming, we’re doing all of these negative behaviors, people withdraw. They deliver the minimum they feel they can deliver. And we’re just not getting their best work.

DAVID CARTY: But that’s enough about problems in another galaxy, as we have plenty of our own here on Earth. Meredith Broussard, a distinguished data journalism advocate and NYU professor, sheds light on the often hidden biases in technology and the critical role of algorithmic accountability. Her role challenges us to reexamine the intersection of technology, race, gender, and disability. Listen in for a thought-provoking discussion that calls for a profound shift in how we build and interact with the digital world.

MEREDITH BROUSSARD: In a world where algorithms are increasingly being used to make decisions on our behalf, that accountability function has to transfer onto algorithms and their makers. My journalistic outlook is that algorithms need to be held accountable, just like power needs to be held accountable. I prefer to assume that developers are going about their day and trying to write good code and do their jobs honorably. I chalk a lot of these problems up to unconscious bias. We all have unconscious bias. We’re all working on it. We’re trying to become better people every day. But we can’t see our unconscious biases, and it’s an inescapable fact that people embed their own biases in the code that they create. When you have code that’s created by a small and homogeneous group of people, like we have in Silicon Valley, then the collective unconscious biases of those folks get embedded in the code. There is definitely some willful ignorance happening, but there is also some unconscious bias. It doesn’t make any sense to say, all right, I’m going to regulate all AI everywhere throughout time because we don’t really need to regulate all AI. We need to regulate some AI. And so context here is key. And we can start by attaching a use of AI to a particular context, and then making a decision about how it gets used or what gets used in that context.

DAVID CARTY: Now let’s hear from Lauren Maffeo, a prominent figure in data governance and civic tech service design. Lauren brings a wealth of knowledge in data policy. In this episode, she discusses the critical aspects of data management in business, exploring themes of compliance, data quality, and strategic data use.

LAUREN MAFFEO: If you think about people who work in tech, and if you work in tech yourself, which I know you do, you’re familiar with this concept of technical debt. It’s the idea that you get into an environment and you are responsible not just for creating something from scratch. You’re also responsible for fixing whatever that big problem is. And as somebody who works in civic tech, which is not known for being the most forward thinking sector, in regards to technology, you go in. And you go in knowing that the problem is going to be bad. And then the degree to which it is bad still continues to shock me, in terms of the outdated systems used, the lack of practices, the lack of automation of very repetitive tasks, which takes a lot of time from humans that they could be using to put towards other strategic initiatives. One of the biggest things that is really important for any data governance council is to get the right executive sponsor. We’re past mentorship at this point. You really need a sponsor for the council who is dedicated to making sure that it succeeds, making sure that the council is working on data projects that benefit the business, and can give the resources in both time and talent and money to help that council succeed. As I described what a data governance council could do, I’m very aware that it can sound like I’m talking about death by committee. And I think left unchecked, this could go wrong that way. I think if it’s not managed well, like any council, there is the risk that it just becomes a meeting point without much action or initiative taken to follow up on whatever you discuss at your council meetings. And so it really is on organizations and on the chair of the Council to ensure that these meetings not only occur regularly, but that they have key takeaways, they have notes, they have specific action items that people are expected to fulfill in between those meetings.

DAVID CARTY: What does software testing mean in the 21st century? James Mortlock, lead UAT Test Manager at Vodafone and a leathercraft enthusiast, challenges conventional testing methods, advocating for efficiency and a customer-centric mindset in quality assurance. In this episode, James encourages us to rethink the purpose and value of testing practices, streamlining processes where possible to ultimately create a better user experience.

JAMES MORTLOCK: Let’s start at the big one for me, with testing and the way that development solutions and architecture, everything, has evolved. Testing isn’t what it used to be. It’s not that developers will produce code. And in comparison to modern day standards now, it was awful. And you would need– that testing was absolutely required to make sure that it functionally worked. And that’s now not the thing. We’re now in the state of play where people are like, unit tests? Like, do we even need unit tests? Because the software quality is so high that we’re now moving out and up, closer towards the UI end of things. So it’s completely transformed, what testing actually means. And the test engineer or just tester doesn’t really exist anymore. It’s pulling people’s safety blankets away, in terms of getting rid of unit testing. How many times have you had a bug that would have gone into production that was caught by unit tests? It may take, what? 45 seconds to run and a couple of minutes to write, but extrapolate that across the lifetime of a project and whatnot, it’s pointless. You’re the guardian of the user experience. A lot of it will be baking in quality right at the start, so making sure that you have quality assurance. Not only you have an actual person that is regarded as quality assurance, but have it there as an ethos with everybody.

DAVID CARTY: Have you ever experienced a website or app and wondered, what were they thinking? Well, you’re not alone. Irene Pereyra, a distinguished documentarian and user experience director and designer, is a fan of bold designs in a world of vanilla user experiences. Irene brings her globally recognized award-winning expertise to this podcast episode, where we explore some of the fundamental yet timeless principles of UX design.

IRENE PEREYRA: There’s a reason why a lot of UX designers, especially if they work in-house at places like BDA or Google, call themselves product designers because we now see digital services and websites as products. So a lot of designers who work on the web or work for interfaces call themselves product designers. I think the big problem with a lot of companies is that we tend to think that the products we make, whether it’s your own product or you’re working for a client, are for us. So we design things that we like with the things that we think matter, even to the point where it’s like the colors that we like or that our wives or husbands like. It’s very important that rather than designing things that we think are great that we actually try to understand what it is that the people want who will be using this product. And there are many different ways in order to make sure that you do that correctly. And I can talk about that for hours, but it’s basically as simple as just empathizing and understanding who these people are. And the easiest way to do that is to find these people and be around them and observe them and interview them and ask them questions. And really try to understand how this product will affect their life or influence their lives and what might help them, in a way, to either do it more efficiently or just easily or also maybe even something that’s totally invisible to them, that feels so effortless that they don’t think twice about it.

DAVID CARTY: Paying for things should be simple, but global payment flows often weave a much more tangled web, introducing any number of potential points of friction or defects in a digital product. Just ask my colleague, Zeb Winzenried, Director of Testing Services in the Payments and Academy Program here at Applause. Zeb specializes in navigating the intricate landscape of payment validation, and his global team plays a vital role in helping renowned companies validate these flows. Join us as Zeb shares his insights on fostering trust and high quality in payment systems.

ZEB WINZENRIED: For many organizations, it’s very slim, especially if you’re a newer company, a startup, or a small online merchant, for example. You might have a very loose relationship with a payment processor or provider that handles things. But you don’t get metrics back on cart abandonment or payment failures. You just have no way of knowing what’s happening. So you might see your numbers start to dwindle for your sales and wonder what’s going on. Is it my product? Is it something else? You don’t have those analytics. Once you start branching out of the United States, you’ll see that it’s a vast world out there with lots of different variations and lots of rules that you need to follow and lots of expectations that consumers have. So on a global scale, most countries have a lot of card security, for one. Making sure that payments are secure, that nobody can actually take your card number and complete a transaction without some sort of extra authentication, whether it be an authentication app or scanning your face for some biometric readouts or validating inside of your banking app that, yes, that transaction is you. These are all really common things outside of the United States, for especially any online purchases. So when you start branching out, an American might not think of those things when they’re setting up their payment solution. It’s very important to the world, really, to make sure that their payment data is secure.

DAVID CARTY: What’s the difference between human rights and data rights? And how intertwined are the two? In a world inundated with data, professor and author Wendy Wong examines the profound impact of datafication in our society, which she argues is as significant a development in the course of human history as the invention of the printing press. This episode explores the importance of data literacy and the evolving relationship between businesses, consumers, and the vast troves of data they generate.

WENDY WONG: One of the things I really wanted to do with this book was to open up that aperture a little bit, to make sure that people think about human rights, not just in terms of these very specific types of rights– so either one that we think is being threatened, like privacy or one that people think should arise because of AI and datafication– this idea of data rights. And so one of the things I’m hoping we’ll talk about a little bit more is why it’s important to think about all the human rights that we have and also the values that underlie the whole framework we have around human rights today. Data literacy is a way to think about the specific need to have data understanding and data skills. So that doesn’t mean everyone’s going to be a data scientist. That would make you a data expert. We just need competence. We need literacy, which means understanding the basics of what data are– so thinking beyond digital data, actually. Understanding how data are made, what kinds of choices go into the creation of data, and how those choices actually really can affect the outcomes that you get. And so understanding the relationship between the data that the algorithms are analyzing and understanding that, actually, the data shape as much of the output as the algorithmic assumptions that are built into those models.

DAVID CARTY: Anyone familiar with DevOps thought leadership has heard of renowned author, speaker, and researcher Gene Kim. This multiple award winning CTO and best selling author joined us to provide findings from his extensive research on high performing technology organizations, drawing from a range of case studies spanning public and private sectors, including NASA missions, software design at Amazon, and the iPhone launch at Apple. Gene explains the principles underpinning organizational success in this can’t-miss episode.

GENE KIM: It was the most intellectually challenging thing I’ve ever worked on in my career, but also one of the most rewarding. And I think the reason is that the goal of writing the book was really to answer the question, why do organizations work in the way they do, both in the ideal and not ideal? And answer the question of what is in common between Agile, DevOps, which has been my area of passion, studying high performing technology organizations for 24 years. I think when it comes down to it, most software teams, the job is to deliver features that users need to either grow market share, to increase profitability, to save costs, et cetera. And so I think that is a fact of life. And so often we find ourselves in a mode where we’ve just got to focus on delivering that feature, that functionality to our customers, whether it’s internal or external. But at a certain point in time, we have to invest in our own productivity and our own efficiencies. And that is often lost. I feel like the mission ahead really is how do we create the right conditions so that business leadership and technology leadership are working hand-in-hand with a sense of mutual trust that they’re working towards common objectives? Because I’ve seen so many situations where if that top leader changes in the technology organization, the transformation dies or maybe their business counterpart changes over. And suddenly, the level of support and the amount of air cover disappears. And so I’m hoping that by creating a common language of what does it take to create outstanding performance that this will help accelerate technology organizations to reach their highest potential. So that’s my personal selfish objective. And so I think that creates a roadmap of years of work to see if we can expand the reach so that we’re DevOpsing in more places and with higher level support than ever.