Select Page
Ready, Test, Go. brought to you by Applause // Episode 32

Embracing Weird User Behavior

 
Listen to this episode on:

About This Episode

Join Gojko Adzic, author and consultant, as he discusses Lizard Optimization, a systematic way to discover unexpected use cases and unlock product growth by engaging long-tail users.

Special Guest

Gojko Adzic
Gojko Adzic
Gojko Adzic is an internationally recognized software delivery consultant and author of several influential books, including “Lizard Optimization: Unlock Product Growth by Engaging Long-Tail Users.”

Transcript

DAVID CARTY: Limiting children’s screen time is the primary battleground for many of today’s modern parents, but a young Gojko Adzic was tinkering on a computer way before it was cool. His fascination with technology began as a six-year-old, and within a few years, he was building his own games.

GOJKO ADZIC: So my passion has always been building stuff. I started tinkering with computers when I was six years old. I think I built my first video game when I was 10 or something like that. It was rubbish. But, for a 10-year-old, not too bad. I drive a car, but it wasn’t really a car. It was more like a blob on the screen with keyboards, keys and try to avoid hitting things that were coming down at you. So it was just one level. It was, yeah, insanely fun to build, and I don’t think anybody would enjoy playing it apart from me, but sometimes you build things just for yourself. I was incredibly proud doing that.

I was copying peek and poke codes for Commodore 64 from a German language book. I never learned German. I couldn’t read German then. I can’t read German now. But I was going through Commodore 64 listings from this German language book. I remember my father brought and copying stuff into Commodore to see what’s going to happen and whether the computer is suddenly going to start going vroom or the screen is going to disappear and things like that.

CARTY: That instinct to build took him away from screens, too. He was fascinated to learn how things work. So he did what any child would do. He disassembled his family’s landline phone, just to see how it functioned. He had eyes on his family’s TV set as well.

ADZIC: So I think it was more trying to figure out how things work. I used to disassemble things a lot as a kid. I remember of my parents being really angry with me opening stuff, and I disassembled the landline phone one time just to see what’s inside and things like that. I think they caught me once trying to disassemble the TV set. The thing with the cathode ray tube, old one.

So I was, I think, always fascinated to figure out how– trying to figure out how things actually work. That’s the allure of software development and programming. You get to figure out how to model things. So they were trying to figure out how things work is and then trying to understand them is what I think is necessary for developing successful software and trying to figure out both what people actually want and how to build it and why that works, the way that works and why things don’t work when they don’t work. I think that’s a big threat that I think have been pulling since then.

CARTY: For many precocious young children today, there are resources to explore programming, but Gojko still thinks there’s a gap between some of the rudimentary build tools and the ones that power most of today’s digital experiences.

ADZIC: Yeah, I think there’s lots and lots of stuff today that really interesting, both toys and technologies for people to play with, but things have gotten a lot more complicated. One of the nice things about getting involved in this when I was a kid was incredibly fast feedback. I remember you type something in, you get something out, whether it was basic or peeks and pokes or other languages later, I think Turbo Pascal was probably an insanely the most productive– insanely productive environment I’ve ever worked in. There was like more than what, 20 or 30 years ago where you had this just feedback cycle immediately to see what’s going on.

And I think I try to teach my kids to do stuff on modern machines. But then you have to explain HTML and browsers and JavaScript and all of this is incredibly complex. Yes there are things like scratch, of course game builders and things like that. But I think that things have gotten a lot more complicated. Kids these days, I think, they want to build a triple-a game immediately. They don’t want to tinker with something. And I think it was easier for us because retro games are comparatively rubbish if you compare them to what kids are playing these days. But it wasn’t too far off from what you could build on your own.

CARTY: Software development and delivery still summons a feeling in Gojko that reminds him of his youth. Pure joy. For him, a block of code can be like a magic spell.

ADZIC: 2013 so what? 12 years ago, a colleague and I started developing again our products, not building for other people. And I remember distinctly how joyous that was. I rediscovered the joy of programming by building and controlling the whole thing. I think I’ve built too many systems that copy data from one database to another and then try to check if everything is correct. And that’s like soul defeating.

For me, software development is the closest thing to magic we can get. We utter some spells and code words and who knows what. And then all of a sudden, something that wasn’t there appears. And it’s intoxicating. It’s fascinating. It’s amazing that we can just magic up things from our minds. I had lost that feeling, but since we started building products for ourselves, I really regained that. And I occasionally do a bit of open source work as well, and I feel joyous when I’m doing things like that as well. It’s building things for the purpose of building it, not because I’m trying to optimize somebody’s clicking on something somewhere halfway around the world.

CARTY: This is the Ready, Test, Go podcast brought to you by Applause. I’m David Carty. Today’s guest is engineering prodigy and software delivery expert Gojko Adzic.

Gojko is an internationally recognized software delivery consultant known for advancing agile and testing practices. He is the author of several influential books, including Lizard Optimization: Unlock Product Growth by Engaging Long-Tail Users, which is the subject of today’s conversation. Lizards can be found on every continent except Antarctica. Just like your products customers, you strategize endlessly to optimize the way you anticipate users will interact with your product, only to find, often with frustration, that a segment of the user base have their own plans in mind. They might be trying to upload the wrong file types or use a feature for a different purpose. Is it frustrating? Yes. Is it an opportunity? Also, yes. So don’t be so cold blooded. Capture the feedback of this user base to see potentially massive returns. That’s what Gojko argues. Let’s go ahead and we’ll check in with him to hear firsthand how Lizard Optimization led to a big turnaround in one of his own products.

Gojko, your new book introduces the concept of lizard optimization. Can you explain what that means? And most importantly, what’s your favorite type of lizard?

ADZIC: My favorite type of lizard is somebody who does something totally insane with the system, with the product, but it turns out actually to be a genuine, proper new use case that I’ve never thought about. For example, the product I’m working on now started as a platform for converting PowerPoint to video. And we spent a bit of time making the text to speech audio function easy to use so that people could just put some text into the presenter notes, and then it would create a video using text to speech from the PowerPoint, and started noticing some people building blank videos and paying me for that, which was really curious because that doesn’t make any sense. And then talking to these people, it turned out that we made the text to speech interface so nice to use that it was worth it for them to suffer through creating a blank PowerPoint to add things into presenter notes, to convert it into video, to wait for the whole thing just to download it in a video editing tool to extract the audio track and then use it onwards. And I thought, if it’s worth going through all these hoops just to use this 10% of the system, why not let people use that 10% of the system directly. And about a month later, we had as many audio files directly being built on the platform as we had videos. And today, I think more than 90% is this new use case that was discovered by us observing blank videos. So we had 100 times growth or something like that from just the fact that we were observing people misusing the system or using the system in ways we didn’t expect.

And Lizard Optimization is a systematic way of doing that. It’s a systematic way of looking at how people are misusing or abusing a product to figure out underserved use cases, to figure out unexpected value that people are getting from a product, where they might be exploiting it or misusing it or struggling to use it in some other way, and then trying to figure out how do we incorporate that into product management. Lots of incredibly successful products stumble their way onto success by serendipity almost, and figuring out things that turned out to be unexpected. PayPal, for example, started as a way to send money between two PalmPilot mobile devices by bumping them together, and they built a website so that they can test things easier without actually bumping phones together because bumping phones together requires physical interaction– is difficult and things like that. And they opened up the website to the general public as– I don’t know– a streak of genius or something that they thought might be interesting. And then a bit later, they started noticing that people are using the website without ever downloading the PalmPilot app. And the metrics that they were trying to track were increasing the downloads and the usage of the PalmPilot app that wasn’t increasing people using the website. That was, I guess, costing them– operational cost of the product management actively fought against the website usage. They were threatening to sue people who were using it on eBay. They were fighting against that up until the point where somebody added up the numbers and there were 1.5 million active website users and 12,000 active PalmPilot users, and then they realized the product is not in the PalmPilot app. It’s the website money transfer.

And lots and lots of product managers fight against unexpected usage or try to prevent people using their products in unexpected ways because it’s not directly going to the objectives and key results that they’ve set for themselves. And I think that’s wrong. So lizard optimization is my attempt to codify what we’ve done for Narakeet and effectively then rephrase in a different light many things that worked out for this other product that we’ve built since 2013, where we had wonderful serendipity a couple of times, but not systematic, just stumbled upon it. With this product I’m building now, I’m really trying to approach it in a systematic way and detect unexpected usage, figure it out, support it if it needs to be supported, and it’s led to incredible growth.

CARTY: Yeah, that PayPal example is really interesting given the deep integration between PayPal and eBay later on, but I do want to focus on the case–

ADZIC: They bought PayPal at the end.

CARTY: Yeah, I know, isn’t that funny how that works out? But let’s talk about the other case study that you introduced, which is your own product Narakeet, a text to speech video maker. In the book, you’re very candid about how close Narakeet came to flatlining, but ultimately focusing on the outliers and optimizing the product for those outliers led to a substantial turnaround. So tell us a little bit more about that experience and what you learned from it.

ADZIC: So I think what I’ve learned from that on a meta level is that as a builder, as somebody who’s mostly involved in developing and creating things, I’m really rubbish at marketing. And growth can come from two different directions. Growth can come from getting more people to use something or it can get people who are already using it to stay for longer or use it more so that new people accumulate.

And I wasn’t that good at marketing. I’m still not that good at marketing. And at some point, the product took off and then one of our competitors got funded by venture capitalists. They had so much money to spend on marketing and search engine optimization that they effectively stole most of our traffic or not stole. I mean, they acquired most of our traffic, and the product was like by all metrics just going down. And one of my ideas was that I’ll just put it on ice. It was profitable. It wasn’t generating a lot of money. It didn’t seem like it was worth investing too much time doing it anymore. But I wanted to make sure that it’s not causing too much support for me so that I can work on other things. And I try to polish all the parts where people were getting stuck and they were causing support and– all the parts that were causing support. And then by doing that, I actually stumbled upon a few of these things where people were getting stuck because they were using the product in an unexpected way. And for that example, with using it just for audio, not just for video is really good one because that’s– that happened around then.

And by building the product to actually satisfy these use cases, we were able to run circles around the competition that the competitors were marketing the other thing that wasn’t that important anymore for us, but we were completely somewhere else, building something different that at the end, more and many, many more people use them than what I thought the product would be. So I think that’s what really saved the product as it is is discovering these completely new areas where we can compete instead of trying to market to area that was already saturated.

CARTY: Talk about serendipity as you mentioned before.

ADZIC: But serendipity, I think is it’s not pure luck. It’s not just about waiting for something to happen. It’s being there and being ready to observe– and being able to observe what’s going on and being ready to figure out how to act and knowing what to do when something like that happens. Lucky breaks happen to everybody. The question is, can you spot when lucky breaks happen to you, and can you take the right action when they do.

CARTY: Yeah, absolutely. You have to be opportunistic there. So the lizard in your book refers to the four key steps in the lizard technique. So that refers to learn how people are misusing your product, zero in on one behavior change, remove obstacles to user success and detect unintended impacts. Now, there’s a lot to each of these individual concepts, but how did you settle on those four key areas and what do they ultimately reveal for product?

ADZIC: I wanted to use lizards, and I wanted to use lizards because of Scott Alexander’s blog post on lizard men. You call lizard men and Muslims from Mars or something like that. I will find the blog post so you can use it– give the link to your readers. Scott Alexander has this wonderful blog called the Star Slate Codex and he wrote about this kind of demographic. So demographic research study they’ve done where they’ve taken some demographic data and then they’ve tried to do some psychological research to compare the two. And they were incredibly surprised by people just giving ridiculous demographic answers. There was a number of people that were selecting American as gender or entering American as gender or something like that. There were people who were selecting Martian as nationality or weird things like that.

And then he goes and compares that to a bunch of other things and comes up with this 4% lizard men’s constant that says basically 4% of the people in any large population are just going to do things that are not reasonable to you. They won’t be reasonable to them, but they’re not reasonable to you. And so they have some other logic. And he compares that with some other thing where somebody concluded that 4% of Americans believe that lizard men are running America or something like that. And so this number 4% keeps coming up in different places. And he just chose to call it the lizard man’s constant. And when I read that, it was like mind blowing for me while doing this other thing, because I realized I’m approaching this thing from a perspective of these people are crazy or these people have no idea what they’re doing or they’re incapable or they’re malicious, but they’re not. Of course, there will be some percentage of malicious people trying to abuse the system, but mostly people are struggling because there’s a mismatch between their intents, their capabilities and what the system provides.

And it’s up to us to understand the lizard logic. It’s not my logic. It’s not your logic, it’s their logic. But by figuring out what they are actually trying to do and what they want to achieve, we can then figure out, is this a good way of supporting– do we have a good way of supporting that? Should we even be supporting that?

So the four steps at the end are learn how people are misusing your system. That’s having this operational awareness of what people are doing to your system. That’s maybe not what you intended. Then the ways I’ve found useful to do that generate so much noise, because there’s so many other ways of– so it’s akin to– I like using tools that were developed for the DevOps community in this whole cloud observability thing where you might be running an application, hundreds machines, one of those machines might be doing something unexpected. You don’t really know. So you have to know about the unexpected stuff. So as an industry, we’ve developed tools to track unexpected stuff and report on anomalies, report on issues, report on problems. And I started applying that to users and to product usage to detect that. So I think once you start doing that, there’s so much noise that you really need to zoom in and figure out one thing you want to do at a time.

You can’t chase everything and trying to satisfy everybody is not good product management. Trying to figure out what we should be doing is critical for focus. So that’s the zoom part. Then the next part is really to try and figure out what obstacles are we placing in front of people to do what they want to do. If somebody is trying to create an audio file and I’m making them upload the PowerPoint, then the whole process of creating and uploading a PowerPoint is an obstacle that shouldn’t be there. Sometimes these obstacles are bad UX. Sometimes these obstacles are over complicated features. Sometimes these obstacles are incomplete features, maybe people can do 10% of what they need to do in your system, but then they need to do something else somewhere for 90%. So there are obstacles there. And then figuring out these obstacles and removing them is really, really important.

And then the last one is figuring out whether what happened was actually what you wanted to help because again, by definition, lizards follow their own logic. They don’t follow my logic or your logic. So our assumptions about what’s going to happen when we introduce a change might or might not be correct. Something completely different might happen. People might be confused. They might be stuck. They might be starting to abuse the system in a totally unexpected way from a different perspective.

So, for example, one of the things I worked on about a year and a half ago was trying to figure out how to speed up the checkout, where lots of people were confused because in European– in Europe, European Union customers, especially business customers, they want their tax IDs, their VAT IDs on invoices, and we were using Stripe for payment processing. Stripe allows companies to put a VAT number. But Stripe is an American company. They don’t really understand European regulations, and they were trying to force people to put in a country prefix, where a lot of people just didn’t know how to do that. And there was a message popping up when you tried to say, if you were French and your French VAT number is 123456789, they would just type in 123456789, but Stripe would refuse that, expecting to put in FR1234 and the error they would report is wrong number where people were then phoning me to say that they’re using a card number, they’re absolutely certain it’s correct and that the dialogue is telling them wrong number. They were thinking that the wrong number is the wrong card number, not the wrong VAT number. This structure was bad. So I thought, OK, let’s try and figure this thing out. And I thought maybe let’s remove the VAT ID field from the form. And then Stripe is not going to prevent people from entering wrong numbers. And I can just take it later on my end and speed up the process. And when we launched that live, yes, people were less confused by entering VAT numbers because they could enter them later. But many, many more people were actually confused. And they didn’t want to complete the purchase because they didn’t see where to put in the VAT number. And they started calling me to ask, where do I put in a VAT number. And I had to explain, well, just go through the process and then put the VAT number in later. So at the end, the cure– the cure was worse than the sickness. So we had to go back and rethink that and change the whole invoicing process in order to fix that case. So pretty much every time any flow change is done for people that we don’t necessarily fully understand, we should really double check that we’re doing the right thing.

CARTY: Yeah, it’s a great point. And to that point about experimentation, many teams are taught to focus on the largest set of users. So how can product leaders maybe shift their mindset to optimize for some of these outliers and what sorts of trade offs should they anticipate?

ADZIC: I think following the largest set of users is course OK, if you can optimize something. But the product– any kind of reasonably mature product is already optimized for the larger set of users. And then the only thing you can really do is growth and grow– the only growth you can really get is bringing more people in. The product is already optimized by discovering unexpected usages by discovering unexpected use cases, a product can work on retention, which brings me back to what I said. I was really rubbish at marketing.

There’s two ways of growing a product. One is to bring more people in or bring people in faster. The other is to keep the people that are using your product to use it more so that the growth compounds, and usually for people who are building products, we can influence the second part much more than we can influence the first one, because we can influence the product, and the product can influence how long people use it for how well it’s satisfying people who are there.

And when somebody comes to your product and tries to do something unexpected, the most difficult part of marketing and acquisition, everything is done. The person is there. They’re interacting with you. They might be trying to get some unexpected value. They might be trying to get different value from you thought they were getting. They might be trying to get value in a different way, but they’re there and the product can, on its own, do something to actually encourage them to stay, to keep them as a customer.

So what I figured is I’m bad at marketing, but I’m really good at this other thing. And I think people building products, people who listen to your podcast are probably more builders than marketers. People building products can influence the retention much more that they can do acquisition. And there was a really interesting article I read in how business review, a couple of years ago by Amy Gallo. I’ll send you a link as well to that so you can include it where she combined some results from different research experiments and concluded that investing in retention has totally disproportionate rewards, because increasing retention by 5% can increase profit by 95% in some cases. So where increasing acquisition is linear, you bring 10 more people in, they’ll spend the average of 10 people. They’ll stay. But increasing retention has disproportionate returns.

And of course, it’s not either or. Good products do both. And it’s a tug of war where I think lizard optimization can work well if you’ve done enough on your primary group and you don’t know really what else you want to do or you’re looking for some new inspiration. You can do a bit of this. Find a new unserved use case, satisfy that, and then maybe go back to optimizing for that and things like that.

And it’s not the only thing people need to do, of course, to create a successful product. I think it’s– what I’ve tried to do with the book is expose something that I think happens to everybody and make it systematic, so that people are not ashamed that something happened out of pure luck, where it’s not just random serendipity, it’s actually something systematic we can do to improve products in these cases. And then we can work on it and improve the products in accepting that this is a systematic way of doing things.

CARTY: Yeah, absolutely to your point about that tug of war, there can be a little bit of an internal conflict here too, because we should be mindful of the fact that many organizations have done a lot of due diligence in attempting to appeal to certain customer personas. Yet you’ve said that some of the best product decisions you made came from engaging with what you call the weird use cases, people outside of that original target persona. And I just imagine that pivot can be politically tricky internally. So how do you navigate those internal–

ADZIC: Product managers, very often fight that until they realize it’s a pointless battle. So I think, again, making that systematic is something that can help remove this stigma that somebody made a mistake or that we’re doing something wrong. We can think about, OK, this part is done. This part is– we’ve done this part. Well, now there’s this new part we can be doing. It doesn’t necessarily mean we should, but here are some ideas that we might want to chase because it might be more worthwhile to go and explore an adjacent use case or build something that helps people do an extra bit of work with the product to keep them around then to just keep doing more of the same in this other area that’s already saturated.

So I think politically, yes, that can be very tricky. I don’t really have good political answers for something like that. But I think understanding that this is a systematic approach to research and it is genuine, valid research, I think is also useful. I think the way I think about this is, again, there is a tug of war in automated control systems between feedback and feedforward. Feedforward is what you provide to the system before with upfront knowledge. And then feedback is what start looking at how the system behaves and then adjusting and course correcting.

I think if you look at something like medicine today, medicine has mostly feedforward, but a lot of feedback as well, you see they go the diagonals, they say, take this pill three times a day for the next three months, it’s going to be better because as a civilization, we’ve spent thousands of years killing people with weird roots and doing weird experiments on them to figure out what works, what doesn’t work. So we have that knowledge. With software products, we don’t.

And we can do a bunch of research up front. But you have to stop research at some point. You can spend 10 years doing research. Your competitors are going to build products and deploy them, and you’re still going to be doing research. So people have to do a bit of research, then do something and then figure out whether what they’ve done actually makes sense or not. And I think a lot of times that I’ve seen this working as a consultant previously with companies is the feedback you would look for is are people doing what we expect them to do? Is it going on the path we want it to go to where we’re not really benefiting from unexpected things? We’re not benefiting from something where it might be wonderful that we’ve discovered something totally different through this feedback, but we’re ignoring it. And I think feedback is an incredibly powerful mechanism for product development if you can spin things quickly, if you can act on feedback quickly, if you can understand that feedback is an integral part of what we do.

And I think this whole online experimentation movement is shining light on that and showing people how their decisions are not necessarily correct or made based on complete data. Plus, also we are not in control of what other people do. People have free will. People have– there’s so many other impacts around us and around our products that even if we had the best research in the world six months ago, that would tell us what the right product is for six months ago, that doesn’t necessarily mean it’s the right product for now.

CARTY: Especially with how quickly the world is changing. talking about how quickly you can get that feedback in right. So how does lizard optimization play out differently in enterprise environments where budgets might be larger but the pressure to prove ROI quickly is also–

ADZIC: So I don’t know. Lizard optimization is my attempt to put a couple of things I think are useful techniques that have crystallized on my work and just put it out there and then see how it’s going to work out.

I think there’s a couple of references I found from larger enterprises that seem to be going in a similar direction, not as systematic, as explicit like that. Rachel Newman had this wonderful video. Again, I’ll send you the link. She was at Eventbrite when they made the video and talking about how they support people were getting a ton of incredible data that product people should have used for the product ideas and product development. All where are people struggling, what issues they have. And there was a disconnect between product and support and customer service and how they basically bridge that by putting product people to occasionally sit with the support people to just understand what patterns are going on and close that loop between customer service and product management and product ideas, rather than discarding problems oh, this person is not clever enough to use a system or lots of people are struggling with this, actually using that as part of feedback. So at Eventbrite, they benefited from integrating the data quite a lot.

I’ve read– I’ve never worked with Amazon. I’ve read about this in the Amazon way, where basically the author’s claim that above a certain level, every high level director has to spend a couple of days per year in a customer service role just to understand what’s going on, to experience these things. So I think closing that loop between the touch point with the actual customers and product management and stakeholders is really important because in larger organizations that gets delegated and delegated and proxied and proxied and proxied, you have area product owners, product owners, then you have research– UX research people over there. You have business analysts over there, and everybody has bits and pieces of the puzzle, but it’s very difficult to propagate that information across. And I think closing that loop is really important.

CARTY: Yeah, that’s getting right into my next question because ultimately these techniques go back to tracking real user behavior. And that’s incredibly valuable data. But where do orgs struggle to gather some of that data in the first place and take action on it. And what would you recommend that organizations do to approach that problem?

ADZIC: So I think one thing that worked incredibly well for me is that any time a user sees an error on a screen, we log that information back, and then we do all weird log analytics on that to extract trends and anomalies and things like.

So one example that’s maybe interesting is I talked about this way of just going to creating speech from text rather than making a PowerPoint. We built this screen where there’s an upload button, you can upload a PowerPoint document– sorry, you can upload a Word document, a PDF, a bunch of text formats. It creates an audio. If you try to upload something that’s not supported, like a JPEG, it’s going to tell you, oh, sorry, you uploaded the JPEG. We don’t support that. Most applications would stop there. Our application actually logs the fact that somebody tried to upload a JPEG and sends it back to us, so we can do analytics on that, and we get weird stuff. Like, I still get about maybe 20, 30 people per day uploading 100 package files into a text to speech screen, which I can’t explain at all. I don’t know.

But then occasionally, people start– you spot a trend and people start uploading something they think should be able to go through this, but it’s not. A bunch of people started uploading subtitle files. Subtitle files are files that accompany a video where there’s a timestamp, a sentence timestamp– sentence timestamp sentence, things that video players can use to show subtitles synchronized with audio. And I thought, well, there’s a few of these people, but we were trying to optimize the screen and get people more out of it. That’s an interesting use case. And it’s a text file. So we just let it go through. And then tomorrow we had a person complaining that he was able to convert the subtitle file to audio, but it was reading all the timestamps. I said, well, yes, your file has timestamps. It was reading out the timestamps. That’s what you uploaded. And he said, no, no, but I want it to be synchronized with the timestamps. I didn’t want it to read out the timestamps. And I took– this is one of these things where I started disassembling a phone. So I realized, OK, I mean, this is– it’s an interesting challenge. There’s probably not a lot of people that will use this. There were maybe one or two people a day that were trying to do that, but it was an interesting challenge and I thought, how do I make this, how do I– because there’s some interesting things around speeding up and slowing down audio so you don’t distort the pitch too much.

There’s all weird things and took me a couple of hours to do that. I think about three hours in total. I launched it and it turned out to be the most profitable thing I’ve ever done in my life, because I mean, there’s again, not a lot of people need this, but people that need it really, really, really, really need it. A bit later, there was somebody from a huge enterprise software company. They have 200,000 videos for training materials for their stuff. They have subtitles for these things. They wanted to have them translated in 50 different languages. And they realized that if they had a translated subtitle file, you could just load it through my system and I’ll give them a synchronized audio file in a matter of minutes, where doing that previously was taking them hours. They would have to either record or generate small bits and pieces, then try to put it in the right place in the video, then try to speak faster or slower or whatever. And this was magically doing it for them.

And again, this is where software is like magic when it works, it works brilliantly. So this thing was, again, a very small percentage of users need this. But people that do really, really, really need it and I think, yeah, that’s probably the most profitable part of the system by far. And it took three hours to do.

CARTY: Say, who knows, who knew taking apart a landline phone a few years ago would lead to you solving the problems of an SRT file today. So there you go–

ADZIC: Trying to understand this– trying to understand how things works is really is always interesting.

CARTY: Gojko, lightning round here. First question for you. What is the most important characteristic of a high-quality application?

ADZIC: It does what users is expecting it to do.

CARTY: What should software development organizations be doing more of?

ADZIC: Speaking to actual customers and users.

CARTY: I love that. What should software development organizations be doing less of?

ADZIC: Building things because one of the stakeholders thought it was a good idea.

CARTY: I like that too. And finally, Gojko, what is something that you are hopeful for?

ADZIC: I’m hopeful for this whole AI thing turning out to be a massive security issue, and people really understanding that they need to take care of their data. Everybody’s plugging in AI’s into everything these days. And I think we are on our way to some massive data catastrophe, where something will be exposed in a way people didn’t expect. And people are going to start taking data much more seriously.

CARTY: Well, Gojko, thank you so much. It’s been great talking with you. Really enjoyed it and appreciate you joining the podcast.

ADZIC: Thank you.