Why So Ceremonious?
About This Episode
Agile is a great way of rethinking product development. But, if all you’re doing is slapping some meetings on the calendar and calling it Agile, you won’t see real change.
As the CEO of Coveros, a consultancy, which also owns TechWell, the host of popular software engineering and testing conferences, Jeff Payne talks with experts in the field all the time. He knows their pain points and challenges — and while some progress is being made, other organizations are fundamentally struggling to enforce change.
On this episode of the Ready, Test, Go. podcast, Payne discusses what real Agile testing looks like. It’s not a ceremonious approach to a Waterfall way of working — it’s about substantive organizational change. On top of rethinking how products are developed and tested, organizations must be proactive to remove obstacles and get disparate teams working together.
From technical lead to CEO of Coveros, Jeff Payne has made a 35-year career of helping solve digital quality concerns. Jeff has published more than thirty papers on software development and testing, and even testified before the U.S. Congress on digital issues.
(This transcript has been edited for brevity.)
David Carty: Jeff Payne collects vintage football cards and memorabilia. As a fellow sports card collector, I was very pleased to hear this. I collect ultra-modern baseball cards of today's current players, but Jeff's passion is a bit more expansive. He collects memorabilia from football players over a 100-plus year span. Everything football from 1869 to 1988 -- cards, programs, ticket stubs, photos, matchbook covers, and postcards, Jeff collects it.
Jeff Payne: I collect anything vintage American football, from the inception of the game in 1869 through about 1988, both baseball and football cards. And then when my boys got into sports, they started picking up and wanting to collect things, and I just kind of went along for the ride for a while, and then just kind of caught the bug and decided to keep rolling going forward.
I have some pretty nice Jim Thorpe items. Probably my most prized possession is, I have a 1911 Jim Thorpe large cabinet photo of Thorpe from his days at Carlisle when he played in college that's autographed on the front. I've only seen one other one. It was on Antiques Roadshow, actually. So I've seen another. But it's my prized possession because he is my favorite all-time football/sports person.
Carty: Jeff got back into collecting when his children took an interest in it, and he quickly found that the hobby provides a few life lessons for young collectors.
Payne: How to negotiate. Also how to ask for things, right, the right way. So you know, when they first started, they didn't know a lot. So I used to walk around with them and I would point out who the star players were or the people that were pretty well-known. And then I would help them figure out what they should ask for in terms of a price, and do a little negotiation.
Then when they got good, they didn't really need me anymore. That's ironically when I actually started picking things up myself because I was getting bored. I would take them to shows and stuff, I was pretty much just the chauffeur. They had price guides, and they had memorized everything. They knew everything about it, a lot of times more than I did. And so I was just dropping them off, more or less, but they were too small to leave there by themselves.
Carty: So why vintage when you can watch any team play any week? Why collect items from players who played decades or even a century ago?
Payne: I'm a big history buff around sports. When I was a kid, I read all the time. I read every sports book that was in the elementary school library, middle school library, high school library. And I just always liked the history of sports.
So, my collecting, even as a kid, even though I was really just opening packs. I grew up in the '70s, and trading with friends and whatnot. Did find a couple of antique malls and a couple of card shops that occasionally we'd arm-twist our parents into taking us to. And they had older things.
I would see cards that I had read about these players. I'd read about Red Grange or Jim Thorpe, or on the baseball side, Mickey Mantle or Babe Ruth or Ty Cobb, or whoever it was. And here I'm seeing things from their playing days. And I just thought that was really cool, being so interested in the history of sports, that I just gravitated toward the old timers. Plus I always joke I don't have to worry about Babe Ruth getting hurt and his card value going down, right? At this point, he's pretty pristine.
Carty: To most collectors, there's something cathartic about it. It's more than just a way to spend time and money on a frivolous pursuit. It's a path to fulfillment.
Payne: I think it's similar to what I hear a lot of collectors say, which is it's a way to reconnect with your childhood, right? I mean, most look back pretty fondly on their childhood. Not everything was perfect. But collecting to me is just, it reminds me of a simpler time. The only care in the world I had was, what was going to be for dinner; on the weekends, running down the street playing with friends; opening packs, trading them; chewing gum; hitting a ball; catching a ball; whatever you were doing. I feel like it connects you with your youth.
Carty: This is the Ready, Test, Go podcast, brought to you by Applause. I'm David Carty.
Today's guest is vintage football collector and CEO, Jeff Payne. Jeff's company Coveros is a consultancy that helps clients modernize their software processes. From technical lead to CEO, Jeff Payne has made a 35-year career of building secure software and solving digital quality concerns. Jeff has published more than 30 papers on software development and testing, and has even testified before Congress on digital issues. Coveros owns TechWell, which hosts the popular STAR and Agile + DevOps conferences, including STARWEST, which was held in Anaheim earlier this month. Let's talk with Jeff.
Let's start out with your definition of what Agile testing is. What is Agile testing really, and what sorts of processes does it involve?
Payne: Yeah, great question. So I always hate when people define terms with the terms themselves. So I'm tempted to say Agile testing is testing in an Agile development process, but it actually is more than that. Certainly, any kind of testing you're doing within an Agile process you could describe as Agile testing, or at least testing for Agile. But I think if you go deeper into it, there are particular techniques and approaches to testing that are Agile that you could apply to any software development process.
A great example is exploratory testing. It's a very iterative, learn-as-you-go, plan-as-you-go testing technique that is very popular in Agile. But there's no reason you can't apply --- and it is applied to Waterfall and other types of development processes.
So I always say that Agile testing is testing that is performed in an incremental or exploratory manner that allows you to plan your testing as you go. That's my definition.
Carty: In your discussions with clients and people attending STARWEST, what sorts of challenges did you hear around Agile testing adoption? Were those challenges the organization, or were they financial, a little bit of both, something in between? What can you share from some of the conversations you had.
Payne: Yeah, just got back from Agile testing. It was this past week. Great show. Had a lot of good, engaging conversations with people. What I heard --- and I did give a talk on Agile testing and some of the challenges. What I heard from people is, unfortunately, in an Agile process, and I'm talking really here about the testing we're doing in an Agile development process, the first challenge is we're still doing little mini Waterfalls in our sprints. Call it Scrummerfall, Scrumbutt --- there's lots of names given to it. But it's where we've just tried to shrink down the Waterfall and stick it in a very short increment of time. And that doesn't work very well. We'll talk about some of the ramifications of that later. But that is still very common out there in the industry, unfortunately.
The other thing I heard a lot was challenges around test automation. I mean, we could do a whole podcast on automation of testing and its importance in an Agile process, but people are still struggling with what kinds of things should we automate, what kinds of tests should we automate, how much do we automate, and all those kinds of things. I heard a lot at STARWEST.
I think the last thing was just trying to figure out, how do I get my developers and testers to work together every day? If I'm not going to do a little mini Waterfall, what's the model look like? How do we interact every day, and how does that all work?
So, those are some of the things I heard that seem pretty prevalent at STARWEST.
Carty: So there's a lot that we could pick apart there. Let's focus on the mini Waterfall version of releasing here. So Agile testing, it involves a lot of ceremonies, many of which I think our audience would be familiar with --- daily stand up, sprint reviews, retrospectives, et cetera. These can be helpful, but these don't make you Agile, right? So what are some of the issues that can pop up if you're focusing too heavily on the ceremonies?
Payne: Yeah. So a couple of things. First of all, in the ceremonies, inherently, the entire team is supposed to be doing those ceremonies together. A lot of times I see people aren't necessarily doing the entire ceremony as a team. So for instance, if one of your ceremonies in Scrum is your kickoff, kind of kicking off your sprint or your iteration, if in that process not everybody is involved in the sprint activities, the kickoff activities, reviewing the stories, estimating those stories, or at least finalizing an estimate for those stories, coming up with acceptance criteria, maybe even creating some initial tests, then you're going to have problems because you're not doing everything collectively. One of the goals in the kickoff is to make sure everybody is on the same page about what we're doing in the sprint. And, if you don't work all together, there's going to be communication gaps, which is what we're trying to avoid in Agile.
So, that's one area where I think people need to do better.
The other thing is that if you --- at least in Scrum, and I use Scrum as a reference because it's so popular, all of the Scrum ceremonies --- if you read the Scrum literature, calls them inspect and adapt activities. Now the word ‘inspect,’ inspect doesn't mean a status meeting, but so often some of these activities turn into a status update. A daily status update. A sprint demo is just a demo update of what you're doing.
If you look up the definition of ‘inspect,’ it's not status, right? It means look at something closely. It means examine something against a standard or a criteria. It means some kind of thought process associated with it. Not just, “Here's my status.” And we don't often do that [inspection].
Then, the second thing is, we don't adapt. So a lot of these stand ups, we don't spend any time after we've inspected talking about, “Well, what should we do about this, or what should we do differently, or how do we handle these challenges that we just heard about?” And so we don't inspect and adapt. Also, whether you're following Scrum or not, I think that's a good kind of best practice for any of your ceremonies.
And, then, some people just aren't doing --- we'll run into companies all the time at Coveros that don't do retrospectives, and then they complain that they're not getting better. That's what retrospectives are for. They're to inspect and adapt what you're doing and make it better. If you're not doing that, you're not going to get better. So there needs to be more rigor around just doing the ceremonies.
Carty: These are well-intentioned ceremonies, right? But they can actually get in the way, it sounds like, of actual process improvement, or, at the very least, they can paint a little bit of an inaccurate picture as to what your organization is trying to accomplish with some of these processes here.
Payne: No doubt. Yeah, no, they can. And it results in all sorts of challenges, right? I mean, this mini Waterfall concept that I mentioned and not figuring out how to work together in your ceremonies and in your sprints leads to all sorts of problems. Usually, I see things like sprints getting elongated. So, organizations are trying to fix the problem that they don't have time to finish their testing by making the sprints longer. Well, you're kind of solving the wrong problem when you're doing that. It's usually not the duration of the sprint. It's how you're working in that sprint, right?
Or, you'll hear organizations who every so often will have a hardening sprint, they'll call it. They only call it that because they know you're not supposed to have testing sprints. So, they call it “hardening” instead. But it's just basically catch-up on all the testing we didn't get done, or fix all the bugs we still have.
And, again, solving the wrong problem, right? We need to figure out how to work better in our sprints to fix those kinds of things. Those are some red flags that we see a lot with people that we talk to and that are struggling with Agile and Agile testing.
Carty: You mentioned elongating sprints, hardening sprints. What are some other red flags that you've identified that point toward organizational challenges?
Payne: So, one red flag to me is, there's organizations out there who are trying to create a PMO [project management office] to make sure all the teams follow their quote, "Agile process." Well, if you read the manifesto, the beauty of Agile is that you inspect and adapt, you retro as a team, and you decide what's your process, right? There's some guidelines. There naturally needs to be guardrails. But you can't prescriptively tell teams this is Agile and this is what you need to do. But we see that a lot. People want to codify a process, say it's the process, and then have people audit that process. That's just not the way Agile works. That shows an organization-wide misunderstanding of Agile.
The other thing is, senior execs are very quick to punt on a transformation or an improvement effort before it really has a chance to succeed. We see them --- they have missed expectations. Maybe someone gave them those expectations, that transformation is going to be fast, that Agile is free. I've heard people say that. Well, Agile is free, right? You just follow these ceremonies and magically, everything works better. Nah, there's nothing free in software, you know. Software is a hard, hard, hard thing to build and get right, and nothing's easy and nothing's free. But, yet, some organizations believe that, have heard that. Because of that, they punt too quickly and never see benefits from their improvement efforts.
Carty: Right. Agile is not a tool you buy, but it's not free, rght? There's a distinction there.
Payne: Absolutely. Yeah, I think the Scrum Alliance says something like, “Scrum is amazingly simple to understand and amazingly difficult to implement correctly.” And that sums up Agile to me, right? You've got to really go into it with your eyes wide open.
Carty: Yeah. And it takes patience and it takes commitment. Absolutely.
So, some of the Agile engineering practices that make for real change --- behavior-driven development, continuous integration, things like that --- these require really changing your way of working. And that's no small feat. So how can orgs accomplish that today, particularly if they are resistant to change?
Payne: Yeah, great point. I mean, the ceremonies we're talking about are kind of what I call the process side of Agile. And they're important. We need to inspect and adapt. But, in my experience, if you're not figuring out how to get your developers, your testers, other people that are building, testing and delivering software working together in a different way with different engineering practices, then you're not going to be successful. You're going to see the problems that we mentioned earlier.
How do you address that if you've identified some of the things that you mentioned as potential engineering practices to adopt? Well, first, you've got to start small. I'm a huge fan of piloting change. And, so, in our process at Coveros, we help lots of people transform their Agile process. Once we have a plan in place, we always start with a pilot, where we take the improvements we feel like are going to have the biggest impact, and maybe the fastest impact because we want to show some quick successes too --- if executives don't see progress, they get nervous, right? And we try to apply them to a particular product first, and measure the results, and the success, and the ROI of applying Agile techniques and engineering techniques. Get it working one place first. I think that [does] a couple of things. One, as mentioned, it demonstrates success, which is good up the chain. But, what I've also experienced is, and you know this, any kind of change, it's cultural, it's people-related. People have to change, right? Our tools don't change, our processes don't change. You get people to change. And people are interesting entities, right? Interesting beings. We don't like change in general. And so we're not typically going to jump at change.
Now, there's exceptions. But, one thing that gets people to change is if you see others doing something successfully, then you might think, “Huh, wow, that kind of works and it looks a lot better than what I'm doing, maybe I should try that.” So already that switch is turned, and that's sometimes the hardest part.
So, piloting things. And I always say, when you pilot, you've got to demo like crazy. You should be giving constant demos of what you're doing and success. It gets buzz, and it gets other people actually coming to you and saying, “Can we be next? Can we try this next?” You've already won. You've fought half the battle when people come to you and ask for change, right, versus you having to track them down, drag them out of a cave, and beat them into Agile submission. So, you know, piloting is a great way to do that.
You also need to get everybody on the same page, whether it's through formal training and education types of things, or it's self-study, or brown bags and lunch and learns. You've got to get everybody up to speed on what is Agile really about, what's it mean --- how do you be Agile instead of do Agile. And that takes some education, any way you want to slice that, because you want everybody to go into it with the same expectation, including senior leadership, as mentioned. So, getting them on board and understanding what it means for them and for the org is going to be equally important.
Carty: Right, and working with a consultancy could be great because you can be the bearers of bad news instead of somebody internally trying to enforce change on a heavy-handed kind of level.
But, I did want to ask you, on that note about aligning expectations, and you mentioned getting developers and testers on the same page, so I want to ask you about that. It's a critical part of achieving effective Agile testing, right? So what are some ways that organizations can foster a little bit of better collaboration between those two groups?
Payne: I would say that, for me, the thing that has been the most successful is some form of pairing between devs and testers. I wrote an article on dev-test pairing and gave some different approaches that you could use to get developers and testers working together. Whether it's one-on-one pairing with a dev and a test, whether it's using kind of the BDD Three Amigos [approach], where you've got the business, dev and test all working together on a story, or you’re mobbing where you have everybody working to build and test story by story. Whatever you're doing to get people to work together every day in some model is going to help that a lot, right?
Make a contest of it too. I've seen success with creating what we call a pairing board, where you take your team and you say, all right, developers are on this axis, my testers and other roles are on this axis, and we're going to set a goal. Depending on the size of your team, every sprint or every quarterly increment or whatever the time frame is, we're going to figure out how everybody works with everybody else at least once on a story. And we're going to fill the chart in, and we'll do it at the end of every --- either in your stand ups if it's just a sprint activity, or at your retros or your sprint demos. Then, if we fill in the whole chart at the end of the quarter, we're going to have a party, or something. Or you're going to get a prize, or whatever it is, right? Gamify it, make it fun, make it something to track. I've seen success with that because then it's a fun gaming activity with maybe some reward at the end of it, instead of a, “thou shall all work together,” kind of a mantra from above.
Carty: So, if there's one takeaway today, it's that a pizza party is just as much of a motivator for adults as it is for kids.
Carty: That's what you want to leave this podcast with today.
Payne: Yeah, well, but beer helps, too. Not for kids though, but for adults. Yeah, add that into the adult motivation.
Carty: Yeah, you got my attention, for sure.
So, when we spoke before, you mentioned that there's been a little bit of an interesting debate around technical debt, which is a little bit related to this topic. The thinking had been that you pay down your debt whenever you have a chance to slow things down a little bit. But you told me that lately the conversation is changing a little bit. And how is that?
Payne: Yeah, so this is something that's popped up in the last year. And we actually had an internal consultant at Coveros do a brown bag on it because they'd heard an interesting talk on it, thought it was an interesting discussion, and we had an internal discussion about it. But now I'm starting to hear it at the events. I heard some conversation about this in one of the sessions that I attended at STARWEST.
Really, what it gets down to is that, originally, technical debt as defined I think it was by Ward Cunningham --- one of the founders of Agile came up with the term “technical debt.” His original point was that it was the debt that you were imposing upon yourself when you decided to release something early. And because you were doing that, maybe it wasn't fully documented or maybe it wasn't fully defined, or you decided you were going to put something out that maybe wasn't yet fully ready, but it was something you felt from a market perspective made sense to do. But, now, over time what's happened is --- and then there was going to be some debt incurred that later you're going to have to fix. But what's happened over time is, people have started to lump almost any kind of issue into technical debt. So, when you ask people what they mean by technical debt, they say everything from bugs, that's technical debt; uncommented code; code that's not readable; lack of documentation are all lumped into this idea of technical debt. That wasn't the original intent of the concept. The point that people are saying is, “Hey, if you go back to the original definition, that's really where we need to focus our attention. We shouldn't be just working off technical debt because it's technical debt as now [it is] today defined, because a lot of that may not have as much ROI as building new features.
So the pushback now is, instead of just being a zealot and saying, “We just always have to make sure our technical debt is low,” is to evaluate that debt and make sure it is actually debt, and weigh that against the value of features that we're implementing. And we haven't, I don't feel like, done a good job of that. It's been a one-sided drive to reduce technical debt irrespective of any kind of quantifiable measure, if that makes sense.
Carty: Definitely. And it gets to be a scope issue, like anything else. I mean, as you add that debt up there, it gets to be harder to pay it off and it takes more time and resources.
Payne: Yep, absolutely.
Carty: Interesting. So, as you said in our discussion prior to recording, you're seeing more of an embrace of DevOps and continuous integration, or at least an attempt at those practices, and that's a good thing. But, what's the next step forward for some of those orgs or teams that might still be a little bit early in their maturity level?
Payne: So, from a testing perspective, I think it's about figuring out what you should and shouldn't automate. I mentioned earlier, one of the challenges is around automation. I'm a firm believer --- and there's people that disagree --- that you're going to have to do some amount of automation in your development process as you move toward an Agile and, certainly, a DevOps process, because you're trying to accelerate delivery, because you want to be able to refactor the code as you go. You have to refactor the code in an incremental model. You're going to need a regression suite at least, and you want that regression suite to be, as much as possible, automated, otherwise you've got a huge block in your process.
So, the question is figuring out, “Well, what do I spend effort and time automating for this process?” And I always look at --- there's a nice model out there, there's other models as well, that asks you to look hard at both your test suites that you have and specific tests in those test suites, and ask kind of three important questions.
The first one is, how important is this test? If it fails, what happens? So, is it catching a critical issue that our customers just --- we just can't ever have them see? Those tests need to be run repeatedly, and that means they should probably be automated if at all possible. So prioritize your tests, prioritize your suites, prioritize by features, and automate the things that are most important that are automatable is point one.
Point two is, those tests, though, have to be reliable. Nothing is worse than automated tests that you have to get on and figure out whether it passed or failed because sometimes it works, sometimes it doesn't, they're flaky, or there's some manual effort involved in the process, or there's some false positives that pop up. You want these tests to be reliable. You want them to run and give you the same result every time. They need to be reliable and not have a lot of human intervention if we're going to use them in an automated process.
The last is, they should be specific. That just means they're testing one particular thing. They're not catch-all tests trying to cover a lot of territory all at once. We want them to test one thing. We want them to be independent so we can run them in different orders, right? We don't want tests to rely on other tests if at all possible, because that constrains our ability to reorder them, to automate some of them and not automate others, to parallelize them and run them faster in parallel maybe in the cloud. That's a hot topic right now. So, they need to be specific, and they need to be independent.
So, I always tell people, look at those three aspects of your tests, and pick out the ones that you think make the most sense. Start with those, and iteratively add to your suite as you can afford to.
Carty: Ok, Jeff, final sprint here, so I have a few quick questions for you. In one sentence, what does digital quality mean to you?
Payne: So that's a term you hear a lot now, right? Digital quality, digital transformation. So, digital quality to me, is really assuring the success of a customer journey or a customer engagement. I feel like it's all about the customer, and the quality of that customer experience for an organization's kind of comprehensive digital platform. Anything that touches customers --- how do I make sure that customer journey is successful? [That’s] the way I would characterize digital quality.
Carty: What will digital experiences look like five years from now?
Payne: Well, yeah, I think the whole goal in the digital transformation, digital quality movement to me is getting a better understanding of our customers and understanding, how they engage with our products and come through different touchpoints --- web, mobile, or whatever it is. And I feel like there's a lot of data being collected, and organizations are trying to use that data to make better decisions and give consumers better results and better options, and, obviously, sell them more. I really feel like AI and using artificial intelligence to take that data and start to really understand customer trends and customer needs is going to make the experience for customers a lot clearer and make a lot better customer recommendations and purchasing recommendations --- using non-AI types of analysis just isn't yet providing that level of sophistication.
So, I think that's going to drive more value. It's going to drive more customer satisfaction. It's going to increase customer engagement, and should increase customer revenue for companies that adopt that. So, I think AI is going to radically change the customer experience over time.
Carty: What's your favorite app to use in your downtime?
Payne: That's a great question. So, I code for fun. I always tell people I got into software because it was what I like to do. It was a hobby when I was a teenager. And I figured if you're going to get paid to do something, why not get paid to do something you like to do?
I don't code in my job anymore. I haven't written a piece of code in one of my companies in so long it's embarrassing to mention. But, I code on the weekends. So, I'd say a good Python environment's probably my favorite app because I do code in Python for fun.
Carty: What's something that you're hopeful for?
Payne: Yeah, so you mentioned you were going to ask this question. This is probably the question I've spent the most time thinking about because it's --- you could go a lot of different directions with this.
One of my hot buttons is, software is being integrated into every product. It's being integrated into every business. It's becoming more and more business-critical, mission-critical. Yet so many of our organizations are not run by technologists. They're run by people who don't understand software at all. And I deal with them every day.
I feel like the world would be so much better --- and I'm biased, of course --- if organizations were run by technology-oriented people who understood software. They would make better decisions; they would invest in the right places; and they would just better understand their products than currently some organizations do.
So, my goal is, and my wish is that, over time, organizations recognize that people who understand software and understand technology make great senior executives, because they're going to make the right decisions for the organization and the shareholders. And I hope that comes to pass. We're not there definitely yet.
Carty: Will we get there eventually?
Payne: I almost feel like we have to, because I feel like organizations that do go that direction, I feel like there'll be hits and misses, but over time, they will be more successful, I personally believe, because they'll better understand their products, and that means they'll better understand their customers, and should make it more successful. And, at some point, it'll tip and it will become the in fashion thing, right? Like, everybody is hiring a tech CEO, or whatever. It'll become the in thing, I hope.
Carty: Well, Jeff, I always appreciate talking with you. Next time we speak, it might have to be about sports memorabilia or cards, but I hope that we get to do that again soon.
Payne: Yeah, no, this has been great. Thank you, David. I appreciate it.
Carty: That was our conversation with Jeff Payne, CEO of Coveros. I always enjoy talking with Jeff, and I hope to get a peek at his vintage football collection again in the future. Thank you for tuning into this episode. Thanks as well to our producers, Joe Stella and Samsu Sallah and graphic designer Karley Searles. Feel free to reach out at firstname.lastname@example.org. That's plural, email@example.com. And we will catch you next time.