
Humans in the Loop: Foot Locker’s Approach to Quality, Innovation, and AI
Insights
- Embedding quality early—through shift-left practices, KPIs, and observability—turns testing into a driver of business outcomes rather than an afterthought.
- AI in testing delivers measurable productivity gains, but the true value comes from pairing automation with human judgment, creativity, and oversight.
- Building a culture of continuous learning, alignment, and psychological safety enables enterprises to adopt innovation confidently and scale transformation sustainably.
In this episode of the Infosys Knowledge Institute podcast, Samad Masood speaks with Sunita McCoy, Global Leader for Platform Quality Engineering, Observability, and DevX at Foot Locker. They explore how enterprises can balance innovation and governance, embed quality early in software development, and foster cultures of psychological safety and continuous learning. Sunita shares lessons on leveraging observability and running AI experiments in testing—highlighting the human factor as central to scaling transformation in the AI era.
Explore more videos:
Samad Masood:
Welcome to the Infosys Knowledge Institute podcast, where business leaders share what they've learned on their technology journey. Today, I'm speaking with Sunita McCoy, the Global Leader for Platform Quality Engineering, Observability, and DevX at Foot Locker. And we're going to be talking about humans in the loop, how you balance innovation, governance, and quality with the ever-important human factor. Hi, Sunita, thanks for joining us.
Sunita McCoy:
Hi, Samad. Thank you for having me today.
Samad Masood:
So Sunita, tell us a bit about yourself and your career, your interests and how you got here.
Sunita McCoy:
Sure. Thank you Samad for the question. You know, like I said, this has been a journey and thinking back three decades ago, this is probably not what I had anticipated I'd be doing. I started as a software engineer in my career and this was a very natural adaptation, more so out of need. I was busy developing code and engaging with clients and passing software over to them and they'd say, gosh, this is really not what we were expecting. And so our requirements and end product were really not a match. And I learned there was something called quality engineering in the midst of it all. And I started my first greenfield adventure of building a quality organization as a part of a consulting firm. And from there on, I took a natural passion to leading software engineers through transformation. It's a very very niche area in my opinion. It takes time for organizations to realize they've been in tech debt and have the opportunity to revive and become more new gen and that's kind of where I've gotten myself into being a transformation leader.
Samad Masood:
What do you see typically going wrong in large innovation projects around governance and quality? Because I see there also a kind of a balance of innovation being creative, fast-moving. Move fast and break things, and governance and quality is literally the opposite of that. How do you marry these two?
Sunita McCoy:
When you think about innovation, transformation, software engineering, it truly is a journey. Rome wasn't built overnight. And when you think about that, today's enterprises are fundamentally focused on what am I giving my business? What am I giving my customers? That's number one. And I know that's revenue generating. To do that, you do want to transform. You want to anchor on innovation. And anchoring on innovation, it's really a thought, right? You got to understand that it is a capability that will propel you, make you faster, better, probably more economical, you'll have economies of scale. And so when you think about that, it really depends on the message and the mantra that the organization is conveying. I like to believe, as leaders, if innovation is truly important, we have to create that environment and the atmosphere culturally within the workplace. And so whether it is launch and learn, whether it is adopt a new tool, try it, fail fast, but bring all of those lessons back to the organization and talk about them openly so that you are truly encouraging folks in removing that fear. So ensuring there is psychological safety in really trying and being open about what you're learning.
The other thing that you talked about was that human factor and when I go back to being in college, I don't think we've really anchored on human factor. We're very, very focused on the technical skills and software engineering. And so that piece of it comes to us very naturally. However, to be successful in corporate America, or just corporate in general, it really is very, very important that leaders think differently and encourage our associates to do differently. So that we're not kind of squaring ourselves in a box. So I think it's really, really important to build that culture and encourage that advancement.
Samad Masood:
Do you think, because again, there is that necessity, isn't there, for governance and quality to kind of control and moderate and monitor? You know, that takes a different mindset, doesn't it? And what you're saying, I guess, coming out of college with a bit of that mindset, a bit of that structured mindset, but you're trying to engage more creativity, more trust. Can you give us some examples of how that works or where it goes wrong as well?
Sunita McCoy:
Sure. So I'm a metrics kind of girl. So, you know, my teams will tell you, you want to talk to Sunita, go ahead and bring some data and some qualitative measures and she'll really understand it more seamlessly. I think it's just that you can't argue data, right? What's in front of you, you can't really argue it. Now, it also is important that you understand the intent of the data and how you've come about it. A lot of organizations set goals and agendas. And I've noticed that the opportunity lies in making sure those goals and objectives are smart and that they're actually measurable. And you don't have to hit the industry best metrics in the get-go. I like to think about it as baby steps, right? Think about what's the first milestone that you want to hit and then kind of build on top of it over the course of time.
And when, where it goes wrong? I will say competing priorities is the biggest one, in my opinion. I've often noticed the competing priorities just create a lot of distraction and context switching for team members. So a lot of loss of time there and inefficiencies. And the other is that you're using the same set of talent to scale across all the different priorities that you have. So I typically tend to walk in and have those conversations upfront. Let's talk about our priorities. What do we want to really achieve, whether it's in a month or quarterly or for this year, but then let's stay true to it, realizing there may be opportunities or need for us. Internal, external factors that make us pivot. So there's need to be agile, right? But then do it such that everybody understands, so it's well communicated and there's that transparency and the why behind what you're trying to do. So that's a big one.
Then the other thing I noticed is just skill gap. AI is a big topic and I know you're going to have questions for me around it, but that's another one that we're navigating and you will quickly realize, hey, the hype's there, you may not have those unicorns available to you as easily as you would like for them to be. Subject matter expertise, right? You are working with multi-generations at a given point in time and we all are at different points in our life. Some of us getting ready to retire, some just walking into the workforce. It takes time to build that knowledge acumen. Those would be three big ones that I would hit. And the fourth one I would throw in is culture, cultural barriers. The leadership tends to set the tone for some of these events to happen. And depending on who's leading and what their priorities and focus are, I think will be another piece to kind of navigate.
Samad Masood:
How do you recommend organizations should be structured in order to make quality really embedded? As you say, as you're describing it, effectively, having those short-term goals, aligning on the objectives that you have, not just blindly, it's not a waterfall culture anymore. What are your recommendations to make agile at scale work?
Sunita McCoy:
From an operating model perspective, first off, just within the organization, I would say there needs to be leadership alignment. Like leadership really truly needs to hold hands and sing kumbaya in my opinion, for that to really transcend down to the operating teams, boots on the ground. I'm big on relationships and allyship, right? Sometimes I might say it and may not really stick as well as if you were coming to the table and, you know, reiterating it and throwing your own flair at it. When more of us say it, I think it tends to stick.
From a shift left, it's a big piece of what I'm doing right now, especially is really building a culture of shifting left from a quality perspective within the realm of software engineering, right? So don't lean on the test phase as the area to really assess the quality of the product or the code that you're putting out, but shift left and build those quality aspects early on from requirements, from design, right? Are your requirements testable? Is your design adequate? And can you start writing your test cases as developers are developing their code? Can it be automated? Could you leverage AI in tests? All of these things should really be conversations that are happening right at the start, not at the end.
And then also setting yourself to have indicators. Like what kind of KPIs? Like if I say, hey, I want to have a less than 10% defect find rate to actually make this operating model for delivery work, but then you actually see you're at 20%, you're at 30%, those are indicators that will quickly tell you, hey, you're falling off the radar here, and what can we do to pivot and mitigate and bring us back to green? So I think it's all encompassing to ensure when you have the partnerships too, you have the tools, capabilities and the leadership and people alignment.
Samad Masood:
And I can see what you're doing here is you're aligning quality to the outcomes of the business, the business outcomes, right? And making it relevant at the start. Because you know, historically testing and quality is seen as the thing you do at the end. Once everyone's decided what they wanted to build, but actually you're tying it directly to the business benefits of the application or the project.
Sunita McCoy:
Yeah, yeah. And you know, as you're talking about it, because I ask my quality leaders to establish their goals and objectives in alignment with the product roadmap. Because if I was doing it in a silo, it wouldn't be meaningful. And then, you know, bringing in the DevX and the observables, the components and why I really enjoy that I have those portfolios in my realm of things is, gives us an opportunity to really think more holistically. I am hugely focused on pivoting from a reactive mode to a proactive mode. And observability is amazing from that aspect, right? Thinking about your NFRs and talking about how you want to ensure you have end-to-end traceability early on so you can actually trace your customer journeys and react before it actually becomes a thing. And so those have been very helpful in building that capability. That muscle is hard to build. It takes time and a lot of focus. But if you can get yourself into a position where you're more proactive than reactive, I think you have a better chance at having a win-win.
Samad Masood:
We at Infosys and at the Infosys Knowledge Institute did a piece of research earlier this year called the AI Business Value Radar. And we looked at different use cases, AI use cases, how they had been deployed and how mature they were and how much value they were generating. We were able to identify which use cases, or which strategies were generating more value out of AI. And a really interesting thing we found was we asked companies about their human engagement, engagement with their employees. And those companies that were engaging fully in change management, AI training, communication around AI and involving their employees in AI projects and design had significantly higher effectiveness from the AI project. The larger proportion, significantly larger proportion of their projects were value for the business. This is a bit of a surprise for us. Though it makes sense. And certainly, talking to you, I imagine it doesn't surprise you at all. What's your perspective of the importance of continuous learning and change management? You mentioned earlier the multi-generational, you know, colleagues and all this. How do you go about this big AI change we are going through and sort of remain sane, I suppose, because everything seems to be moving so fast and there's so many pressures?
Sunita McCoy:
So, the idea of AI, when I talk to my husband, he's like, hey, AI has been around for the last 35 years. I learned it when I was in college, it was something else. And I will say in the workspace, I've seen it truly, truly pick up pace in the last five years, maybe five to six.
And those who want to run with it have probably taken off, and it's a very, very small percentage. Most of us, I would say, are more so in that laggard phase, right? Like we're slowly trailing. We talk about hallucinations, security concerns, oh my God, my data. Everyone's going to hear everything that I'm doing, saying, and so how much of it do I really put into the AI tool? Is it really safe? Am I secure? The big win, in my opinion, is truly being open to learning something new. And I'll give you a very simple example, ChatGPT, Copilot, Grok, like you name it, they're out there. Sure, some of us are probably just using it to fix an email or an essay or getting this math problem resolved. Now you just take that same thought, bring it into the workspace. If you are in an enterprise or organization that is truly supportive of adopting AI and to your point, really pushing for that continuous learning and creating those avenues and learning paths for the organization, for the team members, I think you have a win.
I'm currently running an experiment, and this will be the second time running an AI experiment. We think we could easily gain a 20% lift, like right off the bat, if you were to use AI in test. And more than that, as I was talking to my teams, I'm like, what's the advantage? I get it. You're telling me you get a 20% lift. But then what else? They're like, hey, I have higher confidence that the test cases that I've written are actually solid and are up to par, meet the requirements and the needs. It's almost like using AI as a peer review. And then also probably making me think about a scenario or two that I wasn't going to think about. So it's giving me a little more.
And for those of us who are not truly automation engineers, but can actually use AI and say, hey, just give me this automation, take this requirement, spit it out into automated test cases, use Gherkin, Cucumber, you know, whatever, and then spit out the code, you're already getting a framework. And then you can embellish it with whatever stepwise definitions you want to add to it. It is helping you be more effective, more productive, perhaps giving you some peace of mind that you didn't have before. It's a very small experiment, but it's a good starting point for me. I personally feel that's a great win.
It also opens up or breaks those barriers of, oh my God, is my job going to walk out of here because now I have AI coming into play to do everything that I'm doing. This is where the human factor comes into play, right? AI is not going to have this level of human interaction that you and I are having right now. And while AI is good to tease my brain and make me think about something differently or cement the idea that I have, it's not going to fulfil this interaction. And so I think that will always stay. Our AI will be just as good as we as humans build it or make it out to be. And it will grow over the course of time as we're modelling them and feeding them content and data. But that is the advantage.
Definitely try it, right? Give it a shot. Be exploratory. Create goals. Like when we do organizational goals, we actually include, whether it is certifications or continuous learning, we encourage our team members to go ahead and sign up for one or two a year so that they're also staying relevant with time. Technology moves too fast otherwise.
Samad Masood:
Yeah. And I guess it frees you up to do those things that you were saying earlier that were so important, which is stakeholder alignment and that collaboration and actually it gives you more time to focus on what you should be doing. Even if some of the things you're doing are being sped up by AI. And I increasingly think so many people are saying, it’s just a human assistant and really, we shouldn’t be calling it artificial intelligence, it could be assisted intelligence or something like that because it really is just the tool to help.
So thank you so much, Sunita. It's been really great to get a bit of an insight into your approach and I loved hearing about the human side to quality engineering. I'd like to ask you just for our audience and our listeners just two or three quick takeaways that you would give as advice for someone in your position overseeing quality governance experience and in this AI era, what would you say are the quick takeaways?
Sunita McCoy:
Gosh, I'll probably give you three. I'd say one is build a culture of continuous learning. Create those avenues and opportunities, invest in your people, right? Make sure you have measurable KPIs so you can actually celebrate those wins as you achieve them. And I would say priorities, align on your priorities right across the enterprise, not in your own little world, not in your own silo, but for the entire enterprise. I walk into conversations and I'm asking my C-suite, are they aligned? That's the level of alignment that I'm looking for. So those will be three things that I would say.
Samad Masood:
Great, Sunita, thank you so much for joining us on this interview and good luck with the AI revolution and transformation.
Sunita McCoy:
I really appreciate it. Thank you so much. I have a great team and I'm very much looking forward to the journey. Thank you.
Samad Masood:
This podcast is presented by MIT Tech Review in partnership with Infosys Topaz. Visit our content hub at technologyreview.com to learn more. And be sure to follow us wherever you get your podcasts. You can find more details in our show notes and transcripts at infosys.com/IKI in our podcast section. Thanks to our producers, Christine Calhoun and Yulia De Bari. Dode Bigley is our audio technician and I'm Samad Masood with the Infosys Knowledge Institute, signing off. Until next time, keep learning and keep sharing.