People Nerds

UX Leadership in the AI Disruption

February 25, 2026

overview

Dscout CEO Michael Winnick encourages UX leaders and practitioners to face their fears and treat AI as a design material, moving beyond polarized debates.

Contributors

Michael Winnick

CEO @ Dscout

Thumy Phan

Illustrator

UX Leadership in the AI Disruption

February 25, 2026

Overview

Dscout CEO Michael Winnick encourages UX leaders and practitioners to face their fears and treat AI as a design material, moving beyond polarized debates.

Contributors

Michael Winnick

CEO @ Dscout

Thumy Phan

Illustrator

Finding your footing when the ground is moving

Everything is moving fast right now, especially for UX and research teams.

With new tools showing up weekly and expectations shifting monthly, the role many of us trained for is being quietly renegotiated in real time, often without us in the room.

For people responsible for understanding users, shaping decisions, and protecting product quality, this moment feels uniquely destabilizing. The rules are changing while the work continues, and there is no pause button. There’s no clean break between learning and leading.

So the question I keep returning to is a very specific one for our community: How do UX and research leaders make good decisions, maintain credibility, and lead our teams in this moment? 

Put differently, how do we find our footing when there is little solid ground?

Face our fears

People are generally clear that AI represents a pivotal and massive change. They agree that the stakes are high, but that is often where the clarity and agreement end. 

Everyone is confused about AI, including the people building it. Just look at Sam Altman, CEO of OpenAI; he's deeper into AI than 99.99% of us, yet he constantly pivots his strategy and public positioning.

The combination of high stakes and high uncertainty breeds fear, and everyone building software feels that. 

It’s showing up across product teams questioning their roadmaps, engineers watching AI write code, and executives placing bets they don’t fully understand. Everyone in the software development lifecycle is rethinking their value in real time.

UX teams have their own fears, too. They’re wrestling with fears of becoming irrelevant, defending work that feels slower than the business demands, and being continuously asked to justify their worth.

Let's be honest: many of these fears are grounded in reality. Teams are being restructured, research budgets are under pressure, and the value that was once assumed is now being questioned out loud.

But here's the trap. When humans feel threatened, we respond in predictable ways. 

We fight—over-asserting certainty, getting defensive about our methods. We flee—avoiding AI conversations, hoping the wave passes. Or we freeze—going quiet, waiting it out. These reactions are understandable. 

But they can also be self-fulfilling. When UX becomes defensive or invisible, it confirms the suspicion that we might not be essential to what comes next.

Momentum is the way

What helps UX teams move forward right now isn’t confidence theater or polished opinions. It’s momentum grounded in reality.

Momentum comes from engaging with users early, often, and visibly. It comes from learning fast enough to keep design and research tied to real decisions and showing the organization that uncertainty does not mean paralysis. This is where UX leaders have a real edge.

Our discipline exists to operate inside ambiguity. We are trained to make sense of incomplete information, surface patterns, and guide decisions without pretending the messiness isn’t there.

The challenge is turning those strengths into forward motion that the business can feel.

Get in the field

When AI conversations drift into speculation, the most stabilizing move UX teams can make is to return to users.

While many AI narratives are being shaped in boardrooms and social feeds, the real story is already unfolding in how people adapt their behavior, invent workarounds, and recalibrate expectations.

The UX teams I see navigating this moment well are grounding themselves in questions like:

  • How are our users actually using AI today, inside and outside our category?
  • What fears and expectations are emerging beneath the surface?
  • Where are the highest-friction moments in the experience, and how are those shifting?
  • Which jobs truly matter, and how are AI-native products approaching them?

Running this kind of research regularly does more than generate insight. It turns you and your team into a clarifier, replacing fear and abstract debate with observable reality.

Treat AI as a design material

Physical product design requires a deep understanding of materiality and material properties, and I think UX is the same. It's hard to understand AI unless you understand it as a medium and play with it yourself.

Materials reveal themselves through use. You don’t learn their properties by arguing about them. You learn by shaping, testing, and seeing where they behave in unexpected ways.

The same is true here with AI. I see UX and research leaders like Rose Beverly, David Eisner, and Noam Segal pushing the bounds of these tools to great effect.

Building something, even rough and imperfect, changes the relationship from intimidation to curiosity. The unknown becomes more navigable, questions become sharper, and judgment improves across the board.

This might look like:

  • Vibe code something for personal or professional use
  • Prototyping with vibe coding tools
  • Partnering with engineers using AI coding tools
  • Exploring new tools simply to understand their constraints and affordances

Direct experience builds discernment, and discernment enables UX leaders to speak with credibility when trade-offs emerge.

Bring discernment to polarized debates

Strong opinions about AI are everywhere, including within the UX community. And many of them arrive as absolutes.

"AI should never moderate an interview."

"We should never use synthetic participants."

I respect principled stands. But many of them are borne of fear rather than experience. They foreclose inquiry before it starts.

What moves teams forward is shifting these conversations toward conditions and context. Instead of asking "Is AI moderation good or bad?", ask "When and where is AI moderation helpful? Where does it fall short? What does it depend on?"

Instead of "Are synthetic participants acceptable?" ask, "When are synthetics useful? When are they misleading? What are we trading off?"

This is a muscle UX professionals already have. We are the home of “how might we,” afterall.  We're trained to notice nuance, understand edge cases, and evaluate experiences in real-world conditions rather than hypothetical ones—use those skills here.

When UX teams anchor AI discussions in observed behavior and specific tradeoffs, they elevate the conversation. Decisions become easier to explain, trust grows across functions, and you position yourself as someone who clarifies rather than just opines.

The ground will keep shifting

If the ground feels unstable beneath you, it does for your team, too—from junior designers and researchers quietly questioning their path to senior practitioners rethinking how their expertise fits this new landscape. Everyone is working to stay relevant while the bar keeps shifting.

“Your job as a UX leader isn't to pretend the ground is stable. It's to create footholds—small spots of clarity that give people something to stand on while everything else shifts.”

So be clear about priorities this quarter, be explicit about what success on the project in front of them looks like. Name what you know, what you don't, and what you're figuring out together. These aren't grand gestures; they're just enough traction for people to keep moving.

The ground is going to keep moving. That's not a problem to solve. It's the condition we're operating in.

What UX leaders can control is how we respond: with curiosity instead of defensiveness, with momentum instead of paralysis, with honesty about what we know and what we don't.

The skills that got us here—making sense of ambiguity, understanding people, guiding decisions through uncertainty—are not obsolete. They're more necessary than ever.

HOT off the Press

More from People Nerds