Designers collaborating

3 UX Predictions for 2026

At the end of the year, everybody loves to make predictions for the next year. But with the speed of technological progress it’d be foolish to predict anything.

Unless, of course, you can influence it yourself. At AWS Applied AI Solutions, my team and I intend to influence three big things next year. We hope we’re not alone.

UX moves from interaction design to relationship design
Prediction 1

UX moves from interaction design to relationship design

For decades, we’ve designed for machines, not humans. Computers couldn’t process fluid human thought, so we trained ourselves to speak in the rigid, atomized language they understood. We compressed questions into keywords. We translated nuance into boolean logic. We broke natural workflows into discrete steps code could handle.

This technological constraint became our design philosophy. Since machines needed fragmented inputs, we created fragmented interfaces. UX became a collection of static, isolated interactions. Open an app, do a task, close it. Click a button, toggle a filter, submit a form. We stitched these disconnected moments together and called it an “experience."

But it was never really an experience. It's episodic amnesia. Every session starts from zero, from nada. Your email client doesn’t remember you hate newsletters. Your calendar doesn’t understand that 8am meetings ruin your day. And your music app still cues up “sleep sounds” while you're driving 70mph on the highway to that damn 8am meeting. Our digital relationships are stuck on a bad first date — in perpetuity.

Modern language models break this cycle. For the first time, AI can hold context across sessions and remember your history. It learns your preferences. It anticipates your needs. It acts autonomously on your behalf. And when AI starts acting on your behalf, it shifts from being a tool you operate to a teammate you collaborate with.

But here’s the catch: nobody collaborates with teammates they don’t trust.

You need confidence that your teammate understands your goals, has your back, and won’t let you down. This confidence doesn’t happen overnight. It accumulates through repeated moments of competence, transparency, and respect for boundaries. You can’t design trust directly, instead you must design the conditions where it can grow.

In 2026, UX’s center of gravity will shift from designing discrete interactions to designing ongoing relationships. The fundamental design challenge changes from capturing attention to cultivating trust.

Where interaction design used color, hierarchy, and animation to direct eyes, relationship design will create spaces where confidence develops naturally. We'll stop building "onboarding flows" and start designing "first weeks" where new AI teammates learn your working style and over-explain their thinking. We'll stop demanding blanket permissions upfront and start enabling gradual handshakes so AI teammates earn access to data only when they can immediately demonstrate value. We'll stop measuring Daily Active Users and start tracking Daily Delegated Decisions to see how many choices people trust their AI teammates to handle.

The best companies have always run on trust between teammates. In 2026, some of those teammates will be AI. The winners will be the businesses and people who embrace those relationships.

Humorphism slide
Prediction 2

Devices shift from user interfaces to human interfaces

And yet, we aren’t ready for this relationship with AI teammates.

Since the Macintosh launched, we’ve been in love with the Desktop Metaphor. We’ve relied on physical stand-ins like trash cans, file cabinets, and documents that made abstract code feel like tangible objects we could manipulate. We trained ourselves to drag items, click buttons, and scroll documents. It worked beautifully — for tools.

But now, we’re mixing metaphors. We’re forcing AI — a dynamic, non-deterministic intelligence — into a rigid, object-oriented framework. We’re trying to treat teammates like tools. And the mismatch shows everywhere. The friction you feel every time you click an “AI sparkle icon” or open a bolt-on sidebar chat is less a design flaw than a category error.

We need a new design philosophy. If the era of tools was defined by skeuomorphism (mimicking objects), then the era of AI teammates should be defined by "humorphism" (mimicking people). It’s not so much that AI should look like a person as it is that it should collaborate like one.

The blueprint already exists. For thousands of years, people have refined collaboration patterns: the “standup” to sync, the “escalation” to handle blockers, the “huddle” to tackle complexity. These patterns might look different across companies, cultures, and contexts, but they’re foundational. We just need to translate them into digital equivalents.

Consider what this means in practice:

The "standup": Instead of a static dashboard, your AI teammate joins your morning sync, not as a chat window, but as a presence that reports progress on overnight tasks and listens for new context.

The "escalation": When it runs into a blocker or some ambiguity, it doesn’t just make stuff up. Instead, it "escalates" like a good employee and presents the context, the problem, and three potential solutions for your review.

The "huddle": When complexity peaks, the interface shifts from async text to real-time, multi-modal collaboration—a digital whiteboard where human and AI reason through a problem together before dispersing.

By 2026, we’ll stop forcing AI into user interfaces and start designing human interfaces. We’ll stop applying the physics of objects (drag, drop, click) and start applying the dynamics of humans (listen, interrupt, escalate). The desktop metaphor will finally yield to the collaboration metaphor.

Designers at a table
Prediction 3

Designers evolve from power users to empowered humans

When AI teammates finally arrive to collaborate with us, what will your first reaction be?

Many designers will panic. They’ll resist the change. It’s human nature to resist change, especially when it feels like it threatens your identity.

But this resistance is self-defeating. It’s like a typesetter fighting to arrange metal letters on a printing press, instead of becoming the editor who decides what story needs to be told. Why defend your right to remain a machine when you could reclaim the privilege of being a human?

Designers have built entire careers on skills that were basically human API endpoints. We prized mechanical cognition like moving pixels, crafting flows, and writing strings. We convinced ourselves this labor was our superpower. But really, so much of design craft was just compensation for what computers couldn't do yet.

As AI teammates take on the mechanical work, value moves upstream. We’ll spend more time conceptualizing than producing, more energy curating than making, and more focus judging than generating.

In a word, we’ll all get promotions.

At this stage in my career, nobody lets me near design tools. And I'm thankful for that. I spend my workdays exercising the taste I've cultivated over the years, sharing intuitions about product decisions that feel off, and relying on my eye for what's beautiful. I make judgment calls when data conflicts with my instincts. My work is irreducibly human, and it’s deeply fulfilling. I am an empowered human.

In 2026, we'll see more designers evolve from "power users" to "empowered humans." We'll see a shift from being valued for mastery of tools and shortcuts to being valued for the aesthetic taste and instincts that got us into design in the first place.

The visionary leaders of tomorrow won’t run AI bootcamps. They’ll run human rehabilitation programs. They'll help people excavate the creativity they buried under process, the intuition they apologized for having, and the judgment they were never asked to trust.

Let's make humans more human
Closing

Let’s make humans more human

For 25 years, I’ve been on a mission to humanize technology. I’ve built multiple AI assistants so people could use their natural voice and gestures. But if I’m honest, in practice that mostly meant making the machine slightly less painful to obey.

Now, I finally have the ability to make AI more human. But the irony is, I'm not even really interested in doing that anymore. That’s just a means to an end. My real mission is to make humans more human.

Carl Jung wrote, “The privilege of a lifetime is to become who you truly are.” We’ve spent a lifetime being what the computer needed us to be. In 2026, let’s become ourselves. LFG!

– Hector Ouilhet, VP Design, AWS Applied AI Solution