Medallia’s 2026 State of Customer Experience Report should make many CX teams uncomfortable. Not because the findings are surprising, but because they confirm something many teams have been avoiding for a long time: a lot of feedback programs still create the appearance of listening without proving that anything meaningful is being understood or improved.

The headline finding says it all. 66% of CX practitioners believe their company’s customer experience improved last year. Only 17% of consumers agree.

That is not a minor perception gap. It is a disconnect wide enough to make you question what many CX programs are actually measuring and whether internal signals are being mistaken for customer reality. That is also why the report sparked immediate reaction.

Within days of the report’s release, Maurice FitzGerald, former VP of Customer Experience at HP and now Editor-in-Chief at OCX Cognition, responded with a sharp LinkedIn post stating that the findings point to the need for “immediate and radical change.” His conclusion was blunt: brands should move beyond survey-heavy thinking and go further with AI-generated insights. As he put it, this is not about moving from vinyl to cassettes. It is about moving straight from vinyl to Spotify.

It is a strong line. And he is right about one thing: the old model is clearly under pressure. But based on what we see in our own data, the conclusion goes too far. Because Medallia’s report does not prove that surveys are dead. It proves that generic, low-context survey programs are losing power.

That distinction matters.

Retently’s recent study analyzed over 25 million survey invitations across 600 ecommerce brands. Different sample, different scope, different methodology, but when you line our data up next to Medallia’s, the overlap is striking.

Not because the numbers match perfectly. They do not. Because the pattern does.

Both datasets point to the same broader truth: feedback programs start losing value when they are treated like volume systems instead of listening systems.

Where the alignment is hard to ignore

1. Response rates are declining and likely more than many teams realize

Medallia’s practitioner survey puts the average response rate at 8.6% as of Q3 2025.

Our dataset, measured across actual ecommerce survey sends, shows a full-year average of 5.76% and that includes a Q4 drop to 4.03% as brands nearly doubled survey volume during peak season.

The directional trend is the same. Whether you look at Medallia’s broader CX report or our invitation-level ecommerce data, response is getting harder to earn.

Retently’s data also suggests the problem often becomes more visible under real operating pressure. When brands push volume too hard, participation does not remain stable. It weakens.

That is what Q4 exposed. More sends did not create a more useful signal. They created a weaker response environment. That is not just a seasonal fluctuation, but a sign that many programs are operating beyond the point where more outreach produces more value.

2. Medallia’s “inaction gap” shows up in our data, too, just from the customer side

One of the strongest findings in Medallia’s report is that 30-40% of departments do nothing at all with the customer insights they collect.

That should be treated as a much bigger deal than it often is. Because once customer feedback is gathered but not acted on, the damage is not only internal. It becomes visible externally, too. Customers may not know which department ignored the signal, but they do learn a simpler lesson: brands keep asking, and nothing seems to change.

That is exactly where our own data becomes useful. Medallia identifies the organizational problem. Insight gets collected, but progress stalls at the point of action. Our data shows the customer-side effect. When brands keep increasing survey volume without improving timing, relevance, and especially follow-through, the response starts to weaken. So yes, survey fatigue is real. But it is not just a volume problem. It is often a credibility issue.

Response rate, therefore, deserves to be treated as more than a campaign KPI. In many cases, it functions more like a health signal, reflecting whether the listening model still feels relevant and worth responding to.

3. Consumers want brands to infer more from behavior, but that does not mean direct feedback no longer works

Medallia reports that more than half of consumers believe brands should infer satisfaction from behaviors and signals, not surveys alone.

That fits what we see in our channel data almost perfectly. Our in-app surveys, which intercept customers during active experiences, reached 32.3% response rates, roughly 10 times the email average.

That is not a small channel win. It is a fundamentally different participation environment. And it points to something important: customers are not refusing feedback altogether. They are refusing friction.

They are much more willing to respond when the feedback request is tied directly to the experience they are already having, rather than asking them later to reconstruct it from memory. That does not mean email has no place in CX programs. It means email often becomes less effective when it is used as the default channel for moments that would benefit from more immediate, contextual feedback.

So when consumers say they want brands to infer more and ask less, that should not be interpreted too literally. It does not necessarily mean, “never ask me.” More often, it means: do not make me work unnecessarily hard to help you understand something you should already be able to see.

Where the “drop surveys and move to AI” conclusion breaks down

This is where some of the more extreme reactions to Medallia’s report miss the point.

If the takeaway is that many traditional survey programs are underperforming, that is clearly true. If the takeaway is that teams need to broaden their listening models beyond surveys alone, that is also true. If the takeaway is that AI should play a larger role in modern CX systems, again, yes.

But if the conclusion is that surveys themselves are inherently obsolete, our data tells a different story. Because when surveys are done well, they work dramatically better.

In our dataset:

  • In-app surveys sustained 30-35% response rates year-round, including through Q4 when email dropped
  • CES surveys reached 22.5% across channels
  • Customers with 10+ orders responded at 2.5x the rate of first-time buyers
  • Personalized surveys in emotionally relevant categories saw lifts above 700% (e.g., +765% in Gifts & Specialty and +771% in Health & Beauty).

That is not what an obsolete method looks like. That is what a method looks like when outcomes depend heavily on execution. And that is the point that often gets lost. The survey programs that perform worst tend to share the same bad habits: broad email blasts, generic cadences, weak segmentation, poor lifecycle awareness, low emotional relevance, and no meaningful action loop.

If that is how a program is designed, then yes, surveys will start looking outdated. But that does not prove the instrument is obsolete. It proves the strategy is.

Three things the combined data makes very clear

Our data points to three shifts that separate high-performing programs from the average:

Channel matters more than most teams think

A 32.3% in-app response rate versus a much lower email average is not just a copywriting issue. In many cases, it reflects a context gap. Embedded feedback requests reach customers closer to the moment, while email often works harder to recreate relevance after the fact. That does not, by definition, make email ineffective. It means email is often less suited to immediate, context-sensitive moments than more embedded or transactional survey channels.

Volume does not create signal forever

Customers with 10+ orders respond at 2.5x the rate of first-time buyers, but the Q4 collapse shows that blasting more surveys at everyone simultaneously erases that advantage. Relationship depth enables feedback. Saturation kills it.

One of the clearest ideas in our research is that frequency can build willingness, but volume destroys it past saturation. And that fits Medallia’s report surprisingly well. If up to 40% of departments are not acting on feedback anyway, then collecting more and more of it without changing the operating model just compounds inefficiency.

Surveys still work, but only when they respect relevance and effort

The problem is not that customers refuse to answer questions. It is that they are less willing to answer bad questions, asked at bad times, through weakly matched channels, in systems that feel disconnected from reality.

When the ask is specific, timely, low-friction, and clearly tied to a real experience, people still respond. This helps explain why CES performs so well in the right moment, why loyal customers respond more readily, and why contextual formats outperform more detached outreach so dramatically.

So no, the future is not “keep surveying the old way.” But it is also not “never ask again.” The future is much stricter than either of those. It demands better judgment.

What the data actually suggests

Put Medallia’s report and our dataset side by side, and the real takeaway is not “surveys versus AI.” It is lazy listening versus intentional listening.

Too many brands are still running feedback as a habit: send on schedule, ask everyone, collect the score, circulate the dashboard, move on. That model is losing power fast. The programs that still work look very different. They ask selectively. They preserve context. They treat the channel as part of the experience. They understand that relationship depth changes willingness to respond. And they use feedback as part of a decision system, not as a decorative layer of reporting.

Medallia’s loyalty findings reinforce the same point. If only 22% of consumers feel “very loyal” to any brand and 40% have switched brands recently, this does not suggest that customers have stopped caring about their experience. It suggests that many have stopped believing brands are listening well enough to earn their loyalty.

That is why the real shift is not from surveys to no-surveys. It is from collecting indiscriminately to listening intentionally. Medallia’s own report notes that 78% of practitioners plan to adopt new metrics or approaches, so the appetite for change is clearly real. But the most useful leap is not abandoning direct feedback altogether. It is redesigning how, when, and why it is used.

AI should absolutely be part of that shift. It can help brands decide when to ask, whom to ask, what signals can be inferred passively, and what action should follow. Used well, it can reduce waste, improve orchestration, and make listening systems more intelligent – which is exactly where the inaction gap begins to narrow. 

Yet, replacing weak survey programs with AI-inferred sentiment does not automatically solve that problem. In some cases, it risks making it worse by creating an even more comfortable illusion of understanding while customers quietly disengage. AI can improve orchestration, but AI does not replace relevance or trust. And it definitely does not replace action.

So the real shift is not away from surveys, but away from undisciplined listening.

The bigger takeaway

Medallia’s report confirms that CX has reached a point where old listening habits are becoming harder to defend.

But the most useful conclusion is not that surveys, email, or established CX metrics no longer matter. It is that they work best when used with greater precision, clearer purpose, and stronger follow-through.

The brands that close the perception gap in 2026 will not be the ones that stop asking customers what they think. They will be the ones that become much more disciplined about when they ask, where they ask, what they ask, and what happens next.

That is the real rethink: not the death of surveys, but of lazy surveying.

Want to see the ecommerce data behind these patterns? Read Retently’s 2026 study, based on over 25 million survey invitations across 600 brands, to explore how channel, timing, volume, and customer context shape response behavior in practice.

Get notified of new articles Leave your email to get our monthly newsletter.