How our customer effort score survey revealed hidden friction points

We finally ran a proper CES survey last month and the results were eye-opening.

Turns out users were struggling with parts of our onboarding we thought were smooth. The payment flow had way more drop-off than we realized.

Anyone else had similar surprises when they actually measured effort scores?

Payment flows always surprise us. We had similar issues with our checkout page that looked fine to us.

Payment flows are brutal for this stuff. I ran CES on a subscription app and found users were bailing because our “quick checkout” still asked for billing address twice - once for payment, once for receipts.

We thought the address autofill was helping but people saw it as extra work. Removed the receipt address field and effort scores dropped by 30%.

The scary part is how long we ran ads to a broken funnel. Could have saved thousands in wasted spend if we measured this earlier.

What specific part of your payment flow showed the highest effort scores? Sometimes it’s not the obvious stuff like card entry.

This happens often. We think users understand our processes when they really struggle. I’ve seen a ‘simple’ signup turn into a daunting 7 steps when users count every field and confirmation. Features that took months to build can confuse instead of help. The difference between what we perceive and what’s real is huge. CES helps clear up these blind spots quickly.

Yeah we missed obvious stuff too until we asked users directly.

CES surveys often find issues missed in internal testing since you know your product too well.

In one case, users rated account creation as high effort while we thought it was just two steps. The email verification button looked disabled on some devices, causing confusion.

After fixing the button styling, effort scores improved right away. Now I run these surveys every few months to catch new friction points.