Synkora
Get In Touch

Chatbot

An AI customer support chatbot with high escalation rates and frustrated users. The problem wasn't the AI — it was the conversation design underneath it.

Client

Chatbot

Industry

SaaS / AI

Services

UX Design, Conversation Design, User Research

Duration

7 weeks

AI chatbot interface — the product that needed its conversation flows completely redesigned
43% Reduction in escalation rate within six weeks of launch
200+ Conversation logs analysed to identify drop-off patterns
9 Conversation flow patterns fully redesigned
Before & after the conversation redesign
Before
  • Generic opening message that asked users to "describe their issue" with no guidance
  • Responses that matched keyword patterns but misread intent
  • Escalation triggered abruptly — users suddenly transferred with no transition
  • Error messages that increased frustration without offering a path forward
  • No acknowledgement that the bot had failed before handing over
After
  • Structured opening with guided options for common issues
  • Plain language responses written around user intent, not keyword matching
  • Warm escalation: bot acknowledges the limit, sets expectations, hands over with context
  • Recovery flows for mismatched responses that offer a graceful reset
  • Human agents receiving clearer context and less frustrated users
Conversation log analysis — mapping drop-off points and escalation triggers across 200 real support conversations Flow redesign session — mapping new conversation paths with improved escalation transitions

Left: Log analysis identifying the nine conversation patterns where escalation consistently occurred. Right: Flow redesign mapping cleaner paths and warmer handoff transitions.

The challenge

The client had built an AI chatbot to handle first-line customer support but was seeing escalation rates that were eroding the cost and efficiency gains it was supposed to deliver. Users were abandoning conversations mid-flow or requesting human agents almost immediately. Satisfaction scores for the chatbot channel were consistently below the rest of support.

The assumption had been that the AI needed better training data. The real problem turned out to be the conversation design framing the AI's responses.

What we found

We analysed more than 200 conversation logs to understand where and why users were escalating or abandoning. The patterns were consistent across nine conversation types. The chatbot's opening message offered no structured entry point, leaving users to describe their issue in open text — and then receiving responses that matched surface keywords rather than actual intent. When the bot couldn't help, the handoff to a human agent was abrupt: no acknowledgement of the failure, no transition, just a sudden transfer.

Users weren't frustrated by the limits of AI. They were frustrated by the absence of conversational awareness — responses that didn't seem to understand what they'd actually said, and a system that couldn't gracefully acknowledge when it had reached its limits.

"The AI was capable enough. The conversation design wasn't. Users were escalating because the bot couldn't acknowledge its own limits — let alone recover from them."

What we built

We redesigned the conversation flows for all nine identified patterns. The opening interaction was restructured to offer guided entry points for common support categories, reducing open-text ambiguity without removing flexibility. Response templates were rewritten using plain language principles, structured around user intent rather than keyword matching. The escalation experience was rebuilt as a warm handoff: the bot acknowledges what it couldn't resolve, sets expectations for the human conversation to follow, and passes context so users don't have to repeat themselves.

Six weeks after launch, escalation rates had fallen by 43%. The support team reported receiving better-qualified escalations — clearer context, less frustration, and users who had genuinely engaged with the chatbot rather than immediately bypassed it.

Redesigned chatbot interface — structured entry points, plain language responses, and warm escalation transitions

The redesigned conversation experience: guided entry, plain language throughout, and a warm handoff when the bot reaches its limits. Escalation rate fell 43% within six weeks.

More Case Studies

Got a project?

Say hello