Turning Support Friction Into an AI-Powered System

UX RESEARCHER

problem statement

Monorail needed a better way to handle customer support as volume increased. Too many conversations were being closed without a real response, routine questions were eating up human time, and the team needed a more scalable way to help users without making the experience feel cold or automated.


Objectives

  • Understand the most common reasons users contacted support
  • Identify where conversations were being missed, delayed, or closed without resolution
  • Determine which issues were appropriate for AI handling versus human escalation
  • Improve support coverage without sacrificing trust or clarity
  • Create a scalable support model that could improve over time

Goals

  • Increase AI support coverage across customer conversations
  • Reduce “closed with no reply” conversations
  • Improve automated resolution rate
  • Keep human support focused on higher-complexity issues
  • Create a support system that felt faster, clearer, and more reliable for users

my process


From Complexity to Clarity

Map the Conversation Patterns

I reviewed recurring customer questions, unresolved conversations, no-reply closures, and escalation patterns to understand where the support experience was breaking down.

Structure the AI Support System

I helped organize support content, chatbot logic, routing paths, and knowledge base improvements so AI could handle common issues more consistently.

Measure and Iterate

I tracked AI coverage, resolution rate, human involvement, support volume, and failure points to identify where the system needed refinement.


Primary Product Users

The primary users were Monorail customers who needed help with account setup, deposits, portfolio activation, brokerage connections, ACATS transfers, app navigation, and general product questions.


Many of these users were dealing with financial tasks that required trust and clarity. They were not just looking for fast answers. They needed answers that felt accurate, calm, and reliable.



The internal users were also important: customer support, product, operations, and leadership needed better visibility into what customers were asking and where the support system was failing.

Qualitative Research & Ideation

We analyzed customer conversations, support patterns, unresolved issues, Intercom workflows, knowledge base gaps, and AI behavior to understand how the support experience could become faster and more reliable.

Key Observations

  1. Many customer questions were repetitive and could be answered with better structured AI support.
  2. Some conversations were being closed without a meaningful reply, creating a poor user experience.
  3. Support volume was increasing, but the team did not need every issue handled manually.
  4. AI could help with common questions, but edge cases needed clearer escalation paths.
  5. Knowledge base quality directly affected how well the AI could respond.

Inferences

  1. The support system needed better structure before AI could perform well.
  2. AI should not replace human support completely. It should route work more intelligently.
  3. The highest-value opportunity was separating routine issues from complex cases.
  4. Better support visibility could also reveal product friction and onboarding problems.
  5. Continuous monitoring was necessary because AI performance changes as volume and complexity increase.

research findings


Defining User Needs

Fast First Response

Users needed quick acknowledgment and guidance instead of waiting or being ignored.

Clear Resolution Paths

Users needed to know whether their issue could be solved immediately or required human follow-up.

Trustworthy Answers

Users needed responses that were accurate, clear, and appropriate for a financial product.

Human Escalation

Users needed a path to real support when the issue was too specific, sensitive, or complex for AI.

Recommendations Turned Into Product Decisions

To support the research findings, the team shaped AI support around coverage, clarity, escalation, and continuous improvement.

AI First-Touch Coverage


AI was introduced as the first layer of support so users could receive immediate guidance instead of waiting for manual triage.

Knowledge Base Optimization


Support content was rewritten and organized to help AI answer common questions more accurately.

Conversation Routing


The system separated routine questions from issues that required human review, reducing unnecessary manual support.

Escalation Paths


More complex cases were routed toward human support instead of forcing users through dead-end automation.

Performance Tracking


AI coverage, resolution rate, human involvement, and no-reply closures were monitored to evaluate system health.

Ongoing Iteration


Conversation failures and unresolved patterns were reviewed so the system could improve over time.

outcome

The work helped turn fragmented customer support into a more scalable AI-assisted system.


AI support coverage increased from 0% to over 90%, meaning Fin went from having no meaningful role to touching nearly every customer conversation.


“Closed with no reply” dropped from 25% to 1%, a 96% reduction in customers falling through the cracks.


Fin’s resolution rate improved from 42% to 66%, a 24-point lift in automated resolution.



At the same time, support volume increased from 266 to 323 conversations, a 21% increase, while human involvement stayed around 10%. That allowed the team to keep humans focused on higher-complexity issues instead of routine support.


The result was a faster, more visible, and more scalable support experience that improved both customer coverage and internal efficiency.

Let’s talk about your project

Fill in the form or call to set up a meeting at  (315) 530-5269 or email me at gregorylifanov@gmail.com