AI-Powered Student Request Routing

E-Wolf Support is an AI-augmented request-routing platform designed to serve FRCC’s 30,000+ students with faster, more accurate support. After analyzing the failures of the previous AI chatbot—which frequently surfaced incorrect deadlines and policy information—I designed a human-in-the-loop model where AI classifies inquiries and drafts responses while staff verify all outbound communication.

This workflow maintains 100% accuracy for institutional information, reduces manual sorting time by 30–40%, and saves staff an estimated 5–6 hours per week that were previously spent triaging misrouted requests. For students, it delivers clearer routing and more trustworthy guidance; for FRCC, it reduces operational overhead, improves compliance, and creates a scalable foundation for future AI-enabled services.

A Stanford-University–Led Research Project, Funded by FRCC

Timeline

July'25 - Dec'25

Role

Product Designer
Led end-to-end product strategy, research, IA, interaction design, prototyping, and testing—while also designing and configuring the LLM logic that powers the system’s AI-assisted classification, drafting, and human-in-the-loop workflows.

The Challenge

Fragmented Contact System

Students faced a complex contact page with multiple department options. Unable to determine the correct recipient, most students defaulted to sending all requests to E-Wolf, regardless of the actual department responsible.

User faced FRCC's contact page with multiple department options.

Unreliable AI Chatbot

FRCC’s new AI chatbot routinely surfaced incorrect or incomplete answers on critical topics like deadlines, payments, and financial aid. With no human verification layer, students couldn’t trust the guidance they received—leading to misinformation risks and confusion.

Because of this, the chatbot did not reduce support workload at all. Students continued calling and emailing for clarification, leaving staff with the same volume of requests the bot was meant to relieve.

Legacy chatbot responses were repetitive and unreliable for sensitive topics like tuition—highlighting the need for a human-verified support system.

Manual Routing

E-Wolf staff spent 5-6 hours per week manually categorising and routing student requests to appropriate departments.

A review of 1,000+ student support submissions revealed frequent misrouting, unclear categories, and heavy manual triage. This dataset became the core training source for E-Wolf’s AI-assisted routing model.

High Level Goal

How might I  design an AI-augmented support system that reduces manual categorization workload for E-Wolf staff while improving response clarity and routing accuracy for students—while maintaining the human oversight, professional judgment, and empathy essential to effective student support?

Research & Discovery

Understanding the needs of both students and E-Wolf staff through data analysis and user interviews.

Chatbot Evaluation

Tested FRCC's existing AI chatbot to document accuracy issues with critical information.

User Interviews

Conducted interviews with E-Wolf staff and students about pain points

Workflow Mapping

Mapped current support workflows to identify inefficiencies

Data Analysis

Analyzed Formstack submission data to identify patterns and common request type.

Key Findings

Chatbot Analysis

Existing chatbot provided outdated financial aid deadlines during testing

Gave inconsistent answers to the same question asked differently

Used a confident tone even when wrong—no way for students to verify accuracy

At institutional level, AI errors are unacceptable for critical information

Student Perspective

Students want immediate acknowledgment of their requests

Prefer clear, contextual responses over generic replies

Need to know their request reached the right department

Staff Perspective

Need to maintain oversight over AI-generated responses

Want AI to handle routine categorization while they focus on complex cases

Require ability to refine and personalize AI suggestions

Early Design Work

The solution must create a collaborative relationship between AI and humans, where AI handles pattern recognition and initial categorization while staff provide the nuance, empathy, and contextual understanding that students need.

Defining Structure

With insights in hand, I sketched an updated information structure and user flows. The structure was organized around two axes: user-facing portals and AI-powered backend processes.

Staff Dashboard

Request Queue Management

AI Suggestion Review

Response Editing Interface

Performance Analytics

Student Portal

Request Submission Form

AI-Generated Response

ViewStatus Tracking

Department Routing Display

AI Processing Layer

Natural Language Processing: Analyzes student request text for intent, keywords, and context

Department Classification: Maps requests to 8 FRCC departments with confidence scoring 

Response Generation: Creates contextual draft responses for staff review

The Solution

Design Philosophy: Augmentation, Not Automation

Unlike traditional automation that replaces human workers, this system is designed to augment staff capabilities. The AI handles the repetitive pattern recognition and initial categorization, freeing staff to focus on what humans do best: providing empathy, nuanced judgment, and personalized support

Try E-Wolf AI Support Interface here

This creates a partnership model where AI and humans learn from each other—AI improves through staff corrections, while staff benefit from AI's ability to process patterns at scale.

A two-part system that combines AI intelligence with human oversight to create better outcomes for everyone.

Smart Request Submission (Student View)

On the student submission side, AI analyzes the user’s issue description and suggests relevant departments (e.g., Academic Support, Financial Aid, Registration). This lowers cognitive load for students who may not know which office to contact, while giving staff clean, pre-structured tickets for faster handling. Students can still override suggestions, ensuring transparency and control.

AI-enhanced request form that predicts the correct department and reduces misrouted tickets.

Staff Dashboard & Core Features

The staff-facing dashboard centralizes all student requests and layers AI assistance on top of existing workflows. The interface provides real-time visibility into workload, routing precision, and departmental trends—transforming a previously fragmented support process into a unified, data-informed operations hub.

Staff dashboard enabling AI-assisted triage, routing, and response review with full human oversight.

Key capabilities include:

1. Unified Request Management

Staff can view all pending, in-progress, and completed requests in one place, supported by search, filtering, and clear status indicators. This consolidates email-based workflows and provides a single operational view for all student support activity.

A dedicated AI Accuracy metric on the dashboard tracks model performance in real time, allowing staff to understand how reliably the system classifies requests and where manual oversight may be needed.

A high-level operational overview displays request volume, progress states, and AI precision at a glance.

2. AI-Powered Categorization & Routing and AI-Assisted Response Drafting

Every incoming request is analyzed by the model and assigned a predicted department with a confidence score. This dramatically reduces manual triage time and improves routing consistency across 13+ service areas.

For each request, the system generates a draft response aligned with institutional policies and tone. Staff can edit, refine, or regenerate the message before sending—maintaining 100% human oversight while accelerating turnaround times.

AI-assisted response panel provides auto-generated, high-accuracy drafts that staff can review and personalize before sending.

3. Departmental Analytics & Operational Insights

The analytics view highlights request load, distribution, and throughput across all departments. Donut charts and stacked bar graphs make it easy to identify bottlenecks, spike periods, and under-resourced areas.
These insights enable data-driven staffing decisions and help leadership anticipate student needs more proactively.

Real-time analytics summarizing request distribution and resolution trends across 13 departments.

Testing Results & Projected Impact

Usability Testing Metrics

5-6h

Projected Weekly Time Savings

Based on current manual routing workload


5-6h

Projected Weekly Time Savings

Based on current manual routing workload


5-6h

Projected Weekly Time Savings

Based on current manual routing workload


5-6h

Projected Weekly Time Savings

Based on current manual routing workload


100%

Human Oversight Integration

All AI suggestions require staff approval


8

Department Coverage

Comprehensive routing across FRCC services


91%

AI Categorization Accuracy

Validated across 300+ test scenarios

91%

AI Categorization Accuracy

Validated across 300+ test scenarios

91%

AI Categorization Accuracy

Validated across 300+ test scenarios

91%

AI Categorization Accuracy

Validated across 300+ test scenarios

Testing Methodology

Sample Size: Prototype tested with 300+ authentic student request scenarios sourced from historical E-Wolf support data

Evaluation Criteria: AI routing accuracy, response completeness, confidence score reliability, and staff workflow integration

Stakeholder Validation: System design reviewed and approved by E-Wolf staff members who will utilize the platform upon implementation

Benefits for Students

Instant Initial Response: AI provides immediate acknowledgment and relevant information while staff prepare personalized follow-up

Accurate Department Routing: 91% accuracy means requests reach the right expert the first time

No More Request Ping-Pong: Students don't get bounced between departments trying to find help

24/7 Submission: Submit requests anytime; AI categorizes and queues for staff review

Benefits for E-Wolf Staff

5-6 Hours Saved Weekly: AI handles initial categorization, freeing staff for complex cases

AI-Generated Draft Responses: Start with a smart draft, then add personal touch and department-specific details

Transparency & Control: See AI confidence scores, override suggestions, and maintain full oversight

Focus on High-Impact Work: Spend time on empathy, nuance, and complex student situations

Personal Takeaway

Scaling Impact by Simplifying Complexity

The most meaningful insight from this work was seeing how much cognitive load and operational inefficiency came from unclear routing and inconsistent messaging. By simplifying the flow, tightening classifications, and designing AI-assisted tools that actually support staff, I saw how UX can unlock organizational efficiency—not by adding more technology, but by ensuring it’s used thoughtfully. This project strengthened my conviction that the best UX outcomes happen when technology, process, and human judgment work together, not in isolation.