User Research

Usability Testing

Information Strategy

BYU-Idaho

Creating BYU-Idaho's Ai Chatbot

Working alongside a team of experts to create an ai chatbot that works for internal student support agents.

Quick Overview

My Role

UX Designer / Researcher

Duration

4 Months

The Team

Ai Developer, Student Success & Support Team Directors, Leads of Student Support Services

Goal

Create a internal use and external use AI chatbot that delivers accurate answers and reduces reliance on live agents.

The Challenge

BYU-Idaho’s Support Center had recently launched an AI chatbot for internal use by their support agents. However, both agents and students reported slow responses, irrelevant answers and limited functionality. The vision was to transform this early attempt into a reliable, student-facing AI chatbot integrated with campus-wide data systems.

Setting the stage

When the BYU-Idaho Support Center first introduced an AI chatbot for internal use, the hope was to make finding information faster and easier for support agents.

Shortly after launch, feedback revealed a different story. Students weren’t getting helpful answers, and agents found the tool frustrating to work with. The technology was there, but the experience was falling short.

BYU-Idaho support center student employees - photo copyright BYU-Idaho

Observing the users

Before making improvements to the chatbot, I needed to understand the core issues. I created a user testing plan with clear goals, a script, and a method for capturing feedback. I asked stakeholders for a quick walkthrough, though at the time I wasn’t entirely sure what to expect.

User Testing & Interview Plan (current state, moderated)

Research Goals:

  • Understand a general BSC agent replies to emails workflow

  • How does this fit into the BSC agent fit into their workflow?

  • Observe: Does the tool work the way it was intended? What doesn’t work about it?

  • How can it be a good user experience?

Script

  1. Can you walk me through how you typically respond to student emails, from start to finish? (Understand their baseline workflow.)

  2. How are you currently using the AI email tool in that process? (Where it fits in their workflow.)

  3. What’s working well about the AI tool? (Identify strengths.)

  4. What’s not working or feels frustrating when using the tool? (Identify pain points.)

  5. When the AI gives you a response, how do you check if the information is correct?
    Do you usually trust what it gives you? (Explore how they verify and their confidence in it.)

  6. If you could improve one thing to make the AI tool easier or more helpful, what would it be? (Get actionable UX ideas.)

The process

I spent a day at the heart of the issue, talking directly to the students. In just a few hours, I spoke with nine student employees, learning about the frustrations they faced, the questions they asked most often, and how they felt about using an AI tool for help.

Me conducting user testing & interviews with student employees

What I learned

From my observations, I analyzed and synthesized the data. I found that the interface was not being used in the way it was originally intended.

It was interesting to see how students interacted with the bot compared to its designed purpose. Since I wasn’t part of their day-to-day environment, they were honest and open with their feedback, which ultimately benefited the stakeholders.

View full research presentation 🔗

A few slides from my research presentation. What's working well and areas of frustration.

The solution

UX suggestions

Several key areas needed improvement, and my role was to propose solutions that balanced user needs with technical feasibility. The most critical challenge was improving the tool’s contextual awareness. Other priorities included ensuring reliable link vetting, improving formatting, and refining the tone of the AI’s responses.

UX suggestions from research report

Before Ai Chatbot UX analysis from research

Improvements from feedback

The team took my feedback and quickly got to work. While I credit the UI and interface entirely to them, my UX input helped highlight what was missing and gave them a clearer understanding of user needs. Their updates introduced a contextual input box, a redesigned interface and structure, and an improved user flow.

Results from UX feedback and suggestions

Integrating the ticketing system

Instead of starting with user profile information, the developers went straight to the end goal. Each time an email agent responds, the interaction is documented in a ticket. With the updated AI tool, agents now have access to a ticketing component that quickly summarizes the information and records the ticket automatically.

Additional functionality of the ticketing system includes autofilling and quick submission

Leaps to something bigger

Because the tool proved so powerful and effective for agents, the team began pushing for a student-facing version. This expansion would give both prospective and current students access to a digital chatbot at any time of day. The chatbot would handle the same tasks as phone and email agents, including updating contact information, automating parts of registration, and more.

A conceptual version of the campus-wide ai chatbot

Tracking metrics

Did it make a difference?

Agents responded positively to the changes, noting that the tool was much simpler to use. They appreciated the new context box, which provided clearer direction and more accurate answers.

Within a week, the Student Support Center saw an increase in satisfaction ratings, reflecting the team’s improved timeliness and accuracy in responses.

Customer Service Satisfaction Score

Increased by 10%

one week after ai email bot improvements were implemented

BYU-Idaho support center student employees - photo copyright BYU-Idaho

What I learned

This project reinforced that "context is king". No matter how advanced an AI tool may be, without the right context it will always fall short. I saw firsthand that the difference between frustration and success often came down to how the tool framed questions and guided users.

I also learned the power of small, intentional changes. Adding a simple context box and refining user flows transformed both agent confidence and student satisfaction. These improvements not only increased trust in the chatbot but also gave the team momentum to pursue larger goals, like building a fully student-facing version of the tool.

Lexi Harris

lexiharris.ux@gmail.com

Lexi Harris

lexiharris.ux@gmail.com