top of page

Call Quality Monitoring Tool - UX Case Study

  • May 28, 2020
  • 3 min read

Call Quality Monitoring Case Study

Background

I was working for a call centre company’s R&D department in India and one of the core projects was to build an in-house application for Call Quality Monitoring. Basic steps in Quality Monitoring involved

  • Listening to agents’ call recordings

  • Evaluate their performance by filling a monitoring form

  • Creating reports for progress tracking

  • Schedule performance coaching for under-performing agents

  • Track and resolve any disputes on evaluation by agents or supervisors

All these are done via different applications and hence the flow of information is not agile.

Problem

One stop solution for all Call Quality Monitoring and Performance Tracking needs

The Tool

Our tool - CQM (Made up name for confidentiality reasons) was built for the quality team to be able to access call-related resources like audio and video recordings and the customer related information, fill an in-built form for the call and create reports with the evaluation scores.



Features list

  • Calls Search

  • Form builder

  • Evaluator

  • Call Recording player

  • Workflow for evaluations

  • Reporting

  • Dashboard

  • Scorecard

In this case study I am going to emphasise only on a few parts of the tool and how I ended up designing for them, since the tool as a whole is wide in terms of features.





Brainstorming with the team

My Design Process

Roles and responsibilities

My role in the team was a UX and UI Designer and I was responsible for

  • gathering requirements

  • interviewing stakeholders

  • creating strategy documents

  • competitor analysis

  • building wireframes

  • usability testing

  • Creating interactive prototypes

  • Design system creation

  • Design handoffs



Target Users

Our target users included the members of the Operations team - Agents and their supervisors, and the Quality team - Analysts and their managers.

Here is a gist of the personas that were created to understand their roles and goals in the company.

Analysing the users’ work process



Working on this project made me understand the way a call centre worked. As the UX designer of the team, I took responsibility of talking to the QA-s directly and understanding their day to day process and goals within the team.





Forms, forms, forms…!

The development team and I had interacted with the stakeholders and come up with a list of important things to be done in priority. The top of the list was to convert their long excel spreadsheet forms into simple ones using UI built with simple ones using a UI built with HTML, CSS and the usual back-end interaction.

There were 3 aspects to these forms

  1. A QA filling a form

  2. A manager creating a form

  3. An agent/supervisor viewing a filled form

So we had to deal with the UI, the backend structure and the scripts mapping the two for all these.


Form view — to fill a form

I built a minimal, clean UI for the QA to relate to their excel form and keep track of scores while filling it.

The QA has access to the call recording from the player in the footer of the screen while evaluating.


The current excel spreadsheet form
The current excel spreadsheet form

Form UI
The form UI I designed alternatively

Build view — to create a form

The one thing about these forms is that they aren’t any ordinary fields, but they are questions and answers with scores. Hence there are calculations involved in each level. This was different for each form and even each section.

The team came up with common form elements and customised question elements with algorithms on how scores were calculated.


The Wireframe built for Form creator screen

For technical purposes, we needed to get the form exported in a JSON format which will later be rendered as a HTML form using another component. So, for developers to test and play around, we even had a JSON form editor.


Response view — to view a filled form

This view had to be built in a scorecard format which will show the filled form and options to export the form as a pdf or as an Excel spreadsheet. This also had to show the scores section-wise and for the whole form.



Evaluation Workflow

Each evaluation goes through the following sequence through various users before it is marked as complete. Each user has a role to play and the following diagram shows a sample workflow.


Evaluation workflow

To bring this long workflow into the application, the team and I came up with the idea of role-based “Action” buttons, based on the idea of the stakeholders on their actual workflow. Usually this acknowledgement process happens over emails and important comments are either lost or left unresponded.


Flow of the Evaluation through various users and "Actions"

This is how the pop up would look on an evaluation form screen:


We even added an alerts system for which I designed the notifications listing screen.


Usability testing

After each feature release, we did a User testing session with our key users and I was responsible for taking notes of the session with which I created pain points charts and conducted internal sessions to come up with solutions. These solutions would go on the board for prioritisation for the next release.



Observations from Usability testing sessions
Plan for bug fixes and feature additions for the next release

Takeaways

While creating a solution for users who are accustomed to certain tools for a long time, it is necessary to introduce the change in small parts so as to not overwhelm them.

As designers, we need to remember that it is important make applications that are usable and learnable.

 
 

© 2025 by Ramya Ravishankar.
Proudly created with Wix.com

bottom of page