Increasing User Feedback by 30%

Monthly Iterations from User-Driven Competency Analysis

My Roles: UX Researcher, UX Designer, Content Developer, Internal Training Facilitator, Technical Writer

Background

VIPKid is an EdTech company that provides private one-on-one instruction for English as a Second Language (ESL) students.

Teachers undergo a rigorous onboarding process, including preparing for and passing a mock lesson, where their teaching is graded by a rubric. To ensure teacher applicants are prepared to meet the rubric standards, they are encouraged to attend a free training session, called the Mock Class Prep Course.

Project Details:

Client: VIPKid International

Duration: 1 year+, monthly iterations

Methods: Surveys, Data Analysis, Usability Testing, Revision, Beta Testing, Communication, Implementation

Tools: Google Suite, Surveys, Canva, Zoom

Project Overview

The onboarding Mock Prep Course was designed with zero user input

The Problems

The Mock Prep team had insufficient customer feedback

Customer concerns were being ignored

The Iterative Process

As a co-lead of the content development team, I noticed a lack of user input informing our designs. We set about to improve the usability of teacher applicants making their way through the onboarding process. We:

  • entered the design cycle by advocating for increased user feedback

  • analyzed feedback and implemented user-driven improvements

  • adapted when beta testing revealed unforeseen usability problems

  • iterated the entire process every 4-6 weeks and, ultimately, enhanced user feedback and usability


Surveys

We were designing in the dark so we began collecting feedback to drive changes and create a more user-centric training session.

Our survey included both qualitative and quantitative components. The feedback revealed confusing sections, skills that were not sticking, and whether a facilitator needed more training (costing the company more money).

Each month we collected between 400-500 responses to the survey.

User Quotes Collected


Data Analysis

Creating graphs each month helped us quickly identify our problem areas and know what to target and revise.

For a period of time, we noticed a spike in lack of understanding of the phonics skill. We took everything into account: at what point in the training that skill is covered, the time spent teaching it, whether attendees got hands-on practice with the skill, and what content was presented on the slide or conveyed in the speaker notes.


Revision and Beta Testing

During one iteration, we created the catchphrase “The Big Three Goal” to address three skills applicants were not remembering.

As co-leads of the content development team, we ran beta tests using the newly updated training decks. As we noticed improvements, this assured us we could roll out the new training to the whole team. Sometimes we realized the changes didn’t improve the experience, and went back to the drawing board.

Revised slide that addressed three problem areas for applicants


Implementation

Each iteration we created a newsletter to communicate changes to the facilitators. We included statistics for key skills (e.g. 58 applicants who were confused about the extension skill vs. 252 who understood).

To continue increasing user feedback, we asked facilitators to open the survey on their computers before opening the 10-minute Q&A session. Applicants could scan the QR code or copy the feedback link that the facilitator had placed in the chatbox. After the Q&A session, facilitators would give applicants time in the session to complete the feedback form. This initiative quickly increased our user feedback.

Monthly Newsletter Sample

Feedback request


Continued Iterations

We continued this iterative process every 4-8 weeks for a year. During that period, the onboarding rubric changed once and we had to re-prioritize the different skills covered in the course. Ultimately, we were able to increase understanding in almost all of the problem areas, including an 8% increase in the least understood skill, Synthetic Phonics, over five months.


Summary

The Problems and Solutions

This is my origin story! This was how I discovered a passion for research-based, user-friendly design. This is where I learned what UX is, by stumbling upon it as a curious content developer who wanted to make sure our products were hitting the mark. I taught myself methods to collect and compile data, ask the right questions for the information I was gathering, analyze feedback, and make user-centric products. This iterative process increased our user feedback by 30%.

Results

The best feature of this system was its perpetual functionality. It operated seamlessly, enabling us to consistently gather and analyze data on a monthly basis. This iterative process allowed us to rigorously test the data, facilitating the implementation of user-centric changes for continuous improvement.

Process

Previous
Previous

Streamlining Navigation: Uncovering Issues that Make Users Quit

Next
Next

Reimagining User Manuals for UX