Case Study

Improve service with candidate service surveys

we would like to share how we have improved our service through surveys and usability tests (UT).


Summary in 3 lines

  1. The Viewinter HR team has made significant improvements to the AI interview service through rigorous surveys and usability tests. These enhancements have not only improved users' convenience and engagement but also reassured our stakeholders of our unwavering dedication to improving the Viewinter HR service.
  2. The introduction of AI humans and achievements has resulted in a positive perception of the tour users' invaluable feedback, significantly improving its quality.
  3. Our users' invaluable feedback has guided the ViewinterHR team's commitment to continuous improvement. This user-centric approach has enhanced service quality and made our stakeholders feel valued and integral to the process.

 

Hello. This is the ViewinterHR team.

 

Today, we would like to share how we have improved our service through surveys and usability tests (UT).

 

ViewinterHR is a service we launched as a minimum viable product (MVP) in October 2019. It was designed to streamline the hiring process using AI to conduct initial interviews. Since its launch, the service has grown to over 180,000 candidates and over 3 million interview videos. However, the service was being developed at a high level of complexity without specific quantitative and qualitative feedback from users who were taking AI interviews.

 

ViewinterHR was adding features to meet corporate requirements, and the UI/UX developed for the initial MVP version was inadequate for current users and needed to be improved.

* UI/UX: UI refers to everything on the user's screen, and UX refers to the user's entire experience with the product (User Interface, User Experience)

 

We wanted to improve this service by taking the following steps:

  1. Let's get to know our users through surveys.
    1. Who are our candidates?
    2. What do they think about AI interviews?
  2. Experiment with usability testing (UT) to see if it improved the candidate experience.
  3. The results of our post-redesign survey.

 

 

---

 

 

1. Get to know your users through surveys

 

ViewinterHR's internal data left system logs (recording errors, test completion, etc.), but direct user information (demographics, feelings after watching AI interviews, etc.) was not recorded. First, we understood this through a three-month survey, which received 1,586 responses.

 

1-1 Who are our candidates?

 

"20s job seekers"

 

Of the responses, 42.5% were between the ages of 26 and 30, and 35.5% were between 19 and 25. About 80% were in their 20s, which is especially popular during the hiring season for companies. This demographic data is significant as it indicates that our service is most popular among young job seekers, a key target audience for many companies.

 

In addition, the occupation of current users is 35.8% for pre-employment, 20.7% for college and graduate students, 11.8% for research and development, and 7% for management/office, indicating that most viewers are in their 20s.

 

1-2 What do candidates think about AI interviews?

 

 "Fair but Unempathetic AI"

 

When asked about their expectations for AI interviews, 66.5% of the 1,039 respondents answered that AI will not discriminate because it has no emotions or selflessness like humans, followed by 30.3% who said it will be fast and accurate in its analysis, and 26.4% who said it will be fairer than humans. Overall, people expect AI to be more likely to be (1) non-discriminatory, (2) fair, and (3) accurate than humans.

 

On the other hand, when asked about their worries about interviewing with an AI, 894 respondents, or 57.2%, said, "I don't think it will understand my emotions like a human would," with the rest citing concerns about data errors and the algorithm's reliability.

 

The candidate survey results resonated with us, and we immediately started an internal discussion. This led to the decision to offer AI humans, a feature that allows for a more interactive and engaging interview experience. The introduction of AI humans significantly contributed to improving interview engagement. 

 

The introduction of AI Pause was done through prototyping: first, we created a testable demo program with the help of our developers, followed by UI/UX improvements and usability testing (UT).

* Prototyping: A basic model created with only core functions to verify and improve performance before full-scale commercialization.

 

 

--- 

 

 

2. Usability Test (UT): Experiment to see if the candidate's interview experience is improved.

 

UT Process

In addition to the introduction of AI humans, various UX improvements were made.

① We improved the readability of the overall interview guide.

② We improved usability by allowing you to see your currently connected devices when checking your environment, allowing you to quickly and accurately check your devices.

We also ③ designed a more user-centred UIUX for the entire interview process so that questions are visible close to the camera during the interview.

 

We conducted a usability test (UT) with 20-something job seekers to see how well the improvements would work for our target audience. The UT involved a series of tasks that simulated the interview process, allowing us to observe and measure the candidates' interactions and experiences.

 

UT results

 

The UT results showed that interviews with AI humans were more engaging than interviews without AI humans, with a score of 5.83 (out of 7) and 5.17 (out of 7) for interviews without AI humans. This indicates that introducing AI humans has significantly increased user engagement with the service.

 

However, an unexpected result of the AI human was that some candidates found it difficult to understand the interview questions, either because of the delivery of the interview questions or because the AI human's voice could have been more comfortable. We took this feedback seriously and decided to address this technically by improving the TTS performance, demonstrating our commitment to continuously improving the service based on user feedback.

 

During usability testing, we learned from the characteristics of our testers to make other improvements. We realized that testers need to read the interview guide more thoroughly overall. Even if they had read the guide, they could have remembered it better, but they didn't because the UI they experienced in the interview didn't match the guide.

 

So, we emphasized the parts of the guide that were more important to them, switched to one-page explanations with images, and added tooltips to the UI they'd see in the interview to help them match the guide to the actual interview and remember it.

 

 

---

 

 

3. Results of the post-redesign survey

 

In our post-redesign survey, we found that AI interviews' favorability increased significantly after AI humans' introduction and usability improvements. This indicates that our efforts to enhance the service have been well-received by our users.

 

Would you rather have an AI interview or an in-person interview?

  • Before AI humans: 17.8% chose AI interviews
  • After AI humans: 30.6% choose to interview with an AI human

 

Reasons for this choice included ⓐ the convenience of being virtual, ⓑ feeling like it would be fairer than an interviewer who might ask biased or inappropriate questions, ⓐ feeling like they were having a conversation and not a game, and ⓓ feeling like the company was keeping up with the latest trends.

 

How did you feel when you interviewed with an AI dormant?

  • Before AI humans: "I think AI humans will be awkward" 49.1%, "I think I will be more immersed in the interview" 39.4
  • After: 15.4% felt awkward, 55.2% felt more engaged in the interview

 

The results showed a nearly 20% increase in the opinion that the AI interview would be more immersive. In addition, in the survey after introducing the AI human and usability improvements, 67.9% of respondents said that their perception of the AI interview changed positively after watching it compared to before the interview.

 

 

---

 

 

In addition to AI Human, many other usability improvements are being made to ViewinterHR daily. While the service improvements introduced today are centred around AI Humans, we listen to our users and businesses and address various issues to improve our service daily.

 

The two most significant strengths of the ViewinterHR team are our capacity and speed. We're always listening to our users and organizations and eager to add and fix any issues or improvements we see, which our customers who use ViewInterHR can attest to. 

 

Stay tuned for more stories from the ViewinterHR team as we continue to improve our service through close-knit teamwork and respect for user feedback.

 

 

---

 

 

Write Junghee Lee (HR Business Development Team, Genesis Labs)
Review & Edit Leo (Genesis Labs Marketing)

 

Similar posts

Get the latest HR insights.

Be the first to know about new B2B SaaS HR insights to build or refine your HR function with the tools and knowledge of today’s industry.

ViewinterHR is the only company in Korea that supports government-authorized AI interviews and has the most AI patents for AI interview SaaS.