Ai Interview Assistant

An Ai-generated interview assistant fills a valuable need for job-seekers

Role: UX Designer, UX Researcher

Tools: Figma, Keynote, Miro, Google Sheets, Quicktime, iMovie, pen and paper


Methods: Interviews, competitive audit, directed storytelling, sketched wireframes, usability testing, high fidelity prototypes, stakeholder presentation


Impact

While generative AI is in its infancy, a flood of applications have been developed to streamline processes and solve traditionally complex tasks. While preparing for a job interview will always be challenging for job-seekers in different ways - predicting common interview questions and practicing good responses shouldn’t be. Especially with generative AI knowledge that utilizes job descriptions, a user’s resume, and common job-specific information to tailor realistic interviews.

Research showed a variety of existing applications that failed to bring together video recording, transcripts, usable coaching and data visualization to keep users coming back to improve, although many applications had a good foundation. AI Interview Assistant is an effort to bring together these features in one application and encourage job-seekers to hone their interview skills to land the job they deserve.

Final prototypes, dashboard, and Home Page

“Hello, World!” What Exists In The Current Market?

I began my research with a competitive audit of the existing AI-powered interview prep space to see what features were being utilized and to see what might be missing. Six competitors were analyzed - Huru, MyMockInterview, StandOut, InterviewGPT.ai, InterviewsBy.AI and Interview Prep AI. Right away, some positives stood out. While it isn’t poetry, out of the box ChatGPT does a good job of selecting high quality, realistic, job-specific interview questions that users could expect to hear at a job interview. Here’s an example of the AI at work:

This was enough for me to believe generative AI could be the engine that powered a useful application. These questions were good! What was missing? A few themes emerged:

  • Few of the applications audited offered much encouragement or usable coaching

  • Data visualization (tracking metrics are useful to improve a skill)

  • Video recording / transcripted interviews to study later

A theory began to emerge: AI Interview Assistant could provide greater preparation and functionality for users with these features all coming together, a positive approach to the process, and coaching or data users could use to improve.

Excellent layout and functionality, but where does the data go? How do users know they are improving?

A competitive audit to fully understand the space

What Did Users Think? Testing the Theory

I continued research by conducting a survey of 17 mostly Millenial and Gen-Z likely job seekers, the predicted primary user groups for an application like AI Interview Assistant. Users were asked to answer baseline questions such as “Have you applied to a job in the past year or plan to do so in the next year?” and “How interested would you be in a web application that takes your resume and a job description and uses AI to generate likely interview questions and then (securely) records your verbal responses over time and analyzes your answers and keeps transcripts to help you prepare for job interviews?” to determine if the app idea itself would not just be usable, but desirable. A usable application that doesn’t provide any value won’t stick around with users for long. Users were then asked to rate features like audio recording, live coaching, a data dashboard, and actual video of the “hiring manager” to better simulate an interview on a 1-10 interest scale.

Key takeaways:

  • 17 users, average age 32 years old

  • coming from a range of fields, but mostly “Tech” (7/17)

  • rated the concept of AI Interview Assistant a 7.65 on the 1-10 interest scale

  • and found planned key features generally desirable

  • such as a data dashboard (7.47), live video (7.58), and saved transcripts (6.94)

  • and as always, had important insights I hadn’t thought about

What had I missed?

  • Users were high on video integration, even of a virtual hiring manager. I had assumed users might feel a video representation of the hiring manager would look “fake” or create an uncanny valley effect. The survey challenged my assumption, with one user commenting:

“Consider video + audio recordings of interview answers rather than just audio. "It was Albert Mehrabian, a researcher of body language, who first broke down the components of a face-to-face conversation. He found that communication is 55% nonverbal, 38% vocal, and 7% words only.”

“I really don't like AI, but could see some use in something like [AI Interview Assistant]. If the program stated it wouldn't be trained on the responses I'd feel better.”

Users found value in video integration, and AI Interview Assistant could benefit from assurances about the AI and their data. This made sense, as these were two areas that the competitive audit had found existing applications lacking outside of a few cursory “Terms of Service” pages or simple video of interviewees with cues like “don’t forget to smile!” With the concept validated and a few additional ideas from users, I began sketching out some ideas.

  • Some users expressed hesitancy with AI and data security for a sensitive process

Representative users surveyed

Support for data visualization

Strong overall interest for app

Sketching Out The Plan

With initial research corroborating the “big idea,” a few more ideas about feature integration, I set out to sketch a few key screens that I wanted to prototype further. I hoped that AI Interview Assistant would:

benefit from a clean, functional AI that helped users get started and into mock interviews quickly
visualize data through a dashboard where users could track progress
contain social proofing (testimonials) and numerous guarantees about user data and security

Landing Page Sketch

Interview Page Sketch

Dashboard Sketch

Design Goal 1: Establishing Trust

Users spoke clearly during research about concerns with data privacy, AI, and concerns about an app that would be used to train AI further, sell data, or otherwise take actions that might concern our more tech-savvy Gen-Z and Millenial primary user groups. It was important in designing AI Interview Assistant that guarantees were made about data and privacy. Besides the overall stress of preparing for an interview, the job search involves sensitive information. While it may seem like a small design consideration, this area would be imperative for users. I used an old stump speech adage to achieve this goal: “Tell ‘em what you’re gonna tell ‘em, tell ‘em, and then tell ‘em what you told ‘em!”

Repetition may be boring, but there are few more effective strategies for communicating key information.

The Main Page touches data privacy in three areas: the initial step-by-step app explanation text, a nod in the “Our Promise To You” section, and social proofing from users.

The FAQ Page, I take one more bite at the apple with one of the selected questions confirming user answers do not “train” the AI, nor is information ever sold to third parties.


Design Goal 2: Data Is Beautiful - Designing the Dashboard

During research, users spoke favorably about the value data could provide them for interview preparation. The dashboard should help separate AI Interview Assistant from the pack of existing applications. For the current dashboard, I implemented metrics that could be easily tracked and provide instant value.

This solution currently provides two tracks:

Speech metrics like speaking volume in decibels, speaking speed (words per minute) and amount of “filler words” per minute require little additional coding bandwidth and give users instant feedback. No matter the content of an answer, interviewees should convey answers in a normal speaking volume, cadence, and use as few “like, umm, ahh” verbal pauses to convey confidence in their delivery. If how somebody sounds is outside of the normal range, cognitive bias against an interviewee’s answers is likely.

Time metrics are similarly easy to track and provide a valuable yardstick for users. AI Interview Assistant tracks the average response time for different categories of questions as well as the average time a user spends in a mock interview. Outside of technical interviews, best practices exist for the general time range for different types of questions. Think about how often you’ve practiced for an interview and received the feedback “that answer was very long, try to get to the point quicker” or the inverse “try to provide more detail for that answer, focus on what you contributed to the solution.” The popular STAR interview answer method (Situation, Task, Action, Result) served as inspiration to coach users to convey answers with enough supporting detail but not go overboard in length.

Transcripts, videos, and an AI Interview Score provide value not found in competitor apps. An ability to simply review previous interviews was nowhere to be found in the dominant “one off” strategy found in the current market. AI Interview Score will use simple verbal and time metrics already tracked by the app and AI analysis of supporting detail statements to give users a great idea of how much information they are providing employers, and how succinctly they are doing so - a solid measure of the strength of answers.

Design Goal 3: Realistic Interview Simulation

The best way to learn a skill is to practice it in a way as close as possible to the real situation. Research found that existing apps had little video integration, or worse, have users type out their answers. With interviews overwhelmingly taking place in person, face-to-face or through video conferencing, this seemed like a large deficit I could improve upon.

Furthermore, users spoke highly of using video for both the hiring manager and themselves.

The final design uses AI to generate a facsimile hiring manager asking actual questions out loud and the ability to record user audio and video for future reference.

This solution allows users to see, hear, and read for themselves how they responded to each question, a helpful way to self-critique body language, confidence, tone, and the content of the answers themselves.

In the step-by-step instructions

In the Testimonials section

In the application “Promise To You”

And finally in the FAQ

Lessons Learned: Incorporating User Feedback For New Design Features

As I shared earlier, I was quite surprised by user feedback requesting video integration. The prototyping stage therefore required additional video production and editing. I’m happy with the result, but the final solution ended up looking much different than I had anticipated. My takeaways from the project:

  • Start with as few assumptions as possible. Conducting early research both informs the process and creates less mental fatigue if a pivot is necessary.

  • User needs for this application varied considerably. Future iterations could include more company information and specificity in questions. More experienced job-seekers or those in niche markets tended to favor these techniques for their preparation.

  • Design... minimally for minimum viable products. In a moment of over enthusiasm, I designed Features, FAQ, and Pricing pages. These filler pages added little to demonstrate proof of concept or the “meat” of the application and could have been skipped or added after more feedback and testing.

This project challenged me by necessitating some video editing, a skill I hadn’t used in some time. Likewise, conceptualizing the product users had in their mind’s eye vs. my preconceived notions was an important part of the design process. I’m optimistic that AI Interview Assistant and tools like it will be able to help job-seekers in a real (measured, recordable, retrievable!) way.

Practice makes perfect.