Conducted a heuristic evaluation of current experience and prioritised the issues to share with the cross-functional team
Together with the UX designer from the employer team, we conducted a heuristic evaluation to audit and assess the current state of job seekers’ experience. After the audit, I piled all the notes and prioritised the issues based on severity. I then socialised the findings with the PM and engineering teams to discuss how to implement the fixes that are part of broken experiences such as inconsistent messaging, accessibility and usability red flags.
Utilised a mix of research methods to understand user needs and did a deep dive with PM of our current funnel data.
While the engineering team is working on fixes, I carried out a round of discovery research to understand what are the outstanding pain points that job seekers are facing. The method I utilised:
- User interviews with job seekers who were invited but chose not to complete the interview
- Partnered with PM to have a detailed funnel data analysis to understand where we are losing job seekers’ engagement
- How many upgrade orders include accessory purchase?
- Read through a sample of survey feedbacks post-interview
- Watched 50 completed interviews and did a video/interview quality analysis
A simple user journey to capture the research findings
Through the research, some outstanding user problems include:
- Unfamiliarity: users were just unfamiliar with the concept of one-way interviews
- Fake enthusiasm: the messaging of invitation we provided were misleading
- Uncertainty: users were lack of essential context during each stage of the journey
- Inhumane: interviews are equally important to job seekers to have meaningful interaction with employers
- “Sad path”: rejection and unresponsiveness that leave job seekers with negative feelings
Led a design workshop with cross-functional teams to map out user persona and JBTD
Now that I felt we have a good foundational research understanding of our job seekers, I piled the findings and shared them with the team. Then based on the findings, I led 2 group brainstorming sessions to map out our user persona and JBTD. The purpose of the sessions were to build team’s empathy towards our job seekers (as we were very employer focused prior), and to align on design strategy.
Mapped out the ideal user journey and what are the essential contexts we need to surface at each stage.
Based on the workshop and research findings, I started to think through what would be an ideal user journey and at each stage, what context we should provide job seekers so that they feel well-informed and prepared to complete the interview.
Conducted a round of research sessions where I co-designed with users.
Once I had the first round of concept drafts, I invited 6 job seekers who have never experienced one-way interviews for user studies. In each session, I asked them to help me put together pages/emails at each stage that would help them feel informed to evaluate if they want to take the interview and then feel prepared to start the interview. The user study activity brought up very engaging conversations as job seekers were actively thinking and tons of useful insights including their primary needs/questions, information architecture, and how they evaluate if to take the interview or not.
Conducted an audit across Indeed of interview messaging and user flows.
To ensure a consistent and connected experience (also to find future valuable integration points), I did an audit of current Indeed interview messaging and user flows. I then connected with relevant UX design teams outside Incubator to share our current state and what we are thinking to get their feedback.
Together with PM and Engineers, we stack ranked the experimentation priority.
At this stage, I felt I was in a good shape of design directions, so I started to have conversations with the PM and engineers to prioritise the core use cases and experimentation order for the upcoming roadmap.
A new post-interview survey to measure UX success and satisfaction baseline.
One of the goals was to increase value delivery and quality experience for job seekers, but I felt the current post-interview survey we had could not really help us to measure and understand if we are providing job seekers with the experience that can enable them to be successful in the interview. Hence, I modified the SuperQ model and proposed a new set of questions that can help us create a better baseline for job seeker satisfaction and UX quality. I socialised the list of questions with the PM and content designer, and also got feedback from quantitative researcher expertise within Indeed to ensure a survey that is both effective and aligned with team goals.
Hi-fi Design and Usability Testing
Utilised usertesting.com to conduct unmoderated usability testings.
Based on the experimentation priority, I started to work on detailed hi-fi designs for each epic and before each major experimentation release, I took action to conduct unmoderated usability testings.
We carried out a very iterative process with a 1-week sprint engineering cycle.
I worked very closely with the PM to track experiment results and bounce ideas on hypotheses. I also would go and launch rapid user studies as needed to understand why certain experiments performed that way. Then the PM and I would discuss and move around experimentation order based on hypothesis and user insights.
An example how we iterated the experience over the course based on experimentation results.
We added an intro page as a touchpoint to improve JS context. Then we added an option to start now instead of forcing job seekers to schedule a time to provide job seekers flexibility to choose their own journey and capture their interest at the moment which resulted in uplift in interview starts but -10% in Interview recording completion.
In order to address the false start , we added an initiative of abandon interview email which led to an option for job seekers to proactively save and exit interview as needed to address the false start.