top of page

📌 This page is no longer available on my portfolio website. Take a moment to browse through or simply click 'Home' to explore my latest portfolio projects.


Indeed: Benchmarking Job Search 

A comparative usability test of job search flow 


  • There has been a significant surge in international job applicants in the US, with their numbers increasing exponentially over the years. From 2018 to 2019 alone, 2.2 million work-eligible visas were issued to adults, not including foreign international students on F-1 visas (Bier, 2021).

  • Indeed sought to evaluate the usability of its job search system for the international job market in the US, recognizing the heightened stress and complexity involved in job searching for this demographic, due to factors such as maintaining visa status.

My Role


12 weeks (Sep-Nov 2021)


Zoom, Figma, Excel


  • Moderated and performed note-taking for 10 usability tests.

  • Conducted two unstructured interviews.

  • Collaborated on quantitative and qualitative analysis.

Team of five



User Interviews

Understanding the user group and job platform usage

To find the pain points and preferences of people we conducted some unstructured interviews focusing on international applicants who have recently searched or currently searching for jobs using multiple websites.

Heuristic Evaluation

Self-evaluating the usability of different task flows involved in the job search

In parallel to user interviews, we conducted a heuristic evaluation to find critical usability issues to focus on in the usability testing. Using Nelson's 10 Usability Heuristics for User Interface Design, we examined three task flows.

Task flows


Create a new account.

Log in to an existing account.

Retrieve password.

Search Job/filters

Primary job search bar.


View/Save Job

View jobs list.

View job description.

Save jobs.

The overall analysis of the app was primarily positive, with minor issues. However, the most significant problems were the app's ability to select various filters, select multiple cities, and filter through reviews to aid users in their job search. 

Usability Test Plan

Usability Test Plan

Key Objective

Evaluate the usability of job search on Indeed compared to other platforms, for international job applicants in US.


Comparative benchmarking

We decided to conduct a comparitive analysis with “Handshake”, a competitive application tailored for university students and alumni.

Screening & Recruitment

A survey was conducted using google forms to identify target participants fulfilling the required criteria.



  • Graduated or recently graduated

  • Require U.S. visa sponsorship for job

  • Recent or current job seekers

  • Mixed familiarity with Indeed and Handshake

Testing Set-up

Moderated usability tests were conducted using Zoom. Hence, it was a remote testing setup where users share their screens while performing the given tasks.

Task objectives

Task 1

Ease of use of filters for some major criteria of target users, namely, location, salary and sponsorship

Task 2

Test usability of filters and search bar to find jobs based on specific skillsets - to evaluate issues found during heuristic evaluation

Task 3

Allow users to find a desired job of their choice inorder to observe the functionalities they use and difficulties faced in a natural set-up

Each moderated session has a moderator, two note-takers focusing on user actions, one note-taker for verbal feedback, and a timekeeper who tracks task times. All the three tasks are performed on one application first and then on the second application. To prevent bias, it has been ensured that both Indeed and Handshake were first in the order equal number of times. SUS ratings were recorded after completing all tasks on each application. At the end few comparative questions were asked about their usability.



Quantitative results

SUS scores

4/10 users gave evidently lower preference to Indeed.

Indeed vs Handshake SUS scores (2).png

Time on tasks

Users spent most time in finding the sponsorship information on Indeed. Hence, for Task 1 which focuses on sponsorship, average time taken for task completion is higher for Indeed.

Indeed does not provide a direct way to find sponsorship details which is a key requirement for international applicants

Time on Task (1).png

Task success

Users either went overtime or gave up finding the sponsorship information on Indeed Task 1. Users were more successful in doing Task 2 on Indeed which focuses on custom filters. 

Indeed provided more custom filters which made it easier to filter out based on different specifications

Ease of Tasks

Ambiguity in sponsorship information is the main reason users found tasks 1 and 3 to be easier on Handshake. Task 2 was easier on Indeed.

Task success (Indeed).png
Task success (Handshake).png
Ease of Tasks (Indeed) (1).png
Ease of Tasks (Handshake) (1).png

Some key qualitative findings on attributes impacting ease of finding relevant jobs on Indeed over Handshake are:

Varying job description format on Indeed makes it harder for users to find the information they are looking for

Participant 8: "Handshake job description gives a quick overview and more convenient to read the job information, highlights key information"



Participant 7: "Felt overwhelmed by loads of inconsistent information, especially in the job description with huge blocks of texts and exhausting the eyes"



Providing to choose multiple choices in filter options reduces repetition of searches

Participant 3: "Handshake provides me the option to choose multiple options, such as location because in Indeed I would have to repeat the search process multiple times for every location I want."

Participant 7: "Confused on why the user cannot choose multiple choices under skill-set filter."





Multi tasking as a moderator

The moderator plays a crucial role in guiding the user behavior during the tests. I learned how to balance being professional while making the participant comfortable. It is important to be prepared well in order to make a better judgment on when to interrupt the participant and how to provide subtle guidance.

Specificity of task questions

The quality of data obtained from tests is largely dependent on the tasks. Since some of our tasks provided multiple objectives, the analysis of the data involved more ambiguity to find out specific reasons and patterns in user actions.

Post task follow-up questions

Our moderated tests initially did not incorporate a slot for my team to ask questions on unique user actions observed. This made it difficult to reason out some of their actions later. Hence, in the following sessions, we provided a time slot at the end of the test session for the team to ask any follow-up questions.

bottom of page