top of page

Indeed usability test



The aim of this usability test is to understand and analyze the usability of the Indeed job searching website for international applicants. Job search for international applicants is especially more stressful due to multiple factors including maintaining visa status and location dependencies. In the US, they make up a significant amount of the workforce which is only increasing at an exponential rate over years. From 2018-2019, 2.2 million work-eligible visas were issued to adults. This large number does not include foreign international students on F-1 visas (Bier, 2021)



Prof Thomas Thornton


Team of five


Zoom, Figma, Excel

My contribution

I participated in 10 moderated user tests taking up the roles of moderator for two, note-taker for seven, and timekeeper for one. I conducted two unstructured interviews to understand the pain points of job seekers and collaborated in performing quantitative and qualitative analysis.

12 weeks (Sep-Nov 2021)




To find the pain points and preferences of people we conducted some unstructured pre-interviews focusing on international applicants who have recently searched or currently searching for jobs using multiple websites.

Heuristic Evaluation

We conducted a heuristic evaluation to find critical usability issues to focus on in the usability testing. Using Nelson's 10 Usability Heuristics for User Interface Design, we examined three task flows.

Task flows


Create a new account.

Log in to an existing account.

Retrieve password.

Search Job/filters

Primary job search bar.


View/Save Job

View jobs list.

View job description.

Save jobs.

The overall analysis of the app was primarily positive, with minor issues. However, the most significant problems were the app's ability to select various filters, select multiple cities, and filter through reviews to aid users in their job search. 

Usability Test Plan

Usability Test Plan

Key Objective

Understanding the needs, frustrations, and expectations of international job seekers on the Indeed mobile app while looking for jobs with desired specifications.


Comparative analysis with “Handshake”, a competitive application tailored for university students and alumni.

Screening & Recruitment

A survey was conducted using google forms to identify target participants fulfilling the required criteria.



  • Graduated or recently graduated

  • Require U.S. visa sponsorship for job

  • Recent or current job seekers

  • Mixed familiarity with Indeed and Handshake

Testing Set-up

Moderated usability tests were conducted using Zoom. Hence, it was a remote testing setup where users share their screens while performing the given tasks.

Task objectives

Task 1

Ease of use of filters for some major criteria of target users, namely, location, salary and sponsorship

Task 2

Test usability of filters and search bar to find jobs based on specific skillsets - to evaluate issues found during heuristic evaluation

Task 3

Allow users to find a desired job of their choice inorder to observe the functionalities they use and difficulties faced in a natural set-up

Each moderated session has a moderator, two note-takers focusing on user actions, one note-taker for verbal feedback, and a timekeeper who tracks task times. All the three tasks are performed on one application first and then on the second application. To prevent bias, it has been ensured that both Indeed and Handshake were first in the order equal number of times. SUS ratings were recorded after completing all tasks on each application. At the end few comparative questions were asked about their usability.



Quantitative results

SUS scores

4/10 users gave evidently lower preference to Indeed.

Time on tasks

Users spent most time in finding the sponsorship information on Indeed. Hence, for Task 1 which focuses on sponsorship, average time taken for task completion is higher for Indeed.

Indeed vs Handshake SUS scores (2).png
Time on Task (1).png

Task success

Users either went overtime or gave up finding the sponsorship information on Indeed Task 1. Users were more successful in doing Task 2 on Indeed which focuses on custom filters. 

Task success (Indeed).png
Task success (Handshake).png

Ease of Tasks

Ambiguity in sponsorship information is the main reason users found tasks 1 and 3 to be easier on Handshake. Task 2 was easier on Indeed.

Ease of Tasks (Indeed) (1).png
Ease of Tasks (Handshake) (1).png

Indeed does not provide a direct way to find sponsorship details which is a key requirement for international applicants

Indeed provided more custom filters which made it easier to filter out based on different specifications

Qualitative results

Some key qualitative findings are

Varying job description format on Indeed makes it harder for users to find the information they are looking for





Participant 8: "Handshake job description gives a quick overview and more convenient to read the job information, highlights key information"

Participant 7: "Felt overwhelmed by loads of inconsistent information, especially in the job description with huge blocks of texts and exhausting the eyes"

Providing to choose multiple choices in filter options reduces repetition of searches





Participant 3: "Handshake provides me the option to choose multiple options, such as location because in Indeed I would have to repeat the search process multiple times for every location I want."

Participant 7: "Confused on why the user cannot choose multiple choices under skill-set filter."


Multi tasking as a moderator

The moderator plays a crucial role in guiding the user behavior during the tests. I learned how to balance being professional while making the participant comfortable. It is important to be prepared well in order to make a better judgment on when to interrupt the participant and how to provide subtle guidance.

Quantitave metric data makes more sense only with large data

Quantitative data helped us to find a few focus points that can be linked to qualitative feedback. However, nothing much conclusive could be derived from quantitative data alone due to the very small user sample.

Specificity of task questions

The quality of data obtained from tests is largely dependent on the tasks. Since some of our tasks provided multiple objectives, the analysis of the data involved more ambiguity to find out specific reasons and patterns in user actions.

Post task follow-up questions

Our moderated tests initially did not incorporate a slot for my team to ask questions on unique user actions observed. This made it difficult to reason out some of their actions later. Hence, in the following sessions, we provided a time slot at the end of the test session for the team to ask any follow-up questions.

bottom of page