top of page

Command Center: User Experience Assessment

An overall assessment and exploration of user experience for Command Center, a suite of operation tools

Frame 2 (3).png

Overview

Overview
  • Command Center is a suite of operation tools developed for internal teams at Motional, to effectively and efficiently operate Motional's Autonomous Fleet.

  • To optimize product development strategy, the Command Center team aimed to consistently assess user needs, pain points, and experience.

My Role

Duration

12 weeks (Sep 2023 - Dec 2023)

Tools

Miro, Google Analytics, Looker, Google Suite, Zoom

  • Designed and implemented a variety of research methods exploring the usage of the tools and user needs.​​

  • Analyzed data and presented insights to product and research teams.

  • Collaborated with engineers, product managers, designers, and other researchers to establish UX metrics and prioritize product initiatives.

Methods

Contextual Inquiry, Google Analytics, Surveys, Workshops

Impact

Functional

  • Informed product roadmap with user needs and pain points, while identifying opportunities for scalability and efficiency.

Strategic

  • Ensured existing user data was effectively utilized for design and strategic decisions by fostering the conceptualization of quantitative data through analytics and metric workshops.

Timeline

An overview of research activities conducted.

Untitled (3).png
Research

Research

Research Goals

Untitled (4).png

Contextual Inquiry

Why?

  • To gain a holistic understanding of real user behaviors.

  • Identify both explicit and implicit interactions, transitions between platforms, and communication patterns.

Who?

6 participants spanning key roles who regularly use the Command Center.

How?

In-person walkthrough of day-to-day activities at participant's workspace with follow-up questions.

Analyzing contextual data uncovered intermediate details in user journeys leading to usability issues, and bottlenecks for scaling and efficiency.

commandcenter_cody_2023.jpg
commandcenter-interview.jpg

Observing user environments and tool usage

Google Analytics

Collecting continuous aggregate usage data

Why?

  • To consistently obtain insights into the actual user interaction behavior with the tools.

  • To analyze user engagement and adoption of different features, without time commitment from users.

Who?

All users of the Command Center.

How?

Generated 20+ success measures and corresponding MVP list of user events to track, creating dashboards and explorations to generate insights.

Analysis through path exploration, funnels, segmentation, and charts determined user task success, adoption of new features,  mobile vs desktop usage, and search behavior.

looker.png
ga4logo.png
GA4.png

Surveys

Why?

  • To systematically gather user feedback at scale with minimal user effort and time.

  • To quantify user perception and compare them over time.

Who?

32 responses from regular Command Center users.

How?

Derived 5 UX metrics with the product team to understand user experience and crafted a 5-point Likert scale questionnaire with follow-up for feedback.

Analyzing survey responses quantitatively and qualitatively revealed functional and technical usability pain points, showcasing their impact on user experience and highlighting new opportunities for features.

Gathering user feedback and measuring experience

Ease

How easy it is for users to accomplish their tasks?

Accuracy

How do users evaluate the precision and correctness of the information provided?

Comprehensiveness

How well do the tools cover all the necessary features and functionalities required by the users?

Awareness

How do users perceive their extent of awareness with the features provided by the tools?

Navigation

How easy is it for users to move through the tools to find desired information?

Key Insights

Lack of user permissions was hindering feature usage for some users while abundant permissions were introducing redundancy in certain contexts.

Impact - The team decided to conduct a user role audit to modularize the user permission management for improved feature adoption.

There was repeated evidence of “searching” to find relevant features for specific tasks. 

Impact -  Development of a tool guide and streamlined feature release, to improve feature visibility.

The mobile usability was a major bottleneck for certain critical actions which were often/only performed on mobile

Impact - Informed designing interactions for two features in development using insights on specific usability issues and the extent of mobile engagement,

Align information format with the analytics goals of users 

Impact -  The team determined the need to start with the re-design of existing formats to meet must-have analytics goals of Q1-Q2.

Note: Detailed research insights and artifacts are confidential and not shareable due to the
non-disclosure agreement.

Reflections
Reflections

Knowing when to pivot

I dynamically adapted the research to navigate setbacks, dependencies, and ambiguity, through iterative collaboration on primary goals and expected outcomes with the team.

Communicating to diverse audience

I honed my communication skills by presenting research findings to diverse audiences and organizing workshops.

Keeping the team in the loop

Maintaining team engagement by consistently sharing findings and insights during the research process enhanced the translation of insights. I aim to further explore quick yet effective ways to communicate interim findings to the cross-functional team.

bottom of page