Command Center: User Experience Assessment
An overall assessment and exploration of user experience for Command Center, a suite of operation tools
Command Center is a suite of operation tools developed for internal teams at Motional, to effectively and efficiently operate Motional's Autonomous Fleet.
To optimize product development strategy, the Command Center team aimed to consistently assess user needs, pain points, and experience.
12 weeks (Sep 2023 - Dec 2023)
Miro, Google Analytics, Looker, Google Suite, Zoom
Designed and implemented a variety of research methods exploring the usage of the tools and user needs.
Analyzed data and presented insights to product and research teams.
Collaborated with engineers, product managers, designers, and other researchers to establish UX metrics and prioritize product initiatives.
Contextual Inquiry, Google Analytics, Surveys, Workshops
Informed product roadmap with user needs and pain points, while identifying opportunities for scalability and efficiency.
Ensured existing user data was effectively utilized for design and strategic decisions by fostering the conceptualization of quantitative data through analytics and metric workshops.
An overview of research activities conducted.
To gain a holistic understanding of real user behaviors.
Identify both explicit and implicit interactions, transitions between platforms, and communication patterns.
6 participants spanning key roles who regularly use the Command Center.
In-person walkthrough of day-to-day activities at participant's workspace with follow-up questions.
Analyzing contextual data uncovered intermediate details in user journeys leading to usability issues, and bottlenecks for scaling and efficiency.
Observing user environments and tool usage
Collecting continuous aggregate usage data
To consistently obtain insights into the actual user interaction behavior with the tools.
To analyze user engagement and adoption of different features, without time commitment from users.
All users of the Command Center.
Generated 20+ success measures and corresponding MVP list of user events to track, creating dashboards and explorations to generate insights.
Analysis through path exploration, funnels, segmentation, and charts determined user task success, adoption of new features, mobile vs desktop usage, and search behavior.
To systematically gather user feedback at scale with minimal user effort and time.
To quantify user perception and compare them over time.
32 responses from regular Command Center users.
Derived 5 UX metrics with the product team to understand user experience and crafted a 5-point Likert scale questionnaire with follow-up for feedback.
Analyzing survey responses quantitatively and qualitatively revealed functional and technical usability pain points, showcasing their impact on user experience and highlighting new opportunities for features.
Gathering user feedback and measuring experience
How easy it is for users to accomplish their tasks?
How do users evaluate the precision and correctness of the information provided?
How well do the tools cover all the necessary features and functionalities required by the users?
How do users perceive their extent of awareness with the features provided by the tools?
How easy is it for users to move through the tools to find desired information?
Lack of user permissions was hindering feature usage for some users while abundant permissions were introducing redundancy in certain contexts.
Impact - The team decided to conduct a user role audit to modularize the user permission management for improved feature adoption.
There was repeated evidence of “searching” to find relevant features for specific tasks.
Impact - Development of a tool guide and streamlined feature release, to improve feature visibility.
The mobile usability was a major bottleneck for certain critical actions which were often/only performed on mobile
Impact - Informed designing interactions for two features in development using insights on specific usability issues and the extent of mobile engagement,
Align information format with the analytics goals of users
Impact - The team determined the need to start with the re-design of existing formats to meet must-have analytics goals of Q1-Q2.
Note: Detailed research insights and artifacts are confidential and not shareable due to the
Knowing when to pivot
I dynamically adapted the research to navigate setbacks, dependencies, and ambiguity, through iterative collaboration on primary goals and expected outcomes with the team.
Communicating to diverse audience
I honed my communication skills by presenting research findings to diverse audiences and organizing workshops.
Keeping the team in the loop
Maintaining team engagement by consistently sharing findings and insights during the research process enhanced the translation of insights. I aim to further explore quick yet effective ways to communicate interim findings to the cross-functional team.