SaaS Analytics
Experience
A multi-method research study generating formative & evaluative insights through qualitative in-depth interviews & iterative user testing.
Project overview
Opportunity
WM Technology clients lack satisfactory access to the performance data & analytics they need to run their business successfully, with feedback also pointing to the SaaS suite UX being unintuitive & disjointed.
Contribution
Responsible for designing, conducting, and synthesizing qualitative research to inform design strategy and kickstart ideation, followed by rapid iterative user testing to evaluate & optimize the user experience.
Outcome
An effective & enabling self-serve analytics solution tailored to the complex needs of retail business operators, increasing overall client satisfaction by 30% and reducing calls to Client Success teams by 20%.
-
1 UX Researcher
2 Product Managers
1 Product Marketing Manager
2 UX Designers
-
- Internal interviews to gain insight into the problem and define the opportunity
- User interviews to understand the needs and expectations of clients and inform design strategy
- Usability testing to iteratively evaluate the UI & UX for comprehension and ease of use
-
UserTesting, Figma, Google Workspace, Atlassian Cloud
Background
WM Technology offers a SaaS product suite that empowers ecomm and brick & mortar retail managers to scale their business through omnichannel advertising, marketing, operations, and branded ecommerce. However, the experience navigating these products is disjointed and confusing.
Key performance analytics are decentralized across products or are unavailable on a self-serve basis. The fragmented experience makes seeing the big picture and making informed business decisions extremely difficult for clients, and because they lack access to some crucial data, they have taken to calling their client success representative frequently for insight and guidance.
Project planning
The core team of Product Management, Product Marketing, UX Research, and UX Design held a project kick-off to discuss the following:
Define the problem space & project objectives
Capture considerations, requirements & constraints
Determine research objectives & strategy
Establish timelines, touchpoints, & deliverables
Define success & measurement strategy
Stakeholder Interviews
We conducted stakeholder interviews to gain context around the intricacies of the current state with representatives from various teams:
Product Managers for each of WM’s B2B products (ex. CRM, Ads)
B2B Product Marketing Managers
Client Success Managers
Product Analytics
These conversations were rather informal, with each member of the core team actively asking questions to gain as much context of the problem space as possible.
Notes from interviews were captured and synthesized in FigJam.
Qualitative research
Research planning
I got to work developing a qualitative research plan documenting the research objectives, data collection strategy, and analysis procedure.
-
Understand what clients need to run their businesses successfully & independently
Uncover underlying desires and expectations that are not being serviced today
Determine how client needs differ between account types and job functions
Learn firsthand about the frustrations and friction points clients have with the current experience
Develop a client hierarchy of analytics needs from must-have to nice-to-have
Understand the downstream impact the current state has on the Client Success team
-
In-depth user interviews
6 WM clients, active users of WM product suite
Sample of single-location & multi-location business operators
Sample of various job functions including business owner, operations manager, and marketing manager
Conducted via UserTesting
Note-taking in UserTesting & FigJam
-
Research brief documentation
Sampling & recruitment
Session scheduling
Discussion guide write-up
Session setup & observer invites
Collaborative note-taking prep
Analysis & mapping plan
Reporting & publishing protocol
I opted to conduct in-depth user interviews for this phase of research to untangle the complexity of client needs & unmet expectations and identify how best to enable & serve them.
Though time-consuming, interviews provide an opportunity to clarify ambiguous responses and delve deeper into specific issues, leading to deeper understanding & more informed decisions.
I decided to first interview the Client Success managers to help uncover the “what” of client behavior, then interview WM clients to discover the “why” behind it.
Client interviews
I conducted six 60-minute client interviews over two days.
I requested for the core team to attend each session as an anonymous observer to take notes, and invited any interested stakeholders to observe.
We used a color-coded note-taking system to sort client sentiments into categories such as painpoints, needs, desires, and what’s working today.
Affinity diagramming
After data collection, I led the core team in a collaborative sorting activity, grouping notes into distinct clusters to identify themes.
This exercise allowed us to measure the frequency with which concepts arose, gauge the overall sentiment toward each, and prioritize accordingly.
Key themes
Our thematic analysis unveiled several key findings from the interviews.
Knowing the customer:
Clients want more information about customer demographics, shopping behavior, and overall satisfaction rates.
Comparing data to baseline:
Clients need context by comparison to understand their current business performance vs. previous time periods and/or vs. market trends.
Controlling data:
Clients unanimously value the ability to download/export raw data, and withholding this capability raises concerns about data integrity.
Filtering & segmenting data:
Clients want to filter their data by variables like date & location (if applicable), and create segments using multiple attributes at once.
Understanding metrics & impact
Clients struggle to decipher analytical jargon and seek contextual definitions or a glossary for better comprehension.
Design & user testing
Design sprint
After synthesizing the qualitative data into key insights, the core project team embarked on a 4-day design sprint to expedite early-stage development.
By the end of the sprint, we had a functional prototype and a comprehensive list of key interactions & tasks for usability testing.
Iterative design & testing
Throughout the design phase, I conducted a series of 23 unmoderated usability tests, evolving questions and tasks to reflect any changes in design between tests.
Usability participants included 23 lookalike users (i.e. users who are not actually WM clients, but are operators of small/medium retail businesses with very similar goals and SaaS product needs. I pursued this recruitment strategy due to multiple constraints:
Access to clients is limited by Client Success teams to safeguard B2B relationships.
Client availability would not meet our testing timeline and participant needs.
WM is reluctant to show concepts to clients to avoid influencing expectations or causing frustration with development and release timelines.
Once 6 sequential users were able to complete all tasks successfully, offered minimal negative feedback and rated their experience using the product as “positive” or “very positive,” we felt confident enough to conclude the iterative design & testing phase.
Typically, testing with 5 users exposes about 80% of existing usability problems. Given we had already tested with 17 participants and acted upon several improvement opportunities, we felt that we had reached saturation with the latest design iteration once we met the above criteria.
Results & next steps
Outcome
According to Client Success call logs and satisfaction surveys, the enhanced SaaS analytics experiences has improved overall client satisfaction by 30% and reduced calls to Client Success Managers by 20%.
Next steps
The final iteration of the design was thoroughly usability tested and addressed as many client needs and concerns as possible within technical and current business constraints, but continuous evaluation and development is needed to keep up with the evolving needs and expectations of clients.
My research for this project revealed that most unmet expectations centered around descriptive analytics, but I made note that several WM clients alluded to or explicitly expressed a desire for more advanced analytics.
Descriptive analytics
Describes what is happening in the data
Diagnostic analytics
Explains why things are happening
Predictive analytics
Determines what is likely to happen in the future
Prescriptive analytics
Provides direction and recommendations
While we could not solve needs beyond descriptive analytics at present, these qualitative datapoints provide generative evidence for future iterations of the SaaS analytics experience.