Beacon DriveScore is a telematics program that tracks driving behavior through a mobile app and uses that
data to inform customers of their personalized insurance rate. Beacon needed a redesign of the existing telematics experience
and the introduction of new features, with a major focus on making the DriveScore overview easier to understand,
easier to navigate, and easier to trust.
I leaned heavily on a research heavy design process to establish baseline usability metrics, explore two concept directions, validate them through A/B testing,
and synthesize a final concept that balanced business goals and customer clarity.
The existing experience created friction in a few key areas:
The goal was to redesign the experience in a way that improved comprehension, reduced friction, and created a foundation for ongoing measurement post MVP.
Before designing solutions, I focused on establishing a clear baseline
and research approach that could support future iteration.
Baseline Metrics and Discovery
I started with stakeholder interviews and user interviews to understand business goals, existing pain points,
and the current performance of the app. This work helped define what success needed to look like, both from the business side and customer
experience side. From there, I documented:
Usability Testing
To validate designs for MVP and set benchmarks for post MVP testing, I planned and ran an unmoderated usability study focused on
the DriveScore overview experience. Participants completed task based scenarios designed to surface friction,
confusion, and breakdown points across navigation, comprehension, and trust. I developed the study background, participant
expectations, KPIs and success benchmarks, structured task flows, targeted evaluation questions, and screening criteria.
A/B Concept Testing
Based on baseline findings, stakeholder input, and usability best practices, I created two low-fidelity design concepts and tested them both.
I ran A/B testing on the two concepts and captured both quantitative and qualitative data, including where users hesitated, what language or visuals drove confusion, and what increased confidence.
Click here for the lo-fi prototype
Research Analysis and Decision Making
After testing, I analyzed qualitative and quantitative results across both concepts. These data points allowed me to identify:
That analysis directly informed the final concept, which was used to create high fidelity prototypes and validated again through additional usability testing.
The final experience focused on helping users understand the program quickly, navigate confidently, and feel in control of what impacted their score. Key improvements included:
This project reinforced the value of research led design in a space where user trust and clarity matter.
By grounding decisions in baseline data, structured usability testing, and A/B concept validation, I was able to move the team
from opinions to evidence and deliver a final experience that balanced business needs with customer understanding and product goals.