Redesigning the Employee Development Platform for Usability and Scale

The Employee Performance and Development (EPD) platform is one of Thrive’s flagship products, rebuilt under my leadership from a low-engagement legacy tool into a scalable, modular solution adopted across our client base. Over nine months I introduced structured sprints, defined product strategy from client and stakeholder insights, and worked with psychology, design, and engineering to turn complex psychometric data into actionable experiences. The new platform embedded success metrics into every feature, integrated accessibility and neurodiversity principles, and became a core commercial driver featured in investor pitches.

Role

Product Manager

Timeline

9 months

Industry

Industry

Employee Learning and Development

Team

  • UX Designer

  • Psychometrician

  • Back-end engineer

  • 2 Front-end engineers

Challenge

The legacy EPD platform had low engagement and limited value for clients. Data was unclear, hard to act on, and provided no way to track changes over time within teams. With no embedded analytics, clients and internal teams lacked visibility into usage or impact. To grow into enterprise accounts and attract investment, Thrive needed a modern platform that could drive adoption, deliver actionable insights, and show measurable value.

Results

  • Upgrade rate among existing clients rose from 24% to 44% within months of launch.

  • Task completion rates in usability testing were near-perfect across redesigned features.

  • Usage of core development tools grew by 30%+ post-launch.

  • Clients highlighted the platform’s clarity and actionability, noting that it gave managers visibility into team growth over time and employees greater ownership of their development.

  • Positioned the product as a key commercial driver, central to Thrive’s enterprise sales and investor pitch deck.

+20%

Increase in client adoption

95%+

Near perfect task completion rates at user testing

30%+

Growth in usage of core tools

Process

1. Discovery

To understand why engagement was low, I led a multi-channel research process:

  • User research: Interviews and workshops to see how clients used features, what they valued, and where they struggled.

  • Qualitative synthesis: Analysis of support tickets, sales pitches, prospect feedback, and escalation notes to uncover recurring pain points.

  • Cognitive walkthroughs: Structured usability reviews with design to identify friction points.
    Models applied: Affinity mapping to cluster insights and Jobs-to-be-Done to clarify user needs.

2. Definition & Strategy

Insights were consolidated into a Product Requirements Document (PRD) co-created with psychology, design, engineering, and commercial teams, then prioritised by me as PM.

  • Feature prioritisation: Value vs. Effort framework to balance scalability and feasibility.

  • Metric strategy: Each feature had success metrics defined before development. For example:

    • Reports → scroll depth, heatmaps, pages per session, downloads/shares.

    • Feedback loops → frequency of use, completion rates, sentiment.

3. Delivery
The redevelopment introduced agile practices at Thrive for the first time:

  • Two-week sprints with stand-ups, demos, and retros.

  • Engineering and psychology embedded in prototyping to ensure UX was both technically feasible and psychologically valid.

  • Incremental modernisation of the codebase alongside new modules.
    Model applied: Dual-Track Agile, with discovery and delivery running in parallel.

4. Iteration & Validation
Validation was continuous across cycles:

  • Prototype testing highlighted issues such as data privacy at the individual level.

  • Amplitude and Hotjar were used to track engagement and behaviour at scale.

  • Iterations were made based on both analytics and qualitative insights.

5. Continuous Improvement
Post-launch, we established ongoing experimentation:

  • Each feature monitored against predefined metrics, feeding into sprint planning.

  • Larger clients engaged as pilot partners for complex features before scaling.
    Model applied: Continuous UX validation and data-driven iteration.

Workshop breaking down the usability issues in the old platform

“ The results overview section in the new version is so much clearer. I really like the fact that my [Team Leads] can access their specific team results in as much detail as they want and get actionable insights in the report section ”

Anonymous user

Challenges Overcome

  • Keeping cross-functional teams aligned
    With psychologists, designers, and engineers all contributing, it was easy for priorities to drift. I solved this by running weekly check-ins that created space to critique ideas early, avoid wasted work, and make sure features were both technically feasible and behaviourally valid.

  • Securing leadership buy-in for a full rebuild
    At first there was limited appetite for major redevelopment. By consolidating fragmented client feedback and running structured user research, I built a clear story that showed where the product was falling short and why change was necessary.

  • Managing complexity during rapid iteration
    The pace of feature changes often caused confusion about what had been agreed. To overcome this, I introduced simple documentation and version tracking, which reduced misunderstandings, created transparency, and kept us moving forward with confidence.


Get Athos Pro

Create a free website with Framer, the website builder loved by startups, designers and agencies.