Scroll To Top Icon

Turning Skill Data into Confident Decisions

SkillsAnalysis dashboard showing leadership and behavioral skill insights

Role

Product Designer

Duration

8 weeks

Industry

Human Resources

Team

Product Manager, Engineering Team, Business Stakeholders, Designer, Marketing Team

Responsibilities

Workshop facilitation, Competitor Analysis, UX Design, UI Design, User Testing, Design System, Prototyping, Product Strategy

Core Problem

SkillsAnalysis aimed to help managers and HR leaders understand workforce capabilities and make better hiring, development, and planning decisions. In practice, the product was failing at its core promise: users had access to large volumes of skill data but lacked confidence in interpreting it. Instead of enabling decisions, the platform created hesitation. When users don't trust their interpretation, they don't act — a fatal flaw for a decision-support product.

Business Context and Stakes

1

Only ~30% took follow-up action

Users who opened a skill report struggled with exports, comparisons, and development planning.

2

Internal communication challenges

Users struggled to articulate skill insights internally, risking adoption and upsell to enterprise clients.

3

Decision-making delays

Misinterpretation slowed strategic hiring and L&D decisions across organizations.

If users couldn't confidently explain skill insights internally, the product risked being seen as interesting but non-essential, limiting expansion revenue and long-term adoption.

Discovery & Key Insight

Through interviews, usability testing, and support data analysis, a consistent pattern emerged: users weren’t struggling due to lack of data, but due to uncertainty in interpretation. Most users questioned whether their reading of skill scores was correct, repeatedly revisiting definitions and abandoning reports without taking action.

This reframed the problem from "how do we show skills better?" to "how do we help users feel confident making decisions from skill data?"

Interviewing candidates during discovery phase

Conducting moderated interviews with hiring managers to understand decision-making challenges

Design Strategy

🌟

Confidence before completeness

Users should understand the signal before seeing the full dataset

🧠

Progressive disclosure by intent

Advanced detail should be available, but never forced

Action over analysis

Every view should clearly suggest what to do next

These principles weren’t aspirational — they actively guided trade-offs across hierarchy, interaction depth, and visual emphasis. I anchored the work around one guiding principle:interpretation must come before information. These principles became a shared decision framework across product, data, and engineering, helping us evaluate trade-offs and avoid reverting to data-heavy defaults.

01. Progressive Information Hierarchy

Design Move

The original table showed 9 columns of raw scores with no hierarchy—users scrolled horizontally, unable to identify which metrics mattered. Only 2 of 8 managers could explain what made a candidate strong.

I restructured the experience into progressive layers. The default view consolidated 9 columns into 4 core signals: horizontal bars showing Leadership (green %) vs Behavioral gaps (red %), 5-star ratings, and expandable rows for detail. Users could now compare candidates instantly—86% green meant strong, no mental math required.

Clicking expand revealed trait-level breakdowns with color-coded patterns: green checkered = strengths, red striped = interferences. This let users answer "Why this rating?" only when they needed to.

Key Decisions

  • Horizontal bars replaced numeric scores for instant visual comparison
  • 78% of users never expanded—proving summary was sufficient
  • Removed competing "Overall" numeric scores

Result

Testing with 8 managers: 7 correctly ranked candidates vs 2 before. Time to decision: 4:32 → 0:47 (~84%). Dashboard completion: 23% → 71% post-launch.

Before and After of progressive Information Hierarchy

02. Contextualizing Qualitative Insights

Design Move

Traits like "Ambivert" and "Extreme" appeared for everyone, but users couldn't interpret them—"Is Extreme bad?" The interface treated context as scores to compare.

I made traits conditional—they only appeared when behaviorally significant. "Extreme" in red text flagged outliers. A red flag icon (🚩) appeared for serious interferences requiring review. Normal traits like "Norm" got no special treatment.

Hover states explained on-demand: "Ambivert: Balances intro/extroverted tendencies."

Key Decisions

  • Show only what's unusual, hide what's expected
  • Red flag = "pause before hiring"
  • Explanations behind hover, not upfront

Result

Before: 6 of 8 ignored traits entirely. After: 8 of 8 noticed flags, 7 clicked for context. Trust in insights: 2.8/5 → 4.1/5

Before and after of contextual presentation of qualitative indicators

03. Universal Rating System

Design Move

The original design used tiers (Gold A, Silver B) plus numeric scores (6.2, 5.4) that competed for attention. Users asked: "What's the difference?" "Is 6.2 good?" Two rating systems on one screen created confusion.

I replaced tiers with 5-star ratings. Everyone understands stars instantly—no explanation needed. Hover revealed depth for power users: "4 stars: 83% Leadership, 17% gaps."

Key Decisions

  • Universal language over proprietary tiers
  • Stars visible, data behind hover
  • Satisfies scanners and detail seekers

Result

Before: 5 of 8 asked "What do tiers mean?" After: 0 questions. Time to rank: 3:47 → 0:38 (~83%).

Before and after of universal rating system

Before and After

The Skills Analysis main dashboard for candidate assessment plays a crucial role in summarizing key information about a candidate's leadership competencies and behavioral tendencies that may interfere with them. Despite its importance, the existing dashboard failed to clearly communicate what mattered most, leaving users uncertain rather than confident in their decisions. The newly updated dashboard addresses these challenges by presenting information in a more digestible way, enabling users to make faster and more intuitive decisions.

Before Image of the dashboard UIAfter Image of the Dashboard
Before
After

Impact

70%

Dashboard completion rate

Increased from 23% to 70% after restructuring skill interpretation

80%

Faster decision-making from skills

Average time for managers to correctly interpret a skill reduced by 80%

55%

Monthly return usage

Increased from 30% to 55% as managers revisited skills for planning and comparison

My Strategic Contribution

Beyond interface design, my core contribution was reframing the product from data delivery to decision enablement. I established interpretation as a first-class product goal, influenced prioritization across teams, and created a shared language for evaluating complexity versus clarity.

What I'd Do Differently

With more time, I would instrument deeper analytics around confidence and hesitation signals earlier, allowing us to quantify trust more precisely. I'd also validate organizational-level views sooner, as individual comprehension improved faster than team-wide synthesis.

Reflection

This project reinforced that confident decision-making isn't about showing more data, but about making key insights clear and actionable. Progressive disclosure, contextual cues, and clear hierarchy helped users interpret information consistently, while embedding principles like "interpretation before information" ensured the product supported organizational adoption and long-term trust. Ultimately, designing for clarity and systemic understanding enables both individuals and teams to act with confidence.