Skip to main content
The skill performance views show you how people performed in specific sessions. Use them to evaluate training effectiveness, identify skill gaps, and see who needs more practice.
Performance metrics reflect the specific sessions you have filtered to. When you change your date range, scenario, or group filters, the numbers update accordingly — performance answers “How did these sessions go?” for the exact context you’re looking at.For a stable, person-level measure of capability that doesn’t change with filters, see Skill Proficiency and Reports.

Understanding Performance Metrics

MetricWhat It Means
FirstScore from the participant’s earliest session in your filtered view
BestHighest score achieved across all sessions in view
AverageMean score across all sessions in view
LiftBest minus First — shows improvement. Only appears for participants with 2+ sessions.
Exec shows Best rather than Last because it surfaces what people are capable of at their peak. If someone’s most recent session was an off day, Best still captures their demonstrated ability.
When viewing aggregate tables, these metrics are averaged across participants: Avg. First, Avg. Best, and Avg. Lift (only participants with 2+ sessions contribute to Avg. Lift).

Skill Performance in Roleplay Analytics

Navigate to Roleplays → Analytics in the left sidebar and scroll past the summary metrics to find the skill performance views.
FilterWhat It Does
Date RangeLimits to sessions within a specific time period
UsersLimits to specific people
GroupsLimits to members of selected groups
ProgramsLimits to sessions from specific learning programs
RoleplaysLimits to specific roleplay assignments
ScenariosLimits to specific scenarios
SkillsLimits to specific skills
Nick Demo Exec Com Home

Skill Performance Table

Shows one row per skill with aggregate metrics across all participants in your filtered view.
ColumnWhat It Shows
Skill NameClick to open the skill drilldown
# ParticipantsCount of people with sessions for this skill
Avg. FirstAverage first-session score across participants
Avg. BestAverage best score across participants
Avg. LiftAverage improvement (participants with 2+ sessions only)
Score cells are color-coded: dark green (90+), light green (75+), orange (50+), red (below 50). Use the Skip unpracticed toggle to hide skills with no sessions in your filtered view.

Skill Drilldown

Nick Demo Exec Com Home(1)
Click any skill row to open a side drawer with detailed performance data for that skill:
  • Performance summary — Total participants, total sessions, average score, first, best, and lift
  • First Score Distribution — Bar chart showing how many participants’ first scores fell into each tier (Good, Fair, Needs Improvement)
  • Best Score Distribution — Same format for best scores
  • Participant table — Each person’s session count, first score, best score, and lift. Sortable by any column. Default sort surfaces who needs the most attention.
Click a participant’s name to open all their sessions for that skill in a new tab. Use the arrows at the top to navigate between skills.

User × Skill Heatmap

Nick Demo Exec Com Home(2)
Below the Skill Performance Table, a heatmap grid shows every participant’s performance across every skill.
  • Rows — Participants (with avatar and total session count)
  • Columns — Skills (loads more as you scroll right)
  • Cells — Color-coded by average score, displaying the rounded percentage
Hover over any cell to see a tooltip with participant name, skill name, average score, session count, score distribution, and last session time.

Heatmap Drilldown

Nick Demo Exec Com Home(3)
Click any cell to open a drilldown drawer for that specific person × skill combination:
  • Performance summary — Sessions, average score, first, best, lift, and score distribution
  • Recent sessions — The 5 most recent sessions with scenario name, time, and score badge. Click any to view the full session detail.
  • Assign Scenarios — Scenarios tagged with this skill. Select one or more and click Assign to send them directly to the participant.
Use Assign Scenarios to take immediate action. If someone is struggling with a skill, you can assign relevant practice scenarios right from the analytics view.

Scenario-Specific Analytics

When you view analytics on a specific scenario page, the same skill performance views appear but are automatically scoped to that scenario. Navigate to your scenario and click the Analytics tab.

Evaluation Criteria Heatmap

Nick Demo Exec Com Home(4)
Scenario analytics also includes an Evaluation Criteria Heatmap not available in the main Roleplay Analytics view. You may need to scroll down the page to view this. It shows performance at the individual criterion level:
  • Grouped by criterion (rubric section headers)
  • Each criterion row shows Avg. First Score, Avg. Best Score, and Avg. Lift
  • Criterion items appear as nested rows with the same metrics
This view helps identify which specific behaviors within a scenario are driving skill scores up or down.

Skill Performance in Call Analytics

Call Analytics provides the same skill performance views as Roleplay Analytics, with a few differences.
  • Participants vs. Users — Call analytics uses “participants” because people on calls may not have Exec accounts
  • Call Categories vs. Scenarios — Filter by call category instead of scenario
FilterWhat It Does
Date RangeLimits to calls within a specific time period
Call CategoriesLimits to specific call types
Users / ParticipantsLimits to specific people
GroupsLimits to members of selected groups
SkillsLimits to specific skills
The Call Skill Performance Table and Participant × Skill Heatmap work the same way as their roleplay equivalents.
To find skill information for Calls, head to Calls -> Dashboard, then click on the category that you’d like to view skills in. Scroll until you see Skill Performance.

Score Attribution

How call skill scores are attributed to participants depends on how the scoring rubric is configured per criterion item:
Only participants who spoke during the relevant portion of the call receive the score. The most precise attribution method.
All internal participants receive the score regardless of who spoke. Useful for criteria that apply to the whole team (e.g., “Meeting was well-structured”).
The score is tracked at the call level only and is not attributed to any individual. It contributes to aggregate call analytics but doesn’t affect individual skill tracking.
Attribution is configured per criterion item in the scoring rubric, not per call. If you’re not seeing expected skill data for specific participants, check the attribution settings on the relevant criterion items.

Getting Help

Need help? Contact us at [email protected] for guidance on using skill performance data to drive training decisions.