The 2-Minute Gist
Many dashboards fail because teams conflate KPIs, metrics, and dimensions. This guide clarifies the hierarchy:
- Performance Dimensions (The Why): Context like 'Efficiency' or 'Target Achievement'.
- KPIs (The What): Specific indicators like 'Number of Visits'.
- Metrics (The How): The exact calculation rules and formulas.
Ask ten teams what a KPI is, and you’ll get ten different answers.
Some will say:
- “Anything we track”
- “A number on the dashboard”
- “A chart the leadership looks at”
Others will use metrics, KPIs, and indicators interchangeably assuming they all mean the same thing.
They don’t.
This confusion is not just semantic.It’s the root cause of broken dashboards.
If your dashboards feel cluttered, confusing, or ignored, chances are you’re mixing up performance dimensions, KPIs, and metrics and designing everything at the same level.
Let’s fix that.
Why This Distinction Matters More Than You Think
Dashboards fail when:
- Everything looks equally important
- No one knows which number actually matters
- Metrics increase, but outcomes don’t improve
- Leaders ask questions dashboards can’t answer
The problem isn’t data quality. It’s conceptual design.
High-performing analytics systems follow a clear hierarchy:
Performance Dimensions → KPIs → Metrics & Calculations
Most dashboards collapse all three into one layer; and that’s where things go wrong.
Layer 1: Performance Dimensions (The “Why”)
Performance dimensions define how success is interpreted.
They answer the question:
What aspect of performance are we trying to understand?
Examples of performance dimensions include:
- Target Achievement
- Performance Progression
- Trend Analysis
- Peer Comparison
- Efficiency
- Coverage
- Quality
Notice something important:👉 Performance dimensions are not numbers.
They are lenses through which performance is viewed.
Why Performance Dimensions Are Critical
Consider this number:
- “1,250 field visits”
Is this good or bad?
You can’t tell unless you know:
- Was the target 1,000 or 2,000? (Target Achievement)
- Is this improving month over month? (Trend)
- Is this better or worse than peer districts? (Peer Comparison)
Performance dimensions provide context.Without them, dashboards show activity, not insight.
Common Mistake #1
Treating KPIs as dimensions.
For example:
- “Number of Visits” shown once
- No comparison
- No trend
- No benchmark
This forces decision-makers to manually interpret meaning. Which defeats the purpose of dashboards.
Layer 2: KPIs (The “What”)
Key Performance Indicators translate performance dimensions into specific, trackable indicators.
If the dimension is:
- Target Achievement
Then relevant KPIs might be:
- Number of field visits
- Percentage of planned activities completed
- Budget utilization
KPIs are:
- Measurable
- Actionable
- Aligned to objectives
But here’s the catch:
👉 The same KPI can exist under multiple performance dimensions.
- Example: One KPI, Multiple Dimensions
- KPI: Number of Field Visits
It can be used to analyze:
- Target Achievement → Actual vs planned visits
- Trends → Month-on-month movement
- Peer Comparison → District vs district
- Progression → Cumulative vs incremental
This is why simply listing KPIs on a dashboard is not enough.The dimension determines how the KPI is interpreted.
Common Mistake #2
Creating hundreds of KPIs instead of reusing KPIs across dimensions.
This leads to:
- Bloated dashboards
- Redundant data
- Confused stakeholders
Mature systems use fewer KPIs, applied across multiple dimensions.
Layer 3: Metrics & Calculations (The “How”)
Metrics define how KPIs are measured.
This is the most technical layer and the most dangerous if not standardized.
Metrics answer:
- What exactly is counted?
- How is it calculated?
- Which data fields are used?
- What happens with missing or late data?
The Three Attributes of a Metric
A robust metric definition includes:
- Measurement Domain
What is being measured?
- Single-stage (e.g., a visit)
- Multi-stage (e.g., project lifecycle phases)
This is crucial in programs that evolve over time.
- Measurement Expression
How is it expressed?
- Count
- Percentage
- Ratio
- Index
For example:
- “Number of visits” (count)
- “Percentage of target achieved” (percentage)
- Measurement Calculation
How is it computed?
- Formula
- Dependencies
- Data sources
For example:
- Percentage Achievement =
- (Actual Visits / Planned Visits) × 100
Without this clarity:
- Different teams compute differently
- Numbers don’t match
- Trust erodes
Common Mistake #3
Letting metric logic live in:
- Excel files
- Ad-hoc SQL queries
- Individual analysts’ notebooks
This creates:
- Inconsistency
- Rework
- Endless reconciliation meetings
Platforms like ViewZen Analytics centralize metric definitions, ensuring:
- One version of truth
- Consistent dashboards
- Reliable alerts
- How These Three Layers Work Together (End-to-End)
Let’s put it all together.
- Example Framework
- Performance Dimension: Target Achievement
- KPI: Number of Field Visits
- Metric: Count of visits with valid visit_date
Now add:
- Granularity → Organization → Location → Sublocation
- Time → Month-to-date, Year-to-date
- Benchmark → Monthly target
- Alert → Below bottom 10 percentile
Suddenly, the same data becomes:
- Comparable
- Actionable
- Governable
That’s the power of hierarchy.
Why Most Dashboards Skip Performance Dimensions
Because:
- Tools don’t enforce this thinking
- Teams rush to “show data”
- Visual design is prioritized over logic
- Dimensions feel abstract at first
But once introduced, performance dimensions become:
- The backbone of reviews
- The language of leadership discussions
- The structure for accountability
- Designing Dashboards That Scale
When dashboards are designed using this hierarchy:
- Adding new KPIs becomes easier
- Alerts become meaningful
- Drill-downs feel natural
- Governance is built-in
This is especially critical in:
- Government programs
- Large enterprises
- Multi-region operations
- Long-running initiatives
Dashboards stop being static reports and become management systems.
How ViewZen Analytics Applies This Framework in Practice
In platforms like ViewZen Analytics, this hierarchy is embedded at design time:
- Performance dimensions are first-class concepts
- KPIs are reused, not duplicated
- Metrics are centrally governed
- Alerts, access, and drill-downs inherit this structure
This allows organizations to:
- Scale dashboards without chaos
- Evolve KPIs without rework
- Maintain trust in numbers over years
A Simple Checklist for Your Next Dashboard
Before finalizing any dashboard, ask:
- What performance dimension does this support?
- Is the KPI reused across dimensions?
- Is the metric definition standardized?
- Can targets and benchmarks be applied?
- Can alerts be triggered meaningfully?
- Does this enable a decision?
If any answer is unclear, the dashboard isn’t ready.
Closing Thought
KPIs don’t fail.
Metrics don’t fail.
Dashboards fail when we mix layers and lose meaning.
Once you separate:
- Why (Dimensions)
- What (KPIs)
- How (Metrics)
Analytics stops being noisy and starts being powerful.