Model Performance Optimisation
Find and fix performance bottlenecks using Performance Analyzer and DAX query view. Learn to reduce model size by removing unnecessary rows and columns, and improve query speed by reducing granularity.
Making your model fast
Think of packing for a holiday. If you pack everything you own, your suitcase is huge and slow to move. Pack only what you need, and you travel light and fast.
Same with Power BI models. Remove columns you donβt use, eliminate rows you donβt need, and simplify your DAX β your reports load faster, refresh quicker, and use less memory.
Power BI gives you two tools to find the bottlenecks: Performance Analyzer (which visuals are slow?) and DAX query view (which measures are inefficient?).
Performance Analyzer
Performance Analyzer measures how long each visual takes to render on a report page.
How to use it:
- On the Optimize tab (Power BI Desktop) β Performance Analyzer β Start recording
- Interact with the report (change slicers, navigate pages)
- View results β each visual shows:
- DAX query time β how long the measure calculations took
- Visual display time β how long the visual took to render
- Other β anything else (data retrieval, etc.)
Riley at Coastal Fresh (π) notices her sales dashboard takes 8 seconds to load. Performance Analyzer shows:
- Map visual: 4.2s (DAX query)
- Sales matrix: 2.1s (DAX query)
- Cards: 0.1s each
The map is the bottleneck. She investigates the DAX query it generates.
Exam tip: what Performance Analyzer shows
Performance Analyzer captures three categories per visual:
- DAX query: The time to generate and execute the DAX query against the model. You can copy the query for further analysis.
- Visual display: Time for the visual to render on screen after data arrives.
- Other: Any additional processing time not covered by the above two categories.
For DirectQuery sources, the source query latency is included within the DAX query time β itβs not shown as a separate bucket.
If DAX query time is high β optimise the measure, model, or source database (for DirectQuery). If visual display time is high β simplify the visual (fewer data points, simpler formatting).
DAX query view
DAX query view lets you write and test DAX queries directly against the model. Use it to:
- Test measure performance in isolation
- Analyse query plans
- Compare alternative DAX approaches
Kenji at Apex Manufacturing (π) tests a slow measure:
EVALUATE
SUMMARIZECOLUMNS(
'Date'[Month],
"Revenue", [Total Revenue],
"YoY Change", [YoY Change %]
)
He can run this query, see execution time, and iterate on the DAX until performance improves.
Reducing model size
Remove unnecessary columns
The fastest way to shrink a model is to remove columns nobody uses.
| Column Type | Action |
|---|---|
| Foreign keys used only in relationships | Keep β needed for relationships |
| Descriptive columns used in visuals | Keep |
| Source system IDs (GUIDs, internal codes) | Remove β high cardinality, no reporting value |
| Audit columns (CreatedBy, ModifiedDate) | Remove unless used in reports |
| Free-text columns (comments, notes) | Remove β very high cardinality, terrible compression |
| Duplicate columns (same data, different name) | Remove the duplicate |
Dr. Ethan at Bayview Medical (π₯) imports a patient table with 45 columns. After audit, only 12 are used in reports or relationships. Removing the 33 unused columns cuts model size by 60%.
Remove unnecessary rows
Filter data in Power Query to load only whatβs needed:
- Date range: Only load the last 3 years instead of all history
- Status filter: Exclude cancelled/archived records
- Region filter: Only load regions relevant to the report audience
Best practice: remove in Power Query, not DAX
Always remove rows and columns in Power Query (before loading), not with DAX filters at query time. Power Query removal reduces model size and refresh time. DAX filtering still loads the data β it just hides it during queries.
Think of it as: Power Query decides what goes INTO the suitcase. DAX decides what you look at AFTER the suitcase is packed.
Reducing granularity
Granularity is the level of detail in your data. Transaction-level data (one row per sale) has high granularity. Daily summaries have lower granularity.
Nadia at Prism Agency (π) has 50 million ad impression rows. Her dashboard only shows daily metrics per campaign. She uses Group By in Power Query to aggregate to daily level β reducing 50 million rows to 50,000.
| Granularity | Rows | Use When |
|---|---|---|
| Per transaction | 50M | Detailed drill-down needed |
| Per day | 50K | Daily trends are sufficient |
| Per week | 7K | Weekly reporting only |
| Per month | 600 | Monthly summary dashboards |
Trade-off: Lower granularity = faster model, but you lose the ability to drill into detail.
Performance optimisation checklist
| Action | Impact | Effort |
|---|---|---|
| Remove unused columns | High β biggest size reduction | Low |
| Remove unnecessary rows (date range, filters) | High β fewer rows = smaller model | Low |
| Reduce granularity (aggregate in Power Query) | High β dramatically fewer rows | Medium |
| Avoid bi-directional cross-filtering | Medium β reduces query complexity | Low |
| Replace calculated columns with Power Query columns | Medium β transformation happens earlier, can benefit from query folding | Medium |
| Simplify complex DAX (use variables) | Medium β reduces evaluation time | Medium |
| Avoid DISTINCTCOUNT on high-cardinality columns | Medium β expensive operation | Low |
Knowledge check
Dr. Ethan's patient dashboard loads slowly. Performance Analyzer shows the map visual takes 5 seconds (DAX query: 4.8s, visual display: 0.2s). What should he investigate first?
Nadia has a 50-million-row impressions table but only needs daily campaign metrics. What's the best optimisation?
π¬ Video coming soon
Next up: Choosing and Formatting Visuals β pick the right chart for your data story.