πŸ”’ Guided

Pre-launch preview. Authorised access only.

Incorrect code

Guided by A Guide to Cloud
Explore AB-900 AI-901
Guided DP-600 Domain 3
Domain 3 β€” Module 5 of 8 63%
26 of 29 overall

DP-600 Study Guide

Domain 1: Maintain a Data Analytics Solution

  • Workspace Access Controls
  • Row-Level & Object-Level Security
  • Sensitivity Labels & Endorsement
  • Git Version Control in Fabric
  • Deployment Pipelines: Dev β†’ Test β†’ Prod
  • Impact Analysis & Dependencies
  • XMLA Endpoint & Reusable Assets

Domain 2: Prepare Data

  • Microsoft Fabric: The Big Picture Free
  • Lakehouses: Your Data Foundation Free
  • Warehouses in Fabric Free
  • Choosing the Right Data Store Free
  • Data Connections & OneLake Catalog
  • Shortcuts & OneLake Integration
  • Ingesting Data: Dataflows Gen2 & Pipelines
  • Star Schema Design Free
  • SQL Objects: Views, Functions & Stored Procedures
  • Transforming Data: Reshape & Enrich
  • Data Quality & Cleansing
  • Querying with SQL
  • Querying with KQL
  • Querying with DAX

Domain 3: Implement and Manage Semantic Models

  • Semantic Models: Storage Modes
  • Relationships & Advanced Modeling
  • DAX Essentials: Variables & Functions
  • Calculation Groups & Field Parameters
  • Large Models & Composite Models
  • Direct Lake Mode
  • DAX Performance Optimization
  • Incremental Refresh

DP-600 Study Guide

Domain 1: Maintain a Data Analytics Solution

  • Workspace Access Controls
  • Row-Level & Object-Level Security
  • Sensitivity Labels & Endorsement
  • Git Version Control in Fabric
  • Deployment Pipelines: Dev β†’ Test β†’ Prod
  • Impact Analysis & Dependencies
  • XMLA Endpoint & Reusable Assets

Domain 2: Prepare Data

  • Microsoft Fabric: The Big Picture Free
  • Lakehouses: Your Data Foundation Free
  • Warehouses in Fabric Free
  • Choosing the Right Data Store Free
  • Data Connections & OneLake Catalog
  • Shortcuts & OneLake Integration
  • Ingesting Data: Dataflows Gen2 & Pipelines
  • Star Schema Design Free
  • SQL Objects: Views, Functions & Stored Procedures
  • Transforming Data: Reshape & Enrich
  • Data Quality & Cleansing
  • Querying with SQL
  • Querying with KQL
  • Querying with DAX

Domain 3: Implement and Manage Semantic Models

  • Semantic Models: Storage Modes
  • Relationships & Advanced Modeling
  • DAX Essentials: Variables & Functions
  • Calculation Groups & Field Parameters
  • Large Models & Composite Models
  • Direct Lake Mode
  • DAX Performance Optimization
  • Incremental Refresh
Domain 3: Implement and Manage Semantic Models Premium ⏱ ~12 min read

Large Models & Composite Models

Scale beyond default limits. Large semantic model storage format and composite model design β€” when and how to use each for enterprise analytics.

When default limits are not enough

β˜• Simple explanation

Think of a suitcase with a weight limit.

A standard suitcase (semantic model) has a weight limit β€” typically around 10 GB compressed. If your data weighs more than that, you have two options: get a bigger suitcase (large model format) or split your trip across a carry-on and checked bag (composite model).

Large model format increases the size limit. Composite models let you mix local and remote data. Both are enterprise patterns for when your data outgrows the defaults.

By default, Power BI semantic models have a size limit determined by capacity SKU (e.g., ~10 GB for P1/F64). The large semantic model storage format removes this per-model limit, allowing models to grow beyond 10 GB (up to the total capacity memory). This enables multi-billion-row fact tables and wide dimension tables.

Composite models combine tables with different storage modes (Import, DirectQuery, Direct Lake) in a single model, enabling cross-source analytics without moving all data into one location.

Large semantic model storage format

What it does

FeatureDefault FormatLarge Format
Model size limit~10 GB (capacity-dependent)Up to total capacity memory
Segment size~1 million rows~4 million rows (larger segments)
CompressionStandard VertiPaqEnhanced VertiPaq with larger segments
RefreshFull or incrementalFull or incremental (incremental especially beneficial)

When to enable it

  • Your model exceeds 10 GB compressed
  • You have fact tables with billions of rows
  • You are using incremental refresh (large format works well with incremental refresh, especially for real-time data partitions)
  • XMLA endpoint read/write is needed for management tools

How to enable it

  1. In the Power BI portal or Fabric workspace, go to Semantic model settings
  2. Under Large dataset storage format, toggle it On
  3. Apply changes β€” the model reformats on next refresh
πŸ’‘ Exam tip: Large format prerequisites

Large format requires:

  • Premium or Fabric capacity (P1/F64 or higher) β€” not available on shared capacity
  • Incremental refresh configured (recommended but not strictly required)
  • Model must be published to the service (cannot enable in Desktop)

Large format is especially valuable with incremental refresh β€” only new/changed partitions refresh, keeping refresh times manageable even for enormous models.

Composite models in depth

A composite model mixes storage modes within one semantic model. You saw the concept in the Storage Modes module β€” here we go deeper.

Architecture patterns

Pattern 1: Direct Lake + Import (most common in Fabric)

Direct Lake tables        Import tables
─────────────────         ────────────
fact_sales (500M rows)    dim_exchange_rates (200 rows)
fact_returns (10M rows)   dim_holidays (365 rows)

Large fact tables stay as Direct Lake (fast, no refresh). Small, slow-changing reference tables are imported for maximum query speed.

Pattern 2: Direct Lake + DirectQuery (cross-source)

Direct Lake tables            DirectQuery tables
─────────────────             ──────────────────
fact_sales (Fabric lakehouse) fact_crm (Salesforce)
dim_product (Fabric)          dim_crm_contacts (Salesforce)

Fabric data via Direct Lake, external CRM data via DirectQuery. One model serves both.

Pattern 3: Import + DirectQuery (classic hybrid)

Import tables                 DirectQuery tables
─────────────                 ──────────────────
dim_product (imported)        fact_realtime (live SQL)
dim_date (imported)
agg_monthly (imported)

Historical aggregates imported for speed; live detail table via DirectQuery for drilldown.

Composite model considerations

Composite models add flexibility but require careful design
ConsiderationImpact
Relationship storageRelationships between different storage modes are 'limited' β€” some DAX functions behave differently
SecurityRLS on Import tables works as expected; RLS on DirectQuery tables is pushed to the source
PerformanceCross-mode joins are slower than same-mode joins. Keep frequently joined tables in the same mode.
ChainingUsers can chain DirectQuery to a published model, creating 'composite models over composite models'
πŸ’‘ Scenario: James designs a multi-client composite model

James at Summit Consulting builds a composite model for a client that needs:

  1. 3 years of sales data (2 billion rows) from a Fabric lakehouse β†’ Direct Lake
  2. Live CRM pipeline data from Salesforce β†’ DirectQuery
  3. Exchange rates (200 rows, updated weekly) β†’ Import
  4. Target budgets (50 rows per department) β†’ Import

The composite model connects all four. Relationship between Direct Lake and DirectQuery tables is β€œlimited” β€” James places dimension tables shared between both in Direct Lake for best performance.

Question

What does the large semantic model storage format do?

Click or press Enter to reveal answer

Answer

Removes the default per-model size limit (~10 GB) and allows models to grow up to total capacity memory. It uses larger VertiPaq segments (~4M rows vs ~1M) for better compression. Requires Premium/Fabric capacity.

Click to flip back

Question

What is a 'limited' relationship in a composite model?

Click or press Enter to reveal answer

Answer

A relationship between tables with different storage modes (e.g., Import and DirectQuery). Limited relationships may restrict certain DAX functions and can be slower than same-mode relationships. Design tip: keep frequently joined tables in the same storage mode.

Click to flip back

Question

What is a composite model?

Click or press Enter to reveal answer

Answer

A semantic model that mixes tables with different storage modes (Import, DirectQuery, Direct Lake) in a single model. This enables cross-source analytics β€” e.g., Fabric lakehouse data via Direct Lake combined with live CRM data via DirectQuery β€” without moving all data into one location.

Click to flip back

Knowledge Check

Raj at Atlas Capital has a semantic model that reaches 15 GB compressed. The model is on an F64 Fabric capacity with a default 10 GB per-model limit. What should Raj do?

Knowledge Check

James at Summit Consulting builds a composite model with fact_sales (Direct Lake, 2B rows) and crm_contacts (DirectQuery to Salesforce). A report visual joins both tables. What should James expect?

🎬 Video coming soon


Next up: Direct Lake Mode β€” configure Fabric’s recommended storage mode, including fallback behavior and OneLake vs SQL endpoints.

← Previous

Calculation Groups & Field Parameters

Next β†’

Direct Lake Mode

Guided

I learn, I simplify, I share.

A Guide to Cloud YouTube Feedback

© 2026 Sutheesh. All rights reserved.

Guided is an independent study resource and is not affiliated with, endorsed by, or officially connected to Microsoft. Microsoft, Azure, and related trademarks are property of Microsoft Corporation. Always verify information against Microsoft Learn.