πŸ”’ Guided

Pre-launch preview. Authorised access only.

Incorrect code

Guided by A Guide to Cloud
Explore AB-900 AI-901
Guided DP-700 Domain 1
Domain 1 β€” Module 3 of 8 38%
3 of 26 overall

DP-700 Study Guide

Domain 1: Implement and Manage an Analytics Solution

  • Workspace Settings: Your Fabric Foundation
  • Version Control: Git in Fabric
  • Deployment Pipelines: Dev to Production
  • Access Controls: Who Gets In
  • Data Security: Control Who Sees What
  • Governance: Labels, Endorsement & Audit
  • Orchestration: Pick the Right Tool
  • Pipeline Patterns: Parameters & Expressions

Domain 2: Ingest and Transform Data

  • Delta Lake: The Heart of Fabric Free
  • Loading Patterns: Full, Incremental & Streaming Free
  • Dimensional Modeling: Prep for Analytics Free
  • Data Stores & Tools: Make the Right Choice Free
  • OneLake Shortcuts: Data Without Duplication
  • Mirroring: Real-Time Database Replication
  • PySpark Transformations: Code Your Pipeline
  • Transform Data with SQL & KQL
  • Eventstreams & Spark Streaming: Real-Time Ingestion
  • Real-Time Intelligence: KQL & Windowing

Domain 3: Monitor and Optimize an Analytics Solution

  • Monitoring & Alerts: Catch Problems Early
  • Troubleshoot Pipelines & Dataflows
  • Troubleshoot Notebooks & SQL
  • Troubleshoot Streaming & Shortcuts
  • Optimize Lakehouse Tables: Delta Tuning
  • Optimize Spark: Speed Up Your Code
  • Optimize Pipelines & Warehouses
  • Optimize Streaming: Real-Time Performance

DP-700 Study Guide

Domain 1: Implement and Manage an Analytics Solution

  • Workspace Settings: Your Fabric Foundation
  • Version Control: Git in Fabric
  • Deployment Pipelines: Dev to Production
  • Access Controls: Who Gets In
  • Data Security: Control Who Sees What
  • Governance: Labels, Endorsement & Audit
  • Orchestration: Pick the Right Tool
  • Pipeline Patterns: Parameters & Expressions

Domain 2: Ingest and Transform Data

  • Delta Lake: The Heart of Fabric Free
  • Loading Patterns: Full, Incremental & Streaming Free
  • Dimensional Modeling: Prep for Analytics Free
  • Data Stores & Tools: Make the Right Choice Free
  • OneLake Shortcuts: Data Without Duplication
  • Mirroring: Real-Time Database Replication
  • PySpark Transformations: Code Your Pipeline
  • Transform Data with SQL & KQL
  • Eventstreams & Spark Streaming: Real-Time Ingestion
  • Real-Time Intelligence: KQL & Windowing

Domain 3: Monitor and Optimize an Analytics Solution

  • Monitoring & Alerts: Catch Problems Early
  • Troubleshoot Pipelines & Dataflows
  • Troubleshoot Notebooks & SQL
  • Troubleshoot Streaming & Shortcuts
  • Optimize Lakehouse Tables: Delta Tuning
  • Optimize Spark: Speed Up Your Code
  • Optimize Pipelines & Warehouses
  • Optimize Streaming: Real-Time Performance
Domain 1: Implement and Manage an Analytics Solution Premium ⏱ ~12 min read

Deployment Pipelines: Dev to Production

Create and configure deployment pipelines to promote Fabric content safely across development, test, and production workspaces.

What are deployment pipelines?

β˜• Simple explanation

Think of a restaurant kitchen with three stations.

Station 1 (Dev) is where the chef experiments with new recipes. Station 2 (Test) is where a taster checks the dish. Station 3 (Production) is where it goes to the customer’s table.

A deployment pipeline in Fabric is that three-station system for your analytics content. You build in Dev, validate in Test, and promote to Production. The pipeline handles the transfer β€” you don’t manually copy anything. If a pipeline breaks in Test, Production stays untouched.

Deployment pipelines in Microsoft Fabric are a release management tool that enables controlled promotion of content across up to 10 stages (typically Development β†’ Test β†’ Production). Each stage maps to a Fabric workspace.

When you deploy from one stage to the next, Fabric compares the items in both workspaces, identifies what’s new, changed, or deleted, and applies only the diff. Deployment rules let you swap data sources, connection strings, and parameters between stages β€” so your Dev pipeline reads from a dev database while Production reads from the real one.

Deployment pipelines work alongside Git integration: Git tracks what changed (version control), while deployment pipelines control when and where changes are promoted (release management).

How deployment pipelines work

The stage model

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    Deploy    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    Deploy    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Dev    β”‚ ──────────► β”‚   Test   β”‚ ──────────► β”‚  Production  β”‚
β”‚ Workspaceβ”‚              β”‚ Workspaceβ”‚              β”‚  Workspace   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜              β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜              β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
  • Each stage is linked to exactly one Fabric workspace
  • Deploy forward promotes content from one stage to the next
  • Deploy backward is also supported (e.g., resetting Test from Production)
  • Up to 10 stages β€” most teams use 3 (Dev/Test/Prod)

What happens during deployment

StepAction
1. CompareFabric compares items in source and target stages
2. DiffShows you what’s new, modified, deleted, or unchanged
3. ReviewYou decide which items to include in this deployment
4. DeploySelected items are copied to the target workspace
5. Rules appliedDeployment rules swap parameters, data sources, connections

Deployment rules

Deployment rules are the key to making the same content work across environments. Without them, your production pipeline would try to read from your dev database.

Rule TypeExample
Data source rulesDev reads from dev-sqlserver.database.windows.net; Prod reads from prod-sqlserver.database.windows.net
Parameter rulesenvironment parameter = β€œdev” in Dev, β€œprod” in Production
Lakehouse rulesDev pipeline loads to dev-lakehouse; Prod loads to prod-lakehouse
πŸ’‘ Scenario: Ibrahim's three-stage pipeline

Ibrahim configures a deployment pipeline for Nexus Financial’s risk analytics:

  • Dev workspace: Engineers iterate on notebooks and pipelines. Data source: a sample of 10,000 trades.
  • Test workspace: QA runs the full pipeline against a copy of production data. Deployment rules swap the data source to the test database.
  • Production workspace: Serves the risk dashboard to traders. Deployment rules point to the live trading database.

Every Friday, the lead engineer reviews changes in Dev, deploys to Test, runs validation overnight, and if tests pass, deploys to Production on Monday morning.

Git integration + deployment pipelines

These two features solve different problems and work best together.

Git tracks changes; deployment pipelines promote them
AspectGit IntegrationDeployment Pipelines
PurposeVersion control β€” track who changed what and whenRelease management β€” promote content between environments
Main actionCommit/update (sync workspace with repo)Deploy (copy items between stages)
Rollback methodRevert to a previous Git commitDeploy backward from a known-good stage
Environment configBranch per environment (dev, main)Deployment rules swap data sources and parameters
Review processPull requests with code reviewDeployment comparison shows diff between stages
Best forCollaboration, audit trail, branchingControlled releases, environment-specific config
πŸ’‘ Exam tip: When to use which

Exam questions often describe a scenario and ask which tool solves it:

  • β€œAn engineer needs to see what changed last week” β†’ Git integration (commit history)
  • β€œThe team needs to promote tested changes to production” β†’ Deployment pipeline
  • β€œTwo engineers changed the same notebook” β†’ Git integration (branch + merge via PR)
  • β€œProduction pipeline needs to read from a different server than Dev” β†’ Deployment rules
  • β€œThe team wants to automate deployment after a PR is merged” β†’ Git integration triggers a deployment pipeline (CI/CD)

Automating deployments

Deployment pipelines have a REST API, which means you can trigger deployments from external CI/CD tools:

  • Azure DevOps Pipeline β€” merges a PR β†’ calls the Fabric deployment pipeline API β†’ content promotes to Test
  • GitHub Actions β€” same pattern, different tool
  • You can automate deployments on a schedule by calling the Fabric deployment pipeline REST API from Azure DevOps or GitHub Actions β€” deploy at a fixed time (e.g., every Monday 6 AM)
ℹ️ Scenario: Carlos automates Friday deployments

Carlos sets up an Azure DevOps pipeline for Precision Manufacturing’s Fabric workspace:

  1. Engineers commit changes to the dev branch during the week
  2. Friday at 3 PM, a scheduled Azure DevOps pipeline runs
  3. It calls the Fabric deployment pipeline API to promote Dev β†’ Test
  4. Overnight, automated tests validate the ETL outputs
  5. Monday at 7 AM, if tests pass, a second pipeline promotes Test β†’ Production

No manual clicks. The entire release cycle is automated and auditable.


Question

How many stages can a Fabric deployment pipeline have?

Click or press Enter to reveal answer

Answer

Up to 10 stages. Most teams use 3 (Development β†’ Test β†’ Production). Each stage maps to exactly one workspace.

Click to flip back

Question

What are deployment rules?

Click or press Enter to reveal answer

Answer

Configuration overrides that swap values when content moves between stages. Common rules: data source URLs, connection strings, parameters, and lakehouse references. They ensure Dev content reads from dev resources while Production reads from live resources.

Click to flip back

Question

Can you deploy backward in a Fabric deployment pipeline?

Click or press Enter to reveal answer

Answer

Yes. You can deploy from any stage to any adjacent stage β€” forward (Dev β†’ Test) or backward (Production β†’ Test). Backward deployment is useful for resetting a broken Test environment to match Production.

Click to flip back


Knowledge Check

A data engineer deploys a pipeline from Dev to Production without configuring deployment rules. The production pipeline starts pulling data from the development database. What should the engineer have done?

Knowledge Check

Ibrahim wants to automate the promotion of Fabric content from Test to Production every Monday at 7 AM, but only if weekend tests pass. Which approach is most appropriate?

🎬 Video coming soon

Next up: Access Controls: Who Gets In β€” configure workspace and item-level permissions to control access.

← Previous

Version Control: Git in Fabric

Next β†’

Access Controls: Who Gets In

Guided

I learn, I simplify, I share.

A Guide to Cloud YouTube Feedback

© 2026 Sutheesh. All rights reserved.

Guided is an independent study resource and is not affiliated with, endorsed by, or officially connected to Microsoft. Microsoft, Azure, and related trademarks are property of Microsoft Corporation. Always verify information against Microsoft Learn.