🔒 Guided

Pre-launch preview. Authorised access only.

Incorrect code

Guided by A Guide to Cloud
Explore AB-900 AI-901 aws-aif-c01
Guided AB-620 Domain 3
Domain 3 — Module 4 of 6 67%
26 of 28 overall

AB-620 Study Guide

Domain 1: Plan and Configure Agent Solutions

  • Getting Started: Copilot Studio for Developers Free
  • Planning Enterprise Integration and Reusable Components Free
  • Identity Strategy for Agents Free
  • Channels, Deployment and Audience Design Free
  • Responsible AI and Security Governance Free
  • Agent Flows: Build, Monitor and Handle Errors Free
  • Human-in-the-Loop Agent Flows Free
  • Topics, Tools and Variables Free
  • Advanced Responses: Custom Prompts and Generative Answers Free
  • API Calls, HTTP Requests and Adaptive Cards Free

Domain 2: Integrate and Extend Agents in Copilot Studio

  • Enterprise Knowledge Sources: The Big Picture
  • Copilot Connectors and Power Platform Connectors
  • Azure AI Search as a Knowledge Source
  • Adding Tools: Custom Connectors and REST APIs
  • MCP Tools: Model Context Protocol in Action
  • Computer Use: Agent-Driven UI Automation
  • Multi-Agent Solutions: Design and Agent Reuse
  • Integrating Foundry Agents
  • Fabric Data Agents: Analytics Meets AI
  • A2A Protocol: Cross-Platform Agent Collaboration
  • Grounded Answers: Azure AI Search with Foundry
  • Foundry Model Catalog and Application Insights

Domain 3: Test and Manage Agents

  • Test Sets & Evaluation Methods
  • Reviewing Results & Tuning Performance
  • Solutions & Environment Variables
  • Power Platform Pipelines for Agent ALM
  • Agent Lifecycle: From Dev to Production
  • Exam Prep: Diagnostic Review

AB-620 Study Guide

Domain 1: Plan and Configure Agent Solutions

  • Getting Started: Copilot Studio for Developers Free
  • Planning Enterprise Integration and Reusable Components Free
  • Identity Strategy for Agents Free
  • Channels, Deployment and Audience Design Free
  • Responsible AI and Security Governance Free
  • Agent Flows: Build, Monitor and Handle Errors Free
  • Human-in-the-Loop Agent Flows Free
  • Topics, Tools and Variables Free
  • Advanced Responses: Custom Prompts and Generative Answers Free
  • API Calls, HTTP Requests and Adaptive Cards Free

Domain 2: Integrate and Extend Agents in Copilot Studio

  • Enterprise Knowledge Sources: The Big Picture
  • Copilot Connectors and Power Platform Connectors
  • Azure AI Search as a Knowledge Source
  • Adding Tools: Custom Connectors and REST APIs
  • MCP Tools: Model Context Protocol in Action
  • Computer Use: Agent-Driven UI Automation
  • Multi-Agent Solutions: Design and Agent Reuse
  • Integrating Foundry Agents
  • Fabric Data Agents: Analytics Meets AI
  • A2A Protocol: Cross-Platform Agent Collaboration
  • Grounded Answers: Azure AI Search with Foundry
  • Foundry Model Catalog and Application Insights

Domain 3: Test and Manage Agents

  • Test Sets & Evaluation Methods
  • Reviewing Results & Tuning Performance
  • Solutions & Environment Variables
  • Power Platform Pipelines for Agent ALM
  • Agent Lifecycle: From Dev to Production
  • Exam Prep: Diagnostic Review
Domain 3: Test and Manage Agents Premium ⏱ ~14 min read

Power Platform Pipelines for Agent ALM

Automate agent deployment across environments using Power Platform Pipelines — with automated testing, approval gates, and continuous delivery.

☕ Simple explanation

The Problem with Manual Deployment

Before Pipelines, deploying a solution from dev to production meant:

  1. Export the solution as a zip file from the dev environment
  2. Log into the target environment
  3. Import the zip file
  4. Configure environment variables
  5. Test manually
  6. Repeat for each stage (dev to UAT, UAT to production)

This process is error-prone, undocumented, and depends entirely on the person doing it remembering every step. Miss a step? You find out when something breaks in production.

⚡ Jordan’s scenario: Jordan is the Power Platform admin at AgentForge. With three clients and three environments each (dev, UAT, production), that’s nine environments. Manually exporting and importing solutions across nine environments every time Priya’s team ships an update is a full-day job — and Jordan has made mistakes. Last month, a staging environment got a solution meant for production.

Jordan needs automation. That’s Power Platform Pipelines.

What Are Power Platform Pipelines?

Pipelines are a managed deployment infrastructure built directly into the Power Platform. No third-party tools, no custom scripts — it’s a first-party feature designed specifically for Power Platform ALM.

A pipeline defines:

  • Stages — the ordered sequence of environments a solution travels through (e.g., Dev → UAT → Production)
  • Target environments — which environment each stage deploys to
  • Approval gates — who must approve before a deployment proceeds to the next stage

Architecture: Host and Target

Pipelines use a host environment model:

  • Host environment — A dedicated environment where the pipeline configuration lives. This is the control plane — it stores pipeline definitions, deployment history, and run metadata.
  • Target environments — The environments that receive deployments (your dev, UAT, and production environments). These are registered as stages in the pipeline.

The host environment is typically a production-tier Managed Environment that exists solely for pipeline management. All target environments must also be Managed Environments. The host shouldn’t be your dev or business production environment.

Question

What is the role of the 'host environment' in Power Platform Pipelines?

Click or press Enter to reveal answer

Answer

The host environment is a dedicated production-tier Managed Environment that stores pipeline configurations, definitions, deployment history, and run metadata. It acts as the control plane for all deployments. All target environments must also be Managed Environments.

Click to flip back

Setting Up a Pipeline

Jordan sets up a pipeline for AgentForge’s recruitment agent:

Step 1: Prepare the host environment

Create or designate a Dataverse environment as the pipeline host. Install the Power Platform Pipelines application from the admin center.

Step 2: Register target environments

Link your dev, UAT, and production environments to the host. Each becomes an available deployment target.

Step 3: Create the pipeline

Define the pipeline with its stages:

  • Stage 1: Development (source — where solutions are authored)
  • Stage 2: UAT (first deployment target — for testing)
  • Stage 3: Production (final target — for end users)

Step 4: Configure approvals

For each stage transition, optionally assign approvers. Jordan configures:

  • Dev → UAT: auto-approved (any developer can push to UAT)
  • UAT → Production: requires approval from Priya (CEO) or Jordan (admin)

Step 5: Run the pipeline

From the maker portal, a developer selects the solution and clicks Deploy. The pipeline handles export, transport, and import automatically. If approvals are configured, the pipeline pauses until the approver acts.

Question

What are the five steps to set up a Power Platform Pipeline?

Click or press Enter to reveal answer

Answer

1) Prepare the host environment (install Pipelines app). 2) Register target environments (link dev, UAT, production). 3) Create the pipeline with ordered stages. 4) Configure approvals for stage transitions. 5) Run the pipeline — deploy from the maker portal.

Click to flip back

Extending Pipelines with Power Automate

Out of the box, Pipelines handle solution transport and approval gates. But real-world ALM often needs more — pre-deployment validation, post-deployment testing, notifications, audit logging.

This is where extensibility comes in. You can attach Power Automate cloud flows to pipeline events:

Pre-deployment flows

Runs before a deployment starts. Use cases:

  • Validation checks — verify the solution meets naming conventions, version requirements, or has required components
  • Notifications — alert the team that a deployment is about to start
  • Environment preparation — disable certain features or put the target in maintenance mode

Post-deployment flows

Runs after a deployment completes. Use cases:

  • Automated testing — trigger test sets against the newly deployed agent and fail the pipeline if scores drop below threshold
  • Notifications — alert stakeholders that deployment succeeded (or failed)
  • Configuration — set environment variable current values automatically
  • Audit logging — record deployment details to a SharePoint list or Dataverse table

Jordan builds a post-deployment flow that:

  1. Triggers after UAT deployment
  2. Runs Mira’s test sets against the newly deployed agent
  3. If accuracy drops below 85%, sends a Teams notification to Mira and blocks the production stage
  4. If all tests pass, sends a “ready for production approval” notification to Priya

This effectively creates automated quality gates — no human has to manually test after every UAT deployment.

Question

How can you add automated testing to a Power Platform Pipeline?

Click or press Enter to reveal answer

Answer

Attach a Power Automate post-deployment flow to the UAT stage. The flow triggers after deployment, runs test sets against the deployed agent, checks scores against thresholds, and either allows progression to the next stage or blocks it with a notification if quality drops.

Click to flip back

AspectManual Export/ImportPower Platform Pipelines
ProcessDownload zip, log into target, upload zip, configure manuallyOne-click deploy from maker portal — export, transport, and import are automated
ConsistencyDepends on the person performing the steps — easy to miss stepsEvery deployment follows the same defined stages and validation checks
ApprovalsEmail or chat to request sign-off — no enforcementBuilt-in approval gates that pause the pipeline until authorized users approve
Audit trailNone unless manually documentedFull deployment history tracked in the host environment
TestingManual — someone has to remember to testExtensible — attach Power Automate flows for automated post-deployment testing
RollbackRe-import a previous solution version manuallyDeployment history enables targeted rollback to a previous run
ScaleTime-consuming for multiple environmentsManages any number of stages and environments through a single pipeline definition

Pipeline Best Practices

Based on Jordan’s experience deploying across AgentForge’s nine environments:

Keep the host environment clean. Don’t build agents or store business data in the pipeline host. It’s a control plane, not a workspace.

Version your solutions. Before every pipeline run, increment the solution version. This creates a clear timeline: “Version 1.3.0 went to UAT on Tuesday, 1.3.1 on Thursday after bug fixes.”

Start with manual approvals, then automate. Jordan initially required manual approval at every stage. Once the team trusted the automated tests, he changed Dev → UAT to auto-approve and kept manual approval only for UAT → Production.

Test the pipeline itself. Before using a pipeline for real deployments, do a dry run with a test solution. Verify that each stage targets the correct environment, approvals route to the right people, and extensibility flows trigger properly.

Document environment variable expectations. Every solution should include documentation of which environment variables need current values set after import. Jordan keeps a checklist in the pipeline’s SharePoint documentation.

Question

Why should you keep the pipeline host environment clean (no agents, no business data)?

Click or press Enter to reveal answer

Answer

The host environment is a control plane for managing pipeline configurations, definitions, and deployment history. Mixing business components into it creates clutter, security concerns, and potential conflicts. A clean host is easier to manage and audit.

Click to flip back

Knowledge Check

Jordan wants deployments to UAT to be automatic, but deployments to production should require CEO approval. How should he configure the pipeline?

Knowledge Check

Jordan wants to automatically run test sets after every UAT deployment and block production deployment if tests fail. What should he build?

Knowledge Check

What is the architectural role of the host environment in Power Platform Pipelines?

Key Takeaways

  • Power Platform Pipelines automate solution deployment across environments — replacing error-prone manual export/import
  • Architecture uses a host environment (control plane) and target environments (deployment destinations)
  • Pipelines support approval gates per stage — auto-approve for dev stages, manual approval for production
  • Extend with Power Automate flows: pre-deployment validation, post-deployment testing, notifications, and audit logging
  • Post-deployment flows with test set execution create automated quality gates
  • Always version solutions, keep the host clean, and document environment variable expectations

🎬 Video coming soon

Power Platform Pipelines for Agent ALM — Walkthrough

← Previous

Solutions & Environment Variables

Next →

Agent Lifecycle: From Dev to Production

Guided

I learn, I simplify, I share.

A Guide to Cloud YouTube Feedback

© 2026 Sutheesh. All rights reserved.

Guided is an independent study resource and is not affiliated with, endorsed by, or officially connected to Microsoft. Microsoft, Azure, and related trademarks are property of Microsoft Corporation. Always verify information against Microsoft Learn.