Planning Enterprise Integration and Reusable Components
Architect agent solutions that connect to enterprise systems using the right integration pattern, and design reusable components that scale across projects.
Why integration planning matters
Imagine your agent is a new employee who just joined a giant company.
On day one, that employee needs access β to the CRM, the HR system, the claims database, the document library. Without the right access badges, they sit at their desk doing nothing. Integration planning is giving your agent the right badges, in the right order, with the right security clearance. Pick the wrong badge type and you are rebuilding from scratch two sprints later.
Reusable components? Those are the templates and SOPs the company already has. Instead of writing a new onboarding checklist for every hire, smart teams reuse what works. Same idea β build a connector once, share it across ten agents.
The six integration patterns
Every enterprise integration in Copilot Studio falls into one of these six patterns. The exam expects you to pick the right one for each scenario.
| Feature | When to use | Auth model | Latency | Developer effort |
|---|---|---|---|---|
| Copilot connectors (first-party) | Microsoft data: SharePoint, Outlook, Dataverse, Teams | Entra ID SSO β inherits user context automatically | Low (direct platform calls) | Minimal β configure, not code |
| Power Platform connectors (1400+) | SaaS systems with existing connectors: Salesforce, ServiceNow, SAP | OAuth 2.0 or API key β configured per connection | Medium (connector proxy layer) | Low β use prebuilt actions in cloud flows |
| Custom connectors | Internal APIs, niche SaaS without prebuilt connectors | OAuth 2.0, API key, or Windows auth via gateway | Medium (connector proxy + your API) | Medium β author OpenAPI spec, test, deploy |
| REST/HTTP direct (via cloud flow) | Any HTTP endpoint β legacy SOAP, GraphQL, webhooks | Any β headers, tokens, certificates managed in flow | Variable (depends on endpoint) | Medium-high β build flow, handle errors, parse responses |
| MCP tools (Model Context Protocol) | External tool servers, AI-native integrations, multi-step tool chains | Token-based or managed identity β configured at tool level | Medium (tool server round-trip) | Medium β register MCP server, define tool schema |
| Azure AI Search (knowledge) | Grounding agent responses in large document corpora | Managed identity or API key to search index | Low-medium (vector/hybrid search) | Medium β index pipeline + search config in Copilot Studio |
Exam tip: connector vs. custom connector vs. HTTP
The exam loves giving you a scenario and asking which integration pattern to use. Rule of thumb: if a prebuilt connector exists, use it β it handles auth, pagination, and throttling for you. Custom connectors are for APIs without prebuilt support. Direct HTTP (via cloud flow) is the escape hatch when you need full control over headers, retries, or non-REST protocols like SOAP.
Choosing the right pattern
Work through these questions in order: (1) Data in M365/Dataverse? Use Copilot connectors. (2) Prebuilt connector exists? Use it. (3) REST API you control? Custom connector. (4) Non-REST/legacy? HTTP action in cloud flow. (5) Multi-step tool reasoning? MCP tools. (6) Document grounding? Azure AI Search.
Scenario: Kai maps Pacific Mutual's integrations
Kai is deploying Copilot Studio for Pacific Mutual Insurance (15,000 employees). He maps their integration needs:
- Claims database (internal SQL via API gateway) β Custom connector with OAuth 2.0. The API team already has an OpenAPI spec.
- ServiceNow tickets β Prebuilt Power Platform connector. Already used by their Power Automate flows.
- Policy documents (200,000 PDFs in Azure Blob Storage) β Azure AI Search with hybrid retrieval. Documents indexed nightly.
- SharePoint sites (HR policies, compliance docs) β Copilot connector. Zero config needed.
- Underwriting model (Python microservice on AKS) β REST/HTTP via cloud flow. Returns JSON risk scores.
- Document analysis (extract data from scanned forms) β MCP tool server wrapping Azure Document Intelligence.
Kai maps each to the right pattern before building topics β avoiding mid-sprint rebuilds from wrong auth model choices.
Reusable components
Building one agent is easy. Building ten agents that share common logic without copy-pasting is where reusable components come in.
| Component type | What it contains | Reusability scope |
|---|---|---|
| Topics | Trigger phrases, nodes, variables, messages | Within an agent (can be imported via solutions) |
| Cloud flows | Automation logic, connector calls, data transforms | Across agents in the same environment (or exported via solution) |
| Custom connectors | OpenAPI spec, auth config, action definitions | Across all agents and flows in the environment |
| Knowledge sources | SharePoint sites, files, Dataverse tables, search indexes | Configured per agent, but underlying data shared |
| Card templates | Adaptive Card JSON for structured responses | Reusable JSON β store in a shared library or Dataverse |
| Environment variables | API URLs, feature flags, tenant-specific config | Solution-level β values change per environment without code changes |
Scenario: Priya builds AgentForge's component library
Priya is building AgentForgeβs agent marketplace for mid-market clients. Every client agent needs: (1) a greeting topic with company branding, (2) a fallback escalation flow to Teams, (3) a custom connector to the clientβs CRM, and (4) environment variables for API endpoints that differ between dev/test/prod.
She packages these into a Power Platform solution: the greeting topic uses environment variables for company name and logo URL, the escalation flow references a connection reference (not a hardcoded connection), and the CRM connector is parameterised by environment variable. When onboarding a new client, her team imports the solution, sets the variables, and the agent is 80% done.
Solutions as the reusability mechanism
Power Platform solutions are the container for cross-environment reusability:
- Managed vs. unmanaged: Unmanaged for dev. Managed for deployment β locked, updated only by importing a new version.
- Connection references: Decouple flow logic from specific user connections. During import, admins map references to production connections.
- Environment variables: Config values (URLs, flags, tenant IDs) that change per environment without code changes.
- Solution layering: Multiple solutions can contribute components. Conflicts resolved by layer order.
Why connection references matter
Without connection references, a cloud flow is bound to a specific userβs connection. Export to production and it still points to dev credentials β which fails or creates a security hole. Connection references decouple flow from identity. During import, admins map each reference to the correct production connection.
Kai needs his Copilot Studio agent to search 200,000 policy PDFs stored in Azure Blob Storage. Which integration pattern should he use?
Priya wants her AgentForge solution to work in any client's environment without code changes. Which combination of solution components enables this?
A prebuilt Power Platform connector exists for ServiceNow, but Kai's team also has a custom internal REST API for claims processing. What should Kai do?
π¬ Video coming soon
Planning Enterprise Integration and Reusable Components