Fabric Data Agents: Analytics Meets AI
Integrate Microsoft Fabric data agents to give your Copilot Studio agents natural-language access to lakehouses, warehouses, and semantic models.
What are Fabric data agents?
Imagine your company’s data warehouse is a massive library with millions of records.
Today, if someone wants to know “What were our top-selling products last quarter?”, they need a data analyst to write a SQL query. A Fabric data agent is like giving your AI agent a library card — it can walk into the warehouse, look up the answer, and bring it back in plain English.
Fabric data agents (currently in preview) let your Copilot Studio agent query Microsoft Fabric data — lakehouses, warehouses, and semantic models — using natural language. The user asks a question, the agent translates it into a query, runs it against Fabric, and returns the answer. No SQL required from the end user.
How Fabric data agents work
The flow from user question to data answer has four steps:
- User asks a business question — “What were delivery times for the Auckland region last month?”
- Copilot Studio routes to the Fabric data agent — the orchestrator recognises this as an analytics question
- Fabric data agent translates to a query — the agent’s AI converts natural language to SQL or DAX, depending on the data source
- Results return to the user — structured data comes back, often as a table or summary
What data sources can Fabric data agents query?
Fabric data agents currently support three data source types:
- Lakehouses — Delta Lake tables in Fabric, queried via SQL
- Data warehouses — Fabric’s cloud-native SQL warehouse, queried via T-SQL
- Semantic models — Power BI semantic models (formerly datasets), queried via DAX
The Fabric data agent handles the translation. You configure which data source it connects to when you create it in Fabric.
Adding a Fabric data agent to Copilot Studio
The integration follows the same connected agent pattern:
- Create the Fabric data agent in Fabric — select your data source (lakehouse, warehouse, or semantic model) and configure the agent’s description and sample questions
- Publish the Fabric data agent — this makes it available for connection
- In Copilot Studio, open your orchestrator agent and go to Settings then Connected agents
- Add then Fabric data agent — select the published agent from the list
- Configure trigger descriptions — tell the orchestrator when to route analytics questions to Fabric
- Test with sample questions — verify the agent translates questions correctly and returns accurate results
Preview limitations to be aware of
Fabric data agents are in preview as of 2025. Key limitations:
- Natural-language-to-SQL translation can misinterpret ambiguous questions
- Complex joins across multiple tables may not translate correctly
- The agent relies on good table/column naming — poorly named schemas produce poor results
- Row-level security (RLS) from the Fabric data source is respected, but verify this during testing
Fabric vs other data patterns
The exam may present scenarios where multiple data access methods could work. Knowing which one to pick is essential.
| Feature | Data type | Best for | Query method |
|---|---|---|---|
| Fabric data agent | Structured analytics in Fabric (lakehouse, warehouse, semantic model) | Ad-hoc business questions against BI data — 'What were sales last quarter?' | Natural language → SQL/DAX translation by the agent |
| Dataverse knowledge | CRM/ERP data already in Dataverse (Dynamics 365, Power Apps) | Structured lookups — 'Show me open cases for Contoso' | Direct Dataverse queries using built-in knowledge source |
| Custom connector / REST API | External databases (SQL Server, Snowflake, SAP) | Any structured data not in Fabric or Dataverse | Developer builds connector with specific query endpoints |
| SharePoint / document knowledge | Unstructured documents (PDFs, Word docs, wiki pages) | Policy questions, how-to guides, reference documents | Generative answers with RAG — search + summarise |
Scenario: Dev connects Fabric for delivery analytics
Dev’s logistics company has a Microsoft Fabric lakehouse containing two years of delivery data — shipment dates, transit times, carrier performance, and regional breakdowns. The operations team wants to ask the company’s Copilot Studio agent questions like “Which carrier had the longest average delivery time in March?” or “Show me on-time delivery rates by region.”
Today, these questions require a data analyst to write SQL queries. Dev’s solution: create a Fabric data agent pointing at the delivery lakehouse, then add it as a connected agent to the logistics assistant bot. He configures trigger descriptions like “delivery statistics, carrier performance, transit times, regional analytics.”
Now when a dispatcher asks “How did NZ Post perform this week?”, the orchestrator routes to the Fabric data agent. The agent translates the question to SQL, queries the lakehouse, and returns a summary: “NZ Post delivered 847 packages this week with a 94.2% on-time rate, up from 91.8% last week.”
Exam tip: Fabric = structured analytics data in Fabric
If the exam mentions lakehouses, data warehouses, semantic models, or BI data in Fabric, the answer is a Fabric data agent. If the data is in Dataverse, use Dataverse knowledge. If it is in an external database, use a custom connector. Do not confuse Fabric data agents with generative answers — those are for unstructured documents.
Dev's company has delivery performance data in a Fabric lakehouse. Operations managers want to ask the company bot 'What were on-time rates by region last month?' What should Dev use?
What is the key difference between a Fabric data agent and a SharePoint knowledge source?
🎬 Video coming soon
Fabric Data Agents: Analytics Meets AI