🔒 Guided

Pre-launch preview. Authorised access only.

Incorrect code

Guided by A Guide to Cloud
Explore AB-900 AI-901
Guided AB-730 Domain 1
Domain 1 — Module 6 of 6 100%
6 of 21 overall

AB-730 Study Guide

Domain 1: Understand Generative AI Fundamentals

  • Welcome to Copilot: AI at Work Free
  • Copilot Across Your M365 Apps Free
  • How Context Shapes Copilot's Answers Free
  • Chat vs Agents: Two Ways to Work Free
  • Data Safety, Privacy & AI Risks Free
  • Verifying AI Outputs: Your Quality Check Free

Domain 2: Manage Prompts and Conversations by Using AI

  • Crafting Effective Prompts Free
  • Referencing the Right Resources Free
  • Saving and Sharing Prompts
  • Scheduling Prompts That Run Themselves
  • Managing Your Copilot Conversations
  • Agent Store vs Building Your Own
  • Building Your First Agent
  • Configuring and Sharing Agents

Domain 3: Draft and Analyze Business Content by Using AI

  • Creating Documents and Communications
  • Working with Existing Documents
  • Moving Insights Between M365 Apps
  • Copilot in Meetings: Before, During & After
  • Copilot Pages: Your Collaboration Canvas
  • Copilot Memory and Instructions
  • Exam Prep: Scenario Capstone

AB-730 Study Guide

Domain 1: Understand Generative AI Fundamentals

  • Welcome to Copilot: AI at Work Free
  • Copilot Across Your M365 Apps Free
  • How Context Shapes Copilot's Answers Free
  • Chat vs Agents: Two Ways to Work Free
  • Data Safety, Privacy & AI Risks Free
  • Verifying AI Outputs: Your Quality Check Free

Domain 2: Manage Prompts and Conversations by Using AI

  • Crafting Effective Prompts Free
  • Referencing the Right Resources Free
  • Saving and Sharing Prompts
  • Scheduling Prompts That Run Themselves
  • Managing Your Copilot Conversations
  • Agent Store vs Building Your Own
  • Building Your First Agent
  • Configuring and Sharing Agents

Domain 3: Draft and Analyze Business Content by Using AI

  • Creating Documents and Communications
  • Working with Existing Documents
  • Moving Insights Between M365 Apps
  • Copilot in Meetings: Before, During & After
  • Copilot Pages: Your Collaboration Canvas
  • Copilot Memory and Instructions
  • Exam Prep: Scenario Capstone
Domain 1: Understand Generative AI Fundamentals Free ⏱ ~10 min read

Verifying AI Outputs: Your Quality Check

Copilot is fast but not infallible. Learn practical techniques for checking citations, conducting human review, and protecting sensitive data in AI outputs.

Why verification isn’t optional

☕ Simple explanation

Think of Copilot like a brilliant but occasionally overconfident intern.

It writes fast, sounds professional, and covers a lot of ground. But sometimes it states things with total confidence that are slightly (or completely) wrong. And it never says “I’m not sure about this one.”

Your job? Be the editor. Every output from Copilot should be reviewed before you share it, act on it, or make a decision based on it. The level of review depends on what’s at stake:

  • Internal draft? Quick scan.
  • Client-facing report? Thorough review.
  • Legal or financial content? Multiple reviewers.

Verification is a core responsible AI practice. LLMs generate responses based on statistical patterns, not factual databases — meaning outputs can contain fabricated citations, inaccurate summaries, or misattributed information.

The exam tests your ability to select the right verification method for the task. Low-stakes internal drafts may need only a quick read. High-stakes content (client communications, financial reports, legal documents) requires citation checking, cross-referencing with source material, and human review by subject-matter experts.

Additionally, sensitive data can appear in AI outputs unexpectedly. Even when permissions are correct, Copilot might surface information in a summary that the user didn’t intend to share externally. Reviewing outputs for unintended data exposure is a critical step.

The verification toolkit

1. Citation checks

When Copilot provides references (like file names, email senders, or dates), verify them:

  • Click the citation link — does the source actually say what Copilot claims?
  • Check the date — is this the most recent version?
  • Confirm the author/sender — did this person actually say or write this?
💡 Real-world: Ava's citation catch

Ava asked Copilot to summarise client feedback from the past month. Copilot cited an email from “Tomas, 3 April” about a design revision.

Ava clicked the citation and discovered the email was from March 3rd, not April. The feedback was outdated — the client had already approved a new design.

Without the citation check, Ava would have included old feedback in a client report. A 10-second click saved a potentially embarrassing mistake.

2. Human review

Not every Copilot output needs the same level of scrutiny. Match the review to the stakes:

ScenarioReview LevelWhat to Check
Internal brainstorming notesLight scanGeneral accuracy, nothing offensive
Email to a colleagueQuick readTone, key facts, correct names
Client-facing reportThorough reviewEvery fact, every number, every citation
Legal/financial documentExpert reviewSubject-matter expert validates content, compliance check
Executive summary for leadershipSenior reviewStrategic accuracy, appropriate framing, no sensitive data leaks

Exam tip: When a question asks “what verification is appropriate?” — look at the stakes of the output. Higher stakes = more thorough verification. The exam doesn’t expect you to verify a casual internal draft the same way you’d verify a client contract.

3. Sensitive data review

Even with permissions in place, Copilot might surface information you didn’t intend to share:

  • A meeting summary might include someone’s salary mentioned in passing
  • A document summary might pull in data from a linked spreadsheet with confidential figures
  • A cross-app search might surface information from a private Teams chat

Before sharing any Copilot output externally, ask:

  • Does this contain any information that shouldn’t leave the organisation?
  • Does this include personal information about colleagues?
  • Is there any data here that the recipient shouldn’t see?
Matching verification to the task
Verification StepWhen to UseWhat to Look For
Citation checkAny time Copilot references a sourceSource exists, content matches, date is current
Quick scanInternal, low-stakes contentObvious errors, wrong names, tone issues
Thorough reviewExternal-facing or important contentEvery fact, number, and citation verified against sources
Expert reviewLegal, financial, or compliance contentSubject-matter accuracy, regulatory compliance, appropriate language
Sensitive data checkBefore sharing any output externallyPersonal data, confidential information, unintended data exposure

The verification workflow

A practical four-step process for any Copilot output:

  1. Read the output — does it make sense? Anything obviously wrong?
  2. Check citations — click through, verify sources exist and match
  3. Scan for sensitive data — names, numbers, confidential information that shouldn’t be shared
  4. Match to audience — is this appropriate for who’s going to read it?
💡 Real-world: Jordan's data exposure near-miss

Jordan asked Copilot Chat to prepare a summary of the Peak Solutions Q3 pipeline for a partner meeting. Copilot generated a great overview — but included a line:

“Notable deal: Acme Corp renewal at $450K, currently flagged as at-risk due to competitor pricing pressure.”

This was pulled from an internal Teams chat between Jordan and Raj. Accurate — but absolutely not something to share with external partners. Jordan caught it in the sensitive data check and removed it before the meeting.

Copilot did its job (found relevant pipeline data). Jordan did his job (reviewed before sharing).

🎬 Video walkthrough

🎬 Video coming soon

Verifying AI Outputs — AB-730 Module 6

Verifying AI Outputs — AB-730 Module 6

~8 min

Flashcards

Question

What is a citation check in the context of AI verification?

Click or press Enter to reveal answer

Answer

Clicking through Copilot's referenced sources (emails, files, meetings) to verify that the source actually exists, the content matches what Copilot claims, and the information is current.

Click to flip back

Question

How should you match verification effort to the task?

Click or press Enter to reveal answer

Answer

Low-stakes (internal notes) = quick scan. Medium-stakes (client email) = thorough review. High-stakes (legal/financial) = expert review. The higher the consequences of an error, the more rigorous the verification.

Click to flip back

Question

Why should you check Copilot outputs for sensitive data before sharing?

Click or press Enter to reveal answer

Answer

Copilot might surface information from private chats, linked documents, or mentioned-in-passing data that the user didn't intend to include. Always review for personal information, confidential figures, and unintended data exposure before sharing externally.

Click to flip back

Knowledge Check

Knowledge Check

Dana at Oakfield Healthcare asks Copilot to draft an email to the board summarising employee satisfaction survey results. The draft includes specific complaint quotes from anonymous survey responses. What should Dana do?

Knowledge Check

Marcus uses Copilot to generate a report on warehouse efficiency for a partner company. Copilot includes internal profit margin data from a linked spreadsheet. Marcus didn't ask for this. What type of risk is this?

Knowledge Check

Ava is writing a client proposal using Copilot in Word. The draft cites 'BrightLoop's Q3 Performance Report, page 12' for a key statistic. What verification step is MOST important?


Next up: You understand the fundamentals and risks. Now let’s get practical — learn how to craft prompts that actually get great results from Copilot.

← Previous

Data Safety, Privacy & AI Risks

Next →

Crafting Effective Prompts

Guided

I learn, I simplify, I share.

A Guide to Cloud YouTube Feedback

© 2026 Sutheesh. All rights reserved.

Guided is an independent study resource and is not affiliated with, endorsed by, or officially connected to Microsoft. Microsoft, Azure, and related trademarks are property of Microsoft Corporation. Always verify information against Microsoft Learn.