πŸ”’ Guided

Pre-launch preview. Authorised access only.

Incorrect code

Guided by A Guide to Cloud
Explore AB-900 AI-901
Guided DP-600 Domain 2
Domain 2 β€” Module 13 of 14 93%
20 of 29 overall

DP-600 Study Guide

Domain 1: Maintain a Data Analytics Solution

  • Workspace Access Controls
  • Row-Level & Object-Level Security
  • Sensitivity Labels & Endorsement
  • Git Version Control in Fabric
  • Deployment Pipelines: Dev β†’ Test β†’ Prod
  • Impact Analysis & Dependencies
  • XMLA Endpoint & Reusable Assets

Domain 2: Prepare Data

  • Microsoft Fabric: The Big Picture Free
  • Lakehouses: Your Data Foundation Free
  • Warehouses in Fabric Free
  • Choosing the Right Data Store Free
  • Data Connections & OneLake Catalog
  • Shortcuts & OneLake Integration
  • Ingesting Data: Dataflows Gen2 & Pipelines
  • Star Schema Design Free
  • SQL Objects: Views, Functions & Stored Procedures
  • Transforming Data: Reshape & Enrich
  • Data Quality & Cleansing
  • Querying with SQL
  • Querying with KQL
  • Querying with DAX

Domain 3: Implement and Manage Semantic Models

  • Semantic Models: Storage Modes
  • Relationships & Advanced Modeling
  • DAX Essentials: Variables & Functions
  • Calculation Groups & Field Parameters
  • Large Models & Composite Models
  • Direct Lake Mode
  • DAX Performance Optimization
  • Incremental Refresh

DP-600 Study Guide

Domain 1: Maintain a Data Analytics Solution

  • Workspace Access Controls
  • Row-Level & Object-Level Security
  • Sensitivity Labels & Endorsement
  • Git Version Control in Fabric
  • Deployment Pipelines: Dev β†’ Test β†’ Prod
  • Impact Analysis & Dependencies
  • XMLA Endpoint & Reusable Assets

Domain 2: Prepare Data

  • Microsoft Fabric: The Big Picture Free
  • Lakehouses: Your Data Foundation Free
  • Warehouses in Fabric Free
  • Choosing the Right Data Store Free
  • Data Connections & OneLake Catalog
  • Shortcuts & OneLake Integration
  • Ingesting Data: Dataflows Gen2 & Pipelines
  • Star Schema Design Free
  • SQL Objects: Views, Functions & Stored Procedures
  • Transforming Data: Reshape & Enrich
  • Data Quality & Cleansing
  • Querying with SQL
  • Querying with KQL
  • Querying with DAX

Domain 3: Implement and Manage Semantic Models

  • Semantic Models: Storage Modes
  • Relationships & Advanced Modeling
  • DAX Essentials: Variables & Functions
  • Calculation Groups & Field Parameters
  • Large Models & Composite Models
  • Direct Lake Mode
  • DAX Performance Optimization
  • Incremental Refresh
Domain 2: Prepare Data Premium ⏱ ~12 min read

Querying with KQL

Kusto Query Language for time-series analytics. Filter, aggregate, and visualize streaming data in Eventhouse β€” with practical KQL examples.

What is KQL?

β˜• Simple explanation

Think of KQL as a language built for timestamps.

SQL was designed for tables of customers and orders. KQL was designed for streams of events β€” log entries, sensor readings, trade events, and application telemetry. Every KQL query starts with β€œtake this stream of events and…” β€” it reads like a recipe, step by step, top to bottom.

In Fabric, KQL queries run against Eventhouse databases. If your data is time-stamped and append-heavy, KQL is your tool.

Kusto Query Language (KQL) is a read-only query language optimised for large-scale time-series and log analytics. It uses a pipe-based syntax where each step transforms the result of the previous step. KQL powers Azure Data Explorer, Microsoft Sentinel, and Fabric Eventhouse.

KQL excels at: time-based filtering (last 24 hours, between dates), aggregation over time windows (bin by hour/day), pattern detection (where events follow a sequence), and near real-time analysis of streaming data.

KQL basics: The pipe model

KQL queries flow top-to-bottom through a pipe (|):

TradeEvents
| where Timestamp > ago(24h)
| where TradeType == "Buy"
| summarize TotalVolume = sum(Quantity), TradeCount = count() by Symbol
| top 10 by TotalVolume desc

What’s happening line by line:

  1. Start with the TradeEvents table
  2. Filter to events in the last 24 hours
  3. Filter to β€œBuy” trades only
  4. Group by symbol, calculate total volume and count
  5. Return the top 10 by volume

KQL vs SQL comparison

KQL reads top-to-bottom with pipes; SQL uses nested clauses
OperationSQLKQL
Select all columnsSELECT * FROM tradestrades
Filter rowsWHERE price > 100| where price > 100
Select columnsSELECT symbol, price FROM tradestrades | project symbol, price
Count rowsSELECT COUNT(*) FROM tradestrades | count
Group and aggregateSELECT symbol, SUM(qty) FROM trades GROUP BY symboltrades | summarize sum(qty) by symbol
Sort resultsORDER BY price DESC| sort by price desc
Limit resultsSELECT TOP 10| take 10 or | top 10 by column
Time filterWHERE timestamp > DATEADD(hour, -24, GETDATE())| where Timestamp > ago(24h)

Time-series operations

KQL’s strength is time-based analysis. The bin() function groups timestamps into intervals:

Aggregate by time window

// Trades per hour for the last 7 days
TradeEvents
| where Timestamp > ago(7d)
| summarize TradeCount = count(), AvgPrice = avg(Price) by bin(Timestamp, 1h)
| render timechart

Time-based comparison

// Compare today's trade volume to yesterday
TradeEvents
| where Timestamp > ago(2d)
| extend DayLabel = iff(Timestamp > ago(1d), "Today", "Yesterday")
| summarize TotalVolume = sum(Quantity) by DayLabel

Bucketed average

// 5-minute bucketed average of trade price
TradeEvents
| where Timestamp > ago(1h)
| summarize AvgPrice = avg(Price) by bin(Timestamp, 5m)
| render timechart

Common KQL operations for DP-600

OperationKQL SyntaxPurpose
where`where column == value`
project`project col1, col2`
extend`extend NewCol = expression`
summarize`summarize agg() by groupcol`
bin()bin(Timestamp, 1h)Group timestamps into intervals
ago()ago(24h), ago(7d)Relative time filter
top`top 10 by column desc`
render`render timechart`
join`join kind=inner (table2) on key`
distinct`distinct column`
πŸ’‘ Scenario: Raj monitors trade anomalies

Raj at Atlas Capital writes a KQL query to detect unusual trading activity:

TradeEvents
| where Timestamp > ago(1h)
| summarize TradeCount = count(), TotalValue = sum(Price * Quantity) by Trader
| where TradeCount > 100 or TotalValue > 10000000
| sort by TotalValue desc

This query surfaces traders who have made more than 100 trades in the last hour or traded more than $10 million β€” potential anomalies that the compliance team should investigate.

πŸ’‘ Exam tip: When KQL appears on the exam

The DP-600 exam does not expect deep KQL expertise (that is for KQL-specific certs). You need to:

  1. Read a KQL query and understand what it does
  2. Identify when KQL is the right tool (time-series, streaming data, Eventhouse)
  3. Know basic operations β€” where, summarize, bin, ago, project
  4. Know KQL vs SQL β€” choose the right language for the scenario

You will NOT be asked to write complex KQL from scratch.

Question

What is KQL's pipe model?

Click or press Enter to reveal answer

Answer

KQL queries flow top-to-bottom through pipe operators (|). Each step transforms the result of the previous step: start with a table, filter, project, summarize, sort. This makes queries readable as a sequence of transformations.

Click to flip back

Question

What does the bin() function do in KQL?

Click or press Enter to reveal answer

Answer

bin() groups continuous values (usually timestamps) into discrete intervals. bin(Timestamp, 1h) groups events into hourly buckets. Essential for time-series aggregation β€” 'events per hour', 'average per 5 minutes', etc.

Click to flip back

Question

What does ago() do in KQL?

Click or press Enter to reveal answer

Answer

ago() creates a relative timestamp. ago(24h) means '24 hours before now', ago(7d) means '7 days ago'. Used in where clauses for time-based filtering: | where Timestamp > ago(24h) returns the last 24 hours of data.

Click to flip back

Knowledge Check

Raj at Atlas Capital needs to find the total trade volume per hour for the last 7 days in his Eventhouse. Which KQL query is correct?

Knowledge Check

An exam question describes a scenario with application logs arriving at 10,000 events per second. The team needs to find error patterns over the last 24 hours, grouped by 15-minute intervals. Which approach is most appropriate?

🎬 Video coming soon


Next up: Querying with DAX β€” Data Analysis Expressions for analytical querying and measure evaluation.

← Previous

Querying with SQL

Next β†’

Querying with DAX

Guided

I learn, I simplify, I share.

A Guide to Cloud YouTube Feedback

© 2026 Sutheesh. All rights reserved.

Guided is an independent study resource and is not affiliated with, endorsed by, or officially connected to Microsoft. Microsoft, Azure, and related trademarks are property of Microsoft Corporation. Always verify information against Microsoft Learn.