Data Sync: Change Tracking, Alternate Keys & Upsert
Synchronise data between Dataverse and external systems reliably. Learn change tracking for incremental sync, alternate keys for matching records, and UpsertRequest for idempotent operations.
Keeping systems in sync
Think of data sync like updating a phoneβs contact list from a shared directory.
The slow way: download ALL contacts every time (full sync). The smart way: ask βwhat changed since my last sync?β and only download the delta (change tracking). For matching records, alternate keys define how to find βthe same personβ across systems. And Upsert automatically creates or updates based on whether a match exists β no duplicates.
Change tracking
How it works
- Enable change tracking on the table
- First sync: Get ALL records + a version token
- Subsequent syncs: Send the token β get only created, modified, deleted records
- Store the new token for next sync
Web API pattern
// First sync β get everything + delta token
GET /api/data/v9.2/contacts?$select=fullname,emailaddress1
Prefer: odata.track-changes
// Response includes @odata.deltaLink with token
// Next sync β only changes
GET /api/data/v9.2/contacts?$deltatoken=919042!...
// Returns new, modified, and deleted (marked @removed) records
Organisation Service pattern
var request = new RetrieveEntityChangesRequest
{
EntityName = "contact",
Columns = new ColumnSet("fullname", "emailaddress1"),
PageInfo = new PagingInfo { Count = 5000, PageNumber = 1 }
};
var response = (RetrieveEntityChangesResponse)service.Execute(request);
string dataToken = response.EntityChanges.DataToken; // Store this!
// Next sync β pass the token
request.DataVersion = dataToken;
var delta = (RetrieveEntityChangesResponse)service.Execute(request);
foreach (var change in delta.EntityChanges.Changes)
{
if (change.Type == ChangeType.NewOrUpdated)
{
Entity record = ((NewOrUpdatedItem)change).NewOrUpdatedEntity;
}
else if (change.Type == ChangeType.RemoveOrDeleted)
{
EntityReference removed = ((RemovedOrDeletedItem)change).RemovedItem;
}
}
| Feature | Full Sync | Change Tracking |
|---|---|---|
| Data transferred | ALL records every time | Only changes since last sync |
| Performance | Slow for large tables | Fast β proportional to change volume |
| Detects deletes? | Must compare entire dataset | Yes β deleted records are flagged |
| First run | Returns all records | Same β all records + version token |
Alternate keys
Alternate keys let external systems reference records using natural business identifiers instead of GUIDs.
// Update using alternate key (no GUID needed)
Entity contact = new Entity("contact", "employeenumber", "EMP-12345");
contact["emailaddress1"] = "new@company.com";
service.Update(contact);
// Web API β alternate key in URL
PATCH /api/data/v9.2/contacts(employeenumber='EMP-12345')
{ "emailaddress1": "new@company.com" }
UpsertRequest: Create or Update in one call
Upsert checks if a record exists (by primary key or alternate key). If yes β update. If no β create. Idempotent β safe for retries.
Entity product = new Entity("product");
product.KeyAttributes.Add("productcode", "SKU-A100");
product["name"] = "Widget A";
product["price"] = new Money(29.99m);
var response = (UpsertResponse)service.Execute(new UpsertRequest { Target = product });
if (response.RecordCreated)
tracingService.Trace("Created new product");
else
tracingService.Trace("Updated existing product");
Scenario: Marcus builds a product sync
Marcus syncs 10,000 products nightly from ERP to Dataverse:
- Change tracking returns only 50 changed products (not all 10,000)
- UpsertRequest with product SKU as alternate key handles create/update
- Deleted products are flagged by change tracking β Marcus deactivates them
Result: 30-second sync instead of 10 minutes.
An ERP needs to sync 10,000 employees to Dataverse nightly. Each has a unique Employee ID. Some are new, some updated, some removed. Most efficient approach?
A sync job sends UpsertRequest twice for the same product (SKU-A100) due to a retry. What happens?
π¬ Video coming soon
Congratulations! You have completed all 26 modules of the PL-400 study guide. You now have the knowledge to design, build, extend, and integrate Power Platform solutions at a professional developer level.