From "Productivity" to Profiling — The Dispatch by OccuNX
The Dispatch · Occu NX
Privacy Intelligence
Published April 21, 2026
The Dispatch Surveillance Capitalism · Feature
From Productivity to Profiling

From "Productivity" to Profiling : The Business Model You Didn't Agree To

Fifty years of "features." One continuous side hustle in surveillance. You thought you were opening a document. Microsoft opened a file on you.

Open Word. Type a sentence. Close it. You think that's the deal. It isn't. Somewhere between the save icon and the little cloud logo, Microsoft turned the most boring software on Earth into a behavioral telescope pointed at your desk — and they didn't bother asking. They rebranded. "Smart." "Intelligent." "Connected." "Personalized." In 2026 those words mean one thing: we are watching what you do, and we are keeping it.

This isn't a hunch. It's Microsoft's business model — written in their own documentation, tallied in their own transparency reports, and last summer, admitted under oath by their own lawyer to a European parliament. Let's go.

01 The "Productivity" Lie

Surveillance capitalism doesn't show up in a trench coat. It shows up as autosave. As a little pop-up asking if you'd like smarter grammar. As a Copilot assistant that politely — very politely — wants to read every document you've ever touched so it can help you "be more productive." Cute.

The deal was never productivity. The deal is that you do your job inside a product built to profile you while you do it, and in exchange Microsoft gets a permanent, searchable record of how your brain works at work. That record isn't free. It pays for itself. In enterprise pricing. In AI training data. In the upsell engine that decides, next quarter, you need Copilot.

You didn't buy software. You became the product.

02 What Office Actually Sees

Click a Microsoft 365 icon. At least five distinct data streams light up. None of this is leaked. It's in Microsoft's own privacy disclosures. It's just not on the box.

📊 Telemetry — every click, scroll, feature touch, crash
📄 Content — portions of your files, sent for "smart" processing
💻 Device — OS, hardware, IP, timezone, geolocation
⏱️ Usage — which apps, how long, how often
🔗 Linked account — everything tied to your Microsoft ID

The mechanism is the quiet part. Connected Experiences — on by default. AutoSave — on by default. OneDrive sync — on by default. Cloud-backed Editor, Designer, Copilot — all defaulting to send content to Microsoft servers so the cloud can "assist." Microsoft will tell you required telemetry is small and optional telemetry is your call. What they won't tell you: for most SKUs — including the ones your firm is probably paying for — there is no supported way to fully shut telemetry off. Only certain volume-licensed enterprise and LTSC builds let you drop to the "Security" level. The rest of us get whatever Microsoft decided is "required." Microsoft privacy docs ↗

"Privacy settings" exist. They're a maze. The Microsoft privacy dashboard shows some of what's collected. Not all of it. Enterprise admins can flip telemetry levels you can't. In a Microsoft 365 tenant, the admin's settings override yours — and no, they didn't ask you first either.

03 The CLOUD Act Makes Your Privacy Policy Decorative

Microsoft loves to remind you it doesn't sell your data. Technically true. Technically irrelevant. Since March 23, 2018, Microsoft has been legally obligated to hand it over.

The CLOUD Act — Clarifying Lawful Overseas Use of Data — got tucked into a 2,232-page omnibus spending bill and signed into law without a standalone vote. The effect is exactly what the name advertises: any US-based service provider must disclose data in its "possession, custody, or control" when US law enforcement asks. Server in Seattle, Dublin, Frankfurt, Singapore — doesn't matter. If a US company touches it, the US can pull it. H.R. 4943, 115th Congress ↗

How often does Microsoft actually field one of these demands? Twice a year, they publish the number themselves. The answer is: a lot.

25,182
legal demands for consumer customer data received by Microsoft from law enforcement agencies worldwide in a single six-month period (July–December 2021). Roughly 138 every single day. When anyone says Microsoft "occasionally" complies with lawful requests, recalibrate — it's tens of thousands per half year.

For years, the Microsoft PR line on this was: relax, your European data stays in Europe. That line died last summer. On June 10, 2025, Microsoft France's director of public and legal affairs, Anton Carniaux, sat in front of the French Senate inquiry commission on public procurement and digital sovereignty and took a yes-or-no question under oath. Can you guarantee the data of French citizens hosted in Microsoft's EU data centers will never be handed to US authorities without French authorization? The Register, July 2025 ↗

That's the whole thing. On the record. From Microsoft's own lawyer. Contractual guarantees, EU Data Boundary, Sovereign Cloud, the entire marketing cathedral Microsoft built around European data — all of it yields to a single CLOUD Act order. Carniaux added that Microsoft would ask permission to notify the affected customer, and that no European public sector body has been targeted so far. Neither is a guarantee. Both are pleasantries.

So when Microsoft's sales team promises your law firm the data stays in the region of your choice, here's the honest translation:

Microsoft in marketing
  • 🛡️ "Your data stays in the region you choose."
  • 🇪🇺 "EU Data Boundary. Sovereign Cloud. Localized."
  • 🔒 "We never sell customer data."
  • "Enterprise-grade compliance and privacy controls."
Microsoft under oath
  • ⚖️ We must comply with CLOUD Act orders regardless of storage location.
  • 🚫 We cannot guarantee EU data won't be transferred to US authorities.
  • 🔕 We may not be permitted to tell the customer a demand was served.
  • 📊 Tens of thousands of government demands arrive every six months.

04 The Productivity Score Disaster (And What It Told Us)

October 2020, Microsoft shipped a Microsoft 365 feature called Productivity Score. Pitched to IT as a way to measure "adoption" across the org. Out of the box, it had a dirty little default: managers could see named data about individual employees. How many days they'd sent email. Whether they used "mentions" in chat. Meeting attendance. Content collaboration counts. The whole kit.

Austrian researcher Wolfie Christl of Cracked Labs flagged it on November 24, 2020, and called it what it was — a workplace surveillance tool dressed up as a productivity dashboard. Microsoft caved inside a week. On December 1, Microsoft 365 corporate VP Jared Spataro announced user names would be stripped from the five worst metrics. Microsoft's line was that Productivity Score "was never designed to score individual users" — a genuinely impressive claim about a product that had, until that exact moment, been scoring individual users. GeekWire, Dec 2020 ↗

  • 📅 Oct 2020 — Productivity Score launches with per-employee tracking
  • 📢 Nov 24, 2020 — Cracked Labs publishes analysis calling it workplace surveillance
  • 🔄 Dec 1, 2020 — Microsoft removes individual-level reporting from five contested metrics
  • 🪞 2021–22 — MyAnalytics rebranded and folded into Viva Insights
  • 🧩 2024–26 — Copilot, Purview, and Viva continue generating per-user signals in the tenant

Here's the takeaway. Microsoft built a feature that surveilled individual workers by default. When caught, Microsoft killed the dashboard. What Microsoft didn't kill was the underlying collection. The signals still exist. They still flow. They still feed the next thing. Viva Insights, Copilot activity logs, Purview audit trails, tenant analytics — all running off the same data spine. After December 2020, the dashboard went quiet. The telemetry didn't.

The dashboard went quiet. The telemetry didn't.

05 The EU Already Said It's Illegal

If the Microsoft defense is "this is just how modern software works," a European regulator — the one that supervises the European Commission itself — formally disagreed two years ago.

The European Data Protection Supervisor opened a first investigation into EU institutions' use of Microsoft products in April 2019. In May 2021, after the Schrems II ruling, it opened a follow-up aimed specifically at the Commission's use of Microsoft 365. On March 8, 2024, the EDPS handed down its decision: the Commission's use of Microsoft 365 infringed multiple provisions of Regulation (EU) 2018/1725 — the data protection law that governs EU institutions. Among the findings: the Commission failed to specify what categories of data Microsoft was collecting, failed to specify the purposes of that processing, and failed to provide appropriate safeguards for data transferred outside the EU. The EDPS ordered the Commission, effective December 9, 2024, to suspend all data flows from its use of Microsoft 365 to Microsoft and sub-processors in non-adequate third countries. EDPS Press Release ↗

Both the Commission and Microsoft have appealed. The point survives the appeal. The EU's top privacy regulator looked at Microsoft 365, as designed and as contracted, and ruled it non-compliant. Not a technical quibble. A regulator saying the product, deployed at scale, is out of compliance by default.

06 What Actually Reduces Exposure

No single setting fixes this. You can't check a box and opt out of a business model. But you can absolutely stop feeding the thing as enthusiastically as Microsoft would like. The short playbook for a small professional firm in 2026:

  • 🧱 Stop defaulting to Microsoft 365. If your work can live offline, run Office LTSC 2024(volume-licensed, perpetual, no cloud dependency) or Office 2024(consumer perpetual). Office 2016 and 2019 hit end of support October 14, 2025. If you're still running them on client matters, stop.
  • ⚙️ Turn off Connected Experiences. File → Account → Account Privacy → Manage Settings. Disable optional connected experiences. Disable content analysis. Disable the Customer Experience Improvement Program. Ninety seconds.
  • ☁️ Unlink OneDrive and AutoSave for sensitive work. Save locally. Sync deliberately. The default is a live feed of your work-in-progress going up to the cloud — and you probably didn't mean to turn that on.
  • 🪪 Sign out of the Microsoft account when you don't need it. A signed-in Office app is a profile session. Treat it that way.
  • 🧭 Audit tenant-level telemetry and Copilot governance. On Microsoft 365 Business or Enterprise, your admin can and should set telemetry to the lowest supported level and review how Copilot is touching your data. If you don't know what that looks like in your tenant right now, that's the problem.
  • 🧰 Keep a non-Microsoft option in the rotation. LibreOffice, OnlyOffice, CryptPad — all handle common document formats without the surveillance tax. Even drafting sensitive work in one of those and exporting a clean final meaningfully cuts your exposure.

None of this is radical. None of it requires a hair shirt or a Linux sticker on your laptop. It requires one honest admission: the default configuration of Microsoft 365 is not designed in your interest. It's designed to generate signal. Your job is to generate less.

The Dispatch · OccuNX

You didn't agree to be the dataset. You agreed to edit a document. The gap between those two things — legal, contractual, technical — is where every modern privacy incident at a small firm is born. The fix isn't a product. It's an architecture. Collect less. Send less. Link less. That's the whole discipline.

References & Further Reading Microsoft Law Enforcement Requests Report • EDPS Decision of 8 March 2024 • French Senate Inquiry Hearing, 10 June 2025 • CLOUD Act (H.R. 4943) • Cracked Labs analysis of Productivity Score • Microsoft Office LTSC 2024 product documentation
Share by:
Add your custom HTML here