The Children's Privacy Paradox, Part II — The Dispatch by Occu·NX
The Dispatch · Occu NX
Privacy Intelligence
Published March 7, 2026
The Dispatch Children's Privacy · Part II
description

The Children's Privacy Paradox, Part II: What the Law Does — and Doesn't — Do

On paper, American kids have federal protections. In practice, those protections were written for filing cabinets and registrar's offices — not AI dashboards, behavioral exhaust data, and a $170 billion ed-tech market that treats schools as acquisition channels.

There are three federal laws that are supposed to protect children's information in and around school. If you only read the statute summaries, you'd think we had this covered. FERPA. COPPA. PPRA. Acronyms on acronyms. Then you look at how they actually work inside a district running 40 ed-tech tools on 1:1 Chromebooks, and the picture changes fast.

01 FERPA: The Privacy Law Stuck in 1974

FERPA — the Family Educational Rights and Privacy Act — is the big one. Signed into law in 1974, it protects student education records at any school receiving federal funds, which is effectively every public school and most private K–12 institutions in the country.

On its face, FERPA gives parents (and students who turn 18) the right to inspect their education records, request corrections to inaccurate information, and block disclosure of personally identifiable information without consent. So far so good.

The problem is a single phrase baked into the law:

Schools are allowed to share student information without parental consent with "school officials" who need it to do their jobs. In 1974, that meant principals, teachers, counselors, maybe a bus contractor. In 2008 and 2011, the Department of Education expanded that definition to explicitly include contractors and outside companies operating under agreement with a school. SPLC ↗

Today, districts routinely classify private companies as "school officials" — unlocking access to student records without any direct parental consent:

  • 🖥️ Learning management system (LMS) providers
  • 🔍 Student "safety" monitoring platforms
  • 📊 Behavior analytics dashboard vendors
  • Any ed-tech tool rubber-stamped onto a district's approved apps list

Once a company is classified as a "school official," it can receive student names, internal IDs, contact information, class schedules, grades, assignments, and sometimes disciplinary or counseling notes — all under the umbrella of "legitimate educational interest," without any direct parental involvement.

FERPA was written for filing cabinets. It's now being stretched over full-blown data pipelines. The law hasn't changed much. The definition of "school official" quietly has.

02 COPPA: A Strong Law With a School-Sized Escape Hatch

COPPA — the Children's Online Privacy Protection Act — is the law most people have heard of. It applies to online services directed at children under 13, or services with actual knowledge they're collecting data from that age group. The requirements are real: a clear privacy policy, verifiable parental consent before collecting or sharing kids' personal information, and strict limits on how children's data can be used, retained, and monetized.

In January 2025, the FTC finalized the first COPPA Rule amendments since 2013 — a unanimous 5-0 vote. The updated rule, effective June 23, 2025, adds a separate opt-in consent requirement for targeted advertising and tightens data retention obligations. On paper, that's a meaningful tightening. FTC, January 2025 ↗

Here's the catch: in the school context, COPPA has its own back door. The FTC's own guidance permits schools to consent on behalf of parents for educational tools — if the data is used only for school-authorized educational purposes and the vendor doesn't use it for broader commercial tracking or advertising. In theory, a practical workaround. Teachers can use tools without chasing 30 permission slips per app.

In practice, that "school can consent for you" model gets abused. Routinely. The FTC even declined to finalize its proposed ed-tech amendments in the 2025 rule update, citing pending FERPA regulatory changes — leaving the school consent loophole where it's always been. Federal Register, April 2025 ↗

What COPPA promises parents
  • Verifiable consent before data collection begins
  • Separate opt-in for any targeted advertising
  • Right to delete data already collected
  • Vendor bears the compliance responsibility — not the school
What schools actually deliver
  • "The district accepted terms of service on your behalf."
  • Contracts allowing vendors to reuse "de-identified" data for research and product improvement
  • Parents left out of the loop until something goes wrong

The FTC's 2023 enforcement action against ed-tech platform Edmodo made the stakes explicit. The agency found that Edmodo had collected personal data from hundreds of thousands of children under 13 — including students as young as kindergarteners — without proper parental consent, and used that data for advertising. Worse: Edmodo buried a clause in its terms of service telling schools and teachers that they were "solely responsible" for COPPA compliance, then provided them with none of the information they'd need to actually do it. The FTC called this what it was — an illegal attempt to offload compliance responsibilities downstream. Edmodo paid a $6 million settlement. The company no longer operates in the United States. FTC, May 2023 ↗

$170M
Google and YouTube paid a record FTC fine in 2019 for collecting personal data from children on child-directed channels without parental consent — then using it for targeted advertising. Disney settled a follow-on COPPA violation for $10 million in late 2025 for mislabeling child-directed YouTube videos as "not made for kids," allowing behavioral data collection without parental consent to continue for years after YouTube had flagged the problem directly to the company.

03 PPRA: Good for Surveys, Useless for Surveillance

PPRA — the Protection of Pupil Rights Amendment — is the quiet, narrow cousin in this story. It governs surveys, questionnaires, and evaluations touching on sensitive areas: political beliefs, mental health, sexual behavior, religious practices, and certain family matters. Parents have the right to opt out of many of these surveys and to be notified when federally funded surveys are used.

PPRA is useful when your kid is handed a form asking about their inner life. It is essentially useless for what schools are actually deploying today. PPRA has nothing to say about:

  • 🔴 Always-on browser logging and search history capture
  • 🔴 AI-generated risk scores for self-harm or violence potential
  • 🔴 Behavior tracking dashboards labeling kids as "low engagement" or "off-task"
  • 🔴 Real-time keystroke scanning and document interception

PPRA is about what kids say on a form. It has very little authority over what software infers about them behind their backs, continuously, without their awareness.

04 The Loopholes You Could Drive a School Bus Through

Three federal laws. Zero of them written with AI dashboards, behavioral exhaust data, or 1:1 device monitoring in mind. Here's how the gaps get exploited in practice.

Loophole 01 "School Officials" Expands to Cover Any Approved Vendor
Once a district labels a company a "school official" with "legitimate educational interest," that vendor can receive student names, contact information, grades, class schedules, and discipline records — without parental consent. The LMS, the monitoring platform, the analytics dashboard, and whatever new tool a teacher was encouraged to adopt last semester all potentially qualify. To a parent, it looks like outsourcing parts of the school to private companies. To the law, it still reads as sharing records with school staff.
Loophole 02 Behavioral Exhaust Isn't Treated as an "Education Record"
FERPA revolves around "education records" — things you can put in a file: transcripts, report cards, disciplinary notes. Modern ed-tech generates something else entirely. Web browsing logs, keystroke scans, AI-generated risk scores, time-on-task metrics, engagement heatmaps — companies categorize all of this as "metadata," "signals," or "monitoring data," not education records. That keeps it in a gray zone where FERPA's protections are slower, weaker, or simply inapplicable. Parents rarely see these dashboards. They have no clear legal pathway to challenge what a risk score says about their child.
Loophole 03 "Safety" as a Political Override Switch
Call it surveillance and you have a fight. Call it safety and you get a budget line. AI monitoring platforms pitch to school boards on the promise of catching self-harm and violence signals before anything happens. The political math is brutal: administrators who skip the tool and then face an incident are blamed for doing nothing. Administrators who buy the tool and shred student privacy rarely hear about it in time to face accountability. Lawsuits and investigations have documented the documented reality — journalism drafts intercepted, mental health emails flagged and never delivered, students outed for their sexual orientation by automated keyword hits, visual art flagged as child pornography and seized, false positives burying staff in noise. Safety becomes the override switch. Privacy law becomes a checkbox.
Loophole 04 School Consent Under COPPA Becomes a Blank Check
COPPA's school-consent exception exists so teachers can actually use tools in class. In practice, district contracts sometimes allow vendors to reuse "de-identified" or "aggregated" data for "product improvement," research, or analytics that can still refine behavioral models and ad systems. The FTC has been clear that school consent only covers data used solely for educational purposes — any commercial use requires direct parental consent. Vendors who ignored that line ended up in enforcement actions. The ones who keep their data reuse language vague enough keep operating.

05 States Tried to Patch the Holes. Vendors Adapted.

Starting around 2014, state legislatures stopped waiting for Congress to modernize FERPA and COPPA and started writing their own student privacy laws. By 2018, at least 40 states had passed student data privacy laws targeting vendor practices, data sharing, breach notification, and restrictions on advertising and data sales. As of 2024, that number has grown to 40-plus states, with nearly 150 student privacy laws on the books across 47 states and Washington, D.C. Parent Coalition for Student Privacy ↗

California's SOPIPA — the Student Online Personal Information Protection Act, passed in 2014 — became the national template. It bans vendors from using K–12 student data for targeted advertising, prohibits the sale of student data, and limits even "de-identified" data reuse to tightly defined educational purposes. More than 20 states have since adopted SOPIPA-style laws. Public Interest Privacy Center ↗

That's real progress. It's also a patchwork. A kid in one state has substantially stronger protections than a kid in the next state over. Enforcement is complaint-driven — nothing moves until someone files, and most parents don't know what to file about. Smaller districts don't have the legal resources to negotiate hard terms with major ed-tech vendors. They sign what's put in front of them. And the technology keeps advancing — AI proctoring, biometrics, more granular behavioral analytics — faster than any legislature can write rules to govern it.

06 Enforcement: Real Fines, Same Architecture

To be fair, regulators have landed real punches. Google and YouTube paid $170 million in 2019 for tracking children on child-directed content without parental consent. Edmodo paid $6 million in 2023 for using student data for advertising and illegally offloading compliance onto schools. Disney paid $10 million in late 2025 for mislabeling child-directed YouTube videos and allowing children's data to be collected and used for targeted ads for years. The pattern, though, runs in one direction: enforcement happens after a breach, a scandal, or a press investigation. The underlying architecture — routine, normalized data capture and behavioral monitoring of children at scale — stays in place.

In Kansas, nine current and former students filed a federal civil rights lawsuit against Lawrence Public Schools in 2025 over the district's use of Gaggle, an AI surveillance platform that scans everything connected to a school's Google Workspace — Gmail, Drive, documents — and flags content for self-harm risk, violence, drug use, and "inappropriate content." The lawsuit alleged the platform violated First and Fourth Amendment rights, intercepted students' mental health emails to trusted teachers, seized journalism drafts before publication, and flagged original artwork as child pornography. A district spreadsheet obtained by student journalists showed Gaggle had flagged phrases including "my mental health," "I'm not good enough," and "are you okay." According to Gaggle's own website, more than 1,500 school districts across the country use the same software. The district has since ceased using Gaggle; the lawsuit continues. Lawrence Times, August 2025 ↗

1,500+
School districts across the U.S. use Gaggle's AI surveillance software, according to the company's own website — scanning student emails, documents, and messages for "safety" signals. Independent research has not confirmed that these systems reliably prevent violence or self-harm. What is documented: false positives, civil rights violations, and students being outed for their sexual orientation by automated keyword detection.

07 What People Can Actually Do Right Now

Until the law grows teeth that match the technology, the realistic power is local: school boards, parent groups, students, individual educators. These are the levers that still work.

A
Ask the Annoyingly Specific Questions

At school board meetings, skip the general "do we care about privacy?" speech. Ask which vendors receive student data and what categories they see. Ask whether AI monitoring tools are in use, what they scan, and on which devices. Ask how long each vendor retains student data, whether it can be reused for product improvement or research, and whether any of it is combined with third-party ad-tech. Ask whether families can opt out of specific platforms, and how a parent challenges a wrong risk score. Get the answers in writing — policy language, board minutes, contract terms. A paper trail creates both political pressure and legal foundation.

B
Use the Existing Laws, Even Imperfect Ones

Under FERPA, you can request your child's education records and explicitly ask whether records from third-party vendors — the LMS, the monitoring platform, the behavior dashboard — are included. If you believe data was shared improperly, you can file with the Department of Education's Student Privacy Policy Office at studentprivacy.ed.gov. Under COPPA, if a service is collecting data from children under 13 without proper parental consent, or using that data beyond its stated educational purpose, file a complaint with the FTC at reportfraud.ftc.gov. FTC enforcement priorities are shaped by complaint patterns. Under your state's student privacy laws, many states now have parent-friendly guides and explicit processes for invoking them — check your state attorney general's website or the Parent Coalition for Student Privacy's state law database.

C
Don't Do It Alone

A small coalition of parents, students, librarians, and teachers can push a district to re-evaluate or drop a surveillance vendor. A few well-constructed public records requests can reveal exactly how much student data is leaving the district and under what contractual terms. Local media will often engage if you can hand them: here's the contract, here's what the law says, here's what the vendor is actually doing. This isn't about chasing perfection. It's about moving the balance of power a few inches back toward kids and families.

08 Resources

Where to Go From Here
  • U.S. Department of Education — Student Privacy Policy Office
    FERPA and PPRA explainers, complaint filing, and district compliance guidance. studentprivacy.ed.gov ↗
  • FTC — COPPA Rule Overview and Complaint Portal
    Full rule text, business guidance, and consumer complaint reporting. ftc.gov/coppa ↗
  • Parent Coalition for Student Privacy / Student Privacy Matters
    State law database, report cards, and plain-language parent guides. studentprivacymatters.org ↗
  • Electronic Frontier Foundation — Student Privacy
    Research, surveillance vendor analysis, and digital rights resources for students and educators. eff.org/issues/student-privacy ↗
  • Knight First Amendment Institute at Columbia University
    Litigation, legal analysis, and documentation of school surveillance overreach. knightcolumbia.org ↗
The Dispatch · Occu·NX

The law hasn't failed children because of bad intentions. It's failed them because it was built for a world that no longer exists — and because every incentive in the ed-tech market points away from restraint. Knowing the structure of what's broken is the first step toward actually changing it.

Share by:
Add your custom HTML here