Logged In, Left Behind: Why FERPA Cannot Keep Up with AI in the Classroom
Abigail Clarke
In Austin, Texas, some fourth graders no longer have teachers. Instead, they log onto an AI-driven platform called 2 Hour Learning, which guides them through personalized lessons in mathematics and reading and records every click, every wrong answer, and every pause along the way.[1] Alpha School, the private institution behind this model, charges up to $40,000 per year in tuition and has expanded to campuses in Miami, Dallas, and Washington, D.C.[2] Across public school districts, similar, if less dramatic, transformations are underway. AI-driven adaptive learning platforms such as DreamBox and Lexia Core5 now shape daily instruction for K–12 students nationwide.[3]
These tools share a common architecture. They collect vast quantities of student behavioral data, use that data to personalize instruction, and transfer it to third-party vendors who operate the platforms.[4] What they do not share is meaningful federal oversight. The Family Educational Rights and Privacy Act (FERPA), enacted in 1974, remains the primary federal statute governing student data privacy.[5] Its core framework was not designed with adaptive algorithms that harvest keystroke patterns from eight-year-olds in mind.[6] The result is a widening gap between the data practices of modern edtech and the legal protections students, families, and educators can actually rely on.
This piece examines that gap. It argues that FERPA’s structural deficiencies, particularly the “school official” exception and its de-identification loophole, create serious risks for students whose data increasingly powers the very AI systems evaluating them. It briefly notes the environmental dimension of this technological surge before turning to reform models that could bring oversight up to date with reality.
I. FERPA’s Architecture and Its Limits
FERPA grants parents and eligible students the right to access, review, and challenge their education records. It restricts schools from disclosing personally identifiable information from those records without consent, subject to a list of enumerated exceptions.[7] For most of its history, FERPA functioned tolerably well. The data at issue was static, stored locally, and involved a finite set of school employees who could plausibly be supervised.
AI-driven learning platforms deviate from this norm. They generate continuous behavioral records including a student’s hesitation before answering a question, the pace at which they move through modules, and patterns that correlate with emotional states.[8] They store that data remotely, often across networks the school did not explicitly approve. Also, they are operated by commercial vendors whose primary obligations are to shareholders rather than students.[9]
The Department of Education has made only modest attempts to update FERPA’s regulatory framework to address these realities. The last significant regulatory amendments date to 2011.[10] As of 2025, FERPA lacks clear cybersecurity requirements, even as schools rely on hundreds of edtech tools. Experts have repeatedly urged lawmakers to modernize FERPA with vendor security obligations, but Congress has not acted.[11]
II. The School Official Exception and the De-Identification Loophole
The central mechanism that enables edtech vendors to access student records without parental consent is FERPA’s “school official” exception. Under this provision, schools may share education records with third parties if those parties meet the following criteria: they must be performing an institutional service, have a legitimate educational interest in the records, operate under the school’s direct control, and not use or disclose the data beyond their authorized purpose.[12] In practice, this means a school can designate a commercial AI platform as a “school official” through a contractual clause, and the platform then handles student data with little additional oversight.[13]
De-identification is where the exception becomes most consequential. FERPA permits vendors to use de-identified student data relatively freely; remove a set of direct identifiers and the data is no longer considered an education record.[14] But de-identification requirements under FERPA are minimal and researchers have repeatedly demonstrated that de-identified educational datasets can be re-identified using auxiliary information.[15] A model trained on de-identified student behavioral data learns the patterns of real students. When that model is later deployed as a tutoring AI or risk prediction system, it embeds those patterns back into educational contexts.
FERPA’s enforcement structure compounds the problem. The Department of Education has never imposed a financial penalty on an institution for violating FERPA, instead relying on voluntary, monitored compliance.[16] The Supreme Court’s holding in Gonzaga University v. Doe (that FERPA creates no privately enforceable rights under 42 U.S.C. § 1983) eliminates the most obvious alternative remedy, leaving aggrieved students with no direct cause of action.[17] With the Department’s future uncertain following a 2025 executive order directing substantial reorganization, the prospect of meaningful federal enforcement has grown dimmer.[18]
III. The Environmental Footnote
The data privacy problem does not exist in isolation from environmental concerns. The adaptive learning platforms collecting student data run on cloud infrastructure powered by energy-intensive data centers. U.S. data centers consumed approximately 4.4 percent of total domestic electricity in 2023, a figure projected to nearly triple by 2028.[19] As AI-powered schooling scales from private academies to public charter networks, the aggregate energy demand of edtech is an increasingly non-trivial contributor to the nation’s carbon footprint.
IV. Reform Pathways
The inadequacy of FERPA’s current framework does not mean the problem is intractable. There are several reform models that merit consideration.
First, Congress could close the school official exception by requiring that any third-party vendor designated as a school official be subject to the same substantive obligations as the school itself, including prohibitions on using student data to train commercial AI models, with enforcement through Federal Trade Commission (FTC) jurisdiction. The FTC’s enforcement action against Edmodo in 2023 illustrates that federal consumer protection authority can reach edtech vendors directly when the Children’s Online Privacy Protection Rule (COPPA) applies, providing a workable template for expanded vendor-level accountability.[20] This would not eliminate the exception’s practical utility but would give it meaningful teeth.[21]
Second, Congress could update FERPA’s de-identification standard. The current framework of removing a list of named identifiers was developed before behavioral analytics existed. A modern standard would require demonstrable re-identification risk assessment, consistent with technical literature showing that sparse, high-dimensional behavioral data is substantially harder to anonymize than demographic records.[22]
Third, and most ambitiously, the European Union (EU)’s General Data Protection Regulation (GDPR) offers a persuasive model for what data minimization, purpose limitation, and algorithmic accountability can look like at a regulatory level. The GDPR requires that data processing be limited to what is necessary for a specified purpose, grants individuals a right to contest automated decisions, and imposes substantial financial penalties for violations.[23] The EU Artificial Intelligence Act goes further by classifying AI used in educational settings as high-risk and requiring mandatory human oversight and pre-deployment conformity assessments—a stark contrast to FERPA’s absence of any pre-deployment review requirement.[24] EU enforcement has proven consequential; when Danish regulators found in 2022 that a municipality’s contract with Google for classroom tools did not prevent Google from processing student data for its own commercial purposes, they banned the deployment entirely and suspended data transfers to the United States—the precise conduct that FERPA’s school official exception permits to continue unchecked in American schools.[25] The TikTok enforcement action against children’s data in Ireland demonstrates the same principle at scale, with regulators imposing substantial financial penalties that FERPA’s enforcement structure cannot replicate.[26]
In the near term, state-level action may fill the gap. California’s Student Online Personal Information Protection Act (SOPIPA) directly regulates edtech vendors, prohibiting them from selling student personal information or using it for targeted advertising.[27] As of early 2025, more than 13 states have enacted student data privacy laws that supplement FERPA in various ways, creating a patchwork of protections that differs substantially by jurisdiction.[28]
Conclusion
AI-powered learning platforms are not simply on their way into U.S. classrooms. They are already here. A typical K–12 district now relies on thousands of edtech applications, most adopted without formal privacy review.[29] The students using those platforms are generating behavioral data at scale, under legal protections designed for a world that no longer exists. FERPA is not beyond repair. However, repairing it requires acknowledging what it was never built to do. It was never built to regulate algorithmic systems that learn from children, retain that learning indefinitely, and feed it back into consequential decisions about those same children’s educational futures. The EU’s experience with meaningful enforcement, real financial penalties, and mandatory pre-deployment review for high-risk AI demonstrates that robust oversight is achievable. The cost of inaction is not abstract. It is measured in the behavioral profiles of students nationwide, in the training datasets of tomorrow’s edtech products, and in the absence of any meaningful right to challenge either
[1] Janet Shamlian, Inside the $40,000 a Year School Where AI Shapes Every Lesson, Without Teachers, CBS News (Oct. 8, 2025), https://www. cbsnews.com/news/alpha-school-artificial-intelligence/ (describing fourth and fifth graders at Alpha School Austin working through personalized AI-driven software with no classroom teachers present, and every click and keystroke tracked by the platform).
[2] Sunlen Serfaty, Linda Gaudino & Nicky Robertson, ‘What If I Told You This School Had No Teachers?’: Is AI Schooling the Future of Education — or a Risky Bet?, CNN (Jan. 29, 2026), https://www.cnn.com/2026/01/29/ politics/alpha-school-trump-ai-teaching (reporting on Alpha School’s expansion to Miami, Dallas, and Washington, D.C., and U.S. Secretary of Education Linda McMahon’s endorsement of the AI-powered model).
[3] See Lexia, Core5 U.S. Infographic (2023–24), https://www.lexialearning. com/resources/infographics/core5-us-infographic (reporting that Lexia Core5 reached more than 3.7 million students during the 2023–24 school year); see also Alyson Klein, Discovery Education to Acquire DreamBox’s Math, Reading Tools, Gov’t Tech (Aug. 30, 2023), https://www.govtech .com/education/k-12/discovery-education-to-acquire-dreamboxs-math-reading-tools (noting that DreamBox’s tools serve approximately 6 million K–12 students and 600,000 teachers across the United States).
[4] Alpha Sch., The Program, https://alpha.school/the-program/ (describing the 2 Hour Learning platform’s adaptive AI instruction model and data-driven personalization of each student’s daily curriculum); see also Amy C. Pimentel et al., EdTech and Privacy: Navigating a Shifting Regulatory Landscape, McDermott Will & Emery (Oct. 30, 2024), https://www. mcdermottlaw.com/insights/edtech-and-privacy-navigating-a-shifting-regulatory-landscape/ (observing that edtech tools collect more personal data about students than ever before, including survey results, school performance, study habits, and data sufficient to create psychological profiles and predict academic performance).
[5] Family Educational Rights and Privacy Act, 20 U.S.C. § 1232g (1974).
[6] Family Educational Rights and Privacy Act, 20 U.S.C. § 1232g (1974); see also Elana Zeide, Student Privacy Principles for the Age of Big Data: Moving Beyond FERPA and FIPPs, 8 Drexel L. Rev. 339, 353 (2016).
[7] Family Educational Rights and Privacy Act Regulations, 34 C.F.R. pt. 99 (2011); see Amy Rhoades, Big Tech Makes Big Data Out of Your Child: The FERPA Loophole EdTech Exploits to Monetize Student Data, 9 Am. U. Bus. L. Rev. 441, 469 (2021).
[8] Rhoades, supra note 7, at 450–55.
[9] Rhoades, supra note 7.
[10] Family Educational Rights and Privacy Act Regulations, 34 C.F.R. pt. 99 (2011); see Rhoades, supra note 7, at 469 (noting that FERPA regulations have not meaningfully updated enforcement mechanisms for edtech vendors).
[11] See Ariel Fox Johnson, 50 Years After FERPA’s Passage, Ed Privacy Law Needs an Update for the AI Era, The 74 (Aug. 20, 2024), https://www.the74 million.org/article/50-years-after-ferpas-passage-ed-privacy-law-needs-an-update-for-the-ai-era (arguing FERPA lacks cybersecurity requirements despite schools relying on hundreds of edtech tools).
[12] 34 C.F.R. § 99.31(a)(1)(i)(B) (2011) (permitting educational agencies to disclose student personally identifiable information to contractors under the direct control of the institution performing services the institution would typically perform).
[13] See Rhoades, supra note 7, at 460–69; see also Ctr. for Democracy & Tech., Commercial Companies and FERPA’s School Official Exception: A Survey of Privacy Policies 1–2 (2021), https://cdt.org/insights/commercial-companies-and-ferpas-school-official-exception-a-survey-of-privacy-policies/.
[14] See 34 C.F.R. § 99.3 (2011) (defining “personally identifiable information” and providing de-identification standards); see also U.S. Dep’t of Educ., Family Educational Rights and Privacy Act: Guidance for Reasonable Methods and De-identification of Education Records 3–5 (2011), https://studentprivacy.ed.gov/sites/default/files/resource _document/file/Guidance_for_Reasonable_Methods%20final_0_0.pdf.
[15] See Zeide, supra note 6, at 374–78 (discussing research demonstrating that nominally de-identified educational behavioral datasets can be re-identified using auxiliary data sources).
[16] See President Trump Orders Closure of the Department of Education: What Schools and EdTech Companies Need to Know About FERPA, Nat’l L. Rev. (2025), https://natlawreview.com/article/president-trump-orders-closure-department-education-what-schools-and-edtech (noting that as of 2025, the Department of Education has never imposed a financial penalty on an institution for violating FERPA, instead relying on voluntary, monitored compliance).
[17] Gonzaga Univ. v. Doe, 536 U.S. 273, 290 (2002) (holding that FERPA’s nondisclosure provisions “create no personal rights to enforce under § 1983” because the statute’s provisions lack the individually focused rights-creating language required to support a § 1983 action).
[18] Exec. Order No. 14,242, 90 Fed. Reg. 13,679 (Mar. 20, 2025) (directing the Secretary of Education to take steps to facilitate the closure of the Department of Education “to the maximum extent appropriate and permitted by law”).
[19] See Lawrence Berkeley Nat’l Lab., U.S. Dep’t of Energy, United States Data Center Energy Usage Report 3 (2024) (finding U.S. data centers consumed approximately 4.4 percent of total domestic electricity in 2023, with projections to triple by 2028).
[20] Children’s Online Privacy Protection Act, 15 U.S.C. §§ 6501–6506 (1998); United States v. Edmodo, LLC, No. 3:23-cv-02495 (N.D. Cal. May 22, 2023) (consent order) (finding COPPA violations including collection of persistent identifiers from children under thirteen without parental consent; ordering deletion of AI models and algorithms trained on unlawfully collected student data).
[21] See Pub. Interest Privacy Ctr. Fixing FERPA: Enhancing EdTech Accountability, 6–8 (2022), https://publicinterestprivacy.org/edtech-data-sharing/ (proposing statutory amendment requiring vendor-level accountability requirements and FTC enforcement authority over edtech vendors receiving student data).
[22] See Zeide, supra note 6, at 374–78; see also Rhoades, supra note 7, at 470–74 (arguing that FERPA’s de-identification standard is technically inadequate).
[23] Council Regulation 2016/679, Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data (General Data Protection Regulation), arts. 5(1)(b), 5(1)(c), 8, 22, 25, 83, 2016 O.J. (L 119) 1. Council Regulation 2016/679, art. 22(1); see Lilian Edwards & Michael Veale, Slave to the Algorithm? Why a ‘Right to an Explanation’ Is Probably Not the Remedy You Are Looking For, 16 Duke L. & Tech. Rev. 18, 21–27 (2017) (analyzing the limits of GDPR’s Article 22 right to explanation for automated decisions and its application to algorithmic educational systems).
[24] Council Regulation 2024/1689, Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act), arts. 5(1)(b), 13–14, Annex III § 3, 2024 O.J. (L 1689) 1 (classifying AI systems used in educational and vocational training settings as high-risk; prohibiting AI systems that exploit age-related vulnerabilities of children; requiring mandatory human oversight and pre-deployment conformity assessments for all high-risk AI systems).
[25] Marcelo Corrales Compagnucci, Danish DPA Banned the Use of Google Chromebooks and Google Workspace in Schools in Helsingør Municipality, 8 Eur. Data Prot. L. Rev. 405, 406–08 (2022).
[26] Data Prot. Comm’n (Ir.), TikTok Technology Ltd., Decision DPC-V-2021-002 (Sept. 15, 2023) (imposing €345 million penalty for GDPR violations in the processing of children’s personal data, including failure to implement data protection by design and default).
[27] Student Online Personal Information Protection Act, Cal. Bus. & Prof. Code § 22584 (West 2014) (prohibiting edtech operators from selling student personal information or using it for targeted advertising and requiring vendors to maintain reasonable security procedures).
[28] See Pimentel et al., supra note 4 (noting that at least thirteen states have passed laws modeled after SOPIPA and that state privacy laws range in scope and effect from state to state); see also Leah Plunkett & Urs Gasser, Student Privacy and Ed Tech (K-12) Research Briefing 8–12 (Berkman Klein Ctr., Rsch. Pub. No. 2016-15, 2016) (surveying state-level student privacy legislation and finding widespread variation in vendor accountability requirements across states with enacted student data privacy laws).
[29] See TechPolicy Press, The Case for Making EdTech Companies Liable Under FERPA (Nov. 2025), https://www.techpolicy.press/the-case-for-making-edtech-companies-liable-under-ferpa/ (citing estimate that schools rely on thousands of edtech applications per school year, most adopted without formal privacy review).
