AI Governance for School Boards
Practical governance frameworks for school boards navigating AI adoption — protecting student data, parent trust, and educational integrity.
The Reality
Teachers and administrative staff across your school board are already experimenting with AI tools like ChatGPT, Copilot, and other automation systems — often without clear policies. They're drafting report card comments, creating lesson plans, summarizing student progress, generating assessment rubrics, and communicating with parents through AI daily.
School boards have a duty of care that extends beyond academics. When AI touches student data, the stakes are fundamentally different.
The question is not whether AI is being used in your schools. The question is whether it is governed.
Key Risk Areas
Teachers might paste student names, grades, IEP accommodations, behavioural notes, or health information into ChatGPT to draft report cards or parent communications — violating education privacy legislation.
AI can generate inaccurate lesson content, fabricate citations in curriculum materials, or produce misleading student assessments. In education, accuracy directly impacts student outcomes.
Many AI tools store prompts or use them for training. School boards must protect student records, special education documentation, staff evaluations, and internal communications.
Parents entrust schools with their children's information. If families learn AI is processing student data without oversight, it can quickly erode trust and generate public concern.
Teachers may begin relying on AI for drafting report cards, creating assessments, or summarizing student progress. Without training, this can lead to errors that affect student records and parent relationships.
Framework
Clear guidelines on what educators and staff can and cannot use AI for — from report cards to lesson planning to parent communications.
Explicit rules on what student information must never be entered into AI tools — including names, grades, IEPs, behavioural notes, and health records.
AI outputs must always be reviewed by educators before use in any student-facing, parent-facing, or official capacity.
Guidelines for when AI-assisted content must be disclosed to parents, students, and the broader school community.
Staff need to understand how AI works, where it fails, how to verify outputs, and why student data requires special protection.
Regulatory Context
School boards operate under some of the strictest privacy obligations in the public sector. Student data is among the most sensitive categories of personal information, and the regulatory framework reflects that.
Freedom of Information and Protection of Privacy Act
Governs how school boards collect, use, and disclose personal information. AI tools that process student data must comply with FIPPA requirements.
Personal Information Protection and Electronic Documents Act
Federal privacy legislation that applies when AI tools transfer student data across provincial or national borders — which many cloud-based AI tools do.
Education-specific privacy provisions
Most provinces have education-specific legislation that imposes additional obligations on how student information is handled, stored, and shared.
Artificial Intelligence and Data Act (AIDA)
Canada's proposed AI legislation will introduce new compliance requirements. School boards that establish governance frameworks now will be better positioned when regulations take effect.
Interactive Assessment
Answer 12 questions to assess your school board's current AI governance posture. This is not a pass/fail — it's a starting point for structured improvement.
Do you have a written policy outlining what student information may and may not be entered into AI tools by staff or educators?
Are teachers and staff explicitly prohibited from entering student names, grades, IEP details, behavioural notes, or health information into public AI systems?
Have you reviewed the data storage and retention practices of AI tools used by educators and administrative staff?
Are your AI usage practices compliant with FIPPA, PIPEDA, and provincial education privacy legislation?
Is AI-generated content (report card comments, parent communications, curriculum materials) reviewed by a human prior to distribution?
Are educators trained to verify AI-generated lesson plans, assessment rubrics, or student summaries before use?
Is there a documented process requiring human validation of AI-assisted recommendations about students?
Have you defined disclosure expectations when AI contributes to parent-facing or board-facing materials?
Do you have a formal AI Acceptable Use Policy for educators and administrative staff?
Is AI-related training documented and provided to all relevant staff across the board?
Have staff acknowledged AI usage guidelines in writing?
Is there a defined reporting pathway for AI-related errors, misuse, or parent concerns?
Explore More
Explore more assessments, articles, and training options for education.
© 2026 Beth Andress | Street Safe Self Defence. All rights reserved.
This resource may be shared internally within your school board but may not be reproduced, modified, or distributed externally without written permission.