AI Governance for School Boards

Digital Risk & AI Awareness for School Boards

Practical governance frameworks for school boards navigating AI adoption — protecting student data, parent trust, and educational integrity.

The Reality

Your Educators Are Already Using AI

Teachers and administrative staff across your school board are already experimenting with AI tools like ChatGPT, Copilot, and other automation systems — often without clear policies. They're drafting report card comments, creating lesson plans, summarizing student progress, generating assessment rubrics, and communicating with parents through AI daily.

School boards have a duty of care that extends beyond academics. When AI touches student data, the stakes are fundamentally different.
The question is not whether AI is being used in your schools. The question is whether it is governed.

Key Risk Areas

The 5 Biggest AI Risks School Boards Face

Student Privacy Violations

Teachers might paste student names, grades, IEP accommodations, behavioural notes, or health information into ChatGPT to draft report cards or parent communications — violating education privacy legislation.

AI Hallucinations in Education

AI can generate inaccurate lesson content, fabricate citations in curriculum materials, or produce misleading student assessments. In education, accuracy directly impacts student outcomes.

Data Security Risks

Many AI tools store prompts or use them for training. School boards must protect student records, special education documentation, staff evaluations, and internal communications.

Parent & Community Trust

Parents entrust schools with their children's information. If families learn AI is processing student data without oversight, it can quickly erode trust and generate public concern.

Educator Overreliance

Teachers may begin relying on AI for drafting report cards, creating assessments, or summarizing student progress. Without training, this can lead to errors that affect student records and parent relationships.

Framework

What School Board AI Governance Includes

Acceptable Use

Clear guidelines on what educators and staff can and cannot use AI for — from report cards to lesson planning to parent communications.

Student Data Protection

Explicit rules on what student information must never be entered into AI tools — including names, grades, IEPs, behavioural notes, and health records.

Human Oversight

AI outputs must always be reviewed by educators before use in any student-facing, parent-facing, or official capacity.

Transparency & Disclosure

Guidelines for when AI-assisted content must be disclosed to parents, students, and the broader school community.

Educator Training

Staff need to understand how AI works, where it fails, how to verify outputs, and why student data requires special protection.

Regulatory Context

The Regulatory Landscape for Education

School boards operate under some of the strictest privacy obligations in the public sector. Student data is among the most sensitive categories of personal information, and the regulatory framework reflects that.

FIPPA

Freedom of Information and Protection of Privacy Act

Governs how school boards collect, use, and disclose personal information. AI tools that process student data must comply with FIPPA requirements.

PIPEDA

Personal Information Protection and Electronic Documents Act

Federal privacy legislation that applies when AI tools transfer student data across provincial or national borders — which many cloud-based AI tools do.

Provincial Education Acts

Education-specific privacy provisions

Most provinces have education-specific legislation that imposes additional obligations on how student information is handled, stored, and shared.

Proposed AI Legislation

Artificial Intelligence and Data Act (AIDA)

Canada's proposed AI legislation will introduce new compliance requirements. School boards that establish governance frameworks now will be better positioned when regulations take effect.

Interactive Assessment

School Board AI Governance Diagnostic

Answer 12 questions to assess your school board's current AI governance posture. This is not a pass/fail — it's a starting point for structured improvement.

0 of 12 answered0 Yes

Student Privacy & Data Protection

Do you have a written policy outlining what student information may and may not be entered into AI tools by staff or educators?

Are teachers and staff explicitly prohibited from entering student names, grades, IEP details, behavioural notes, or health information into public AI systems?

Have you reviewed the data storage and retention practices of AI tools used by educators and administrative staff?

Are your AI usage practices compliant with FIPPA, PIPEDA, and provincial education privacy legislation?

Output & Representation Risk

Is AI-generated content (report card comments, parent communications, curriculum materials) reviewed by a human prior to distribution?

Are educators trained to verify AI-generated lesson plans, assessment rubrics, or student summaries before use?

Is there a documented process requiring human validation of AI-assisted recommendations about students?

Have you defined disclosure expectations when AI contributes to parent-facing or board-facing materials?

Governance & Oversight

Do you have a formal AI Acceptable Use Policy for educators and administrative staff?

Is AI-related training documented and provided to all relevant staff across the board?

Have staff acknowledged AI usage guidelines in writing?

Is there a defined reporting pathway for AI-related errors, misuse, or parent concerns?

© 2026 Beth Andress | Street Safe Self Defence. All rights reserved.
This resource may be shared internally within your school board but may not be reproduced, modified, or distributed externally without written permission.