Getting Started: How to Develop AI Guidelines for Schools and Districts
Artificial intelligence is shaping nearly every aspect of our lives, from the entertainment we consume and how we book doctor’s appointments, to the directions we use to drive to our next destination and the thermostats we use to keep the temperatures regulated in our homes.
Yet, it’s still very much “uncharted terrain” in schools and districts, both in terms of how students, teachers, and leaders use the technology as well as the guidelines for that usage.
If your state or district hasn’t provided specific guidance around utilizing AI, you might be feeling some ambiguity about the next step to take in developing your own policies. AI’s innovation and prevalence has grown exponentially over the last year. How do you create policies for technology that’s changing so rapidly?
Rather than focusing on hard-and-fast, complex policies, many schools are opting to create frameworks and guardrails that are flexible and can change as more information becomes available. All the same, you’ll need a place to begin—and that means asking yourself some initial questions about your school community’s AI usage and goals. We have four questions to help you get started below:
What guidance on developing AI policies or guardrails currently exists?
A good place to start is to find what guidance already exists. There are several education organizations that have stepped up to fill the AI guidance void, including the International Society for Technology in Education (ISTE), who have created an “Artificial Intelligence Explorations for Educators” self-paced online course, and the Consortium for School Networking (CoSN), who have issued a set of guidelines for using AI in the classroom.
CoSN also recently partnered with the Council of Great City Schools to release the K-12 AI Readiness Checklist, which contains 93 questions for schools and districts to consider when using AI as it relates to data, technical, operational, security, and legal/risk readiness.
CoSN also recently partnered with the Council of Great City Schools to release the K-12 AI Readiness Checklist, which contains 93 questions for schools and districts to consider when using AI as it relates to data, technical, operational, security, and legal/risk readiness.
In addition, Teach AI, an initiative launched by over 20 partners including Code.org, Digital Promise, the European Tech Alliance, and others, has developed an AI Guidance Toolkit that gives samples and suggestions for not only developing policies and guidelines, but also for communicating them through teacher and staff letters, parent letters, and student agreements.
What is my school or district’s level of “AI literacy?”
If you’re not sure, here’s a framework to get you started. Digital Promise defines AI literacy as “the knowledge and skills that enable humans to critically understand, use, and evaluate AI systems and tools to safely and ethically participate in an increasingly digital world.” The organization uses the verbs in their definition as part of their three-part “AI Literacy Framework”: Understand, Use, and Evaluate. We’ve summarized their framework below:
- Understanding AI: AI users need to have a technical understanding of how AI uses data to develop associations and automate predictions. The importance of this step is also echoed by Massachusetts Institute of Technology professor Cynthia Breazeal, who says that “we need to demystify how these systems work…people talk about these things like a conscious ether that surrounds us. We need to understand that we’re really talking about something people actually make and control and engineer.”
- Using AI: According to Digital Promise, there are three ways humans typically engage with AI in an educational context. We interact with systems that collect data to provide automated decisions, questions and suggestions (i.e., using Netflix or Amazon). We create by leveraging those systems to develop content (such as AI-generated photos and written content). Or, we apply AI by developing specific systems to process and predict information (by developing our own AI programs).
- Evaluating AI: Digital Promise says this is the most critical element of AI literacy—taking a more active approach around being aware of the data a particular AI algorithm is using and how it’s being applied and shared.
With your teachers, staff, and leaders, audit your own understanding of AI through these three lenses. Your answers will give you a good place to start in understanding the professional development your school or district will need around the specific facets of AI in addition to gathering the information that will help you establish any guidelines.
{{blue-form}}
Are we thinking about our AI usage in a holistic way?
Because there’s so much to learn about AI, we might find we’re thinking about the technology in terms of particular types of usage, like using ChatGPT to help us write an email or brainstorm lesson plans. We might be narrowing the topic of AI without realizing.
To truly set our guardrails and create eventual AI policies or guidelines, there’s a need to think more broadly—and that begins with viewing AI through the context of what you’re already focusing on as part of your school vision.
To truly set our guardrails and create eventual AI policies or guidelines, there’s a need to think more broadly—and that begins with viewing AI through the context of what you’re already focusing on as part of your school vision.
For example, when this school district created their AI guidebook, they did so around an acronym: VIEW, short for Value of Understanding, Implementation Safety, Educational Learning, and World Preparation. Examining AI through these four lenses enabled the district to ensure they aligned with their goals, mission, vision for their district:
- Value of Understanding: How does valuing and understanding the role of AI impact our students, our communities, and our schools?
- Implementation Safety: How can AI be safely and effectively used to support the mission of the district?
- Educational Learning: What tools are available, and how can they be utilized to aid teaching and learning for student success?
- World Preparation: How is AI necessary to prepare our students for college and career?
As you think about the role of AI in your community, consider your school or district’s specific values and priorities and how it might enhance, advance, or limit your objectives.
Do we have a cross-functional team that can lead this work?
Developing AI guidelines is something that can’t be done in a vacuum, and it’s not just the responsibility of a single school or district leader either. Here’s where distributed leadership can be a big help.
AI technology is evolving faster than formal training and guidance for teachers, staff, and students can be developed, so it will be critical to form a cross-functional team that will be able to think through not only how to train a school or district’s workforce to responsibly use AI, but how to prepare for the fundamental shifts in teachers’ roles and student opportunities in the years ahead.
Developing AI guidelines is something that can’t be done in a vacuum, and it’s not just the responsibility of a single school or district leader either. Here’s where distributed leadership can be a big help.
Think of your school or district’s AI exploration team similar to a representative instructional leadership team (ILT). Typically, ILTs include the school principal, assistant principal, grade-level team leaders, content-area department heads, and school counselors. An AI team should have many of those same voices, with a few additions: early adopters among your teachers and staff, representatives from your IT department, parents, and your district’s legal team are all stakeholders that will be able to spot opportunities and repercussions you may not be considering. In addition, consider partnering with education organizations who have AI expertise and can weigh in on best practices.
Also, don’t forget to include students in this cross-functional group. Not only are students the end-users of this tech in many circumstances, but they’ll need AI skills in many future career roles, so their perspective is critical.
{{blue-form}}
Focus on developing “living, breathing documents”
Policies are something we’re used to developing as school and district leaders, but the advice when it comes to AI educational policy is to resist the urge to be too heavy-handed and instead focus on baseline parameters.
“Policy is always behind technology,” says Kristina Ishmael, the former deputy director of the U.S. Department of Education’s Office of Educational Technology and now a strategic advisor at Ishmael Consulting. “In some cases, that’s very intentional, because it’s policy; once you put it in, it’s hard to take it off.” She suggests that all AI policy and guidance needs to be “living, breathing documents, because the technology is changing so quickly.”
The sooner there are guardrails in place within your school or district community, the sooner you and your teachers, staff, and students can begin exploring AI in an ethical, safe way.