His district offers Cyber Week, an optional week during the summer for teachers to explore innovative teaching practices. Last summer, the theme of Cyber Week was artificial intelligence.
Additionally, the district has monthly low-stakes, hour-long meetings where teachers can explore generative AI without the expectation of immediately integrating it into their classrooms or teaching. “I think the lack of that expectation of outcome… breeds more innovation in our schools,” Guidotti said.
Wider implications of artificial intelligence
Part of the AI experience is about helping teachers improve their teaching.
Although AI tools may seem useful for everyday tasks, generative AI tools for instructional design must have a critical lens, according to Mark Watkins, director of the Mississippi Teacher Institute.
Watkins pointed to Harvard’s Teaching AI Project as a resource for educators looking to learn more about the ethical use of AI and practical tools. The Modern Language Association has also collaborated with the Conference on College Composition and Communication to form a task force on writing and artificial intelligence dedicated to developing guidelines and resources.
“When it comes to creating content for distribution to students, we ask teachers to be transparent” about using AI-generated activities or lesson plans, Guidotti said. He added that when a teacher discloses their use of AI to students, it creates an opportunity to have a broader conversation about when it may or may not be appropriate to use AI in an educational setting.
According to Dukes, AI is not particularly good at creating curriculum. Instead, he suggested using artificial intelligence to create creative word problems and activities that fit with the existing curriculum.
“Experimentation (with AI) can be useful and fun, especially if the teacher is intellectually engaged in the process and paying close attention because AI makes a lot of mistakes,” Dukes said.
Dukes also warned of explicit and implicit biases when it comes to using tools such as AI detection software and AI classification, especially if the outputs are to be evaluated for punishment or disciplinary action. “(Teachers’) biases will shape the decisions they make about who to investigate, and that has implications,” Dukes said.
Protecting student privacy and data, copyright infringement, and disclosure of use are major ethical implications to consider when using AI as an educator. For example, “You definitely don’t want to give ChatGPT the names of your students,” Dukes said.
According to Watkins, AI that provides feedback to students like OpenAI could prioritize the standard white vernacular of English, excluding students who may speak and write from a different cultural framework. Students may also have “neurological diversity that requires a different level of nuance to be brought into the assessment process,” Watkins continued.
Even with an agreed set of policies and tools, change is inevitable. According to Dukes, the real challenge is that within a few years, once understanding of AI technologies improves, “then we may have a whole new generation of AI capabilities, AI-powered tools.”
Teachers are still hesitant about using AI
For Marcus Luther, a high school English teacher in Oregon, the application of AI in classrooms and K-12 teaching has moved too quickly. He does not use AI in lesson or classroom planning, and his current curriculum standards do not require him to teach his students how to use AI. He does not feel confident enough about the ever-growing AI technology to use it outside the parameters of the curriculum in a thoughtful, ethical, and academic way.
He said he held one professional development session to address AI tools for teachers, but the approaches he saw did not make him feel supported in implementing AI in the classroom because of the broader implications.
What he is looking for is a deepening of the learning process and he is not sure that the tools he has seen achieve this, but he may prefer a “shortcut to proficiency.”