As artificial intelligence-assisted technology increases in K-12 instruction and learning, many educators and education businesses see opportunities and potential for the tools — including enhanced instruction that can be personalized for individual students and efficiencies in conducting student or teacher-led research.
Others, however, hold concerns about the potential for cheating or false accusations of cheating, as well as overuse or inappropriate uses for AI systems, which use large amounts of analyzed data to make predictions and perform tasks.
“We’ve done this over the decades because technologies, when they’re first introduced, we either say that they are going to be detrimental or they’re going to be lifesaving,” said Shelley Pasnik, senior advisor to the Center for Children and Technology, a nonprofit that researches technology’s influences on teaching and learning.
In fact, the use of technology in classrooms, including AI, can be much more complex because humans are actually guiding the application of the technology, said Pasnik, who is also senior vice president at Education Development Center, a nonprofit that designs, implements, and evaluates programs to improve education. The Center for Children and Technology is affiliated with the Education Development Center.
AI in education has been used for tutoring support, language translation, checking for plagiarism, verifying student absences, teacher coaching, the administration and scoring of assessments, and more.
With teachers and students in the driver’s seats of the use of AI in classrooms, Pasnik suggests five ways they can maintain trusting relationships during the growth of AI platforms.
Discussing what’s known and unknown
Having conversations about the tech tools that will be used or will be under consideration can help teachers and students better understand shared goals for their applications, guardrails needed for inappropriate uses, and if there are apprehensions, anxieties and excitement in using the technology.
“Ask open-ended questions and find out what students know, what they’re thinking, what teachers know, what they may be thinking,” Pasnik said.
These conversations can also reveal what teachers and students understand well and what they need to learn more about regarding classroom use of artificial intelligence. This understanding of the known and unknown can be a helpful step as policy is written regarding AI-assisted instruction and learning, Pasnik said.
Having a shared set of expectations
As policies are drafted around uses of AI in classrooms, considerations of governance and expectations will need to be made. This may include adding AI-assisted activities in student code of conduct agreements or to classroom-level teacher expectations for students, Pasnik said.
She added that some teachers are very clear that if students generate answers to assignments through a large language model that uses algorithms to develop text, they will fail.
Pasnik added that expectations should also be paired with consequences for when trust is broken.
Allowing for teacher collaborations
Teachers should be given time to consult with each other about their experiences with AI in the classroom, including how AI may be changing their lesson plan development or how it’s influencing pedagogy.
Additionally, schools should reach out to parents to ask if they have questions, worries or suggestions.
“So often, parents and teachers alike are confronting an environment and a set of conditions that is different, and perhaps even radically different, from their own lived experience,” because it doesn’t mirror their own educational experiences, Pasnik said.
Understanding reasons for overuse or improper use
Overuse or improper use of AI by teachers and students should be explored with the goal of better understanding why these actions are taking place. A teacher who is prone to a surveillance mentality may see AI as helpful in preventing and catching cheating.
Likewise, a student who is struggling academically may be more inclined to rely on AI assistance.
For those reasons, it’s important for educators to look at AI in a larger context of instruction and learning and in relation to other supports available to teachers and students, Pasnik said. Teachers and students together can also explore the benefits and limitations of AI assistance in learning by comparing AI-generated and student-generated work and discussing the differences.
The moral panic about AI & cheating is the same as the panic over proctoring & cheating is the same as the panic that led to plagiarism detection software.
People, the vast majority of students DO NOT want to cheat. They want to learn. The lack of trust undermines education.
— Joshua Eyler (@joshua_r_eyler) January 3, 2023
Addressing biases
Whether perceived biases are coming from technology or humans, it’s important that students and teachers feel seen and heard, Pasnik said. That means making sure all students feel welcomed and included and that their social-emotional needs are being addressed.
“Algorithms are not accurate reflections of the full diversity of humanity,” Pasnik said. “How are students and teachers thinking about their own biases and also the biases of these new tools?”
Leave a Reply