Ed tech experts urge caution on ChatGPT’s student data privacy

Dive Brief:

  • School districts should be concerned about ChatGPT’s terms of use when permitting the artificial intelligence tool on school devices, especially when it comes to protecting students’ personally identifiable information, according to Pete Just, founding chair of the Indiana CTO Council, speaking during the Consortium for School Networking (CoSN) conference this month.
  • OpenAI, an AI research lab and company that runs ChatGPT, is “very elusive” about its data privacy policy — and will share its information with anybody, said panelist Keith Bockwoldt, chief information officer of Hinsdale Township High School District 86 in Illinois. 
  • Even if schools block ChatGPT on their networks and devices due to a fear of exposing student data, Bockwoldt said, those students can still use the technology at home.

Dive Insight:

Inherent data privacy issues within ChatGPT, such as the potential for someone to search student information with AI, are scary, Bockwoldt said during a March 21 CoSn panel discussion on the pros and cons of AI in schools. 

“We need to teach appropriate use,” he said. “This is a new world for us. We have to navigate it. We have to protect that data as much as we can, and having those conversations, I think, is going to be really important.”

With staff, it’s important to explain these privacy issues, he said. While teachers can choose to use this technology, Bockwoldt added that they should be aware this information could be shared by OpenAI elsewhere.

Ed tech experts have long stressed that the increased use of technology tools and apps in the classroom puts student data at risk. Meanwhile, recent research has discovered a majority of ed tech companies use “extensive” tracking technologies and share students’ personal information with third parties. 

Despite these student data privacy concerns, panelists expressed cautious optimism about the use of AI tools in schools.

Right now, there’s a short window of opportunity closing for districts to connect with students and show them how to use this new technology in an ethical and responsible way, said panelist Allison Reid, senior director of digital learning and libraries at the Wake County Public School System in Cary, North Carolina. 

“Or we can choose to tell them it’s bad and that we want to block it,” Reid said. “When we do that, that puts us in an adversarial role between our students and their learning.”

Blocking these AI tools can also form inequities, Reid said, especially as some students won’t have access to the technology at home while others will. “We widen that … digital learning gap, because we’re slowing some kids down while we accelerate others.”

Panelist Alison Schlotfeldt, a curriculum integration specialist at the School City of Mishawaka in Indiana, has already found helpful ways to use ChatGPT in her classroom. For instance, Schlotfeldt said she encourages students to plug their essays or assignments into the generative AI tool for immediate feedback so they can improve their work before turning it in.

Schlotfeldt said she has also heard of an engineering teacher who has urged students to problem solve using ChatGPT, allowing the instructor to spend more time with students thanks to this extra support.

“It pushes our students to be more analyzing and not just regurgitating,” Schlotfeldt said. “So we have to teach them how to use these AI tools in order to better themselves.”

This article originally appeared in www.k12dive.com

Leave a Reply

Your email address will not be published.