Study: Teachers prefer AI for specific problem-solving, not just efficiency
When it comes to integrating artificial intelligence into K-12 education, teachers may benefit from hands-on learning as much as their students. A recent study reveals that educators aren’t simply looking for efficiency for its own sake; they need a clear vision for how AI tools can directly solve specific classroom challenges. This insight comes from an ongoing analysis of educator-designed AI pilots in California.
Researchers from the Center on Reinventing Public Education (CRPE) at Arizona State University tracked over 80 teachers and administrators across 18 California schools—a mix of district, charter, and private campuses. These educators participated in the Silicon Schools Fund’s “Exploratory AI” program during the 2024-25 school year, where they received six training sessions to understand generative AI, identify potential applications, and then build and test their own tools. Their creations ranged from systems designed to differentiate lessons for students of varying academic levels to applications that fostered teacher collaboration and improved student behavior.
David Whitlock, who led a development team as a vice principal at Gilroy Prep charter school in Gilroy, California, described the experience as “really freeing to just play around with AI and explore use cases.” He highlighted a key benefit: “We can now adapt our tech to meet students and staff where they’re at versus them having to adapt to a new platform.”
The CRPE study found that even with relatively limited training, teachers quickly became proficient in building and customizing AI tools. However, whether these tools were truly integrated into instructional practice hinged on their ability to address a specific problem, rather than merely offering generalized efficiency. Chelsea Waite, a senior researcher at CRPE, emphasized this distinction: “AI could be a core accelerator, fueling the teachers’ capacity to deliver on an instructional goal, but in other places it was more like a paint job. In absence of a clear vision, it ended up seeming like an interesting tool but not much else.” This analysis arrives as many educators nationwide express feeling unprepared to effectively use AI in their classrooms.
One compelling example of problem-solving AI emerged from Summit Tamalpais High School, a charter school in Richmond, California. Jackie Wilson, the executive director who participated in the pilot, noted the common concern among staff that AI might diminish human interaction. To counter this, her team developed a chatbot designed to encourage greater human engagement. This bot helps teachers use an Enneagram personality assessment to plan collaborations and has since become a staple in the school’s professional development meetings and even parent-teacher conferences, prompting users to improve communication, resolve conflict, and enhance team dynamics.
Similarly, the team at Gilroy Prep, part of the Navigator Schools charter network, tackled a widespread issue: the challenge of implementing restorative justice practices effectively while managing time constraints and communication with parents and administrators. Whitlock, now Navigator Schools’ technology innovation director, and his colleagues created an app that generates restorative activities based on a discipline incident’s description, severity, student grade and reading level, desired behavioral goals (like empathy or responsibility), and available time.
Ally Funk, a former sixth-grade STEM teacher at Gilroy Prep and a member of the development team, used the app last year after students misbehaved during a field trip. The app swiftly generated a relevant reading passage with reflection questions, along with a draft letter to parents to reinforce the lesson at home. “Once I hit start, it comes up with a reading passage and questions to go with it, and then a whole message that I can kind of proofread and send to parents,” Funk explained. “That way, I’m not having to overthink my workload over students that just didn’t want to participate in a fun field trip.”
Funk, now an assistant principal at Gilroy Prep, noted that fine-tuning the tool took weeks of trial and error. While school policies could be uploaded, privacy concerns prevented the input of personal student data, limiting the app’s ability to detect patterns. “A chatbot is only as knowledgeable as what you teach it, and so you have to keep either feeding it information or practicing the outcome you want,” Funk cautioned.
Despite these limitations, the restorative practice generator is regularly used by Gilroy Prep teachers and is expanding across the Navigator Schools network. However, Funk stressed that the app’s effectiveness is intrinsically linked to strong pre-existing student-teacher trust. “I still think there obviously needs to be human interaction,” she affirmed. “This restorative assignment generator just gives a piece of paper with questions based on their behavior. You have to have the relationships to build it on. So if you haven’t built [student-teacher] relationships that should be priority No. 1.” The study ultimately underscores that AI serves best as an accelerator within a robust educational framework, not as a replacement for fundamental human connections and clear instructional goals.