HEAR FROM THE EXPERTS

Torrey Trust

Could you briefly introduce yourself and your current role?

I am a Professor of Learning Technology in the College of Education at the University of Massachusetts Amherst. I have spent more than a decade exploring the relationship between teaching, learning, and technology, with a focus on how technology can be used to create more effective, meaningful learning experiences for teachers and students alike. However, I am not a techno-hype person. My goal is to help teachers and students make an informed decision about the role technology should play in their educational experiences.

What AI-related initiatives or programs have you been involved with?

For teaching, I designed and teach a First Year Seminar called AI for College Success. The first part of the course helps students learn about what GenAI is, how it works, and the ethical implications of using these technologies. The second part of the course focuses on 6 dimensions of wellness (academic, physical, mental/emotional, financial, career, social) and students investigate when GenAI tools might be (as well as might NOT be) helpful in developing each area of well-being throughout their college career. The goal of this course is to provide students with resources, information, social connections, tools, and ideas to successfully navigate college.

For research, I have been studying AI-generated lesson plans. I keep seeing emails from tech companies that promise to save teachers hours of their weeks if only they just use their AI tools to do all their teaching work (write lesson plans, design assessments) for them. And that worries me! Especially knowing how these tools are designed (LLMs are basically predictability machines that guess which words go together to make the most plausible human sounding response; they don’t know what good teaching is; they don’t understand students or context). Not surprisingly, my recent study found that when looking at AI-generated lesson plans for Massachusetts’ 8th Grade Civics and Government curriculum standards, most learning experiences in these lesson plans were at the bottom of Bloom’s Taxonomy (asking students to remember/repeat back information; or to summarize information). Ultimately, AI-generated lesson plans for Civics reproduced traditional teacher centric practices that put the teacher in control of teaching and students following along with what the teacher says. These lessons don’t prepare students to be active, engaged, or informed citizens, which we NEED today!

Additionally, I was recently selected to be a Public Interest Technology fellow at my university (https://www.umass.edu/news/article/public-interest-technology-initiative-announces-2025-26-pit-faculty-fellows). As part of this fellowship, I will continue the exploration of AI-generated teaching and learning materials, with a focus on expanding out to different standards like the AP African American Studies standards.

How do you address AI in your own courses?

I actively engage my students in using GenAI in my classes. It fits, for one thing, since my classes are all about using tech for teaching and learning. But also, I find that still to this date, many teachers and some students haven’t even tried GenAI tools (or if they have, they haven’t tried more than one tool). You can’t make an informed decision about a technology without trying it out first. So, we use GenAI in my classes, we analyze these tools, we investigate the ethical issues surrounding these tools, and we explore whether and how these tools might be used to enrich and enhance teaching and learning. Here’s my AI Syllabus policy for an example of my thoughts on when it is allowed or not: https://docs.google.com/document/d/1caSLk2JM40K4tdQHlLRwftYVGM6k8z0ZA2J12SwLhtU/edit?tab=t.0#heading=h.i7bagmx726nl

What opportunities, if any, do you see for AI to have a genuinely positive impact on education?

For one thing, I hope it encourages educators to reflect on their practice and consider why they ask students to do what they ask them to do. Why have students write a paper? Just to grade it? Or is it to help students clarify their thinking? Make their thinking visible? Build their writing skills for a future career? 

Then, maybe consider how to change any assignments that can easily be done by GenAI; or shift the way they do grading; and/or be more transparent and actually tell students why they are being asked to do what they are being asked to do.

For another thing, I think GenAI tools have the potential to support individualized learning. Teachers can design prompts that create interesting, meaningful, and relevant learning experiences for every student. For example in Gemini Canvas mode “Design an interactive learning tool that explores [insert topic] in a way that connects to a student’s personal interest in [insert favorite hobby]. The tool should: Show how the topic relates to the hobby in fun or meaningful ways. Include at least one interactive element (e.g., game, simulation, quiz, or design activity). Be age-appropriate and engaging for students. Encourage the student to apply what they learn to both the topic and the hobby.” I had a student interested in classic rock, who was learning about rhetorical analysis (I’m not sure I could have figured out how those two go together) and Gemini created a fascinating tool to explore that topic more deeply through the students’ interest. Check out Ethan & Lilach Mollick’s Assigning AI: Seven Approaches for Students, with Prompts for examples of how to write simulation prompts and other prompts or their prompt library (https://www.moreusefulthings.com/prompts) to find prompts you can remix/revise to create more individualized learning experiences.

What do you see as the biggest threats to education, if any, from AI or how it’s used?

It seems that Big Tech is making all the decisions regarding GenAI tools in education without any educators present. They launched these tools into the world without any warning, and in many cases, without any real idea of what they could do. Educators have been scrambling for more than 2.5 years to figure out how to teach and how to support learning in the age of AI. This all is happening at the same time these tools constantly have new models and updates and features; and many more tools, like Canva or Padlet or Grammarly, are embedding AI. It’s exhausting to keep up with! I think the biggest threat to education is that Big Tech will continue to push GenAI into education as a time saver, tutor, and maybe even teacher; when it really ISN’T!!

What advice would you give faculty who feel overwhelmed by AI or unsure how to address it in their classroom?

Talk with your students! Please! Students just need an open, safe space to have a conversation with teachers about GenAI use. At the very least, this could get everyone on the same page about what is and what is not allowed. It could also strengthen the relationship between the instructor and student so students feel more comfortable asking the instructor about GenAI use or sharing their GenAI use, when allowed.

What do you wish more people knew? Are there any misconceptions about AI or education that you’d like to correct?

GenAI tools are NOT intelligent. They are designed to mimic human intelligence; so they might seem intelligent. They are not. They are not trustworthy. They don’t think. They don’t understand context. They are literally guessing machines. As such, they are great for things like brainstorming or idea forming but less so for providing accurate information.

What policies, supports, or resources do you think higher ed needs to implement to address AI well?

I believe that every higher ed faculty needs an AI policy statement in their syllabus. Every institution has its own academic integrity or honesty policy, but these are often broad enough to allow for academic freedom (a.k.a. instructor flexibility in decision making). This also means that if each instructor has their own opinion about what is and what is not allowed when it comes to using GenAI tools (and if students have 5-6 classes), students are left trying to figure what each instructor allows, and if they don’t know, they have to guess, and if they guess wrong, the consequences can be dire, like failing an assignment or losing a scholarship. The policy shouldn’t just include popular GenAI LLMs like ChatGPT either; it also needs to consider AI media generators and AI-embedded tools like Zoom, Grammarly, Canva, etc.

How can readers find and follow your work?

Here’s my website: https://www.torreytrust.com/

Or follow me on LinkedIn: https://www.linkedin.com/in/torreytrust/