Teaching & Learning: Embracing Emerging Tech in the Classroom

Summer 2024

By Ben Farrell

This article appeared as "Uncharted Waters" in the Summer 2024 issue of Independent School.     

In the fall of 2022, something big, new, and important was happening all around us. I remember a certain buzz around campus—one associated with teenagers knowing something adults don’t. That isn’t totally unusual. The older I get, the more I find myself scouring the internet for pop culture references or to understand a new turn of phrase I overheard in the hall. New vocabularies invented overnight spread like wildfire across social platforms. All this new lingo, as any parent can attest, can be kind of overwhelming. 

But October 2022 was different. Words like “generative AI” and “ChatGPT” were whispered across the school, quickly followed by the sometimes furtive glances of students wondering, “Do they know yet? Are the adults on to us?” Some, more than likely, might even have thought they had finally found the cheat code to an easy A. 

For almost all educators, these conversations around a disruptive and emerging technology were fraught. Most of my early exchanges with colleagues were shaped by what we read in the media, the majority of which skewed negative and were framed around our ability to “detect” the usage of generative AI. Without any concrete research, generative AI could be felt as the final nail in the coffin of creativity and originality. At the risk of sounding like an early ’90s movie, if assignments could be so easily “hacked” by a new tool, did that make our assignments obsolete? Did it make us obsolete? 

The quick answer is an emphatic no! The longer answer is a bit more nuanced. At the New England Innovation Academy (NEIA), where I am assistant head of school and director of the upper school, we took a different approach, one that leans into the human-centered design (HCD) process. HCD embraces humanity and teaches students to solve real-world problems by designing with the user in mind. This approach is woven into our mission and all of the work we do in educating the next generation of innovators and entrepreneurs. We’ve been here before, building an integrated curriculum from the ground up and creating a multidisciplinary, cutting-edge design studio. We’re not afraid to question, and radically rethink, centuries-old structures. 

To help haul myself into the light of this new day, I invited my administrators to join me in meeting with upper school students. We asked, “What do we do? What do we do with this new and revolutionary tool?” And we listened to their wisdom. 

Human-Centered Design Process

The first step in the human-centered design process is to understand your stakeholders—the users of this emerging technology—on a human level. We got a whole spectrum of responses. “AI is the death of original thought,” said one 11th grader. A 10th grader quietly added, “I don’t want it to write the papers for me, but it could help me with my research.” Another 11th grader bookended the conversation with a very hopeful interpretation of what generative AI could do for us: “It’ll teach us to be more human.”

During the past year, these three sentiments have helped guide our school, not just in how we approach generative AI but in how we approach all emergent technologies. AI can’t replace original thinking, that much is clear. But what if we approach it like a helpful research tool, a new resource that opens up possibilities we can’t even conceive? Could it even teach us something about the human condition? 

We work hard to focus on the “Innovation” in our school’s name. Sometimes, being innovative means we don’t have all the answers. As scary as that might be, we have to be brave enough to acknowledge that reality. The authenticity that comes from saying this out loud, in front of a group of teenagers no less, can lead to wonderful truths. As another prescient 10th grader pointed out, with the pace of change regarding generative AI, the second we formalized a policy in writing and included it in our handbook updates over the summer, it would already be out of date. 

So what do we do? At NEIA, we are choosing to dive in, knowing that at times we might feel out over our skis. If we act with transparency and openness in setting up the guard rails for how we use emerging technology, the sky’s the limit. We must teach our students to act with empathy and a strong moral compass as we jump into generative AI and other emerging technologies. 

Taking Action

From that first upper school meeting, the biggest question always was, “How are the students using this technology?” After engaging our students, our team got a wide array of responses. Students are using AI for rapid design concept exploration, grammar editing, research assistance, and ideation support. However, our teachers emphasize the importance of analyzing AI-generated results and maintaining a focus on fundamental learning, particularly in writing. Students are asked to cite and acknowledge all resources used, whether they’re quoting AI or creating an image. 

In May 2023, we ran a forward-thinking forum on generative AI and collective intelligence, where we invited other local schools and the community to hear from cutting-edge experts. We centered students to give them voice and agency in this evolving space. Kathleen Kennedy, executive director of the MIT Center for Collective Intelligence (a multidisciplinary research center), delivered a keynote address. After that event, students resoundingly said, “Let us use generative AI in school!”

Faculty members also are encouraged by the community (students, parents, and faculty) to think about and explore the boundaries of AI in the name of optimizing their role as educators. They begin with their own work (so that their voice and thinking are present) and then use AI to enhance that work: assessing the tone and clarity of their communications, generating images to create personas, and generating additional problem sets and examples. 

In execution, this can lead to a lot of exciting developments. NEIA’s director of the Library & Learning Commons has students enter their own writing into ChatGPT to check for bias. This provides an interesting lens to the concerns that generative AI is itself biased. As our Innovation Studio creates real-world solutions, ideas, and products, our industrial designer in residence is helping his classes take advantage of the image/concept generation through Stable Diffusion XL to more quickly realize ideas for clients to respond to. Our head of teaching and learning is encouraging students to use AI as a partner in writing: AI can help ideate on topics, research for connections, edit for basic grammar, and check for bad writing habits. But we also stay vigilant to ensure that students originate their own work. AI will not replace student work, but enhance it.

What’s Next?

AI is just one of the many technologies that have been introduced to (and ultimately improved) our courses. There are other emerging technologies that other educators may be quick to dismiss but that our faculty use to explore and enhance their lessons.

For example, our integrated curriculum allows for the Innovation Studio and Entrepreneurship to help solve real-world issues. Recently, we focused on a large but unprogrammed space in our newly built local library. When we found out that the library was considering using it as a maker space, we reached out to the team and jump-started an exciting collaboration: to have our students design a layout for them to consider. 

Our students used the human-centered design process, and after multiple rounds of ideation, creation, and feedback, library staff came to campus for a tour of what their maker space could be—in virtual reality. The library’s administrative staff remarked how much more compelling the designs were because they were able to experience them as if they were walking through the finished space. Students learned valuable skills about designing for an actual client, presenting, and community involvement. 

Everyone is using AI differently. And that’s what’s so exciting about a new tool. We don’t know its limitations, and it’s only as powerful as the user wielding it. We believe, as a new innovation-minded institution, that we must embrace this tool and develop digital citizenship by providing students with instruction on AI usage, examining writing for bias, and facilitating discussions on copyright, fair use, and authenticity. If we help students harness AI to better optimize and collaborate while ensuring they retain their originality and scrutiny of sources, we can set them up for greater success in both college and the workplace. 

We understand that generative AI and other emerging technologies are at once exciting, messy, and a bit overwhelming. We also understand that they aren’t going away. It’s our work to help prepare our students for a future that is constantly shifting. We believe that, like generations of educators before us, we must teach our students, our children, to act and think with empathy, to consider their moral compass as the future unfolds in ways we haven’t dreamed of yet, and to always remember that technology, if used appropriately, can help us shape a more hopeful and optimistic future.


Go Deeper

AI is on our radar, and we’re continually covering emerging technologies. Check out what you may have missed. 

Ben Farrell

Ben Farrell is the assistant head of school and director of the upper school at New England Innovation Academy in Marlborough, Massachusetts.