AI, Ethics, and "Friends" – A Thanksgiving Morning Reflection.

AI, Ethics, and "Friends" – A Thanksgiving Morning Reflection.

BY ANDREW DAVIS, HEAD OF SCHOOL

I started my Thanksgiving morning with a cup of coffee, Strudel – my dog – snuggling on the sofa, and seventy pages left in Bruce Holsinger’s Culpability. In no time, the coffee was gone, Strudel was on his back enjoying a belly rub, and I had finished this riveting page-turner about a car crash, artificial intelligence, and ethics. While I probably should have been haunted by the titular theme and by a future of AI-powered cars and war drones, I found myself most disturbed by the “chats” between a supporting character and her AI “friend” woven throughout the book.

This fall, I’ve been teaching an eighth-grade class called Futures, focused largely on artificial intelligence. Recently, students explored ethical challenges, including bias, environmental impact, and deepfakes. Channeling the G.I. Joe adage, “Knowing is half the battle,” I asked students to design laptop stickers that raise awareness about the ethical dilemmas surrounding AI. Their work has been impressive. Here are two examples of their sticker and explanatory paragraphs:

Joy’s sticker on privacy and “data-fication”

 

“This AI awareness sticker is effective because it immediately shows how AI surveillance constantly monitors our personal data and privacy. The sticker features an eye surrounded by binary code and circuit patterns, which represents a real concern with AI systems: AI collects our data often without us knowing or agreeing to it.… Together, the image and text clearly communicate that AI monitoring can invade people’s privacy, making the sticker a memorable and effective reminder to think about surveillance in AI systems.”

Charlotte’s sticker on AI’s environmental impact

“The AI awareness sticker is effective because it highlights the link between AI chatbots and the amount of water and energy they use. The sticker features a hand holding a phone with the ChatGPT app open next to a scene with an AI server in a desolate, smog-filled landscape with a dry lake. This highlights how AI uses so much more energy and water than traditional search engines. The words THINK BEFORE YOU CHAT urge people to use AI sparingly and to think about the pros and cons of chatting with AI, and the words AI COSTS OUR EARTH reinforce the message that AI is harmful to the environment.”


Holsinger’s Culpability takes on even larger AI ethical quandaries, stirring up deep concerns about the future we will soon inhabit. Ever one to bring it back to my own sphere of influence, I was most unsettled by the “chats” that the protagonist’s daughter—likely the age of an MTS student—has with an AI companion. The bot ingratiates, praises, guilts, and demands. Only in a final moment of human willpower does she cancel her account, ending the relationship. My fear—one widely supported—is that many teens and certainly tweens may begin turning to AI companions as stand-in “friends” for advice, comfort, and emotional validation. And they may do so at precisely the developmental moment when they most need real, sometimes challenging, human relationships to learn who they are and how to live alongside others.

My greatest AI worry, both as a school leader and as a parent, is the loss of critical human experiences, including—perhaps especially—the hard ones. It is certainly our responsibility to teach children the skills they need to thrive in an AI-shaped economy. But it is an even greater responsibility to teach the skills of humanity.

When I drafted the syllabus this past July, I scheduled our class on the dangers of AI companions for the week after Thanksgiving. That timing was, perhaps, prescient. Now, both in Futures and in all of our homes, we should be guiding children away from artificial relationships and toward human ones.

No pressure, Andrew—the Futures lesson on Monday better be good!

 


Photo of Andrew Davis, Head of School

TGIAM is the blog of Andrew Davis, Head of School. TGIAM = Thank Goodness It's Almost Monday.

Read More