Speaker Jay Martin stood before the young audience until the gym fell quiet. He warned them about AI’s growing dangers, a threat that can negatively impact young students. As the weight of Martin’s words settled on their shoulders, Yutan students and staff realized they had underestimated the dangers that the technology in our devices brings.

“I’m going to talk about what you should be concerned about if you put these devices in your pocket, how they can actually impact your life, how they manipulate you,” Martin, Nebraska Department of Education School Safety and Security Director, said.
Principal Stefanie Novotny explained that she learned about Jay Martin through the Nebraska Department of Education. She invited him to speak at Yutan so that students could better understand how AI technology affects them.
“There are so many dangers, and people are finding ways to exploit children or predators out there; I want to keep kids safe by helping inform them,” Novotny said.
Martin spoke to the 7-12 students on Sept. 5 about the dangers of AI and technology, covering topics ranging from scientific reasoning to real-life examples. One of his first points was that technology devices are being underestimated in their ability to cause the receiver to become addicted through their ability to trigger the reward pathway in our brains.
“Our devices, guys, are built to activate our dopamine,” Martin said. “It’s just like a drug.”
Martin then moved into mental health issues people are facing as a result of technology. In recent years, mental health issues have been on the rise, especially for younger generations.
“In 2012, we saw a huge increase in loss of purpose, loss of sleep, anxiety, depression, mental health issues,” Martin said.
To illustrate the mental health effects, Martin told the story of Kiana. He wants Kiana’s message—and name—remembered because she experienced the life-threatening impact of cyberbullying herself. Due to the harsh amount of cyberbullying from peers, she attempted to take her life but did not succeed; instead, she is now bound to a chair receiving continuous care. In Colorado, Kiana’s story changed the cyberbullying law and was even renamed after her.
“Two years and two months after trying to take her life, a lack of oxygen led Kiana to need around-the-clock care because of the severe brain injury, communicating only with her eyes. Cyberbullying had sent her into a spiral, and her family had no recourse, no one to hold accountable. That’s where Kiana’s law comes in, taking parts and turning them into something positive. When it was renamed after her, actually, she smiled,” Martin said.
With the case of Kiana, Martin explained that the growth of emotions requires real-world practice, not just practice digitally. Phone use can affect the ability to manage emotions in a non-toxic way.
“[Students] don’t know how to deal with things because they didn’t have to practice with their emotions because they were too busy typing it into the phone instead of actually crying and learning to work through it, actually getting mad and learning to get over it,” Martin said.

The last topic Martin discussed was warning students of the dangers of AI creating false information. AI creates tampered information that can spread fast, outpacing the truth.
“Harvard actually did a study, guys, and the study showed that fake news goes farther, faster than people think to be real news,” Martin said.
As Martin finished his speech, the students and staff took several things from the message. Teacher Alyssa Hansen emphasized the importance of students’ ability to recognize true and false information.
“I think kids need to get better at discerning what is true and what isn’t,” Hansen said. “They need to spend time really thinking critically about the information they are getting.”
Junior Leah Thompson took away how artificial technology is advancing to be similar to a human and their actions.
“I think the speech opened my eyes to artificial intelligence and how it’s starting to become its own like a person and thing and how scary it is that it can do its own thing,” Thompson said.
For Novotny, the speech heightened her existing worries that AI reduces students’ ability to problem-solve and not have technology be a crutch.
“AI makes me worried about the future, as to how we will continue to allow students to problem solve and think on their own,” Novotny said. “I worry about how our students will continue to gain knowledge without just searching for an answer.”