Should kids use artificial intelligence? Parent reactions are mixed



Open this photo in gallery:

Jared Miller, centre, sits with his children, Elyse, left, and Caleb, right, as they explore AI programs together on a tablet and phone at their home in Niagara Falls, Ont., on Friday. Mr. Miller believes supervised AI use can spark creativity and learning.ALEX FRANKLIN/The Globe and Mail

At bedtime, seven-year-old Elyse Miller quizzes herself on her favourite subject: animals. Her trivia partner isn’t a parent or sibling, but ChatGPT, speaking through her father’s phone.

“Which animal is known as the king of the jungle?” the chatbot asks.

“Lion,” Elyse says, without missing a beat.

“Okay, now time for a harder one. What’s the biggest animal in the world? Hint: It’s a gentle giant that lives in the ocean.”

Open this photo in gallery:

Elyse often chats with ChatGPT about animals, while her younger brother Caleb draws on an AI-generated colouring page.ALEX FRANKLIN/The Globe and Mail

Elyse pauses, thinking. “Whale shark.”

“Good guess,” ChatGPT replies. “But the biggest one of all is actually the blue whale.”

In the Miller home in Niagara Falls, ChatGPT has also been an artistic savant, whipping up colouring pages based on any wild prompt from Elyse or her four-year-old brother, Caleb. It’s also a storyteller, crafting fantastical choose-your-own-adventure style bedtime tales based on the children’s interests.

“It now has memory baked in, so if it hears Elyse’s voice, it remembers certain things and reintroduces them, which is horrifying when you think about it too much,” says Mr. Miller. “But from my daughter’s perspective, she really enjoys that it seems to know her.”

Mr. Miller believes AI will fundamentally change the way kids learn, so he’s eager to let his children use it under parent supervision.

But that’s becoming harder as generative AI creeps into more apps his kids use on their own, such as the language translation app Duolingo.

“The thing that scares me is that AI is going to evolve at such a pace that parents won’t be able to continue to be that firewall and say, ‘this is how we’re using this,’” says Mr. Miller. “Now, this is how it’s using us.”

Open this photo in gallery:

Mr. Miller believes AI will play a key role in how children learn and express creativity.ALEX FRANKLIN/The Globe and Mail

Artificial intelligence is increasingly being built into everyday apps, and major tech companies are now tailoring versions of their AI tools specifically for children. Earlier this year, Google launched a version of its Gemini chatbot for kids under 13, while Meta introduced “digital companions” on Instagram, Facebook and WhatsApp. Elon Musk has mused about building “Baby Grok,” a version of his chatbot for kids two to 21.

While some tech savvy parents are embracing AI as a patient helper to assist with tricky math equations and promote creativity, others worry it will encourage cheating and affect kids’ mental health.

Daniel Browne has introduced his kids to AI as a helpful tool at home. His 11-year-old son uses ChatGPT like a search engine, asking it about Easter eggs in Minecraft or the YouTuber Technoblade. With his six-year-old daughter, Mr. Browne uses ChatGPT to suggest words for spelling quizzes.

“I feel strongly that if you’re not teaching your kids about technology, then when they grow up, they’re not going to have the antibodies to be able to resist it,” says Mr. Browne, who works at the Schwartz Reisman Institute, a technology research lab at the University of Toronto.

“The younger that you can impart critical learning skills and an understanding of how things actually work, the more likelihood that they will understand its influence and they will be able to maintain their own agency.”

Opinion: AI is winning hearts and minds in the classroom – but at what cost to our cognitive future?

Leeanne Morrow, a librarian at the University of Calgary whose children are 12 and 16, says it’s up to parents to learn how to use AI tools so they can then teach their kids. Her older child uses it to help with homework, including brainstorming ideas, checking math solutions or copyediting assignments.

“When your kid signed up for Snapchat, what do we all do? We all learned Snapchat. It should be the same thing,” says Ms. Morrow.

Other parents, however, do not allow any AI tools unless it’s a school requirement, blocking ChatGPT and Gemini on their kids’ phones, as well as apps that provide AI companions, such as Character.AI, PolyBuzz and Replika. These apps allows kids to create customized chatbots based on characters from pop culture or with animated avatars.

As these apps become more popular, however, there’s growing concern teenagers are forming unhealthy relationships with chatbots. Last year, a 14-year-old Florida boy died by suicide after allegedly becoming obsessed with a chatbot made by Character.AI modelled on a Game of Thrones character.

Since then, around a dozen parents who have lost children to suicide have come forward with similar stories and filed lawsuits in the United States against the major AI companies, alleging the chatbots contributed to their children’s deaths. Earlier this month, OpenAI rolled out new parental controls, including an alert system to notify a parent if ChatGPT detects signs of potential self-harm.

Opinion: AI can be a threat to children’s lives. How can we build safeguards?

Jennifer Estrela, a psychotherapist in Toronto who works primarily with teens, doesn’t let her three kids, aged 9, 11 and 16, use social media, and if they need to use AI for school, it must be discussed first.

She recalls a constructive conversation after her youngest daughter used ChatGPT to come up with a script for a play, explaining that she could have used her own creativity to write the script instead of relying AI. “She said, ‘Well, I didn’t think about it like that, but you’re right.’”

In her own practice, she’s seen the impacts AI can have on young people’s mental health.

“Kids are relying on AI for friendships, for intimacy, for connections – and it’s all false,” says Ms. Estrela. “If something happens, at the end of the day, there’s nobody there to tell them that they’re worth something or they matter.”

Ms. Estrela hasn’t spoken to her two youngest kids about companion apps yet – her eldest daughter brushes off these apps as silly – but she’s prepared to explain how they can be harmful when the time comes.

“I’m not foolish and naive to believe I can control this, or I can supervise this at all times. But if the child has knowledge and awareness of what can happen, then it’s like the ‘don’t touch the hot stove,’ says Ms. Estrela. ”I’m not going to be around you, but I’ve taught you that it’s hot and you can hurt yourself.”

Open this photo in gallery:

Mr. Miller watches as his son Caleb, and daughter, Elyse, use ChatGPT for creative storytelling and learning games under their father’s guidance.ALEX FRANKLIN/The Globe and Mail


Source

Visited 1 times, 1 visit(s) today

Recommended For You

Avatar photo

About the Author: News Hound