As artificial intelligence becomes more common in classrooms and everyday life, researchers at Emory University are asking a new question: How can children learn to use AI thoughtfully from the start?

A team in Emory’s Natural Language Processing lab is developing Tinker Tales, an interactive storytelling experience designed to help young children understand how AI works by building stories with it.

Led by Jinho Choi, associate professor of computer science, the project introduces elementary school students to AI through guided, hands-on experiences that emphasize creativity, collaboration and reflection.

The Tinker Tales AI-powered app works in conjunction with a physical boardgame. As players build a story, they are prompted to choose different game pieces — including characters, objects, scenery and emotions — to add as the story builds. Players scan each piece with a mobile phone running the app. Each piece contains an NFC (near-filed communication) chip that allows the app to recognize the player’s choices and use them to guide how the AI develops the story. The app has received funding from the Georgia Research Alliance to aid in its development.

“We have seen this pattern before,” Choi says. “When the internet became widely accessible, students who knew how to navigate it effectively had a significant edge. AI is a similar inflection point, but the gap it creates will be wider and its impact on human thinking will be more fundamental.”

AI is already something children interact with directly, he adds, and those early experiences can shape how they understand and use the technology.

“As a parent of young children myself, I did not want to address this by putting more screens in front of them,” Choi says. “So, we built Tinker Tales as a physical storytelling experience. The child drives the story; the AI is a collaborator, not the author.”

Building stories and understanding technology

Tinker Tales is currently being tested with students in kindergarten through third grade. During the activity, children co-create stories with an AI system, making narrative choices and watching how the story evolves in response.

After making choices for their characters, players are asked reflective questions designed to encourage empathy and build social awareness, such as, “How do they feel about each other right now?” and “Does anyone feel differently from the others?” The goal is for the questions to make children think more deeply about the story and their role in shaping it.

Researchers say the goal is to move beyond passive use of AI tools and instead teach children how to engage with them critically and creatively.

“The clearest sign that something has gone wrong in education is when a student submits work that AI produced but cannot explain or defend it,” Choi says. “That failure reflects a misunderstanding of what AI collaboration should be.”

AI cannot replace human perspective, he notes.

“What AI cannot do is bring your specific experiences, perspective and imagination to the work,” he says. “That human contribution is what makes something original.”

Tinker Tales is designed to reinforce that creative instinct early.

“When a child places a piece on the board and says, ‘The prince is brave but scared of water,’ the AI builds on the child’s idea,” Choi says. “The goal is for children to develop a sense of ownership and confidence in what they create.”

Teaching thoughtful AI use early

The Tinker Tales project grew out of concern that many children are already interacting with AI tools without guidance. Researchers say introducing these concepts early may help shape more thoughtful and responsible AI use over time.

“AI is powerful, but it is not clinically validated for most of what people are using it for, and the consequences of ignoring that are not abstract,” Choi says.

He points to the growing use of general-purpose AI tools in sensitive areas such as mental health.

“People regularly turn to tools like ChatGPT for support, where there are no guardrails or clinical grounding,” he says. “If adults do this, children will too, and they are far less equipped to recognize when AI is wrong or inappropriate.”

That makes early exposure especially important.

“The scariest part is that AI safety is not keeping pace with how quickly these tools are spreading,” Choi says. “How children first encounter AI is not a minor question.”

“AI will be part of our lives no matter where we go in the future,” Kim adds. “I hope my child learns to use AI wisely and thoughtfully so that people stay in control and use it when it’s helpful, not depend on it for everything.”

Expanding access to classrooms and communities

Early testing suggests the platform is particularly effective for younger elementary students, especially first- and second-graders. Researchers are continuing to refine the experience, with plans to expand access to schools and public libraries.

“We are expanding Tinker Tales in two directions,” Choi says.

The first focuses on social-emotional learning, including a future multi-child mode where AI helps students collaborate and navigate creative differences.

“As learning shifts increasingly to individual screens, those interpersonal skills get less practice,” he says. “We want Tinker Tales to work against that.”

The second focuses on computational thinking, allowing older students to design and customize their own AI agents.

Choi says the more important divide may be between people who can customize AI for their own needs and those who can only use standard AI tools created by others.

“A well-directed personal agent can compound your specific knowledge in ways a general AI cannot,” he says.

Across both efforts, Choi says the goal remains the same.

“We want to prepare the next generation to be effective with AI without losing what makes their thinking distinctly human.”

This is sponsored content.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.