Friday, September 22, 2023
Home » New AI tool calls out gender bias in classic fairy tales 

New AI tool calls out gender bias in classic fairy tales 

by News Desk

A new AI-tool created in the U.S is explicitly exposing the gender biases that exist in traditional fairy tales such as Snow White, Cinderella and Sleeping Beauty.

Hundreds of stories were run through the AI-driven tool to be assessed on their level of gender bias and stereotypes. Researchers said they ultimately wanted to examine ways AI can help young children develop language learning skills without becoming inculcated in harmful gender stereotypes. 

“If we can develop a technology to automatically detect or flag those kinds of gender biases and stereotypes, then it can at least serve as a guardrail or safety net not just for ancient fairy tales but the new stories being written and created every day today,” lead researcher, Associate Professor Dakuo Wang from Northeastern University, said. 

Hundreds of fairytales and stories were collected from various countries and then run through the AI-tool to identify patterns in the sequence of events or actions experienced by characters.

These “temporal narrative event chains” isolated character names, genders and events from each story before being run through an automated process which arranged events into chains for each character to be categorised accordingly. 

Each event was then assigned a ‘likelihood’ ratio to establish how frequently it was associated with male or female characters.

From a total of 33,577 ‘events’ studied using the tool, 69 per cent of ‘events’ were associated with male characters, while only 31 per cent were tied to female characters. 

Female characters were mostly found to be engaged in domestic tasks, such as grooming, cleaning, cooking and sewing, while male characters were connected to events that revolved around success, failure, or aggression.

“It’s actually the experience and the action that defines who this person is, and those actions influence our readers about what [they] should do or shouldn’t do to mimic that fictional character,” Prof. Wang, who studies natural language processing, said. 

“Someone is being saved and then getting married and then living happily ever after; some others killed the monster, saved the princess and lived happily ever after.”

“It’s not the ‘lived happily ever after’ part or ‘get married’ part that are different. It’s actually the events happening before these events in a chain that make a difference.”

Wang hopes that the tool can be used by writers and publishers to create stories that won’t perpetuate outdated attitudes about gender. Individuals can upload their drafts into the tool to receive a score indicating potential gender biases, as well as suggestions for improvement.

“If in the future I have a baby girl, I don’t want her to feel discouraged to take on those tasks or conquer those challenges [or] say, someone will come save me or it’s not supposed to be something I would do as a girl,” Wang told

The latest research confirms previous studies that showed pervasive stereotypical portrayals of females and males in traditional fairy tales, including one from last November that concluded female characters were “more associated with care, loyalty and sanctity related moral words, while male characters were more associated with fairness and authority related moral words.”

The post New AI tool calls out gender bias in classic fairy tales  appeared first on Women’s Agenda.

You may also like

Leave a Comment


Latest Women Business, Fashion, Style, Entertainment, and influencer news and updates exclusively on Women Trends, Follow us for the Latest News and Stories About Women around the world.

Latest Articles

Editors' Picks

© 2022 Women Trends – All Right Reserved.