Women are perceived differently from men — even when they’re robots

We tend to bond better with bots that we perceive to be female. (AP Photo/EIaine Thompson)

Female bots are perceived to have more positive human qualities, such as warmth, experience and emotion, than male bots, and this greater humanness leads consumers to prefer female artificial intelligence, a new study has found. Consumers think that female A1 is not only more human but also more trustworthy and more likely to consider the unique needs of users, according to a new study published March 22 in Psychology & Marketing.

The large-scale study is the first of its kind about gendered A1 and is key to understanding how current technology may objectify women.

The preference for female A1 is evident in the market today, from virtual assistants such as Siri and Alexa to robots such as Sophia (who recently sold an NFT for nearly $700,000). Machines lack warmth and friendliness, and humanizing technology makes users more apt to trust robots — especially when the robots are cast as female.

Previous studies have shown that gendered and realistic robots are seen as more human, but researchers have largely overlooked the differences between female and male bots. Sylvie Borau, lead author of the study and a professor of ethical marketing at TBS Business School in Toulouse, France, said this distinction is important.

Borau’s research uses “what really makes something human as a building block [to] explain why female features are so prominent in A1 settings.” She tested human characteristics, including competence, warmth and experience, in five separate studies. Online interactions and surveys with over 3,000 people compared gender perceptions not just in machines but in humans and animals, too.

In one study, participants were randomly assigned to either a female or a male chatbot and then answered questions about the chatbot’s humanness, such as how mechanical, cold and evolved the bot was perceived to be. The male and female chatbots’ ages, facial expressions and physical attractiveness were identical. Even their names, Oliver and Olivia, were similar.

In another study, participants were shown an image called “The March of Progress,” which illustrates man’s progression from ape to human. Though the drawing is famous in human evolution, there has never been a female version of this “Road to Homo Sapiens. ” Borau had to create a gendered image for her research, which measured dehumanization in humans and animals.

Participants reported that their views on gender were influenced by qualities such as age and parenthood, which have been shown to increase gender biases. People who follow gender stereotypes are more likely to dehumanize women, whether human or machine, according to the researchers.

This study found that older participants were less likely to humanize bots and, perhaps unsurprisingly, that narcissists were more negative toward bots.

Overall, participants perceived human women as having more positive human qualities than human men for nearly all variables in the study. People made both subtle and blatant links between female robots and their human character. In health care services, the female bots and chatbots were thought to give more unique treatment solely because of their gender. In all cases, male bots were perceived to be less human. But relying on female A1 creates an ethical issue.

Transforming women into objects as virtual bots “could actually lead to women’s objectification,” Borau told The Academic Times. Using a female personality to make technology seem more human promotes an “idea that women are simple tools designed to fulfill their owners’ needs,” she said.

And most virtual programs use only the mental state of women, while the woman’s body disappears. Similar to how women are “sexually objectified based on their physical appearance in advertising,” female bots are “mentally objectified in A1,” Borau explained.

Gender-neutral robots are one possible solution, though Borau sees their voices as a big issue. “A1 engineers will have to work very hard to make gender-neutral robots sound human,” because their voices tend to sound more robotic, she said. Another solution is to achieve gender parity in A1, which will be the focus of Borau’s future research.

While governments and workplaces sometimes struggle to have an equal number of women and men, Borau thinks gender parity can be very easy in A1. Companies choose the sex of the robot as part of the design, and there’s no need for gender biases to get in the way of equality.

“It’s impossible that we’re going to let these companies feminize all A1,” she said. Borau also plans to research how humans verbally abuse female robots. She assumes that people are more likely to make sexist statements to female bots. “I don’t understand why there’s not more backlash on women in A1,” Borau said. “A1 should reflect the ideal world we want to live in, not be a selection of the current sexist biases” in our society, she said.

The study, “The most human boc Female gendering increases humanness perceptions of bors and acceptance OF A1, ” published March 22 in Psychology & Marketing, was authored by Sylvie Borau and Samuel Fosso Wamba, TBS Business School; Tobias Otterbring, University oFAgder and the Institute of Retail Economics; and Sandra Laporte, Toulouse School OF Management.