People are more likely to exploit female AI partners than male AI partners, indicating that gender-based discrimination has implications beyond human interactions.
A recent study published November 2 in the journal iScience investigated how people’s willingness to cooperate changes when their human or AI partner is not labeled as female, nonbinary, male, or gender.
The researchers asked participants to play a well-known thought experiment called the prisoner’s dilemma, a game in which two players choose to cooperate with each other or work independently. If they work together, they will get the best outcome for both parties.
you may like
However, if one chooses to cooperate and the other does not, the player who did not cooperate will have a higher score, giving one side an incentive to “exploit” the other. If both players choose not to cooperate, both players’ scores will be lower.
Studies have shown that people are about 10% more likely to abuse an AI partner than a human partner. They also found that participants were more likely to collaborate with female, non-binary, or non-gendered partners than with male partners because they expected the other players to cooperate as well.
The study found that people were less likely to cooperate with their male partners because they did not trust them to choose to cooperate. In particular, female participants were more likely to cooperate with other “female” agents than agents who were perceived as male, an effect known as “homosexuality”.
“The biases observed in human-AI agent interactions are likely to influence the design of AI agents, for example to maximize people’s engagement and build trust in interactions with automated systems,” the researchers wrote in their study. “Designers of these systems need to be aware of unwanted biases in human interaction and actively work to mitigate them in the design of conversational AI agents.”
Risks of anthropomorphizing AI agents
If participants don’t cooperate, it’s for one of two reasons. First, they expected other players not to cooperate and did not want a lower score. A second possibility is that you thought your opponent would cooperate, so you thought going solo would reduce the risk of lower scores at the expense of other players. The researchers defined this second option as an exploit.
Participants were more likely to “exploit” their partner if they were labeled as female, non-binary, or no-gender than if they were labeled as male. If your partner is an AI, the potential for exploitation increases. Men were more likely to exploit their partners than the AI, and more likely to cooperate with their human partners. Women were more likely to cooperate than men and did not differentiate between human and AI partners.
This study did not have enough participants who identified as a gender other than female or male to draw conclusions about how other genders interact with gendered human and AI partners.
you may like
According to the study, AI tools are increasingly being anthropomorphized (given human-like characteristics such as gender and names) to encourage people to trust and engage with them.
But anthropomorphizing AI without considering how gender-based discrimination affects people’s interactions can reinforce existing biases and exacerbate discrimination.
Many of today’s AI systems are online chatbots, but in the near future people may routinely share the road with self-driving cars or let AI manage their work schedules. This means we may have to work with AI in the same ways we are currently expected to work with other humans, making awareness of gender bias in AI even more important.
“Although exhibiting discriminatory attitudes toward gendered AI agents may not be a major ethical challenge in itself, it can foster harmful practices and exacerbate gender-based discrimination that exists within our society,” the researchers added.
“By understanding the underlying patterns of bias and user perceptions, designers can work towards creating effective and trustworthy AI systems that can meet the needs of users while promoting and maintaining positive social values such as fairness and justice.”
Source link
