Navigating the complex relationship between race and AI


facial recognition
Credit: Pixabay/CC0 Public Domain

Race is intricately woven into the fabric of daily culture, and as a result, it influences artificial intelligence (AI) systems that automate tasks traditionally performed by humans.

Whether in disease detection, , loan approvals, or generative imagery, race is an embedded element in numerous technologies. Consequently, AI systems have frequently been criticized for their insensitivity and inaccuracies in addressing race, often neglecting the complex social nuances it entails.

Two members of the Pamplin College of Business—marketing doctoral candidate Angela Yi and Assistant Professor of Marketing Broderick Turner—recently explored the complex relationship between race and AI in their paper, “Representations and Consequences of Race in AI Systems,” appearing in the August edition of the journal Current Opinion in Psychology.

To address the issue, Yi and Turner first had to find the definition of race.

“There’s actually no behind race,” Yi said. “It’s more of a social categorization system that we still see used today.”

The concept of race first emerged during the European Enlightenment. It was used to classify people based on superficial traits, which were then used to reinforce existing social hierarchies. These categories, established during periods of colonialism and slavery, have persisted and continue to influence modern AI systems, often perpetuating outdated and inaccurate assumptions.

According to Yi, race is often treated as a fixed category—such as Black, white, or Asian—in many AI systems. This static representation fails to capture the social and cultural dimensions of race, leading to inaccuracies and potential biases.

“For example, when a person using Google Gemini searched for an image of the Founding Fathers of the United States, the system outputted an image that included non-white individuals,” Yi said. “It is speculated that this occurred because Google was trying to overcorrect for diverse representation.”

As illustrated by the issues encountered when using Google Gemini, integrating race into AI systems presents significant challenges. The tendency to treat race as a fixed category overlooks its social and historical dimensions, leading to inaccuracies and potential biases.

In their paper, Turner and Yi offered recommendations for appropriately incorporating race in AI systems.

  • Reassess the necessity of race: Evaluate whether incorporating race into an AI system is necessary for achieving its objectives. If the inclusion of race adds no benefit to improving accuracy, it may be best to exclude it from the model.
  • Capitalize on the probabilistic nature of AI: Instead of relying on fixed racial categories, AI systems should utilize odds-based approaches that reflect the variability and complexity of race. Probabilistic models can provide a more nuanced representation versus assigning individuals to fixed categories.
  • Use proxy variables: In contexts such as medical AI, where race is often used to explain differences in , other variables should be considered. For example, rather than attributing differences in solely to race, AI systems could use specific proteins or other relevant biomarkers to provide more accurate predictions and insights.

“People need to recognize that race is a dynamic social construct that changes over time,” Yi said. “AI systems should reflect this complexity rather than relying on outdated or overly simplistic categories.”

Yi also suggested that developers of AI systems consider the broader implications of including race in AI systems and embrace a more nuanced representation.

“Including race in AI systems is not always going to be a simple answer, but it’s going to need to be a nuanced answer because race is social, and understanding the social and historical context of race can help developers create more equitable and accurate models,” she said.

More information:
Angela Yi et al, Representations and consequences of race in AI systems, Current Opinion in Psychology (2024). DOI: 10.1016/j.copsyc.2024.101831

Provided by
Virginia Tech


Citation:
Navigating the complex relationship between race and AI (2024, August 27)
retrieved 27 August 2024
from https://phys.org/news/2024-08-complex-relationship-ai.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Leave a Comment