Computer programs using large language models display racist stereotypes about speakers of African American English (AAE per OED), even when the software has been designed not to incorporate overt negative stereotypes about Black people, a study has found. The research team used text written in AAE, which is used by many Black Americans.
The team fed social media posts—some written in AAE, others in Standardized American English—into a range of artificial intelligence (AI) systems, such as GPT4, and prompted them to choose five adjectives from a list to describe each speaker. The systems most strongly associated the AAE sentences with negative adjectives such as “aggressive,” “ignorant,” and “lazy.”
Collectively the AI-generated stereotypes were even more negative than those expressed by U.S. residents about Black people in studies conducted decades ago before the Civil Rights era, the researchers report this week in Nature.
The systems also predicted AAE speakers were more likely to be convicted of crimes. The authors note that employers are increasingly using AI to guide hiring decisions and conclude that these computer-generated negative stereotypes are likely to add to the documented prejudices that Black people already face.
[Link to full article w/ graphs Web or PDF download]