Too frequently, racial capitalism is all show and no substance. We have a big problem with race relations in America. Racial capitalism doesn’t help, and it sometimes makes things worse…
Racial capitalism breeds racial resentment. People of color know when they’re being showcased to benefit someone else — for example, a student of color whose photo is plastered all over their school’s website often resents the school for using them.Sean Illing, “How capitalism reduced diversity to a brand” at Vox
However, writer and theatre producer Libby Emmons sees the problem a bit differently. In her view, “using AI for tribalism” works precisely because the tribalism is already accepted in society. Big business uses big tech to respond profitably to an existing trend:
When we identify ourselves and allow ourselves to be identified, when we tribe up and proclaim the characteristics that are uniform throughout our tribe, we give advertisers and marketers just what they’ve been looking for all these years: groups of conformed individuals to whom they can sell things. The last century proved to marketers and advertisers that they could create products that were geared to be consumed by specific subsets of the population, from fan bases to ethnic groups. The new way to do this is through AI and machine learning algorithms that do more than target individuals who subscribe to group identities—it actually herds us into identities.Libby Emmons, “How Identity Politics Plays Right Into The Hands Of Big Business” at The Federalist
She finds that we often insist on going where the Machine is taking us. She recounts meeting an old friend who was horrified by the idea that she should not be judged by others based in part on her ethnic identity. The friend seemed unaware that judging people based on identity was up until recently considered a form of prejudice. To the extent that her view is a growing force, big business must listen or lose ground. Emmons asks,
If we do not want to be categorized by identity, then why do we continue to push ourselves into these categories? And if we do want to be recognized by group identifiers—if we want our tribes to be known—then why do we have an issue with this data being used to give us what our past selections, or presumptions about what our group identifiers, signify we want? If we are going to proclaim our identity, and demand that it be relevant information to political and social forces, then we should want our feeds and choices to be calibrated to our identity.Libby Emmons, “How Identity Politics Plays Right Into The Hands Of Big Business” at The Federalist
The simmering controversy sometimes explodes into serious charges. For example, the Department of Housing and Urban Development has launched a Fair Housing Act complaint against Facebook for targeting customers in a way that may constitute discrimination:
“Facebook is discriminating against people based upon who they are and where they live,” HUD Secretary Ben Carson said. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”
The social network allowed those advertising housing to exclude people it classified as parents; non-American-born; non-Christian; interested in accessibility and Hispanic culture; as well as other group’s deemed protected classes, HUD said in a statement.Terrence Dopp and Jesse Westbrook, “Facebook Violated Fair Housing Act With Ad Practice, HUD Charges” at Bloomberg
In any event, in addition to the facts about ourselves that we want others to know, high tech firms are continuing to gather and store data about us whether we know about it or not. That may prove to be an even more significant issue in the long run.
See also: Did AI teach itself to “not like” women?
Can an algorithm be racist?