Why tech made racial injustice worse, and how to fix it

Why tech made racial injustice worse, and how to fix it

A month before today, George Floyd was killed when a Minneapolis cop put his knee on Floyd’s neck for 8 minutes and 46 seconds, while Floyd said he could not breathe. This has led to a re-examination of long-standing issues surrounding racial injustice between people, communities and governments. And the role of the tech community.

The technology industry is often centered around the idea that science and technology can solve many of humanity’s biggest problems and overcome long-standing obstacles to progress. In recent years, that narrative has been called into question as systems such as artificial intelligence and facial recognition have reinforced racial divisions and made the problem worse.

As part of the CNET’s Now series, we have been a professor of African American Studies at Princeton University, Dr. Ruha explores the impact of technology on race relations with Benjamin and the author of the book Race After Technology. Benjamin is a sociologist focused on technology and brings a unique perspective on the impact of technology on race relations.

We talked about where the state of technology is getting worse when it comes to race and social justice in America. According to US government research, facial recognition systems include false positives for people of color ranging from 10x to 100x. It also includes recruitment systems that screen candidates in an automated, fair manner but reinforce existing divisions and prejudices.


We also talked about how people of color have long proposed an alternative narrative on the social impact of technology. Dr. Martin Luther King, Jr. once warned, “When scientific power outpaces moral power, we end up with guided missiles and misguided men.”

Read more: Facial recognition always bothers people of color. Everyone should listen

Benjamin has positive tips on how we can better design future systems to address the present injustice.

“We need to see who is actually developing the technology and what kind of incentive is within the structure [and] What kind of ecosystem is there, “says Benjamin. The fact is that our technology is being developed and developed by a small thinker of humanity, and this murderer of humanity projected his vision of the good life of all the rest is.”

Dr. Ruha Benjamin

Dr. Ruha Benjamin

The current panacea of ​​the tech industry is to diversify and companies are setting aggressive goals to transform themselves. Benjamin said that would not be enough to bring about meaningful change. “We need to consider not only diversifying behind the screen,” says Benjamin.

“It’s important, but it’s not enough, because if the ecosystem stays the same,” she says, “if the context and incentive structure in which the diverse workforce is developing is the same technology – where the benefits inevitably trample other types of public Is the stuff – then you can be as diverse as the workforce you want and you are still going to get many of the problems we see today. ”

As a sociologist, Benjamin’s focus has been on the concept of “discriminatory design”. In his 2015 TED Talk, Benjamin stated, “At the heart of discriminatory design is the idea that we can create technological improvements to social crises … Instead of dealing with the underlying circumstances, we create short-term responses that address the issue.” Out of sight, out of mind. ”

Read more: Black Lives Matter: What you can do now and throughout the year

Benjamin’s appeal gets the technology industry and the wider society to think more and more critically about the role we are playing in our lives and in modern communities.

“Part of what we want is not just focusing our attention on creating less biased better technology that works better, but thinking about the whole ecosystem,” she says. “What would it mean to develop technology for the public interest, for the good of the public, and not just in rhetoric … I mean literally in the incentive structure, in the economic and social governance system – that we are an ecology Tantra makes up ‘don’t trust a person’s good intentions. ”

To learn more about Benjamin’s approach to how to reverse course on the impact of technology on racial injustice, take a look at his book Race after technology.

Playing now:
Check it out:

From Jim Crow to Jim Jim Code ‘: how technology expanded racial …


Leave a Reply

Your email address will not be published. Required fields are marked *