Ruha Benjamin

In 2016-’17, when Ruha Benjamin, associate professor in the Department of African American Studies at Princeton University, was on leave at the Institute for Advanced Study and thinking about her next research effort, she saw lots of headlines warning people about “racist robots.” What this means, she says, is that the way we train our machines and apps “reproduces or reinforces existing social biases.” As a result, applications developed to solve a real problem can at the same time reinforce existing prejudices.

Benjamin offers as an example an application developed to make resume handling more efficient and equitable that in the end sorts resumes based on existing prejudices.

Not only do these programs sort through resumes more quickly, Benjamin says, “but they also promise to be less biased than a human who is interviewing or sorting resumes based on the prejudices humans have.”

When Amazon was developing a hiring tool in 2014, for example, it trained its computer models by observing patterns in resumes submitted over a 10-year period. Because most of these were from men, reflecting the existing dominance of males in the tech community, men were given preference in hiring at Amazon, according to a 2018 Reuters article by Jeffrey Dastin.

“Even if a resume doesn’t have a box that has a choice of female or male, gender is coded in many other areas of the application,” Benjamin says, giving as an example a club or activity that is gendered.

Benjamin, a sociologist, has tackled this problem in two recently published books: and “Race after Technology: Abolitionist Tools for the New Jim Code” (July, 2019, Polity) “Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life” (June, 2019, Duke University Press). She will share some of her findings in a talk, “Are Robots Racist? Reimagining Technology, Equity, and the New Jim Code,” on Friday, October 4, at 2 p.m., in McCosh Hall 10 at Princeton University.

Technology is making decisions about the future based on “patterns we take for granted or make excuses for,” Benjamin says, and we need to decide whether “we want to maintain those patterns or change them.” She suggests a number of changes to bring more equity into the tech arena.

Amplify the social dimensions of scientific and technological education. The idea behind the training of engineers and computer scientists, she says, has been that “technology is going to be able to save us from all the imperfections in society. They don’t learn how technology has been used to abuse or harm people.”

If ethics and knowledge of the social sphere were more thoroughly integrated into technological training, software creators would hopefully be asking more questions about the social context of and human needs for a particular technology. When they don’t, the technology often does not meet the needs of the people who will be using it.

An example of an application where a technology with good intentions may not be meeting the needs of the students who use it is Summit Learning, a web-based curriculum designed by Facebook engineers. Its website states that it “gives every student support from a caring mentor, life skills that they can apply to real-world situations, and an ability to use self-direction.”

But on the ground its creators missed the fact that kids could be spending close to five hours a day in front of a computer, mediated by only 10 to 15 minutes of teacher mentoring per week.

Nearly 100 students who were fed up with Summit Learning walked out of classes at the Secondary School for Journalism in Park Slope, Brooklyn, in revolt against the curriculum. The response of the Department of Education was to immediately drop the program in the 11th and 12th grades.

Similarly in Kansas, according to the “New York Times, many towns that were dealing with underfunded schools and deteriorating test scores initially embraced the free, new curriculum. But it wasn’t working for many students, and a number of parents took their children out of the public schools.

“These are pockets of organizational activism in which people are pushing back against the idea that technology is the solution,” Benjamin says. “It’s not that we’re against technology, but we need to be more involved in the decisions about how to incorporate this. It can’t be a top-down rollout that displaces the role of human relationships and people.”

Lobby for laws and policies to monitor existing technologies. “Right now tech companies spend a lot of money lobbying government officials to create policies in their favor,” Benjamin says. But at the grassroots level, people are working on “policies less about maximizing the good of the company, and more about maximizing the public good.”

People are lobbying, for example, to create moratoriums and bans on certain technologies, like facial recognition. In response, San Francisco has banned city agencies from using facial recognition software. Despite the benefits this technology may have in helping investigate crimes, it has potentially serious downsides in terms of invading privacy or when it is used to target particular neighborhoods. Or, as MIT graduate student Joy Buolamwini found in her research, some facial recognition software was more likely to get a false positive in identifying darker-skinned people.

Bring input from the general public into private sector decision-making about technology. Although we are socialized to think that decisions about the software design process are entirely private-sector concerns, Benjamin says, “if an app is going to have a widespread public impact, it is no longer feasible to think about it in this separate sphere.”

Benjamin offers as examples two apps trying to address the crisis of over-incarceration. The first, Promise, heralded as a way to get people out of jail sooner by tracking their movements, was developed by a private-sector startup. Appolition, developed in a communal context, is an automated bail fund where people donate spare change that is used by community organizations around the country to pay bail for people who can’t afford it so they can be released from pretrial detention.

“On the surface there is a similarity [between the two apps], but one [Promise] continues to feed and bloat the jails and prisons in the U.S.,” Benjamin says. Whereas the goals of Promise may seem to be equitable, its tracking “makes it easy for people to have technical violations that have nothing to do with breaking preexisting laws.” One example is when the mother of a man being tracked got sick after his curfew. “He had to choose between whether to violate curfew by taking his mom to the hospital or not taking his mom.” She adds that people have to pay for the monitors themselves, and missing a payment is also a violation.

Benjamin was born in India and moved to a black and Latino community in Los Angeles at age 3. Her mother is of Iranian heritage and her father is African American. They met in India at a Bahai program during the 1970s and decided to get married the next day.

From ages 9 to 16, she lived in Conway, South Carolina, where her father worked at the Louis G. Gregory Bahai Institute and the attached Bahai radio station, which focused on socioeconomic development in the surrounding rural area.

It was in school that Benjamin became cognizant of race. “I was seeing different treatment in the schools,” she says.

Aware of how race and class operated, Benjamin felt drawn to the anti-apartheid movement in South Africa, and the role students played in it. So for two years she studied at the United World College in Swaziland. The school, which borders South Africa near Johannesburg, was created by anti-apartheid activists of different backgrounds so that their kids would not have to go to school under apartheid.

For college, she had considered schools in South Africa, but ended up applying only to Spelman College, an all-black women’s school in Atlanta, Georgia. Spelman was a deep learning experience for her. “The main thing I learned was that despite the demographic similarity there was so much difference in the student body that I could appreciate and see because we were all black women — class, which was very pronounced, nationality, religion, sexual preference, region of the country,” she says.

Starting as a child development major with a minor in drama, she switched to sociology after her first class.

Her thesis, a comparison of the black midwifery tradition to what happens in medical/obstetrical practice, paralleled her decision to use a black midwife to deliver her older son, Malachi, who was born a week before she graduated from Spelman in 2001.

Accepted by the University of California-Berkeley in its sociology graduate program, she and her husband moved to California. In the second of their six years there, the couple’s younger son, Khalil, was born, this time with a certified nurse midwife who could be covered by insurance.

At Berkeley her focus was on the sociology of science and medicine, and biotech in particular. She took the same set of concerns and questions about inequality, power, and race she had looked at in her thesis and studied how they worked during the creation and implementation of an agency devoted to stem cell research. Studying the social dimensions of the agency, she concluded that “science doesn’t take place in a bubble; it is not outside of society. It takes place in an already unequal context — what we do in science, technology, and medicine is affected by the existing social order.”

After receiving her degree in 2008, Benjamin had a postdoctoral fellowship for two years at the University of California-Los Angeles in the Institute for Society and Genetics, then spent four years at Boston University, teaching classes both in science and technology and on race and ethnicity, sometimes bringing the two areas together in one class.

In talking about her work recently with community groups, K-12 educators, and technologists, Benjamin says she has been “most surprised by the fact that in the last couple of years there has been a growing movement and pushback against harmful technology by people working in the tech industry themselves.”

Although on the surface the people in these groups may not seem like they have shared interests, Benjamin says, “when you dig beneath the surface, you see that the values and commitments are converging.”

Facebook Comments