Notice of Online Archive

  • This page is no longer being updated and remains online for informational and historical purposes only. The information is accurate as of the last page update.

    For questions about page contents, contact the Communications Division.

By Kathleen Parrish

If you think algorithms are neutral bits of data and code devoid of bias, consider that a 2015 study discovered Google’s online advertising system showed ads for high-paying jobs more often to men than women.

Algorithms are written by people and can push forward social bias around identities and groups, Mary Armstrong, professor and chair of women’s and gender studies, explained during her talk “Figuring What’s Hidden: How Algorithms Help Discriminate,” organized by the student Women in Computing group as part of Hidden Figures Week.

Pioneering feminist, activist, and author Rita Mae Brown talks with Mary Armstrong, head of women's and gender studies at Lafayette College.

Mary Armstrong, head of women’s and gender studies, at a previous event with feminist activist and author Rita Mae Brown

Unless you’re in computing or studying informatics, you don’t know we’re all soaked in algorithms, “invisible hands that guide us” and help form our perceptions of the world.

“Algorithms turn data into information,” she said. “The first is a set of items, the other is a form of conclusion, and it generates trends and eventually ‘facts.’ If you get enough facts, you get reality, and if you get reality, you own everything.”

Big data drives decisions about employment, education, housing, health care, policing, “the rules we live by,” said Armstrong. “The question becomes this: What if that data is reflecting the biases and inequities that are already existing?” she said.

Consider:

  • A Harvard University study found ads for arrest records were between 81 and 93 percent likely to turn up in searches for names associated culturally with being black as opposed to 23 to 20 percent of the time for predictive white names. “What is it teaching a person who Googles that?”
  • The Federal Trade Commission has struggled with questions of regulation of advertisers who target people who live in low-income neighborhoods with high-interest loans.
  • A University of Washington study found a Google image search for CEO produces 11 percent women, despite that 27 percent of women in the United States hold that title. “The algorithm got it wrong because of assumptions.”

Machine learning evolves based on online human behavior so when you punch a topic into a search engine, it takes you where it thinks you want to go. From there, already biased human “clicking patterns” further reinforce those assumptions.

“A search could have taken you here, here, and there, but it didn’t. That’s incredibly important because it took you away from certain kinds of outcomes and information,” said Armstrong. “As the algorithm learns and starts to get smarter, you get a strong feedback loop. That’s why it’s so important that all of us—from programmers to web users— become actively aware that what’s driving seemingly objective searches can be biased.”

 

Categorized in: Lectures, Lectures-Discussions, News and Features, Women’s and Gender Studies
Tagged with: