How algorithms rule our world Twitter
By Kathleen Parrish
If you think algorithms are neutral bits of data and code devoid of bias, consider that a 2015 study discovered Google’s online advertising system showed ads for high-paying jobs more often to men than women.
Algorithms are written by people and can push forward social bias around identities and groups, Mary Armstrong, professor and chair of women’s and gender studies, explained during her talk “Figuring What’s Hidden: How Algorithms Help Discriminate,” organized by the student Women in Computing group as part of Hidden Figures Week.
Unless you’re in computing or studying informatics, you don’t know we’re all soaked in algorithms, “invisible hands that guide us” and help form our perceptions of the world.
“Algorithms turn data into information,” she said. “The first is a set of items, the other is a form of conclusion, and it generates trends and eventually ‘facts.’ If you get enough facts, you get reality, and if you get reality, you own everything.”
Big data drives decisions about employment, education, housing, health care, policing, “the rules we live by,” said Armstrong. “The question becomes this: What if that data is reflecting the biases and inequities that are already existing?” she said.
Machine learning evolves based on online human behavior so when you punch a topic into a search engine, it takes you where it thinks you want to go. From there, already biased human “clicking patterns” further reinforce those assumptions.
“A search could have taken you here, here, and there, but it didn’t. That’s incredibly important because it took you away from certain kinds of outcomes and information,” said Armstrong. “As the algorithm learns and starts to get smarter, you get a strong feedback loop. That’s why it’s so important that all of us—from programmers to web users— become actively aware that what’s driving seemingly objective searches can be biased.”