A people's guide to finding algorithmic bias: By all people, for all people, regardless of technical background. (2022). Center for Critical Race + Digital Studies.
https://www.criticalracedigitalstudies.com/peoplesguide.
Algorithms that touch our lives are imbued with the values of their programmers, organizations, institutions, culture, and history. As such, they all take on various biases in society. There are opportunities for society's prejudices against marginalized communities to be incorporated into algorithmic systems at virtually every development stage. Bias is present because each step requires human judgment. This guide explains what ML algorithms are, algorithmic bias, types of bias, and the effects of these biases. [Site]
Reading selection of the Summer 2023 HCC Library's Algorithmic Bias Reading & Discussion Group
Broussard, M. (2018). Artificial unintelligence: how computers misunderstand the world. The MIT Press.
In this book, the author argues that our collective enthusiasm for applying computer technology to every aspect of life has resulted in a tremendous amount of poorly-designed systems. We are so eager to do everything digitally - hiring, driving, paying bills, even choosing romantic partners - that we have stopped demanding that our technology actually work. The author, a software developer and journalist, reminds us that there are fundamental limits to what we can (and should) do with technology. Making a case against technochauvinism - the belief that technology is always the solution - the author argues that it's just not true that social problems would inevitably retreat before a digitally-enabled Utopia. [Amazon, WorldCat]
2019 Prose Award in the Computing and Information Sciences Winner, 2019 Hacker Prize from the Society for the History of Technology
Broussard, M. (2023). More than a glitch: Confronting race, gender, and ability bias in tech. The MIT Press. https://0-search-ebscohost-
com.librus.hccs.edu/login.aspx?direct=true&db=nlebk&AN=3309073&site=ehost-live&scope=site.
Data scientist Meredith Broussard demonstrates in More Than a Glitch how neutrality in tech is a myth and why algorithms need to be held accountable. Broussard, one of the few Black female researchers in artificial intelligence, masterfully synthesizes concepts from computer science and sociology. She explores a range of examples: from facial recognition technology trained only to recognize lighter skin tones, to mortgage-approval algorithms that encourage discriminatory lending, to the dangerous feedback loops that arise when medical diagnostic algorithms are trained on insufficiently diverse data. [Publisher]
FT's Best Summer Books of 2023: Technology, getAbstract International Book Award Winner 2023: Business Impact, 2024 PROSE Award Finalist: Popular Science and Mathematics
Buolamwini, J. (2023). Unmasking AI: My mission to protect what is human in a world of machines. Random House.
Buolamwini explains how we've arrived at an era of AI harms and oppression, and what we can do to avoid its pitfalls. Unmasking AI goes beyond the headlines about existential risks produced by Big Tech. It is the remarkable story of how Buolamwini uncovered what she calls "the coded gaze" -- the evidence of encoded discrimination and exclusion in tech products -- and how she galvanized the movement to prevent AI harms by founding the Algorithmic Justice League. Applying an intersectional lens to both the tech industry and the research sector, she shows how racism, sexism, colorism, and ableism can overlap and render broad swaths of humanity "excoded" and therefore vulnerable in a world rapidly adopting AI tools. Computers, she reminds us, are reflections of both the aspirations and the limitations of the people who create them. [Publisher]
National Bestseller, Los Angeles Times Best Book of the Year
Criado-Perez, C. (2019). Invisible women: Data bias in a world designed for men. Abrams Press. [Paperback version available]
From economic development to health care to education and public policy, we rely on numbers to allocate resources and make crucial decisions. But because so much data fails to take into account gender, because it treats men as the default and women as atypical, bias and discrimination are baked into our systems. And women pay tremendous costs for this bias, in time, money, and often with their lives. The author investigates this shocking root cause of gender inequality. Examining the home, the workplace, the public square, the doctor's office, and more, the author uncovers a dangerous pattern in data and its consequences on women's lives. Product designers use a 'one-size-fits-all' approach to everything from pianos to cell phones to voice recognition software, when in fact this approach is designed to fit men. [Book Jacket]
Royal Society Insight Investment Science Book Prize (2019, Financial Times and McKinsey Business Book of the Year Award (2019)
Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press. https://0-search-ebscohost-
com.librus.hccs.edu/login.aspx?direct=true&db=nlebk&AN=1497317&site=ehost-live&scope=site. [Print version available]
Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. [Publisher]
MacArthur Foundation Fellow aka "Genius Award" (2021) author, Inaugural NAACP-Archewell Digital Civil Rights Award (2022) recipient
O'Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown. https://0-search-ebscohost-
com.librus.hccs.edu/login.aspx?direct=true&db=nlebk&AN=1109940&site=ehost-live&scope=site. [Hardcover version & Soft cover version available]
We live in the age of the algorithm. Increasingly, the decisions that affect our lives—where we go to school, whether we can get a job or a loan, how much we pay for health insurance—are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules. But as former Wall Street "quant," mathematician and data scientist Cathy O'Neil, PhD reveals, the mathematical models being used today are unregulated and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination—propping up the lucky, punishing the downtrodden, and undermining our democracy in the process. Welcome to the dark side of Big Data.
National Book Award for Nonfiction Longlist (2016), NYT Bestseller, MAA Euler Book Prize (2019)
Samai, E. E. (2019, March 25). Garbage in, landfill out: How data worship amplifies bias, division, and oppression. [Post.] LinkedIn.
https://www.linkedin.com/pulse/garbage-landfill-out-how-data-worship-amplifies-bias-division-samai/.
"What I'm concerned with is the practice of spreading data without interrogating the research machinery behind it, the blind faith, the worship of data. Methods, people, funding, circumstances all affect outcomes of any scientific query, not to mention whether or not the results of a certain study are suppressed or promoted..." [Post snippet]
Schellmann, H. (2024). The algorithm: How AI decides who gets hired, monitored, promoted, and fired and why we need to fight back now. Hachette Books.
Hilke Schellmann, an Emmy award-winning investigative reporter and journalism professor at NYU investigates the rise of Artificial Intelligence (AI) in the world of work. AI is now being used to decide who has access to an education, who gets hired, who gets fired, and who receives a promotion. Drawing on exclusive information from whistleblowers, internal documents, and real-world tests, Schellmann discovers that many of the algorithms making high-stakes decisions are biased, racist, and do more harm than good. Schellmann takes readers on a journalistic detective story, testing algorithms that have secretly analyzed job candidates' facial expressions and tone of voice. She investigates algorithms that scan our online activity, including Twitter and LinkedIn, to construct personality profiles a la Cambridge Analytica. Her reporting reveals how employers track the location of their employees, the keystrokes they make, access everything on their screens, and, during meetings, analyze group discussions to diagnose problems in a team. Even universities are now using predictive analytics for admission offers and financial aid. [Publisher]
Emmy-award winning author
©2022 Houston Community College Libraries