logo

41 pages 1 hour read

Virginia Eubanks

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor

Nonfiction | Book | Adult | Published in 2018

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Chapter 4Chapter Summaries & Analyses

Chapter 4 Summary: “The Allegheny Algorithm”

The Allegheny County Office of Children, Youth and Families (CYF) in Pennsylvania uses a judgment protocol called the Key Information and Demographics System (KIDS) to process social services requests. The aim of the KIDS system is to identify which indicators for kids in vulnerable situations are connected to child abuse. However, “Many struggles common among poor families are officially defined as child maltreatment, including not having enough food, having inadequate or unsafe housing, lacking medical care, or leaving a child alone while you work” (130).

In the late 1990s, when Marc Cherna took over CYF, “Seventy percent of children in the foster care system were Black, though African Americans made up only 11 percent of the population of Allegheny County” (134). There was huge racial bias inherent in the system. Cherna’s first priority was a comprehensive data warehouse, meant to “increase agency communication and accountability, provide wraparound services for clients, and cut costs” (135). Again, as with Governor Daniels in Indiana, the aim was to funnel public money to private business: In 2012, the state offered a million-dollar contract to build an automated triage data system. The winning proposal by New Zealand economist Rhema Vaithianathan was “a predictive model using 132 variables” which gave results with only “fair, approaching good” accuracy (137). Despite the fact that academic researchers in New Zealand warned that the model (also in use in that country) “was wrong about nearly 70 percent of the children it identified as at highest risk of harm in the historical data” (138) and neglected to account for inherent bias against Maori families and the poor, Pennsylvania awarded the contract anyway.

The new predictive risk tool was called the Allegheny Family Screen Tool (AFST). It was supposed to partner with human call screeners, supporting “human decision-making in the call center. And yet, in practice, the algorithm seems to be training the intake workers” (142). Human decision-making elements are visible in three components of the AFST: outcome variables, predictive variables, and validation data.

Outcome variables measure the phenomenon they are trying to predict. In this case, these variables were the number of children initially screened out but later screened again, and the number of children initially screened in and later placed in foster care. Predictive variables are data that correlate with the outcome variables. Finally, validation data is universally used to see how well a model performs. AFST turned out to only be 76% accurate, “about the same accuracy as a yearly mammogram” (145), raising questions about its data sources. The reason that data was biased and skewed was immediately apparent: “Allegheny County has an extraordinary amount of information about the use of public programs stored in its data warehouse. But the county has no access to data about people who do not use public services” (147). By never comparing the poor to the non-poor, dataset makers were unable to teach their algorithm which behaviors were normal throughout the population, thus perpetuating disparities. As a result, as of 2016, “African American children are more than two and a half times as likely to end up in foster care than they should be, given their proportion of the population” (153). A study of Allegheny County CYF found that the racial disparity exposed by the data was actually from a misapplication of data restraints, and not a true indication of racial reality.

“Poverty profiling” is the assessment of poor people’s morals according to their spending habits and data about “TANF, Supplemental Security Income, SNAP, and county medical assistance” (156). By contrast, information from private sources used by the non-poor, like nannies, therapists, and rehabilitation centers, could also potentially predict child abuse in the population above the poverty line, but this data is protected by privacy laws. Eubanks concludes that the poor have no privacy, and their automatic monitoring is akin to being in prison. A source familiar with the CYF system seconds that opinion, saying that the expungement process is more complex than for a criminal record.

Chapter 4 Analysis

The fourth chapter in Eubanks’s book drives home the point that the solutions for poverty derived through automated means are just as flawed as the data they collect. One problem is ownership—by privatizing and diffusing a system’s architecture, government risk clashing intensions. In Allegheny County, for instance, system author Vaithianathan and her team prioritized academic laurels, the corporations who used the AFST model prioritized their binding contract, and the politicians who proposed the whole thing prioritized burnishing their image with their voter base. The only goal left behind was the only one that truly matters: addressing poverty at its roots.

This diffusion of concerns also brings with it diffusion of responsibility, leaving no one to answer for system failures and biases. Vaithianathan refused to consider negative externalities resulting from her model, dismissively suggesting her team would address this unlikely scenario only if and when it occurred. Eubanks criticizes this stance: “the assumption that academics speaking out against the way their research is used will have a significant impact on public policy or agency practice is naive” (173). Similarly corporations who take on public works contracts find it relatively easy to legally contest contractual responsibilities for fixing, upgrading, or troubleshooting the systems they create—which is what happened in Indiana. All of this means that even when the results of new automated systems are drastically bad—for example, “only 37 percent of calls [to CYF] that triggered a mandatory investigation were found to have merit” (170)—elites who are disconnected from the impacts of their work never face consequences for its improper implementation or conceptualization.

Eubanks contrasts the severe loss of privacy and rights that is the cost of accessing services (giving up self-determination, freedom, and access to one’s own children) with the extreme secrecy and privacy accorded to the automated systems that run the godlike algorithms: “Once the big button is clicked and the AFST runs, it manifests a thousand invisible human choices. But it does so under a cloak of evidence-based objectivity and infallibility” (167). Another injustice of automated systems is that there is no right to be forgotten, which undermines familial rights and freedoms. Parents who access mental health, drug, or alcohol services pass on their higher AFST scores to their children. This punishes poor individuals, their family members, and even neighbors in the community. It also further entrenches racist bias into the dataset, compounding the problem.

blurred text
blurred text
blurred text
blurred text