social defense systems

I started teaching Cathy O’Neil’s book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy in my class last week. Despite being a mathematician by training (she goes by the moniker mathbabe online), the book makes a strong case for the importance of social science generally and sociology in particular.

O’Neil looks ouweaponsmath-r4-6-06t at the land of big data and its various uses in algorithms and sees problems everywhere. Quantitative and statistical principles are badly abused in the service of “finding value” in systems, whether this be through firing bad teachers, targeting predatory loans, reducing the risk of employee turnover by using models that incorporate past mental health issues, or designing better ads to sniff out for-profit university matriculates. Wherever we look, she shows, we can find mathematical models used to eke out gains for their creators. Those gains destroy the lives of those affected by algorithms that they sometimes don’t even know exist.

Unlike treatises that declare algorithms universally bad or always good, O’Neil asks three questions to determine whether we should classify a model as a “weapon of math destruction”:

  1. Is the model opaque?
  2. Is it unfair? Does it damage or destroy lives?
  3. Can it scale?

These questions actually eliminate the math entirely. By doing so, O’Neil makes it possible to study WMDs by their characteristics not their content. One need not know anything about the internal workings of the model at all to attempt to answer these three empirical questions. More than any other contribution that O’Neil makes, defining the opacity-damage-scalability schema to identify WMDs as social facts makes the book valuable.

The classification also helped me realize that the failure of many of the WMDs she describes could be mitigated through the application of basic sociological principles. Sociology, I think, offers two defense systems against Weapons of Math Destruction.

Methods to study the disparate impact of mathematical algorithms provide the first defense. Our discipline has demonstrated the importance of having diverse development teams as a necessary (though insufficient) condition of fairness in the way that O’Neil uses it in her book. The tech media – and tech companies themselves – often focus on the lack of representation among staff at large tech firms as a problem because it stymies upward mobility in a growth sector of women and people of color. But, O’Neil’s work demonstrates that the downstream effects are just as important. Assuming that “what people want” is equivalent to “what people I know want” (her anecdote about the CEO who didn’t want to see ads for the University of Phoenix was priceless) can ruin the lives of those not included in those discussions. As Lauren Rivera’s work shows, overcoming these biases around cultural fit are very difficult to overcome, though she has some useful suggestions for how to do so.

The principles of audit methods seem like they could be applied in the development of algorithms to assess whether they end up being unfair. There are already overlaps in the data science world since audit studies are essentially post-hoc A/B tests that developers already use and could even be implemented using unit tests in the software development process. Having sociologists on data science teams, or computer scientists versed in sociology, could relatively easily come up with profiles to run against the algorithms based on nationally representative data to see if algorithms can, for example, really disentangle credit worthiness from racial composition in neighborhoods. As an extra step, sociologists could develop “ideal types” of profiles that would likely confuse algorithms based on close ethnographic study and representative samples. Again, these could be fed through unit tests on the algorithms to assess how well they disentangle signal from noise. O’Neil has set up a firm to conduct these audits; I hope that she will incorporate the insights of sociology when she does so.

If auditing constitutes the first-line sociological defense system, then the application of sociological theory to real-world policy problems related to algorithms could offer a second-line defense. The first-line defense assumes, to some degree, noble intentions. Unit-testing software and preventing disparate impact assumes that companies are willing to do such tests internally. It’s hard to imagine that the University of Phoenix or unscrupulous lenders would want to be bound by such rules. Instead, we are left to try to guess whether companies discriminate in unethical or illegal ways.

Sociologists have a unique opportunity to demand a seat at the table to be part of the discussion on how to regulate these WMD. The initial conversation will be dominated by tech insiders, data scientists and mathematicians like O’Neil. But O’Neil clearly shows the social implications of these methods, a topic that I have some confidence in saying that we know something about. If we had a seat at the table in the drafting of these regulations (as Zeynep Tufekci should) and we trained our students to understand how these forms of social interaction structured through algorithmic media, then we could be sure that sociological solutions are included among the inevitable technological or mathematical solutions that would be offered. The applications within sociology, while important, thus far are relatively limited and largely focus on only a few areas such as financialization.

Speaking of students, we also have the opportunity to train a cadre of professional sociologists who could confront these issues if we act relatively fast. Our discipline has typically devalued professional training over academic training (I have been as guilty as others, though exceptions exist). But I think that it would be incredibly valuable for sociologists to work to develop what a professional degree would look like that would help address the problems with Weapons of Math Destruction.

What an appropriate course of study would look like should be a topic of its own discussion. It seems clear that a training program would require some instruction in computer science, which might take away from traditional topics of research or reduce the amount of methods and theory that could be taught. The tradeoff would result in training a cadre of professionals that will be in positions to advise Congress and executive regulatory agencies on the social costs (and perhaps social benefits) of algorithmic interactions, we could influence the world in ways that we have – as yet – been reluctant to do.

Many of us come into this profession and this discipline with the hopes of seeing a better world come about because of our work. Developing these sociological defense mechanisms seems like a way to do it.

Update: I meant to include in the original post the new Master’s program in digital sociology offered at Virginia Commonwealth University started by, among others, Tressie McMillan Cottom , who reminded me of the program.

5 thoughts on “social defense systems”

  1. Thanks for this post. I agree that social scientists, and sociologists in particular, have a lot to add to this conversation. Right now, though, I’m not seeing the obvious cluster of people in sociology who would come together around such a project. There are sociologists who study the online world, and sociologists are represented at organizations like Data & Society. And a very small handful of sociologists study financial algorithms. But it seems like it would take a pretty big field-building project to make something like this happen (in sociology) — it’s not like all the pieces are in place just waiting for somebody to assemble them. Maybe you disagree?

    Like

    1. Beth, I agree that it won’t necessarily be easy to build a coherent field. As Tressie points out (and is in a much better position to know than me), there is a nucleus from which I think that the discipline can build. I would say two other things in response to your post. First, the fact that we spend such a large amount of our time interacting with digital media and so we, as a discipline, should figure out how to study these topics if we care about describing and explaining the social world. Second, I think that part of what reading the book made me realize is that one need not necessarily be part of a field of digital sociology. In my own field of housing, I think that one need not become an expert in digital sociology to consider how algorithms could affect housing discrimination. That is part of the real contribution of the book: it turns those algorithms into objects of study that can be investigated as social facts within existing fields of inquiry.

      Liked by 1 person

  2. Great post. I have the book on my desk. My colleague Victor Chen just reviewed it in The Atlantic. I’ve not yet had a chance to read it. I have had the chance to write the curriculum for our new master’s program in digital sociology and co-edit a book of the same title that hopefully starts a conversation that builds on the issues raised here. While we cannot do it alone (we are growing but small), our department thinks it is ridiculous to assume that sociologists cannot be trained to understand algorithms or how to write a data scraping script or how to read financial statements while also thinking as critical sociologists. For my money, recent theoretical work by Marion Fourcade and Kieran Healy provides a great model for thinking about algorithmic stratification. Myself, Victor Ray and Louise Seamster are thinking through what that looks like empirically for status groups and organizations. We hold conferences at ESS. We do some of the foundational work of discipline-building that, as Elizabeth points out, has to happen. But, it is likely true that until someone can review this work, a mainstream journal supports its knowledge production, and the profession acknowledges it as sociology we’re at risk of being absent in important social problems. That is bad for sociology and society.

    Liked by 1 person

    1. Tressie, I can’t wait to read your edited volume — I’ll find some time once the craziness of the semester ends. I think that your department has done a great job figuring out how to put together both training as critical sociologists and providing the methods to do what is necessary. As someone who has come out of rethinking a master’s program in the past couple of years and, therefore, looked at lots of MA programs, I do think that your department is way ahead of the curve and the willingness to think outside of a normal program of study will do great things for the discipline and VCU. I think that not all departments are necessarily where your department is at the moment…

      Like

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s