University of Virginia, School of Data Science (interior)
Very happy to report that I’ve moved to the new School of Data Science at the University of Virginia, where I will be a tenured Associate Professor from Fall 2023 onward. I will miss Chicago, but I am very excited to be in a growing program at a thriving institution, and I am very excited to see what the future brings!
Many folks have asked how this disciplinary move will affect my research. The short answer is that I will still be doing the same historical and humanistic work. I will also be helping build (and define) the growing field of academic data science from within, at one of the country’s leading data science programs. Find me at my new institutional home here.
It was a delight to recently speak to the University of Edinburgh’s Centre for Data, Culture, and Society about my new co-edited volume, Your Computer Is On Fire, with co-editor Prof. Kavita Philip. One of the most important things that I think this volume tries to do is to make topics that have for too long been seen as the purview of those inside the tech industry approachable for people outside tech, using the latest research from a variety of scholars in different fields. You can view the recorded event here, and read more about the book on the MIT Press website.
I recently wrote an article for MIT Technology Review about TikTok’s decision to suddenly change the voice used for the “automatic” voiceover feature. This appeared to be due in part to a lawsuit brought against ByteDance, their parent company, by the voice actor who had–unbeknownst to her–provided the voice that ByteDance decided to use on TikTok.
The interesting thing about this change is it’s not the only time a woman’s voice has been used for a multibillion dollar company’s product without due credit, or the voice actor even being aware. This dovetails in an interesting–and illuminating–fashion with the way so many Silicon Valley corporations have recently seemed unable to listen to women’s voices when the call is coming from inside the building, so to speak. In other words, using women’s voices and reaping value from them as a commodity, versus valuing what women have to say, are two very different things. You can read the full article here.
If you’re interested in learning more about the historical background to this kind of gendered dynamic, I was recently on the Tech Won’t Save Us Podcast talking about my research with host Paris Marx. Take a listen on the pod app of your choice or in your web browser here.
I’m happy to announce that a project I’ve been working on for several years as a co-editor has just been released. It’s a book called Your Computer Is On Fire that collects some of the sharpest thinkers today working in the areas of history of technology, STS, information studies, and more to ask the question: how did we get into this mess, and how to we get out?
“The collection of impactful tech issues interrogated over the span of decades in this book makes it recommended reading for anyone interested in the impact of tech policy in businesses and governments, as well as people deploying AI or interested in the way people shape technology” writes Khari Johnson (VentureBeat). Tamara Kneese, writing for the LA Review of Books calls it, “the book tech critics and organizers have been waiting for.” New Scientist says: ‘Technology is so embedded in our lives that we can sometimes forget it is there at all. Your Computer is on Fire is a vital reminder not only of its presence, but that we urgently need to extinguish the problems associated with it.”
Each essay looks at a different aspect of our global, interconnected computing landscape, and the effects that our computing tools and infrastructures have on our political, economic, and social systems. With a special focus on labor and workers’ ability to fight back against unethical and overbearing tech, the book sounds an alarm about the dangers of uncritical techno-utopianism. It trains a spotlight on the inequality, marginalization, and biases in our technological systems, showing how they are not just minor bugs to be patched, but part and parcel of technologically deterministic ideas that assume technology can fix—and control—society. After decades of being lulled into complacency by narratives of technological neutrality, people are waking up to the large-scale consequences of Silicon Valley–led technophilia. The essays in Your Computer Is on Fire interrogate how our human and computational infrastructures overlap, showing why technologies that centralize power tend to weaken democracy. These practices are often kept out of sight until it is too late to question the costs of how they shape society. From energy-hungry server farms to racist and sexist algorithms, the digital is always IRL, with everything that happens algorithmically or online influencing our offline lives as well. Each essay proposes paths for action to understand and solve technological problems that are often ignored or misunderstood.
After decades of being lulled into complacency by narratives of technological neutrality, people are waking up to the large-scale consequences of Silicon Valley–led technophilia. The essays in Your Computer Is on Fire interrogate how our human and computational infrastructures overlap, showing why technologies that centralize power tend to weaken democracy. These practices are often kept out of sight until it is too late to question the costs of how they shape society. From energy-hungry server farms to racist and sexist algorithms, the digital is always IRL, with everything that happens algorithmically or online influencing our offline lives as well. Each essay proposes paths for action to understand and solve technological problems that are often ignored or misunderstood.
You can hear me talk about the book on the Tech Won’t Save Us podcast, or order the book through your library, your local indie bookstore, or the MIT Press website. You can read a review in the LA Review of Books: “A primer in tech criticism and activism, Your Computer Is On Fire uses the underexamined past to better understand the present and shape the future. The four editors are historians of computing, and the contributors, a quarter of whom are women of color, represent varied academic fields and areas of expertise. The volume coalesces around their collective conviction that the tech world is in a state of emergency. The “fire” of the title, inherent from computing’s inception, has three interconnected meanings: the first is literal, in that computing technology runs hot and eats energy; the second refers to computing’s current crisis; and the third emphasizes fire’s ability to propagate. The fire threatens to destroy our digital worlds from within.”
Recently, I was asked to offer an answer to this question posed by Gizmodo for their “Giz Asks” feature. The original is here, (along with the answers provided by several other historians of technology) and I’ve also posted my response separately, below. Given everything that’s going on right now in the United States with anti-Black racism and police brutality reaching a crisis point, I think that we should use the history of technology to look more critically at the past, especially the stories that get told about technology.
If we are going to look at causes and effects in the history of technology we have to be honest: technology doesn’t simply equal progress. More often it is a way for people to wield power over others, and to intensify and centralize that power. Right now, there’s a huge debate surrounding surveillance and policing: critics worry that more facial recognition and surveillance tools will simply amplify and extend racist policing. There is a lot of historical precedent to support that contention. I tried to use an example from the more distant history of technology to help illustrate the way that seemingly neutral technologies interact with their historical contexts to deepen inequalities and cause real harm. And to show why it’s so important to learn from these histories.
The Cotton Gin And The Expansion of Slavery
Images from Eli Whitney’s 1793 patent application for the cotton gin (original held in US National Archives).
When we think about technologies that have killed a lot of people by accident, we have to think about technologies that have been around a long time, and whose utility has been so great for industrial expansion that their negatives have been overlooked—or, worse yet, intentionally hidden.
The
cotton gin, patented by Eli Whitney in 1794 and in widespread use throughout
the US throughout the 19thcentury, is one such technology.
The cotton gin (short for “enGINe”) was a machine that made cleaning and
preparing raw cotton much quicker and more efficient—and therefore made the
growing of cotton much more profitable.
What the
cotton gin also did was to make slavery far more entrenched, through making
cotton picking by enslaved people in United States much more profitable.
Slavery had not been expanding as rapidly until the invention of the gin
encouraged more and more white cotton growers to expand their production. White
southerners “imported” more than 80,000 Africans as slaves between
1790 and the ban on “importing” enslaved Africans in 1808. Between
the years of 1790 and 1850 the number of enslaved people in the US rose from 700,
000 to more than 3 million through generational enslavement (chattel
slavery). By the start of the civil war one
third of all Southerners were enslaved people.
This was all in the service of the booming cotton industry that the cotton gin created: the US supplied the vast majority of all the word’s cotton by the mid 19th c. and the production of cotton doubled every decade after 1800. When people say that the U.S. economy was built on the backs of enslaved Black people, they are talking about industries like cotton and all the personal and national wealth created at the expense of enslaved Black people’s lives.
Had it
not been for the invention of the cotton gin, it is likely that slavery
would’ve been abolished more quickly instead of massively expanding in the way
that it did, in a relatively short period of time. The calculation of deaths
that includes enslaved Black people who died enroute to the US, and enslaved Black
people who died or were killed while in the US, already more than qualifies
this technology for a high spot on this list—to say nothing of the widespread
misery and pain caused to enslaved people, and the generations of their
descendants who have been deprived of their full civil rights as a result.
Right now, we are seeing all too clearly how Black people living in the US today lose their lives as a result of this economic and technological history. White business owners in the South in the 18th and 19th centuries used technology to amplify and extend racism, misery, and death, much in the same way that we see happening with certain technologies today. The goal, then as now, is both profit and power.
So I
think this is an important history of technology to keep in mind. Because it
shows how technologies are always constructed for and by the contexts in which
they come into being. And if that context is racist, they are likely to uphold
racism, if what they do is help make the existing economic and social
structures stronger and more efficient without caring about existing
inequalities. When technologists try to “fix” things with merely technical
solutions they ignore the broader context and how those technologies work in
it.
This is one reason why it’s so important for STEM practitioners to learn and know history, and why STEM programs at universities do their students, and all of us, a disservice by not having more humanists and historians. Narrowly technical “advances” that don’t understand the broader context can lead to terrible unintended —but not unforeseen—outcomes. And that isn’t real progress at all.
I recently created and taught two new courses, one on “Diversity in the History of Technology” (fall 2019) and a seminar on “History and Historiography” (spring 2020). See the full syllabi here.
The diversity in technology course is a history of technology course that reorients students’ understanding by balancing the often triumphalist, technophiliac accounts of tech’s past with stories from the margins, & histories of technologies that center previously ignored or submerged voices & narratives. For instance, we read Professor Deirdre Cooper Owens’s book Medical Bondage, on how white supremacy was part and parcel of the development of gynecological surgery, and we read Prof. Lisa Nakamura’s work on Navajo hardware manufacturing workers at Fairchild Semiconductor in the 1970s, who protested unfair labor conditions. We read Margot Shetterly’s Hidden Figures and relate this history to the present by reading work by scholars like Prof. Safiya Noble and Prof. Ruha Benjamin on Black producers and users of computing technologies, and the overlapping systems of oppression that large scale commercial information technologies rely on and strengthen.
Annie Easley, scientist at NASA. Photo circa 1981, courtesy of nasa.gov
Over the course of the semester, students expand their understanding of what usually “counts” as history of technology, and who usually gets to count within it. The course asks students to think about how oppression and the power relationships inherent in powerful, centralized technological systems have shaped what we think are the best ways to implement technologies today, and how ignoring these factors (or failing to contend with the history that created them) often leads to problematic, myopic strategies for “diversifying” technological products and workforces today.
The historiography course introduces students to the field of historiography—the study of how history gets written. Readings in the course focus on recent, innovative historical works that reconfigure the way histories of certain topics have been written in the past. We read, among other works, Prof. Hazel Carby’s most recent book on empire and family (Imperial Intimacies), Clyde Ford’s latest on race in the history of computing (Think Black), and Lauren Jae Gutterman’s new book on queer history (Her Neighbor’s Wife). Students investigate what went into writing them, think about why certain stories haven’t been written until now, and begin conceptualize history as a dynamic, changing set of narratives and ideas about the world, rather than simply a static, unchanging record of past events.
In the second half of the course, students also look at how different mediums (written, oral, visual) influence how histories are constructed, conveyed to an audience, and how the knowledge they create eventually becomes taken for granted or “common knowledge,” after initially being seen as novel or even radical.
My book, Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge In Computing, published by MIT Press, has been awarded the 2019 Herbert Baxter Adams Prize from the American Historical Association. (It also won the 2018 PROSE Award for History of Science, Technology, and Medicine from the Association of American Publishers; The 2018 Sally Hacker Prize from the Society for the History of Technology; The 2018 Stansky Prize from the North American Conference on British Studies; and the 2018 Wadsworth Prize from the British Business Archives Council.)
It is currently available in paperback,audiobook, and e-book. And a graphic novel version may be on the horizon!
Drawing I did in a signed copy of Programmed Inequality for a reader.
The short guide below evolved out of a conversation with Miriam Posner (@miriamkp) of UCLA who was looking for ways to help her students read more quickly and effectively. These tips can help you retain more when reading academic texts and allow you to get through them at a quicker pace.
Here’s what I tell my students if they have trouble keeping up with the reading for my history and STS classes: Continue reading →
This year, at the University of Madison-Wisconsin, I am teaching a new and improved version of my popular course Women in Computing History. It was initally taught at Illinois Tech in Chicago last year, where it garnered some press attention.
Due to the interest the course generated with people beyond the walls of our classroom I annotated the syllabus with discussion topics and class notes to give a sense of what we did in each class meeting–and what kinds of questions might be useful if you do the readings on your own.
See my syllabus page for the newest version of the course–the old version is still available as well, for all you completionists who might want to look at the details of how the course has changed!
I recently wrote a piece for the Washington Post using history to debunk the infamous “Google Memo” and its contention that women are somehow less innately suited to technical pursuits. Truth is, for a long time women were predominant in the field of computing because technical work wasn’t seen as important. Their disappearance has everything to do with structural discrimination and little to do with “innate” differences.
I was also very glad to get a few mentions in The Guardian. See this (delightfully acerbic) article about memogate in general, and this one that’s specifically about the history of computing’s role in helping us better understand power and (the lack of) diversity in our technological landscape in the present.
Quick note about the latter article–it made a little bit of a mistake in the first few lines (read more here and here). Both SUSIE and SADIE were computers. The typist/programmer in the ad was unnamed.
BCL Computer ad from 1967 that talks all about the “typist” that will program your newly-purchased computer for you.