Remembering Professor Halcyon Lawrence

On the last day of Black History Month, and the eve of Women’s History Month, I wanted to take some time to tell you about Professor Halcyon Lawrence, who is a part of both of these histories. She was a brilliant, kind, pathbreaking, and warm colleague, and she died suddenly in late October 2023.

It has taken me a while to make these thoughts public because she was such a special person, and I wanted to do justice both to her and her work in writing about her. I know I have missed many things below, but one other reason I feel it’s important to post this is because, thanks to the boom in “generative” AI, hundreds of random sites started posting fake, Chat-GPT generated obituaries of her right after her death. These were full of misstatements and outright falsehoods, but unfortunately many sounded true enough that well-meaning people intending to memorialize her and spread word of her passing accidentally (and understandably) shared them. In this current moment, the hype around AI-generated informational poison knows no bounds, and unfortunately that means that nothing is off limits.

One of the things that was so important to Halcyon in life was making sure that humanity did not get lost in the course of deploying technology, and that people were centered in any conversation about how machines should work. So I am trying to honor that part of her message with this post that pushes back, in some small way, against the inhumane AI-generated text that is starting to take over the web. If you are a person who knew and admired her, please leave a comment to add your memories of, and experiences with, Dr. Lawrence. (There is a moderation delay as I manually approve comments.)

Dr. Halcyon Lawrence was a good friend and colleague whom I had known for over a decade. She was a graduate of the technical communication PhD program (now defunct) at Illinois Institute of Technology, and at the time of her death was an Associate Professor of technical communication and user design at Towson University in Maryland, having earned tenure early. (A Towson graduate has written a very moving remembrance of her here.)

She was an amazing scholar, and a unique and original thinker who studied how linguistic imperialism embedded itself into speech technologies, and how this issue could be averted and countered by both technical and nontechnical means. At the time of her death, she was one of only a few scholars working on this problem, which affects the majority of English speakers in the world.

She was also the kind of person who let everyone else know how much she valued and appreciated them every time she talked to them, and her good example rubbed off on others. As a result, I always tried to let her know how much I valued and appreciated her every time we talked. I was so thankful for how she had taught me to be more open in that way, particularly when I realized, in retrospect, that we had talked for the final time.

Dr. Halcyon Lawrence delivering a keynote at a conference at the Computer History Museum in Mountain View, California in 2017. Photo by Mar Hicks.

Prof. Lawrence was originally from Trinidad and Tobago, and this deeply influenced the work that she did: she focused on linguistics and technical communication, with an especial interest in studying how accent bias in automated technologies opened up new questions and areas of study. She often pointed out that even though the number of people in the world who speak English with a “nonstandard” accent–in other words, not British, American, or Australian–matches or outnumbers the people who do, speech recognition technologies barely take this into account.

As someone who herself often needed to code-switch to be understood by automated systems, she felt this inequitable part of technological “progress” acutely. One thing she always highlighted was that she could use clear speech strategies to communicate with people who didn’t immediately understand her accent (such as speaking more slowly, hyper-enunciating, etc.), but doing these things with machines was rarely as successful. People can meet each other halfway, while machines have often been programmed to refuse to do so. And this, she pointed out, was an intentional, political, technical choice on the part of the largely white and U.S.-based technologists who initially designed and deployed these technologies. If you’d like to read a chapter she wrote on this topic from a book I co-edited, you can read a preprint for free here: Siri Disciplines. Below, you can see a photo of her presenting this work at Stanford, at one of the conferences that led up to the publication of that book.

Dr. Halcyon Lawrence speaks at Stanford in 2017. Photo by Mar Hicks.

Halcyon had the ability to hold audiences rapt when she spoke about her work, both because of what she was saying and because of how she communicated. In the spring of 2023 she gave a talk to my graduate seminar that my students gushed about for weeks afterward. She made her lectures participatory and engaging, always meeting people where they were without giving up where she was coming from. I was so glad to have funding to invite her to come speak to my fall 2023 class as well (via videoconference), and she had been looking forward to it. But when we talked via email just a week before, she knew she was ill and needed time off to rest. She thought she had more time left than just a few short days, so I was deeply saddened and shocked when I heard about her passing. The historian in me recognizes that if there were less structural racism in the medical systems that we rely on in this country, she may well not have ended up in the sudden, dire position that she did.

The memory that seems to resonate for many people who knew Dr. Lawrence is how unusually kind and engaged she was. Halcyon was well known and loved in her field, and many other fields, for her kindness and generosity of spirit, her intellectual fearlessness, and her willingness to mentor, help, and support her fellow scholars. Her razor sharp insights and her dedication to building more inclusive communities and technologies, especially speech technologies, impacted multiple disciplines and powerfully influenced how people thought about the relationships between speech, empire, and technology. Colleagues in her field are in the midst of preparing a special section of Communication Design Quarterly to commemorate her work’s impact, and I greatly look forward to reading it. Even in her lifetime, her work was highlighted in the press for pointing out an important new dimension in the fight against biased and broken tech.

When Halcyon and I were both at Illinois Tech, it was always a treat when she was in her office and free to talk at the same time I had free time. A few minutes of conversation with her could truly light up your entire day. One afternoon, we walked out to the lakeshore in Hyde Park, where she told me about her upbringing in Trinidad and Tobago, surrounded by devices on the cutting edge of computing because of the work her father did. She was always comfortable with technology–just never comfortable with needlessly ceding power and agency to it. She wanted it to be firmly human-centered, and serve people’s needs better. While still a graduate student, for instance, she collaborated on a project to help the Chicago Transit Authority make the stop announcements on the ‘L’ and on buses more easily understood in high noise environments. If you have ridden public transit in Chicago, you have probably benefited from her work.

One of my first meetings with Halcyon is the memory that feels most fitting to end with: I will always remember how, when I interviewed for my job at Illinois Tech, she (as a grad student) was the only member of the department who stayed for the final meeting of the day, to talk to me some more as the sun set and the roads iced over on a cold Chicago winter night, instead of leaving early like all the faculty in the department had. I kept letting her know that she didn’t have to stay just to keep me company, but she truly wanted to stay and talk to me. She valued ideas, but valued the people they came from even more. Her intellectual practice was strengthened so much by her approach to people, and I will always remember the impact this had. Over the years, I have often found myself thinking of how she did things when I am trying to do better. It was an honor to have known her and learned from her.

Rest in peace, Professor. You made such a positive impact on so many.

News: Moving Institutions

University of Virginia, School of Data Science (interior)

Very happy to report that I’ve moved to the new School of Data Science at the University of Virginia, where I will be a tenured Associate Professor from Fall 2023 onward. I will miss Chicago, but I am very excited to be in a growing program at a thriving institution, and I am very excited to see what the future brings!

Many folks have asked how this disciplinary move will affect my research. The short answer is that I will still be doing the same historical and humanistic work. I will also be helping build (and define) the growing field of academic data science from within, at one of the country’s leading data science programs. Find me at my new institutional home here.

University of Virginia, Rotunda

Your Computer Is On Fire in Edinburgh

It was a delight to recently speak to the University of Edinburgh’s Centre for Data, Culture, and Society about my new co-edited volume, Your Computer Is On Fire, with co-editor Prof. Kavita Philip. One of the most important things that I think this volume tries to do is to make topics that have for too long been seen as the purview of those inside the tech industry approachable for people outside tech, using the latest research from a variety of scholars in different fields. You can view the recorded event here, and read more about the book on the MIT Press website.

The voices of women in tech are still being erased

I recently wrote an article for MIT Technology Review about TikTok’s decision to suddenly change the voice used for the “automatic” voiceover feature. This appeared to be due in part to a lawsuit brought against ByteDance, their parent company, by the voice actor who had–unbeknownst to her–provided the voice that ByteDance decided to use on TikTok.

The interesting thing about this change is it’s not the only time a woman’s voice has been used for a multibillion dollar company’s product without due credit, or the voice actor even being aware. This dovetails in an interesting–and illuminating–fashion with the way so many Silicon Valley corporations have recently seemed unable to listen to women’s voices when the call is coming from inside the building, so to speak. In other words, using women’s voices and reaping value from them as a commodity, versus valuing what women have to say, are two very different things. You can read the full article here.

If you’re interested in learning more about the historical background to this kind of gendered dynamic, I was recently on the Tech Won’t Save Us Podcast talking about my research with host Paris Marx. Take a listen on the pod app of your choice or in your web browser here.

Your Computer Is On Fire

New book on an old computer.

I’m happy to announce that a project I’ve been working on for several years as a co-editor has just been released. It’s a book called Your Computer Is On Fire that collects some of the sharpest thinkers today working in the areas of history of technology, STS, information studies, and more to ask the question: how did we get into this mess, and how to we get out?

“The collection of impactful tech issues interrogated over the span of decades in this book makes it recommended reading for anyone interested in the impact of tech policy in businesses and governments, as well as people deploying AI or interested in the way people shape technology” writes Khari Johnson (VentureBeat). Tamara Kneese, writing for the LA Review of Books calls it, “the book tech critics and organizers have been waiting for.” New Scientist says: ‘Technology is so embedded in our lives that we can sometimes forget it is there at all. Your Computer is on Fire is a vital reminder not only of its presence, but that we urgently need to extinguish the problems associated with it.”

Cover of book--yellow with flames

Each essay looks at a different aspect of our global, interconnected computing landscape, and the effects that our computing tools and infrastructures have on our political, economic, and social systems. With a special focus on labor and workers’ ability to fight back against unethical and overbearing tech, the book sounds an alarm about the dangers of uncritical techno-utopianism. It trains a spotlight on the inequality, marginalization, and biases in our technological systems, showing how they are not just minor bugs to be patched, but part and parcel of technologically deterministic ideas that assume technology can fix—and control—society. After decades of being lulled into complacency by narratives of technological neutrality, people are waking up to the large-scale consequences of Silicon Valley–led technophilia. The essays in Your Computer Is on Fire interrogate how our human and computational infrastructures overlap, showing why technologies that centralize power tend to weaken democracy. These practices are often kept out of sight until it is too late to question the costs of how they shape society. From energy-hungry server farms to racist and sexist algorithms, the digital is always IRL, with everything that happens algorithmically or online influencing our offline lives as well. Each essay proposes paths for action to understand and solve technological problems that are often ignored or misunderstood.

After decades of being lulled into complacency by narratives of technological neutrality, people are waking up to the large-scale consequences of Silicon Valley–led technophilia. The essays in Your Computer Is on Fire interrogate how our human and computational infrastructures overlap, showing why technologies that centralize power tend to weaken democracy. These practices are often kept out of sight until it is too late to question the costs of how they shape society. From energy-hungry server farms to racist and sexist algorithms, the digital is always IRL, with everything that happens algorithmically or online influencing our offline lives as well. Each essay proposes paths for action to understand and solve technological problems that are often ignored or misunderstood.

You can hear me talk about the book on the Tech Won’t Save Us podcast, or order the book through your library, your local indie bookstore, or the MIT Press website. You can read a review in the LA Review of Books:
“A primer in tech criticism and activism, Your Computer Is On Fire uses the underexamined past to better understand the present and shape the future. The four editors are historians of computing, and the contributors, a quarter of whom are women of color, represent varied academic fields and areas of expertise. The volume coalesces around their collective conviction that the tech world is in a state of emergency. The “fire” of the title, inherent from computing’s inception, has three interconnected meanings: the first is literal, in that computing technology runs hot and eats energy; the second refers to computing’s current crisis; and the third emphasizes fire’s ability to propagate. The fire threatens to destroy our digital worlds from within.”

Old mactintosh computer with the book inside of it where the screen should be

What technology has killed the most people by accident?

Recently, I was asked to offer an answer to this question posed by Gizmodo for their “Giz Asks” feature. The original is here, (along with the answers provided by several other historians of technology) and I’ve also posted my response separately, below. Given everything that’s going on right now in the United States with anti-Black racism and police brutality reaching a crisis point, I think that we should use the history of technology to look more critically at the past, especially the stories that get told about technology.

If we are going to look at causes and effects in the history of technology we have to be honest: technology doesn’t simply equal progress. More often it is a way for people to wield power over others, and to intensify and centralize that power. Right now, there’s a huge debate surrounding surveillance and policing: critics worry that more facial recognition and surveillance tools will simply amplify and extend racist policing. There is a lot of historical precedent to support that contention. I tried to use an example from the more distant history of technology to help illustrate the way that seemingly neutral technologies interact with their historical contexts to deepen inequalities and cause real harm. And to show why it’s so important to learn from these histories.

The Cotton Gin And The Expansion of Slavery

Images from Eli Whitney’s 1793 patent application for the cotton gin (original held in US National Archives).

When we think about technologies that have killed a lot of people by accident, we have to think about technologies that have been around a long time, and whose utility has been so great for industrial expansion that their negatives have been overlooked—or, worse yet, intentionally hidden.

The cotton gin, patented by Eli Whitney in 1794 and in widespread use throughout the US throughout the 19thcentury, is one such technology. The cotton gin (short for “enGINe”) was a machine that made cleaning and preparing raw cotton much quicker and more efficient—and therefore made the growing of cotton much more profitable. 

What the cotton gin also did was to make slavery far more entrenched, through making cotton picking by enslaved people in United States much more profitable. Slavery had not been expanding as rapidly until the invention of the gin encouraged more and more white cotton growers to expand their production. White southerners “imported” more than 80,000 Africans as slaves between 1790 and the ban on “importing” enslaved  Africans in 1808. Between the years of 1790 and 1850 the number of enslaved people in the US rose from 700, 000 to more than 3 million through generational enslavement (chattel slavery). By the start of the civil war one third of all Southerners were enslaved people.  

This was all in the service of the booming cotton industry that the cotton gin created: the US supplied the vast majority of all the word’s cotton by the mid 19th c. and the production of cotton doubled every decade after 1800. When people say that the U.S. economy was built on the backs of enslaved Black people, they are talking about industries like cotton and all the personal and national wealth created at the expense of enslaved Black people’s lives.

Had it not been for the invention of the cotton gin, it is likely that slavery would’ve been abolished more quickly instead of massively expanding in the way that it did, in a relatively short period of time. The calculation of deaths that includes enslaved Black people who died enroute to the US, and enslaved Black people who died or were killed while in the US, already more than qualifies this technology for a high spot on this list—to say nothing of the widespread misery and pain caused to enslaved people, and the generations of their descendants who have been deprived of their full civil rights as a result. 

Right now, we are seeing all too clearly how Black people living in the US today lose their lives as a result of this economic and technological history. White business owners in the South in the 18th and 19th centuries used technology to amplify and extend racism, misery, and death, much in the same way that we see happening with certain technologies today. The goal, then as now, is both profit and power.

So I think this is an important history of technology to keep in mind. Because it shows how technologies are always constructed for and by the contexts in which they come into being. And if that context is racist, they are likely to uphold racism, if what they do is help make the existing economic and social structures stronger and more efficient without caring about existing inequalities. When technologists try to “fix” things with merely technical solutions they ignore the broader context and how those technologies work in it. 

This is one reason why it’s so important for STEM practitioners to learn and know history, and why STEM programs at universities do their students, and all of us, a disservice by not having more humanists and historians. Narrowly technical “advances” that don’t understand the broader context can lead to terrible unintended —but not unforeseen—outcomes. And that isn’t real progress at all.

Two New Courses

Image from my archival research process, 2016.

I recently created and taught two new courses, one on “Diversity in the History of Technology” (fall 2019) and a seminar on “History and Historiography” (spring 2020). See the full syllabi here.

The diversity in technology course is a history of technology course that reorients students’ understanding by balancing the often triumphalist, technophiliac accounts of tech’s past with stories from the margins, & histories of technologies that center previously ignored or submerged voices & narratives. For instance, we read Professor Deirdre Cooper Owens’s book Medical Bondage, on how white supremacy was part and parcel of the development of gynecological surgery, and we read Prof. Lisa Nakamura’s work on Navajo hardware manufacturing workers at Fairchild Semiconductor in the 1970s, who protested unfair labor conditions. We read Margot Shetterly’s Hidden Figures and relate this history to the present by reading work by scholars like Prof. Safiya Noble and Prof. Ruha Benjamin on Black producers and users of computing technologies, and the overlapping systems of oppression that large scale commercial information technologies rely on and strengthen.

Annie Easley, scientist at NASA. Photo circa 1981, courtesy of

Over the course of the semester, students expand their understanding of what usually “counts” as history of technology, and who usually gets to count within it. The course asks students to think about how oppression and the power relationships inherent in powerful, centralized technological systems have shaped what we think are the best ways to implement technologies today, and how ignoring these factors (or failing to contend with the history that created them) often leads to problematic, myopic strategies for “diversifying” technological products and workforces today.

The historiography course introduces students to the field of historiography—the study of how history gets written. Readings in the course focus on recent, innovative historical works that reconfigure the way histories of certain topics have been written in the past. We read, among other works, Prof. Hazel Carby’s most recent book on empire and family (Imperial Intimacies), Clyde Ford’s latest on race in the history of computing (Think Black), and Lauren Jae Gutterman’s new book on queer history (Her Neighbor’s Wife). Students investigate what went into writing them, think about why certain stories haven’t been written until now, and begin conceptualize history as a dynamic, changing set of narratives and ideas about the world, rather than simply a static, unchanging record of past events.

In the second half of the course, students also look at how different mediums (written, oral, visual) influence how histories are constructed, conveyed to an audience, and how the knowledge they create eventually becomes taken for granted or “common knowledge,” after initially being seen as novel or even radical.

Find both syllabi here.

Good News!

My book, Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge In Computing, published by MIT Press, has been awarded the 2019 Herbert Baxter Adams Prize from the American Historical Association. (It also won the 2018 PROSE Award for History of Science, Technology, and Medicine from the Association of American Publishers; The 2018 Sally Hacker Prize from the Society for the History of Technology; The 2018 Stansky Prize from the North American Conference on British Studies; and the 2018 Wadsworth Prize from the British Business Archives Council.)

It is currently available in paperback,audiobook, and e-book. And a graphic novel version may be on the horizon!

Drawing I did in a signed copy of Programmed Inequality for a reader.

Interview with Dame Stephanie Shirley

I had the great fortune to interview Dame Stephanie “Steve” Shirley for the Computer History Museum in Silicon Valley. A child refugee from Nazi Germany, she went on to found a feminist software startup at a time when few people even knew what software was, and even fewer cared about feminist business models that put women’s needs first. She and her mostly-women employees wrote  some of the most important software for 20th century British industry and government–including programming the black box flight  recorder for the Concorde. She eventually became a billionaire and now focuses on philanthropy–particularly autism related causes. She was also the founding donor of the Oxford Internet Institute. Listen & watch here or read the transcript here. The interview was conducted over video link between her home in London and my home in the US. It runs about an hour.

For a much longer and more exhaustive oral history (many hours) check out Dr. Tom Lean’s interview with her for the British Library Oral History “voices of science” collection.


SHOT Book Launch

At the Society for the History of Technology Meeting this coming weekend in Philadelphia, MIT Press will being doing a joint launch of my book and Edward Jones-Imhotep’s terrific new volume on a unique set of Cold War technological failures. It runs from 3:30-4:30 on Friday, October 27, at the MIT Press table in the SHOT book hall. Discounts, free bookmarks, and snacks will be available! Come on by.