Category: history of computing

MOOCs are doing a head-fake on higher education

Clay Shirky recently did an interesting post on Massive Open Online Courses (MOOCs) and the future of “non-elite” educational institutions and the students they serve: http://www.shirky.com/weblog/2012/11/napster-udacity-and-the-academy/

There are some misunderstandings of higher educational funding models in Shirky’s post, and the first few paragraphs comparing higher education to the demise of the music industry are problematic and likely to get your dander up, but if you stick with it through the end, it has some interesting insights to offer–as do the comments.

Shirky’s argument is that most current critiques leveled against online learning, and especially MOOCs, don’t take into account the reality of our higher educational landscape, but instead focus on comparing elite, top-50 liberal arts colleges and their educational benefits to MOOCs etc., ensuring that the comparison is specious and unhelpful. He cautions that we may be missing the point of where higher education is going in our devotion to an admirable, but limited set of ideals rooted in a rather classist and nationalistic view of how higher education “works.”

Shirky notes: “The possibility MOOCs hold out isn’t replacement; anything that could replace the traditional college experience would have to work like one, and the institutions best at working like a college are already colleges. The possibility MOOCs hold out is that the educational parts of education can be unbundled [from the physical/social ones].” The historian of computing in me can’t help thinking about how unbundling (of software) was also once thought preposterous and largely undoable, but soon thereafter helped to nearly destroy the market dominator–IBM–and completely changed the landscape of computing.

In other words, MOOCs and online learning, intentionally or not, are doing a “head-fake” of sorts on the “traditional academic model” (or rather, the academic model of the top 50-150 colleges and universities in the nation). Many of us working in these institutions may think that online, free courses are competing with what we have to offer, when really they’re setting up a whole new landscape: they’re looking to replace our teaching, learning, and research models without competing on the same terms. This is dangerous, but also exciting. It means that it doesn’t matter if, like IBM, we can say the way we do things is better; what matters is if the way we do things is still as applicable and flexible as the new modes and methods coming into play.

This got me thinking about what more I can do to meet these issues head on in my own academic practice. As Shirky’s post crucially notes–and many who write on MOOCs neglect to remember this–universities are not just about teaching, but about research. Research, and knowledge production, are really the raison d’etre of the university, with transmission of that knowledge (teaching and public engagement) being only about half of that equation or less. Right now, research publications are the main way knowledge is produced and disseminated to students, colleagues at other institutions, the government, and the public. It’s fair to say this isn’t the greatest system–many more people could be reached, for instance, than currently are. Soon, it’s reasonable to assume, universities will transition to using tools like MOOCs to disseminate research and new knowledge. In other words, we lack vision when we think of MOOCs as merely low-level teaching tools for getting standardized courses out to anonymous students.

Therefore, for academics to adapt to the coming new models of education, we will need to–perhaps counterintuitively–focus more on research and squeeze teaching into less time: teaching through MOOCs will, quite possibly, become the new way that we get our research out into the wild, taken seriously, and used as part of larger intellectual, social, and economic debates. (Don’t believe me? Think about how radio, TV, and podcasts have all, in historical turn, stood in for reading in serious and major ways.) Just as we effectively give our research publications  “away for free” to advance the state of human knowledge now, we may give our teaching and research content away for free via online courses for the same reason. That there don’t currently exist the same economic gatekeepers (Gale, SAGE, IEEE, etc.) for the latter as for the former is of little long-term concern: they will evolve as we begin to transition en masse to new forms and methods of creating and distributing research content.

One thing I am thinking of doing to start to meet this transition head on is to continue my exploration of blended learning models and tools. I’ve used this blog over the course of the last semester to invite–ok, require–students to participate in public intellectual exercises. They’ve engaged in online discussion and written their short “papers” in the comments of this blog rather than writing them on paper, for only me to read, or posting them behind the great wall of Blackboard, where they would effectively be lost after the semester’s end.

Next, I plan to try to use a teaching model that further “flips” the class lecture/discussion model, perhaps by (as some of my colleagues do) recording lectures in advance for students to watch online, and then using in-person class time to have something that more resembles a discussion section. Right now, I try to combine both lecture in discussion into the class period, and this only works well some of the time–usually when the students have been diligent in doing the reading (and unfortunately many often aren’t–or they come to class and mentally doze behind laptops even if they’re “prepared”).

The down side of these new tactics is that they will leave less room for error, either on the part of myself or on the part of students: misunderstandings that could be easily rectified IRL will assume more importance and negative impact when students rely on a non-interactive time-shifted recording. And this model will require more student prep time outside of class for the average student—not only will they have to do the readings before each class, but they will also have to devote over an hour to listening to a lecture. Many will not do both, I am sure. For my part, the enhanced prep time will require me to offer fewer graded assignments, likely reverting to a more traditional model of midterms, finals, and perhaps one or two papers during the term.

But as Shirky points out, to think that this sea-change will present us with a new option equivalent to the old is to misunderstand the whole point of change. And I’m all right with that uncertainty, because these types of problems–and the intellectual stretching required to solve them–is the whole reason I got into this game in the first place.

History of Computing Class: 3rd Blog Comment

What is the (implicit) argument about who or what controls data in the chapter we read from Stephen Levy’s In the Plex? And what is the argument of the chapters we’ve read so far from Goldsmith and Wu’s Who Controls the Internet?

Write a comment of 3-4 concise paragraphs (no more than 450 words total) that talks about how these two arguments are actually at odds with each other. How can you bring them into conversation, or alignment, by using your own insights and historical examples from class? If you’re stumped, take a look at this article to help you think through the connections.

THIRD BLOG COMMENT DUE Thursday Nov. 15 by 10pm

Conference Cultures

This year I had the odd fortune to have all three of my major academic conference commitments occur right in a row. I went directly from the Society for the History of Technology in Copenhagen, to the Turing in Context II Conference in Brussels, to the Midwest Conference on British Studies in Toronto.

Although this was a bit grueling, it gave me perspective that I don’t think I’d have gotten if not for the close juxtaposition of conferences. I began to notice things about the conferences’ cultures that made each intellectual environment unique, and I think it can be neatly summed up by characterizing them, in order, as being
1) convivial
2) questioning
3) collaborative
(Say it out loud to appreciate the alliteration.)

1) Society for the History of Technology (SHOT)–> Convivial

Although the name of the conference might sound like we’re stuck in the past, doing hopelessly dry technical histories, that’s thankfully not the case at all. SHOT is a big tent in the best sense of the word, and it welcomes people who work on technology either as their main interest or as a part of their larger constellation of intellectual concerns. The definition of “technology” is as wide as one wishes to make it: last year the paper that got the prize for best new scholarship was on the technology of ballet pointe shoes and dancers’ bodies. The year before it was on British imperial geographic surveying tools. A few years before that, the prize went to a paper on the language surrounding abortion techniques.

Skyline of Copenhagen

Which brings me to another point I only just realized while writing this: women are very well-represented at SHOT. More than you might imagine given the name of the conference. Of the three prize papers mentioned above, all of the presenters were women, and I believe this year’s prize went to a woman as well. The main book prize this year also went to a lady: Eden Medina’s Cybernetic Revolutionaries, which won the computing subgroup’s smaller book prize as well. Perhaps even more importantly (to me, at least) is that there is not one rigid, right way to perform gender at SHOT: I’m not saying it’s perfect, but I can be myself there more than at other conferences I’ve attended, and still have lots of similar people to talk to.

Overall, this 250-450 person conference generally feels much smaller than it is because there is a huge emphasis on conviviality and on welcoming new members who have a wide range of interests. SHOT was the most welcoming and friendly crowd I encountered early in my career, and it encouraged me to stick around and pay it forward. SHOT strives to expand the scope and perspectives at work in the history of technology as a field, which paradoxically means welcoming folks who don’t necessarily see that as their main field. I think that’s all to the better.

2) Turing in Context II–> Questioning

The outside of the conference venue, the Royal Flemish Academy of Arts and Sciences

This conference was one of several European conferences set up to celebrate the centenary of Alan Turing’s birth, by looking at aspects of his work and life. Aside from meeting folks from a variety of fields, from robotics to history, one of the highlights of the conference was the screening of a new docudrama about Turing’s work and, perhaps more importantly, how his work impacted his life. It will be have a few limited screenings in the US: Codebreaker.

Perhaps to be expected for an interdisciplinary conference with a high proportion of philosophers and scientists, the mood of this conference was interrogative. Not in a bad way at all, but there was much more critical engagement with and amongst the presenters and the participants. It certainly kept one on one’s toes!

3) Midwest Conference on British Studies–> Collaborative

A subconference of the larger North American Conference on British Studies, the MWBCS gives midwestern scholars of Britain an additional chance to meet and present their work. This was my first time going to the conference, and I enjoyed it greatly.

Snatching a view of Toronto from the airport ferry was about all I could muster at this point

I was struck by the culture of paper-delivery was at the MWBCS. The emphasis was firmly on reading very eloquent prose that had been committed to paper well in advance (by contrast, a more conversational/explanatory mode of paper delivery reigned at both SHOT and Turing in Context). At times, this paper-reading could get to be a bit much: some presenters, short on time, sped up the reading of their papers to an almost comical pace. But, I shouldn’t complain: by the time my 9am Sunday presentation slot rolled around, it was all I could do to just read my paper!

The most interesting thing about my interactions at MWBCS was how collaborative they were: everyone to whom I spoke went out of their way to connect their work to mine, either in theme or in topic. It made for a conference that was both friendly and also extremely useful–I came home with a list of articles and authors to follow up on. My only complaint is that I wish there had been more folks committing their thoughts to twitter: at SHOT, one of the ways I often meet like-minded scholars is through seeing their tweets, responding, and then meeting them in person before the conference closes. At MWBCS, the more formal, paper-note-taking culture of interaction made this unlikely, and in fact, I sometimes wondered if tweeting in a session might come off as rude, whereas at SHOT it is common and expected.

Whew, well, that’s about it. I’d be interested to hear more about your experiences at these conferences, if you went. Check out these two other posts on SHOT 2012 for more perspective:

Laine Nooney

Alex Bochannek

History of Computing Class: Assignment 2

In one concise paragraph, discuss one technical advance that we’ve learned about since Sept. 3 and why that advance was important. It can be a machine, a technique (like a programming technique), or a specific idea. Your response should show us its importance in a broad sense: this is your opportunity to answer the “so what?” question of why a particular historical event matters. Be original and creative: your response should tell us something non-obvious about the advance you choose.

Grace Hopper with the UNIVAC I, from the Computer History Museum’s website:
http://www.computerhistory.org/timeline/?year=1952 Collection reference: 102635875 (Courtesy Gwen Bell)

Please use formal English and write your response as you would a short academic paper. Keep it to one concise paragraph, and make your point as well as you can in that space.

Your comment will not show up right away: I will approve the best 5 or so comments after the deadline.

As noted on your syllabus, your comment is due by 10pm on Thursday, Sept. 20. There will be no credit given for late responses.

Have fun.

History of Computing Class, Assignment 1

Students, comment on this post by writing a three-paragraph response to the following:

So far, we’ve discussed the precursors to electronic computing. What are the three most important things we’ve learned?

Please use formal English and write your response as you would a short academic paper. Include relevant, specific historical details to make your points, but remember to keep it concise: this should only be three paragraphs.

Fanny Bindon Bailey and her Remington no. 2 c. 1909
Fanny Bindon Bailey, a clerk at the US Coastal and Geodetic Survey Office, with her Remington No. 2, c. 1909.   From: http://www.officemuseum.com/typewriters_office_models.htm

Your comment will not show up right away: I will approve the comments after the deadline, once everyone’s had a chance to respond, so as not to bias your answers.

As noted on your syllabus, your comment is due by 10pm on Thursday, Sept. 6. There will be no credit given for late responses or technical difficulties (so don’t leave it until the last minute).

Have fun.

Note: If you see bracketed text in the comments, that represents text I added or text I corrected–in other words, alterations from the student’s original essay. My objective in correcting your posts is to make your blog comments a useful learning resource for the class and anyone else on the web who may come across them later. I want to ensure we put as little misinformation out on the web as possible.

Welcome, students.

It’s that time of year again, when we return to the classroom and try to make old lessons new. At least in history. Fortunately, that isn’t hard when you teach history of technology. There’s nothing like a rapidly changing contemporary landscape to put past technological developments into new perspective on a continual basis.

This year I’m doing a new course, somewhat cheekily titled “Disasters!” It looks at technological change through the lens of regulatory and social paradigm shifts caused by disasters: environmental, organizational, medical and more. It also shows students how to work with historical newspaper sources and databases, because as we discussed in class earlier this week, one of the key defining elements of a disaster is the public perception of an event as such. We are fortunate enough to now have the London Times Digital Archive for this purpose (prior to this year IIT had only very limited newspaper databases–I hope we’ll be able to get the historical New York Times archive sometime soon, despite its expense).

Sketch of early electronic (mostly) computing landscape doodled as a study aid for my students last fall.

I’m also teaching my History of Computing course (aka Computing in History), revamped with new articles and learning activities that incorporate just-opened primary documents from my summer trip to the UK National Archives. Later in the semester you’ll be seeing some blog commentary from my students as part of their class assignments.

In fact–students–this post would be a great time for you to test out leaving a comment. You don’t have to use your real name if you don’t want to, but be sure to pick one handle and stick with it for the rest of the semester. Answer this: how many unread messages do you currently have in your main email inbox? For me, it’s 9,607. Yikes.

Don’t worry if your comment doesn’t show up immediately: I need to approve them.

 

Digital humanities, tacit knowledge, and (re)making the world in whose image?

This year, I helped set up a digital humanities speaker series for our department, titled Goals and Boundaries in the Digital Humanities. The series will bring in speakers from inside and outside IIT to discuss the current state of the art in digital humanities and explore disciplinary issues associated with the field. The speakers come from many backgrounds–different academic humanities disciplines, library and archive work, computer science, museum studies, design, and public history.

As I was working on it, I ran into some articles that seemed especially apropos given that our speaker series is part of a larger effort to define what we should be aiming for as we try to create a digital humanities program within the department.

‘‘Another Bobby Pin.’’ This cartoon appeared in the British Tabulating Machine Company magazine Tabacus in January 1957 (p. 13). (Courtesy of the British Tabulating Machine Company)

The first looks at the implications of tacit knowledge and the “commonsense” divisions thrown up between being, thinking, doing, and discourse. It struck an especial chord with me because of what I work on–in addition to the implications of race and privilege the author points out, there is a subtly gendered order at work here as well. Framing the debate as being between those who do (hackers) and those who can only sit on the sidelines and talk (yackers) implicitly leverages a long history of gendered categories–from those surrounding masculine professional expertise, to those enabling and privileging amateur tinkering.

The second article is a response to the first that hits on many of my concerns, and additionally points out how queer and postmodern analysis may in fact be deprecated by the hack ideal. I like how this piece encourages us to think about the hidden issues at work in the creation of canonical knowledge in the digital humanities: what exactly do we know as we’re trying to create new knowledge? And how does this old knowledge in fact predetermine much of the new?

I’m sure we’ll discuss these issues (and more) as we proceed through our seminar series. Anyone at IIT (and other local universities) is welcome to attend. The first meeting is on September 20th, in Siegel Hall, room 218 conference room.

Bletchley Park’s Colossus Rebuild

This summer, while doing research at the UK National Archives, I was fortunate enough to be able to take a side-trip to the Bletchley Park historical site in Milton Keynes.

The site of some of the most important codebreaking of World War II, Bletchley Park now functions as a museum of early British codebreaking and computing. A dedicated team there has painstakingly constructed a working model of the Colossus, the 1900-vaccuum tube behemoth designed and built by London Post Office engineer Tommy Flowers in 1943 to speed “Tunny” codebreaking operations using codebreaker William Tutte’s statistical method. The Colossus rebuild is a sight to behold–and hear–as you can see from the video below:

Though Turing and his electromechanical Bombes get a lot of credit for wartime codebreaking successes (there is a working Bombe rebuild at BP too), it was Flowers’s Colossi that sped up British codebreaking to the point of maximal utility. They were the first digital, programmable computers to harness the speed of electronics for time-sensitive, mission-critical work. His team produced 10 of the massive machines between late 1943 and the war’s end, frantically working out of a factory-like workshop in Birmingham after building the first Colossus at the Dollis Hill Research Station in London. Colossus II, installed just days before D-Day, was so critical to the success of the D-Day landings that (as B. Jack Copeland reports) the operators had to keep the machine running despite the floor being flooded–they put on thick rubber boots so that they didn’t get electrocuted.

You can see an image of Women’s Royal Naval Service members working on a Colossus here. (From I. J. Good, D. Michie, and G. Timms, General Report on Tunny With Emphasis on Statistical Methods, 1945, 332, HW 2/25, TNA). The workers have been identified as Dorothy Du Boisson (left) and Elsie Booker (right) by historian B. Jack Copeland.)

While I was at the exhibit, I overheard a father trying to explain to his young son what the Colossus was, in terms he thought the boy might understand. Brandishing his iPhone, he said: “See, this phone is millions of times more powerful than that big computer.”

Call me back when your iPhone wins a war, I thought.