Toward Algorithmic Criticism

“How can computers make our lives better?” is a question asked routinely by computer scientists, engineers, and the enigmatic entrepreneurs who bankroll the digital world. The answers they arrive at often point to monumental changes: after all, the personal computer, the cell phone,  and any number of other technological achievements that have brought on new epochs. Digital humanists  ask similar questions about how technology can transform the humanities. In Reading Machines: Toward an Algorithmic Criticism (University of Illinois Press, 2011), Stephen Ramsay asks: “how can computers make literary criticism better?” His answer provides insights into using technology that are not limited to the field of literary criticism but are applicable to the humanities more broadly.

Toward an Algorithmic Criticism posits that many of us are already doing algorithmic criticism, which Ramsay defines as “human-based criticism with computers,” already. Many of the ways in which we research, organize our thoughts and put them into words happen as a result of forms of algorithmic criticism that blend seamlessly into what we have always called research:  for example, using an online library catalog, or following a trail of hyperlinks down a rabbit hole.

Ramsay’s text is a call for us to do  these tasks more mindfully. This is a two pronged project. The first principle of algorithmic criticism is to be aware of the ways we are already using technology in our scholarship. The second is to expand our imagination  about the ways we are using technology. Ramsay introduces several computing methods and applications that can, in a sense, read text in a manner that may offer new lines of thought for critics. These include textual analytics, such as word frequencies across texts; as well as technology now that can analyze text for tone and subject among other characteristics. Ramsay envisions a world where these technologies are as commonplace as the index card,  informing our research and analysis as profoundly as the humble keyword search.

The tools and techniques that Ramsay describes have a wide range of application in the humanities and any field that analyzes text. Advertisers are already using the same techniques to glean data from social media posts to better target consumers. Much of these techniques fall under the quickly growing branch of computer science called natural language processing, which is often said to be one of the end-goals for computer developed artificial intelligence.

Ramsay asks the reader to not only imagine the possibility of computer based humanities. He also asks readers and historians to acknowledge that we are already doing it, and can do it better. He writes, “Algorithmic criticism looks forward to when we will have understood computer-based criticism to be what it has always been: human-based criticism with computers” (81). Looking through digital archives and libraries that are tagged and hyper-linked, and as Ramsay notes himself, the keyword search itself, are all forms of practicing algorithmic-criticism. A large part of what Ramsay proposes is that we do what we are already doing in more mindful, constructive ways.

Ramsay’s work is nothing short of groundbreaking. Two of the books I will be reviewing next in this series directly trace their impetus back to Reading Machines and how to apply Ramsay’s ideas to the study of history. As a historian I see much of Ramsay’s vision in line with traditional methods of history. This kind of reading is analogous to methods already employed by historians in analyzing and pulling meaning from large swathes of sources. The impact Ramsay has on history as a discipline is that algorithmic-criticism has the ability to let us as historians ask new questions of new sources with the tools that computers offer us.