The following is a
brief, informal response to Noah Wardrip-Fruin’s The Prison-House of Data from the perspective of a PhD student in Digital Media Studies.
From August 2010 to August 2011 I worked for a software company that specialized in pattern-recognition and data-mining. The company’s objective was to produce software that collected and processed data in order to make a more efficient factory. As a tech writer and web designer, I was tasked with explaining this process to a range of audiences. But no matter who I was speaking to I always had to make it clear that the data-processing software was a supplement, useful only to a highly-trained specialist. This meant that the role of the software was simply to decrease the noise, enabling the specialist, over time, to identify information-rich patterns. Once a pattern was identified, it could be encoded into the system for future, automated detection.
Does that description sound familiar? Perhaps not, but it’s proven eerily similar to the characterization of the (digital) humanities in the mainstream press. Take for instance a recent article in The Wall Street Journal which opens with the question: “Can physicists produce insights about language that have eluded linguists and English professors?” The article proceeds to detail ‘Culturomics,’ a fledgling field described as “the application of data-crunching to subjects typically considered part of the humanities.” This sets up a rather depressing comparison: that the humanities (minus the digital) are essentially data-mining specialists of the kind mentioned above but sans the software. Put another way, the humanities are like ENIAC in relation to modern PCs; they are performing today’s tasks with yesterday’s tools. The corollary, then, is that the digital humanities realize their full potential by transforming themselves into software-enabled pattern-recognizers.
Noah Wardrip-Fruin follows up on this characterization in his recent Inside Higher Ed piece. In this case, it’s comments from within the humanities, made by Stanley Fish in The New York Times, that drive the debate:
“Stanley Fish wrote…that digital humanities is concerned with ‘matters of statistical frequency and pattern,’ and summarized digital humanities methodology as ‘first you run the numbers, and then you see if they prompt an interpretive hypothesis.’”
The article closes with Wardrip-Fruin proclaiming that “digital humanists must begin by recognizing and developing important areas of work, already part of the field’s history, that such conceptions marginalize.” Nevertheless, he admits that a portion of the humanities has always been about data-mining but that such a role has been altered by computational advances:
“Certainly a grasp of data — the historical record, our cultural heritage — is a great strength of the humanities. But in the digital world, the storage, mining, and visualization of large amounts of data is just one small corner of the vast space of possibility and consequence opened by new computational processes — the machines made of software that operate within our phones, laptops, and cloud servers”
As an early exemplar of where digital humanities should stake its claim, Wardrip-Fruin puts forward Phil Agre’s Computation and Human Experience as a text that “serves a primary humanities mission of helping us understand the world in which we live, while also helping reveal sources of recurring patterns of difficulty for computer scientists working in AI” [italics mine]. In other words, the digital humanist can help build a better machine.
And here’s where things get problematic for me. As a PhD student in digital media studies, I feel that such a role, defined (apparently) in order to provide some degree of relevance to the digital humanities, casts me as a glorified beta-tester. This is something I’ve been struggling with since I started my PhD. And so I’ve used the handful of conferences I’ve attended to criticize texts in the digital humanities whose aim seems to be largely to expose such ‘patterns of difficulty’ present in new media projects.
Yes, it’s true, for example, that VR systems do not take into consideration embodied aspects of human experience (Hansen’s Bodies In Code). But it’s also true that in two thousand years of Western literature, no texts were as ‘mimetic’ as those of the Modernists. Erich Auerbach exhaustively details this in Mimesis. However, at no point during that historical cataloging (data-mining, if you will) did Auerbach question mimesis (the recurrent pattern), either as an ideal or as a phenomenon.
Mimesis was, in a number of ways, composed entirely in a closed-system,1centered around a term that is never defined, perhaps because once it was examined, it would have collapsed. Instead, it’s used as the perspective from which all of Western literature is judged. And yet, mimesis (the concept) is the very unquestioned signifier that Derrida and Foucault and the ensuing postmodernist texts would proceed to problematize. Mimetic in relation to what? From whose perspective—the synaesthesiac? The blind? The deaf? It becomes clear that mimesis for Auerbach is largely a visual term, for those texts that lack visual similitude to reality are quickly discredited. Oddly enough, VR, with its own mimetic ambitions, has recently been criticized for focusing too much on the visual spectrum. But we are at risk of overlooking the moral of the postmodernist tale that states that no single encoding mechanism can achieve perfect, synchronized mapping with ‘reality.’
So while under the aegis of digital humanities we help debug AI programs, and isolate omissions in VR systems, and occasionally beta-test computer-science research projects, it’s worth asking: to what end? To build a better, more accurate simulation?
To reiterate, Wardrip-Fruin concludes his article by stating that “ digital humanists must begin by recognizing and developing important areas of work, already part of the field’s history, that such conceptions marginalize” [italics mine]. Is it not part of our history to recognize patterns so that we do not replicate them? Put differently, the question shouldn’t be ‘How can the digital humanities remain relevant?’ but rather ‘How can the digital humanities demonstrate the irrelevance of uncritical media (research)?’
In this, I can only speak for myself and for what I wish to pursue and achieve in my career. That includes the desire to be a countervailing force that follows from a firm belief that no matter how complex the encoding mechanism, the system will always fall short. And so I choose to characterize my role, as someone entering the digital humanities, as a critic and not as a product tester.
I’ll conclude by turning to the beginning of Wardrip-Fruin’s article. In those opening paragraphs he laments that the digital humanities don’t have a seat at the table set for computer-scientists and digital artists. For me, the absence is obvious; it’s because we simply don’t belong at that table. We’ll be seated at the next one over, eavesdropping on their conversations from time to time. And when we meet up at the after-party, we’ll tell them how they’re characterizing their work and how that’s similar to other patterns in other fields–perhaps pointing out that the visualization on the screen/lens cannot replace the embodied aspects of experience (Hansen), not to divert research into embodiment (a current trend) but to point out the limitations of all media.
To invoke another metaphor, our role is two-steps back from the screen, watching them watching it. The difference here is between recognizing patterns for the system (‘How can we enrich the viewer’s experience?’) and recognizing patterns of a pattern-recognition system (‘What is it about this medium that captures the attention of this audience?’ ‘What aspects of experience have been filtered out and how are the viewers compensating for such an absence?’). The former asks us to help fine-tune a medium, as though our participation in the conversation could perfect the VR system. The latter asks us to consider a question far better suited to the humanist: What kind of culture would produce such media?
1. Auerbach, largely confined to Istanbul due to the Nazi regime in Germany, wrote much of the book from the limited selection of texts available at the Istanbul State University. In fact, Auerbach credits the existence of Mimesis to the ‘lack of a rich and specialized library.’ On this basis, he operated much like a data-processing program—a closed-system that analyzes only that data properly encoded for it to process, autopoietically consuming its output as input. Source