I’ve read a number of articles lately that posit “digital humanities” as a coherent intellectual movement that makes strong, scary normative claims about the proper future of the humanities as a whole.
Adam Kirsch’s piece in The New Republic is the latest of these; he constructs an opposition between a “minimalist” DH that simply uses computers to edit or read things as we have always done, and a “maximalist” version where technology is taking over English departments and leveling solitary genius in order to impose a cooperative but “post-verbal” vision of the future.
I think there’s a large excluded middle in that picture, where everything interesting actually happens. But I’m resisting — or trying to resist — the urge to write a blog post of clarification and explanation. Increasingly, I believe that’s a futile impulse, not only because “DH” can be an umbrella for many different projects, but more fundamentally because “the meaning of DH” is a perspectival question.
I mean it’s true, objectively, that the number of scholars actually pursuing (say) digital history or game studies is still rather small. But I nevertheless believe that Kirsch is sincere in perceiving them as the narrow end of a terrifying wedge. And there’s no way to prove he’s wrong about that, because threats are very much in the eye of the beholder. Projects don’t have to be explicitly affiliated with each other, or organized around an explicit normative argument, in order to be perceived collectively as an implicit rebuke to some existing scheme of values. In fact, people don’t even really get to choose whether they’re part of a threatening phenomenon. Franco Moretti hasn’t been cheerleading for anything called “digital humanities,” but that point is rapidly becoming moot.
I’m reminded of a piece of advice Mark Seltzer gave me sixteen years ago, during my dissertation defense. Like all grad students in the 90s, I had written an overly-long introduction explaining what my historical research meant in some grander theoretical way. As I recall, he said simply, “you can’t govern your own reception.” A surprisingly hard thing to accept! People of course want to believe that they’re the experts about the meaning of their own actions. But that’s not how social animals work.
So I’m going to try to resist the temptation to debate the meaning of “DH,” which is not in anyone’s control. Instead I’m going to focus on doing cool stuff. Like Alexis Madrigal’s reverse-engineering of Netflix genres, or Mark Sample’s Twitter bots, or the Scholars’ Lab project PRISM, which apparently forgot to take over English departments and took over K-12 education instead. At some future date, historians can decide whether any of that was digital humanities, and if so, what it meant.
(Comments are turned off, because you can’t moderate a comment thread titled “you can’t govern reception.”)
Postscript May 10th: This was written quickly, in the heat of the occasion, and I think my anecdote may be better at conveying a feeling than explaining its underlying logic. Obviously, “you can’t govern reception” cannot mean “never try to change what other people think.” Instead, I mean that “digital humanities” seems to me a historical generalization more than a “field” or a “movement” based on shared premises that could be debated. I see it as closer to “modernism,” for instance, than to “psychology” or “post-structuralism.”
You cannot really write editorials convincing people to like “modernism.” You’d have to write a book. Even then, understandings of the historical phenomenon are going to differ, and some people are going to feel nostalgic for impressionist painting. The analogy to “DH” is admittedly imperfect; DH is an academic phenomenon (mostly! at times it’s hard to distinguish from data journalism), and has slightly more institutional coherence than modernism did. But I’m not sure it has more intellectual coherence.