What is “Digital History”? Is it a field of study, a genre, a methodology, a promise? Can it be defined? What is it not? How can I, a young graduate student studying “Digital History,” succinctly define this minor field to future employers? How well does the title chosen for this minor field, “Digital History:Theory and Practice,” articulate and represent what I will learn and produce? This is the key issue in a majority of the readings this week discussing the origins of humanities computing, the advantages and disadvantages in employing digital media in historical research, and the elusive definition of the term. The work, Defining Digital Humanities: A Reader, by Melissa Terras and Edward Vanhoutte even states that no universal definition exists. That text alone includes numerous articles and debates between scholars over time that incorporates many dynamic definitions. “Digital History” encompasses a field of study and a methodology, and is dynamic, collaborative, fluid, and generative.
Digital history is a field of study, as numerous undergraduate and graduate programs offer majors, minors, and certificates in some form of “digital humanities.” As discussed in “Interchange: The Promise of Digital History,” these courses and programs focus on the theory of digital media and humanities disciplines and strive to emphasize the new narrative forms, non-linearity, and new arguments that digital media can create for a varied audience. They seem to focus less on the concrete technology, though some scholars want programming to be an essential element. The “Grounding Digital History in the History of Computing” article stresses the ability of programming to provide tailored code for historians. George Mason requires its doctoral students in history to complete two courses in digital media and history, the Clio Wired courses. One course introduces the students to theory, and the subsequent course teaches the practice of web design using HTML and CSS. These courses strike a good balance between the two. Digital history can and does encompass both theory of technology and history, as well as the digital tools themselves.
Digital history is also a methodology, a way to tackle historical questions and problems. Cameron Blevins best communicated digital history’s ability to create new historical arguments in his article, “The Perpetual Sunrise of Methodology.” He argues that historical scholarship needs to focus on the methodology of digital media in actively presenting new historical arguments, not just promoting its potential to do so. This is an important aspect of digital history that George Mason’s Clio Wired courses introduced to me in the first year of the program: What can you do with digital media and history that you cannot do without it? How does digital media unveil new historical arguments, or generate new historical questions? Blevins clearly shows how this is possible with his analysis of nineteenth-century Houston newspapers. The script he wrote to identify geographic terms revealed a regional based geography, rather than the traditional nationally based geography. He set a new historical argument in motion.
Digital history also involves more than just historians; it is collaborative effort involving museum curators, librarians, and computer programmers, among others. But it also allows historians to learn and practice those skills themselves. Almost all of the digital projects and articles from this introduction to digital history feature numerous authors, collaborators, and editors. Andrew Prescott argues more than once that digital humanities needs to continuously involve collaboration between scholar, curator, and technician. Digital history involves collaboration across professions and institutions of knowledge. It promotes new historical scholarship, shows the non-linear complexities and multiple perspectives of the past to a variety of audiences, and prompts new historical questions.