Text analysis is just one of a variety of methods at the disposal of digital humanists. With the application of computational methods, scholars can perform "distant reading" (Franco Moretti) or "macroanalysis" (Matthew Jockers) of large collections of text, such as a corpus of 2,958 19th century British novels. Robert K. Nelson, a recent visitor for the "Computing and the Practice of History" program, utilized topic modeling to uncover linguistic themes in the Richmond Daily Dispatch during the U.S. Civil War. Cameron Blevins, a PhD student at Stanford's Center for Spatial and Textual Analysis, utilized both textual analysis and geospatial analysis tools to work with several decades of Houston newspapers. Blevins was able to visualize the "imagined geographies" from the vantage point of Houston, and how these geographies changed over the course of the 19th century (his interactive data visualizations are embedded within his methodological essay, "Mining and Mapping the Production of Space: A View of the World from Houston").
Scholars from a variety of disciplines, such as sociology, rhetoric, history, and literature, use a similar set of tools to explore and expose collections of text. The newly-formed Text Analysis Working Group (TAWG) will meet at the D-Lab and bring together researchers from these various disciplines to explore and discuss these methods. Members will be working with a diverse set of texts that include tweets, news articles, sermons, novels, and open-ended survey responses. Group activities will include tutorials, hackathons, speakers, discussion of scholarship, group-work, collaborations and more.
TAWG will meet every other Tuesday from 4 to 6 PM at the D-Lab. The group's next meeting is on December 9th. Please contact the group facilitator, Nick Adams to be added to the mailing list.