The Information Age has provided us with both a flood of measurable data and a variety of new tools to analyze and present that data. This course will consider how the analysis and visualization of information through digital technologies has significantly changed the way we look at our world both within the academic community and in society at large. The data analysis portion of the class will begin with an understanding of the ontologies of data and metadata and address analysis techniques such as distant reading, topic modeling, text encoding, and text analysis. The visualization portion of the class will interrogate just what it means to visualize an argument and will include both a critique of and experimentation with the timelines, maps, infographics, charts, and time-based experiences that are used as alternatives to textual explanation. Along with practical work with digital tools for analyzing and visualizing humanities data, such as Voyant, Google Ngrams, CartoDB, Omeka Neatline, Topic Modeling Tool, and Agisoft Photoscan, this course will include readings by authors such as Manovich, Tufte, Moretti, Ramsay, Presner, and Grafton.
This is one of two non-sequential survey courses in the Digital Humanities (the other, DH: Collections and Connections, is usually offered by the Center for Experimental Humanities in the fall) that consider questions and technologies fundamental to modes of academic inquiry made possible by new media and computational methods. While the two courses will cover different sets of technologies and digital practices, both will consider how we make our work public via digital platforms that provide rhetorical and design flexibility in making intellectual arguments.
LaGuardia Co-op Mac Lab
Prof. Kimon Keramidas
Center for Experimental Humanities
Office Hours: Tuesday 2-4