Jeff  Drouin

Profile photo of Jeff Drouin  

I am a Post-Doc in Digital Humanities and Visiting Assistant Professor of French at the University of Illinois at Urbana-Champaign. I recently completed a Ph.D. in English at the Graduate Center of the City University of New York, with a dissertation on James Joyce, the new physics, and modernist print culture. I am currently developing an electronic research environment for Marcel Proust's À la recherche du temps perdu that will involve text mining, social network analysis, and visualization. I am also drawing plans to write a grant for a social network analysis of avant-garde publishing networks in conjunction with the Modernist Journals Project.

  • Network Analysis and Lit Crit


    I’ve recently gotten interested in network analysis (using ORA, Gephi, and others) to support research and criticism of both large corpora and single works. So I’d be interested in a session on the use of quantitative visualization for text-based research and discovery. I’m a literature scholar and digital humanist with my feet in both the print and digital realms, with my main research areas in Joyce, modernist periodicals (with an interest in digital infrastructures like the Modernist Journals Project), and Proust. So here are some of the topics I’d be interesting in presenting on or discussing:

    • The transition from statistical analysis to humanistic interpretation: methodological and/or theoretical problems.
    • Network representation of a single work: What can it really tell us?
    • Since most network analysis applications are designed for corporations and sociologists, are there any that are particularly useful for humanists? If not, what would one look like? And what does this say about the nature of digital humanism?
    • Digitization of print culture and dissemination platforms in the shadow of the Google Monster: Should archival projects like the MJP and others (perhaps at UVA?) start focusing more attention on the development and use of interpretive tools using Google data (and on improving Google data, which is usually not up to snuff for academic research), as opposed to digitizing materials ourselves? This is likely to be a thorny issue in applying for grants, since funding institutions like the NEH, NSA, and Melon will probably start giving out less money for digitization efforts. Maybe some strategy speculation would be of interest here.

    Aside from network analysis, I’m also intrigued by the current crisis in the humanities, with the closure of foreign language and theater departments, etc., and the responses that have come from scientists and other humanists. It would be interesting to have a discussion about the role that digital humanities plays and could play in the midst of these battles. Not sure if we need to devote a session to it, but informal discussions could be stimulating.

Skip to toolbar