Director of Digital Technologies
Center for Digital Scholarship
Brown University Library
Linked data, programming for digital humanities, digital repositories and library systems
Digital Humanities Librarian at Brown University
Data modeling; database architecture
The Early American Foreign Service Database (EAFSD)
Women Writers Project
Markup languages, programming for digital humanities projects, XML technologies
Syd Bauman works at the WWP on modeling texts in XML. He also toys around with modeling other kinds of data (personnel records, medical charts, financial transactions, etc.), mostly in XML, but sometimes in relational databases or spreadsheets.
Professor for French, Francophone and Italian Linguistics
University of Leipzig
Elisabeth Burr holds the chair of French, Francophone and Italian Linguistics in the Department of Romance Studies of the University of Leipzig. Her research and teaching focuses on corpus design, corpus markup, and on the corpus-based investigation of language use (language varieties, gender, linguistic categories, multilingualism); on the relationship between technological developments, the status of languages and the identity construction of speakers; the (conscious) creation of norms via tools like grammars, dictionaries and discourses about languages; on multilingualism and multiculturalism in digital, urban and scholarly spaces; and on the conceptual changes which information models bring to humanities research and teaching. She has also founded the European Summer School “Culture & Technology” which aims to bring together young scholars from the Humanities, Engineering and Information Sciences in order to create the conditions for future project-oriented collaboration and networking across the borders of the individual disciplines in a Digital Humanities perspective.
King’s College London
Literary studies, theory of text encoding, digital editing, digital humanities
Paul Caton is a Research Associate in the Department of Digital Humanities at King’s College London. His current work includes modeling entity types and relations for the digital version of the new Cambridge Edition of the Works of Ben Jonson, and creating an ontology of manuscripts and related phenomena for the DigiPal project.
Berlin-Brandenburg Academy of Sciences and Humanities
History, practical data modeling, data models for complex multi-level encoded texts
Alexander Czmiel studied Digital Humanities, Egyptology and History at the Universty at Cologne, Germany. Since 2005 he works at the TELOTA-Initiative (“The electronic life of the Academy”) at the Berlin-Brandenburg Academy of Sciences and Humanities. There he is responsible for consulting and realization of projects in the field of Digital Humanities.
Women Writers Project
Julia Flanders directs the Brown University Women Writers Project, where her work focuses on the use of text encoding and other forms of digital representation in scholarly communication, and also on developing curricula and tools for teaching text encoding in a humanities context. She has been closely involved with the Text Encoding Initiative, serving as its chair, and also with the Association for Computers and the Humanities (serving as president, 2008-11). Her research interests include digital editing, digital humanities project development, information modeling, text encoding and markup languages, and literary studies.
Humboldt Universität Berlin
Library and information science, data modeling for digital libraries (Europeana)
Dr. Stefan Gradmann is a full Professor teaching knowledge management and semantic knowledge architectures at the School of Library and Information Science of Humboldt-Universität zu Berlin since 2008. Other focal areas in teaching and research are digital libraries, library automation as well as the use of information technology in the realm of signification and interpretation in the ‘Digital Humanities’ with a specific focus on the ‘document’-notion and its deconstruction. He has been heavily involved in the creation of Europeana from its beginnings, where he is responsible for semantic interoperability and one of the architects of the Europeana Data Model (EDM).
University of Bergen
Textual ontology, markup language theory, digital humanities
Claus Huitfeldt is Associate Professor of philosophy in Bergen, Norway, and was founding director of the Wittgenstein Archives at the University of Bergen. His research focuses on the ontolology, epistemology and semantics of markup languages and digital documents.
Professor for Literary Computing and German Literary History
University of Würzburg
Fotis Jannidis has worked on questions of text encoding, especially in the context of digital editions, for example on ways to encode documentary or genetic informations. In the last years he also worked on a history of the German novel and ways to complement this research by the use of large corpora, specifically a corpus of German novels from 1500-1930. His is involved in a new edition of Goethes ‘Faust’, the virtual research environment ‘Textgrid’ and ‘Dariah’, a European Digital Humanities network.
Lecturer and Academic Technology Specialist
Department of English
Jockers’s research involves computational approaches to the study of large collections of literature, what he calls “macroanalysis. His approach has much in common with corpus linguistics and borrows from text-mining, information retrieval, and natural language processing. His research focus, however, is on strictly literary questions, especially questions related to literary history and the nature of literary change over time. His published work includes essays on computational approaches to authorship attribution, as well as papers on Irish and Irish-American literature. His book, Macroanalysis: Methods for Digital Literary History, is under contract with the University of Illinois Press. Jockers is the Co-Founder and Co-Director, with Franco Moretti, of the Stanford Literary Lab . Jockers teaches Irish literature and both introductory and advanced courses in humanities computing.
Assistant Director, Humanities Digital Workshop
Washington University in St. Louis
Douglas Knox recently joined Washington University in St. Louis as assistant director of the Humanities Digital Workshop. Prior to November 2011 he was Director of Publication and Digital Initiatives at the Newberry Library in Chicago. He was managing editor of the Encyclopedia of Chicago (University of Chicago Press, 2004) and directed a project to create a full-text digital version of the Chicago Foreign Language Press Survey (http://flps.newberry.org). He is curious how digital evidence relates to historical arguments, and how stories bridge the scales at which we make up our collective minds.
College of Information Studies
University of Maryland
Textual scholarship, preservation and modeling of virtual worlds, image analysis, literary studies, information science
Kari Kraus is an Assistant Professor in the College of Information Studies and the Department of English at the University of Maryland. Her research and teaching interests focus on digital preservation, Alternate Reality Games and transmedia storytelling, and textual scholarship and print culture. Kraus is a local Co-PI on an Institute of Museum and Library Services grant for preserving virtual worlds; a Co-PI on an IMLS Digital Humanities Internship grant; and the co-Principal Investigator of an NSF grant to study Alternate Reality Games (ARGs) and transmedia fiction in the service of education and design. Her work has been published or received coverage in the New York Times, the Atlantic, Wired, Baltimore Public Radio, and the Long Now Foundation. In addition to the University of Maryland, she has taught at the University of Rochester and the Eastman School of Music, and in the Art and Visual Technology program at George Mason University.
Jim Kuhn is Head of Collection Information Services at the Folger Shakespeare Library. He is responsible for planning and managing technical services operations (Acquisitions, Cataloging, and Photography and Digital Imaging). In addition to an MLS, Jim has a Master of Arts in Philosophy with a focus on philosophies of language and science.
University of Hamburg
Semantic annotation, text analysis, narratology, machine learning, markup language theory, computer-assisted reader research
Jan Christoph Meister is Professor of Modern German Literature (Theory of Literature, Methodology of Textual Analysis and Literary Computing) in the Department of Langage, Literature and Media I, Faculty of Humanities, University of Hamburg. His research focuses on Literary Computing, Cognitive Modelling, Narratology, Austrian Literature of the 20th century, and Fantastic Literature.
University of Würzburg
Gregor Middell currently works as a software developer on a genetic digital edition of Goethe’s Faust and as a research assistant at the Chair of Computer Philology, Julius-Maximilians-Universität Würzburg. His main research interests are firstly computer-supported collation with the aim to semi-automatically correlate eletronic texts, analyze textual variance and determine intertextual relationships and secondly markup theory/practice, specifically with regard to the constraints of applying context-free grammars to the modeling of complex, natural language texts. Before joining the Faust project, he earned a Master’s degree from Humboldt-Universität zu Berlin, majoring in both Modern German Literature and Computer Science. Further information can be found on his homepage: http://gregor.middell.net/
Associate Director, Maryland Institute for Technology in the Humanities
Assistant Dean for Digital Humanities Research, University of Maryland Libraries
University of Maryland
Trevor Muñoz is Assistant Dean for Digital Humanities Research at the University of Maryland Libraries and an Associate Director of the Maryland Institute for Technology in the Humanities (MITH). Trevor holds an MA in Digital Humanities from the Department of Digital Humanities at King’s College London and an MS in Library and Information Science from the Graduate School of Library and Information Science at the University of Illinois, Urbana-Champaign. He works on developing digital research projects and services at the intersection of digital humanities centers and libraries. Trevor’s research interests include electronic publishing and the curation and preservation of digital humanities research data.
Center for Digital Scholarship
Brown University Library
Digital humanities, information design, classics, hypertext systems, rhetoric of digital interfaces
Elli Mylonas is Associate Director for Projects and Research. She came to STG early in 1994 as Lead Project Analyst. She is principally responsible for developing and managing STG research projects and also participates in them as an STG consultant. Her areas of expertise lie in hypertext, SGML, structured text problems, and digital libraries. She has published and spoken on hypertext and electronic text, as well as on project management and academic software projects. Before coming to STG, Elli was Managing Editor of the Perseus Project at Harvard University, a multimedia database on classical Greek civilization.
Elena Pierazzo gained a PhD in Italian Philology from Scuola Normale Superiore di Pisa in 2001. After a few years at the University of Pisa, where she worked as Research Assistant and lecturer, she moved in 2006 to the Department of Digital Humanities (formerly Centre for Computing in the Humanities), King’s College London, where she is now Chair of the Teaching Committee and Director of the MA in Digital Humanities. At King’s she has played leading roles in major research projects such as Jane Austen’s Fiction Manuscripts, CHARM – AHRC Research Centre for the History and Analysis of Recorded Music, and the Jonathan Swift Archive. She has been chairing the TEI SIG on Manuscripts since 2004 and an elected member of the TEI Council since 2006; from 2012 she will serve on the TEI Board.
Professional information modeling, humanities scholarship, digital humanities, markup language theory and practice, LMNL
In 2002, Wendell Piez co-authored with Jeni Tennison a paper for the Extreme Markup Conference in which they described LMNL, the Layered Markup and Annotation Language. Like XML, LMNL is a metalanguage, a specification for how to model arbitrary structures in documents. Unlike XML, LMNL is defined as a data model not a markup syntax, supports description of overlapping regions (ranges) of text within a document, and allows for providing ranges with annotations that are not simply name/value pairs, but can themselves be structured like documents. While never (to date) fully implemented, the LMNL model has been the basis for several experiments and prototypes demonstrating the potential of markup beyond XML, and continues to stimulate the thinking of researchers in this area. Piez has also worked more generally to articulate the need for a capable data model to support research in the humanities, through both theoretical arguments and demonstrations of the potentials of LMNL processing on an XML platform. Most recently, his efforts have included parsing LMNL syntax (implementations in Python and XSLT 2.0) into XML representations (using both inline and standoff annotations) of the LMNL model.
Institute for Advanced Technology in the Humanities
University of Virginia
Archival metadata, digital humanities, digital project development, markup languages
Daniel Pitti is Associate Director of the Institute for Advanced Technology at the University of Virginia. Pitti has extensive experience in the design and development of international archival description communication standards, as the principal technical architect of Encoded Archival Description (EAD) and Encoded Archival Context-Corporate Bodies, Persons, and Families (EAC-CPF), and working collaboratively with humanities scholars in the analysis and digital representation of artifacts (objects of interest).
Associate Professor of English
University of Nebraska-Lincoln
Text analysis, digital literary studies, programming for large text collections, digital humanities
Stephen Ramsay is an Associate Professor of English and a Fellow at the Center for Digital Research in the Humanities at the University of Nebraska-Lincoln. He is the author of Reading Machines: Toward an Algorithmic Criticism (University of Illinois Press), and the co-author (with Brian Pytlik-Zillig) of Abbot — an interoperability framework for large XML-encoded text corpora.
Institut für Sprach- und Literaturwissenschaft
Medivial studies, digital editions, digital libraries, digital dictionaries
Since 2010 Andrea Rapp is a professor for medieval studies and computer philology at the Department of Linguistics and Literary Studies at Technical University Darmstadt. She has been the head of the Goettingen Digitization Center at the State and University Library in Goettingen (2003-2004), afterwards one of the executive directors of the Center for Digital Humanities at Trier University (2004-2010). She has been working in the field of digitization, digital editions, electronic dictionaries, and digital humanities in general for over 20 years. She is in charge of several DFG- and BMBF-funded projects and is one of the initiators of the TextGrid project.
University of Würzburg
History, digital editing, genetic editions, visualization
Malte Rehbein is a research associate in the department of Computerphilology at Würzburg University, director of the Würzburg Research Centre for Digital Editing and lecturer in Digital Humanities as well as in Medieval History. His research interests cover a wide range of topics in the emerging discipline of Digital Humanities. He is focusing though on computational methods and tools for historical research (“digital history”). Among other activities in this area, he is developing methods for the processing of complex historical texts, textual variation and digital editing and on advanced manuscript studies.
Graduate School of Library and Information Science
University of Illinois, Urbana-Champaign
Markup language theory, analytic philosophy, ontologies
Allen Renear is a professor at UIUC’s Graduate School of Library and Information Science, where he teaches courses and leads research in information modeling, data curation, and digital publishing. He also has an appointment in the philosophy department. Prior to coming to GSLIS Allen was the founder and director of the Brown University Scholarly Technology Group. His current research is focused on issues in the development of formal ontologies for scientific and cultural objects, and the exploitation of those ontologies in data curation, scientific publishing and information system design.
Research Director at INRIA–Visiting Scientists
Department of German Language and Linguistics.
XML, TEI, language resources, digital repositories, lexica, digital humanities
University of Cologne
History, digital editing, large text collections, large archival collections, meta data alignment
Patrick Sahle has a MA in History, Philosophy and Political Science and a PhD in Humanities Computer Science (with a thesis on digital scholarly editing), both from the University of Cologne (Germany). Currently he coordinates the affairs of the Cologne Center for eHumanities where he also cares for the digital methods and information architecture in various Digital Humanities research projects. As the chair for Humanities Computer Science, he is involved in DARIAH-DE, a large project on building infrastructures for the Digital Humanities. He is also a founding member of the ‘Institute for Documentology and scholarly editing‘.
Center for Complex Network Research
Maximilian Schich is a Visiting Research Scientist at BarabásiLab, where his work focuses on Complex Networks in Art History and Archaeology. His Ph.D. (Humboldt University 2007) deals with Reception and Visual Citation as Complex Networks and builds on more than a decade of consulting work with large graph databases in art research. He is an Editorial Advisor at Leonardo Journal (MIT-Press) and aims to foster the community of practice related to Arts, Humanities, and Complex Networks. Max is interested in understanding the nature and emergence of complexity in the arts and humanities, beyond given data model definitions.
Desmond Schmidt has a BA from the University of Queensland (1980) in Classical Greek language and Ancient History, a PhD from the University of Cambridge, UK in Classical Greek papyrology (1987), and a second PhD (2009) from the ITEE School at UQ on ‘Multiple Versions and Overlap in Digital Text’. From 1989 until 2005 he worked full or part time for the Cambridge Wittgenstein Archive making an edition of Wittgenstein. He has commercial experience in software development with XtreamLok in Brisbane, working on a license management system for OSX, and with Leximancer, on the development of a text-mining application. He currently works at the Information Security Institute in the Queensland University of Technology as a software engineer, on a denial of service testbed. Since 2002 he has collaborated with the Digital Variants team at Roma3, Italy, on ways to represent modern manuscripts in the digital medium, and for the past two years with the HRIT group at the University of Loyola, Chicago.
Susan Schreibman is the founding Director of the MPhil in Digital Humanities and Culture at Trinity College Dublin. She was previously Director of the Digital Humanities Observatory. Her digital projects include The Thomas MacGreevy Archive and the Versioning Machine. She is a co-editor of The Blackwell Companion to Digital Humanities and the Blackwell Companion to Digital Literary Studies.
Black Mesa Technologies
Professional information modeling, standards development, textual scholarship, scholarly editing, German literature, markup language theory
C. M. Sperberg-McQueen (Black Mesa Technologies LLC) is a consultant specializing in helping memory institutions solve information management problems and preserve cultural heritage information for the future by using descriptive markup: XML, XSLT, XQuery, XML Schema, and related technologies. He co-edited the XML 1.0 specification and the Guidelines of the Text Encoding Initiative.
Digital library, modeling historical texts in library and research contexts, meta data formats for digital texts
Dr. Stäcker is Deputy Director at the Herzog August Bibliothek, and is responsible for reader services, cataloguing and acquisition, reproduction services and digitization projects of the Wolfenbuettel Digital Library. Editor of the journal Wolfenbütteler Notizen zur Buchgeschichte. His current research interests include digitization of cultural heritage material, book history, and digital editions of early modern imprints.
Lisa Swanstrom is an Assistant Professor of English at Florida Atlantic University. Her areas of research include science fiction, fantasy, and the digital humanities. Before joining the English Department at FAU, she was a postdoctoral research fellow in the Digital Humanities at Umeå University’s HUMlab in northern Sweden (2010), as well as the Florence Levy Kay Fellow in the Digital Humanities in the English Department at Brandeis University in Massachusetts (2008-2009). She completed her Ph.D. in Comparative Literature in June 2008 at the University of California, Santa Barbara.
University of Saarbrücken
Corpus linguistic, designing corpora for research questions, multi-level annotation
Elke Teich’s research focuses on the corpus-based investigation of linguistic variation, both in contemporary language (synchronic variation) as well as historically (diachronic variation). She is especially interested in the evolution of scientific writing, including linguistic diversification according to scientific domains. In a current project called Registers in Contact (DFG; 2010-2013), the focus is on the development of new scientific registers from the 1960s/1970s to the early 2000s (e.g., computational linguistics, bioinformatics). Informed by the experiences gained in her computational and linguistic work, Elke Teich’s research seeks to advance the use of text corpora and language technology in the philologies/humanities as well as to inform information technology by linguistic knowledge. Elke Teich is PI in projects on applied research, e.g., the Clarin-D project, as well as foundational research, notably the Saarbrücken Cluster of Excellence ‘Multimodal Computing and Interaction’.