Computation and the Humanities interviews

Item set

Title (Dublin Core)
Computation and the Humanities interviews

Description (Dublin Core)
Audio files of the interviews were transcribed in Julianne Nyhan and Andrew Flinn 2016. Computation and the Humanities: towards an oral history of Digital Humanities. Springer.

Creator (Dublin Core)
Julianne Nyhan and respective authors

Rights (Dublin Core)
Interview audio files are made available under a creative commons licence “by-nc-nd” with the following characteristics:
• by: the content must be attributed to me and the interviewer.
• non-commercial: commercial use of the content is not allowed.
• no derivative works: the material is to be allocated in its original form and may not be
edited.
See: http://creativecommons.org/licenses/by-ncnd/
3.0/ and http://creativecommons.org/licenses/by-nc-nd/3.0/legalcode.
Relation (Dublin Core)
Transcripts of interviews are published here: http://www.springer.com/it/book/9783319201696
fileFormat (schema)
mp3
Language (Dublin Core)
English
Type (Dublin Core)
Oral history interviews about the history of Digital Humanities
Temporal Coverage (Dublin Core)
c. 1949-present

Items

Advanced search
  • hic Rhodus, hic salta: Tito Orlandi and Julianne Nyhan
    This oral history interview between Wilhelm Ott and Julianne Nyhan was carried out on 14 July 2015, shortly after 10am, in the offices of pagina in Tübingen, Germany. Ott was provided with the core questions in advance of the interview. He recalls that his earliest contact with computing was in 1966 when he took an introductory programming course in the Deutsches Rechenzentrum (German Computing Center) in Darmstadt. Having become slightly bored with the exercises that attendees of the course were asked to complete he began working on programmes to aid his metrical analysis of Latin hexameters, a project he would continue to work on for the next 19 years. After completing the course in Darmstadt he approached, among others such as IBM, the Classics Department at Tübingen University to gauge their interest in his emerging expertise. Though there was no tradition in the Department of applying computing to philological problems they quickly grasped the significance and potential of such approaches. Fortunately, this happened just when the computing center, up to then part of the Institute for Mathematics, was transformed into a central service unit for the university. Drawing on initial funding from the Physics department a position was created for Ott in the Tübingen Computing Center. His role was to pursue his Latin hexameters project and, above all, to provide specialised support for computer applications in the Humanities. In this interview Ott recalls a number of the early projects that he supported such as the concordance to the Vulgate that was undertaken by Bonifatius Fischer, along with the assistance they received from Roberto Busa when it came to lemmatisation. He also talks at length about the context in which his TUSTEP programme came about and its subsequent development. The interview strikes a slightly wistful tone as he recalls the University of Tübingen's embrace of the notion of universitas scientiarum in the 1960s and contrasts this with the rather more precarious position of the Humanities in many countries today.
  • I heard about the arrival of the computer: Hans Rutimann and Julianne Nyhan
    This oral history interview was conducted between Hans Rutimann and Julianne Nyhan via skype on 15 November 2012. Rutimann was provided with the core questions in advance of the interview. In this interview he recalls that his first encounter with computing was at the MLA, c.1968/9. Following a minor scandal at the organisation, which resulted in the dismissal of staff connected with the newly arrived IBM 360/20, Rutimann was persuaded to take on some of their duties. After training with IBM in operating and programming he set about transferring the membership list (about 30,000 contact details) from an 'addressograph machine' to punched cards. After its early use to support such administrative tasks the MLA began investigating the feasibility of making the research tool called the MLA International Bibliography remotely accessible. Rutimann worked with Lockheed to achieve this. It was in Lockheed's information retrieval lab that the system known as Dialog, an online information retrieval system was developed (Summit 1967). He vividly recalls how he travelled the 3,000 miles to San Francisco to deliver the magnetic tape to Lockheed so that they could make the database available online. He “jumped for joy” when, once back in New York, the data was available to him via the newly acquired terminal of the MLA. While making clear that his roles in MLA, Mellon and EIF have primarily been enabling ones (and to this we can add advocacy, strategy and foresight) he also recalls the strong influence that Joseph Raben had on him and mentions some of the projects and conferences that he found particularly memorable.
  • So, into the chopper it went: Gabriel Egan and Julianne Nyhan
    This interview took place at the AHRC-organised Digital Transformations Moot held in London, UK on 19 November 2012. In it Egan recalls his earliest encounters with computing when he was a schoolboy along with some memories of how computers were represented in science fiction novels, TV programmes and advertising. His first job, at the age of 17, was as a Mainframe Computer Operator. He continued to work in this sector throughout the 1980s but by the end of the decade he recognised that such roles would inevitably disappear. In 1990 he returned to University where he completed a BA, MA and PhD over the next 7 years. He recalls his shock upon returning to University as he realised how little use was then made of computers in English Studies. Nevertheless, he bought a relatively cheap, second-hand Sinclair Z88 and took all his notes on it. Later he also digitised his library of 3000 books, destroying their hard copy versions in the process. The interview contains a host of reflections about the differences that computing techniques and resources have made to Shakespeare Studies over the past years, along with insightful observations about the contributions and limitations of DH. In this interview Egan describes himself as a 'would be Digital Humanist'; indeed, it is the landscape that he describes from this vantage point that makes his interview so interesting and useful.
  • Getting computers into Humanists' thinking: John Bradley and Julianne Nyhan
    This interview tool place in Bradley's office in Drury Lane, King's College London on 9 September 2014 around 11.30am. Bradley recalls that his interest in computing started in the early '60s. As computer time was not then available to him he sometimes wrote out in longhand the FORTRAN code he was beginning to learn from books. One of his earliest encounters with Humanities Computing was the concordance to Diodorus Siculus that he programmed in the late '70s. The printed concordance that resulted filled the back of a station wagon. The burgeoning Humanities Computing community in Toronto at that time collaborated both with the University of Toronto Computer Services Department (where Bradley was based) and the Centre for Computing in the Humanities, founded by Ian Lancashire. Aware of the small but significant interest in text analysis that existed in Toronto at that time and pondering the implications of the shift from batch to interactive computing he began work as a developer of Text Analysis Computing Tools (TACT). He also recalls his later work on Pliny, a personal note management system, and how it was at least partly undertaken in response to the lack of engagement with computational text analysis he noted among Humanists. In addition to other themes, he reflects at various points during the interview on models of partnership between Academic and Technical experts.
  • I mourned the University for a long time: Michael Sperberg-McQueen and Julianne Nyhan
    This interview took place on 9 July 2014 at the Digital Humanities Conference, Lausanne, Switzerland. In it Sperberg-McQueen recalls having had some exposure to programming in 1967, as a thirteen year-old. His next notable encounter with computing was as a graduate student when he set about using computers to make a bibliography of secondary literature on the Elder Edda. His earliest encounters with Humanities Computing were via books, and he mentions proceedings from the 'Concordances and the Dictionary of Old English' conference and a book by Susan Hockey (see below) as especially influential on him. In 1985 a position in the Princeton University Computing Centre that required an advanced degree in Humanities and knowledge of computing became available; he took on the post while finishing his PhD dissertation and continuing to apply for tenure-track positions. Around this time he also began attending the 'International Conference on Computers and the Humanities' series and in this interview he describes some of the encounters that took place at those conferences and contributed to the formation of projects like TEI. As well as reflecting on his role in TEI he also compares and contrasts this experience with his work in W3C. On the whole, a somewhat ambivalent attitude towards his career emerges from the interview: he evokes Dorothy Sayers to communicate how the application of computers to the Humanities 'overmastered' him. Yet, he poignantly recalls how his first love was German medieval languages and literature and the profound sense of loss he felt at not securing an academic post related to this.
  • There had to be a better way: John Nitti and Julianne Nyhan
    This oral history conversation was carried out via Skype on 17 October 2013 at 18:00 GMT. Nitti was provided with the core questions in advance of the interview. He recalls that his first encounter with computing came about when a fellow PhD student asked him to visit the campus computing facility of the University of Wisconsin-Madison, where a new concordancing programme had recently been made available via the campus mainframe, the UNIVAC. He found the computing that he encountered there rather primitive: input was in uppercase letters only and via a keypunch machine. Nevertheless, the possibility of using computing in research stuck with him and when his mentor Professor Lloyd Kasten agreed that the Old Spanish Dictionary project should be computerised, Nitti set to work. He won his first significant NEH grant c.1972; up to that point (and, where necessary, continuing for some years after) Kasten cheerfully financed out of his own pocket some of the technology that Nitti adapted to the project. In this interview Nitti gives a fascinating insight into his dissatisfaction with both the state and provision of the computing that he encountered, especially during the 1970s and early 80s. He describes how he circumvented such problems not only via his innovative use of technology but also through the many collaborations he developed with the commercial and professional sectors. As well as describing how he and Kasten set up the Hispanic Seminary of Medieval Studies (HSMS) he also mentions less formal processes of knowledge dissemination, for example, his so-called lecture 'roadshow' in the USA and Canada where he demonstrated the technologies used on the dictionary project to colleagues in other universities.
  • It's a little mind-boggling: Helen Agüera and Julianne Nyhan
    This interview was carried out between London and Washington via skype on the 18 September 2013, beginning at 17:05 GMT. Agüera was provided with the core questions in advance of the interview. She recalls that her first encounters with computing and Digital Humanities came about through her post in NEH, where she had joined a division that funded the preparation of research tools, reference works and scholarly editions. Thus, she administered grants to a large number of projects that worked, at a relatively early stage, at the interface of Humanities and Computing, for example, Thesaurus Linguae Graecae. In this interview she recalls some of the changes that the division where she worked made to its operating procedures in order to incorporate digital projects. For example, in 1979, a section that was added to application materials asking relevant projects to provide a rationale for their proposed use of computing or word processing. She also discusses issues like sustainability that became apparent over the longer term and reflects on some of the wider trends she saw during her career. Computing was initially taken up by fields like Classics and Lexicography that needed to manage and interrogate masses of data and thus had a clear application for it. She contrasts this with the more experimental and exploratory use of computing that characterises much of DH today.
  • They took a chance: Susan Hockey and Julianne Nyhan
    This interview was carried out via skpye on 21 June 2013. Hockey was provided with the core questions in advance of the interview. Here she recalls how her interest in Humanities Computing was piqued by the articles that Andrew Q Morton published in the Observer in the 1960s about his work on the authorship of the Pauline Epistles. She went on to secure a position in the Atlas Computer Laboratory where she was an advisor on COCOA version 2 and wrote software for the electronic display of Arabic and other non-ASCII characters. The Atlas Computer Laboratory was funded by the Science Research Council and provided computing support for Universities and researchers across the UK. While there she also benefitted from access to the journal CHum and built connections with the emerging Humanities Computing community through events she attended starting with the 'Symposium on Uses of the Computer in Literary Research' organised by Roy Wisbey in Cambridge in 1970 (probably the earliest such meeting in the UK). Indeed, she emphasises the importance that such gatherings played in the formation of the discipline. As well as discussing her contribution to organisations like ALLC and TEI she recalls those who particularly influenced her such as, inter alia, Roberto Busa and Antonio Zampolli.
  • It's probably the only modestly widely used system with a command language in Latin
    This interview was carried out on 10 July 2014 at the Digital Humanities Conference in Lausanne, Switzerland. In it Thaller recalls that his earliest memory of encountering computing in the Humanities dates to c. 1973 when he attended a presentation on the use of computational techniques to map the spatial distribution of medieval coins. The difficulties of handling large, paper-based datasets was impressed upon him as he compiled some 32,000 index card of excerpts for use in his PhD thesis. When he later encountered statistical standard software at the Institute for Advanced Studies in Vienna he found that such software could not be beneficially applied to historical data without first transforming in some way the historical data in the sources (indeed, the formalisation of historical and cultural heritage data is an issue that reoccurs in this interview, much as it did in Thaller's research). In light of his experience of the problems of using such software 'out of the box' to work with historical data he went on to teach himself the programming language SNOBOL. Within a few weeks he had joined a project on daily life in the middle ages and was building software to manage the descriptions of images that the project compiled and stored on punched cards. Having contributed to various other projects with computational elements, in 1978 he took up a post at the Max-Planck-Institut for History in Göttingen. As well as discussing the research he carried out there, for example, CLIO/ κλειω a databased programming system for History with a command language in Latin, he discusses the immense freedom and access to resources that he benefitted from. He also goes on to discuss some of the later projects he worked on, including those in the wider context of digital libraries, infrastructure and cultural heritage.
  • Moderate expectations, tolerable disappointments
    This interview was conducted on 11 July at the 2014 Digital Humanities Conference, Lausanne, Switzerland. Huitfeldt recounts that he first encountered computing at the beginning of the 1980s via the Institute of Continental Shelf Research when he was a philosophy student at the University of Trondheim. However, it was in connection with a Humanities project on the writings of Wittgenstein that he learned to programme. When that project closed he worked as a computing consultant in the Norwegian Computing Centre for the Humanities and in 1990 he established a new project called the 'Wittgenstein Archives', which aimed to prepare and publish a machine-readable version of Wittgenstein's Nachlass. Here he discusses the context in which he began working on the encoding scheme (A Multi-Element Code System (MECS)) that he developed for that project. In addition to discussing matters like the trajectory of DH research and his early encounters with the conference community he also discusses some of the fundamental issues that interest him like the role of technology in relation to the written word and the lack of engagement of the Philosophy community with such questions. Ultimately he concludes that he does not view DH as a discipline, but rather as a reconfiguration of the academic landscape as a result of the convergence of tools and methods within and between the humanities and other disciplines.
  • I would think of myself as sitting inside the computer: Mary Dee Harris and Julianne Nyhan
    This oral history interview was conducted 3 June 2015 via skype. In it Mary Dee Harris recalls her early encounters with computing, including her work at the Jet Propulsion Lab in Pasadena, California. Despite these early encounters with computing she had planned to leave it behind when she returned to graduate school to pursue a PhD; however, the discovery of c.200 pages of a Dylan Thomas manuscript prompted her to rethink this. Her graduate study was based in the English Department of the University of Texas at Austin, which did not have an account with the Computer Centre, and so it was necessary for her to apply for a graduate student grant in order to buy computer time. Her PhD studies convince her of the merits of using computers in literary research and she hoped to convince her colleagues of this too. However, her applications for academic jobs were not successful. After working in Industry for a time she went on to secure academic positions in Computer Science at various universities. During her career she also held a number of posts in Industry and as a Consultant. In these roles she worked on a wide range of Artificial Intelligence and especially Natural Language Processing projects. Her interview is a wide-ranging one. She reflects on topics like the peripheral position of a number of those who worked in Computers and the Humanities in the 1970s and her personal reactions to some of the computing systems she used, for example, the IBM 360. She also recalls how she, as a woman, was sometimes treated in what tended to be a male-dominated sector, for example, the physics Professor who asked “So are you going to be my little girl?”
  • The University was still taking account of the meaning of universitas scientiarum: Wilhelm Ott and Julianne Nyhan
    This oral history interview between Wilhelm Ott and Julianne Nyhan was carried out on 14 July 2015, shortly after 10am, in the offices of pagina in Tübingen. Ott was provided with the core questions in advance of the interview. He recalls that his earliest contact with computing was in 1966 when he took an introductory programming course in the Deutsches Rechenzentrum (German Computing Centre) in Darmstadt. Having become slightly bored with the exercises that attendees of the course were asked to complete he began working on programmes to aid his metrical analysis of Latin hexameters, a project he would continue to work on for the next 19 years. After completing the course in Darmstadt he approached, among others such as IBM, the Classics Department at Tübingen University to gauge their interest in his emerging expertise. Though there was no tradition in the Department of applying computing to philological problems they quickly grasped the significance and potential of such approaches. Fortunately, this happened just when the computing center, up to then part of the Institute for Mathematics, was transformed into a central service unit for the University. Drawing on initial funding from the Physics department a position was created for Ott in the Tübingen Computing Centre. His role was to pursue his Latin hexameters project and, above all, to provide specialised support for computer applications in the Humanities. In this interview Ott recalls a number of the early projects that he supported such as the concordance to the Vulgate that was undertaken by Bonifatius Fischer, along with the assistance they received from Roberto Busa when it came to lemmatisation. He also talks at length about the context in which his TUSTEP programme came about and its subsequent development. The interview strikes a slightly wistful tone as he recalls the University of Tübingen's embrace of the notion of universitas scientiarum in the 1960s and contrasts this with the rather more precarious position of the Humanities in many countries today.
  • Individuation is there in all the different strata: John Burrows, Hugh Craig and Willard McCarty
    This oral history interview between Willard McCarty (on behalf of Julianne Nyhan), John Burrows and Hugh Craig took place on 4 June 2013 at the University of Newcastle, Australia. Harold Short was also present for much of the interview. Burrows recounts that his first encounter with computing took place in the late 1970s, via John Lambert, who was then the Director of the University of Newcastle's Computing Service. Burrows had sought Lambert out when the card-indexes of common words that he had been compiling became too difficult and too numerous to manage. Craig's first contact was in the mid-1980s, after Burrows put him in charge of a project that used a Remington word processor. At many points in the interview Burrows and Craig reflect on the substantial amount of time, and, indeed, belief, that they invested not only in the preparation of texts for analysis but also in the learning and development of new processes and techniques (often drawn from disciplines outside English Literature). Much is said about the wider social contexts of such processes: Craig, for example, reflects on the sense of possibility and purposefulness that having Burrows as a colleague helped to create for him. Indeed, he wonders whether he would have had the confidence to invest the time and effort that he did had he been elsewhere. Burrows emphasises the network of formal and informal, national and international expertise that he benefitted from, for example, John Dawson in Cambridge and Susan Hockey in Oxford. So too they reflect on the positive results that the scepticism they sometimes encountered had on their work. As central as computing has been to their research lives they emphasise that their main aim was to study literature and continuing to publish in core literature journals (in addition to Digital Humanities journals) has been an important aspect of this. Though they used techniques and models that are also used by Linguists and Statisticians their focus has remained on questioning rather than answering.
  • The influence of algorithmic thinking: Judy Malloy and Julianne Nyhan
    This interview was carried out via skype on 11 August 2015 at 20:30 GMT. Malloy was provided with the core interview questions in advance. Here she recalls that after graduating from University she took a job as a searcher/editor for the National Union Catalog of the Library of Congress. About a year after she arrived Henriette D. Avram began work on the process of devising a way to make the library’s cataloguing information machine readable (work that would ultimately lead to the development of the MARC format (Schudel 2006)). Malloy recalls this wider context as her first encounter, of sorts, with computing technology: though she did not participate in that work it made a clear impression on her. She had learned to programme in FORTRAN in the 1960s when working as a technical librarian at the Ball Brothers Research Corporation. She had also held other technical roles at Electromagnetic Research Corp and with a contractor for the Goddard Space Flight Center, which was computerising its library around the time she worked there. She recalls that she did not use computers in her artistic work until the 1980s (when she bought an Apple II for her son). However, she had been working in an interactive, multimedia and associative mode for some time before then, as evidenced by the card catalog poetry and electronic books that she created in the 1970s and early 80s. In this interview she traces the importance of card catalogs, systems analysis and algorithmic thinking to many aspects of her work. She also reflects on why it was that the idea of combining computing and literature did not occur to her (and also was not practically feasible) until a later stage in her career. Among other topics, she reflects on the kinds of computing and computing environments that she encountered, from the reactions in the 1960s of some male engineers to the presence of a female technical librarian in the mainframe room to the thrill of discovering the community that was connected via the Whole Earth 'Lectronic Link (The WELL).