Dissertations, Theses, and Capstone Projects

Date of Degree

9-2025

Document Type

Master's Thesis

Degree Name

Master of Arts

Program

Digital Humanities

Advisor

Jeffrey Allred

Subject Categories

Digital Humanities | History of Science, Technology, and Medicine | Technical and Professional Writing

Keywords

media archaeology, command-line interfaces, unix, bell laboratories, computational textuality, word processing

Abstract

This thesis provides a media archaeological and theoretical examination of ed(1), the Unix line editor developed by Ken Thompson at Bell Labs in 1969, arguing that its approach to "invisible text" processing established lasting patterns of human-computer collaboration that persist in contemporary digital systems. Through historical research, technical analysis, and critical interpretation, this study demonstrates how ed(1) embodied a particular epistemology of text processing that appears alien in a contemporary context, but proves insightful in the age of generative text creation.

ed(1) emerged from the unique convergence of Bell Labs' collaborative research culture, severe hardware constraints of early Unix systems, and the telecommunications infrastructure that provided its operational environment. Working within the limitations of teletype machines that could only display one line at a time, ed(1) users learned to manipulate documents held entirely in computer memory through abstract, terse commands that operated on textual structure rather than visual appearance. This mode of interaction required users to collaborate, in effect, with computational systems, maintaining mental models of document organization while delegating pattern matching, addressing, and transformation operations to algorithmic processes they could not directly observe.

The thesis examines how this delegation established what I call an "epistemology of invisible text"—a mode of textual knowledge based on trust in computational processes rather than direct visual manipulation. Through detailed analysis of ed(1)'s command syntax, addressing systems, and integration within Unix tool chains, this study reveals how users learned to specify textual intentions through declarative commands while accepting responsibility for transformations they could not immediately verify. This cognitive compact prefigures contemporary interactions with machine learning and so called artificial intelligence systems, where users routinely delegate complex textual processing to neural networks operating according to statistical patterns learned from vast corpora.

By tracing the institutional origins and technical evolution of ed(1), this thesis contributes to digital humanities scholarship on the cultural history of computing while offering critical insights into contemporary debates about algorithmic mediation, digital literacy, and human agency in computational systems. This study demonstrates how understanding this genealogy of text processing illuminates both the possibilities and limitations of human-machine collaboration in an era of increasing technological opacity.

Share

COinS