Skip to main content

Cut-Copy-Paste Constructionism: Teaching Encoding and Text Mining with Erasure Poetics: Cut-Copy-Paste Constructionism: Teaching Encoding and Text Mining with Erasure Poetics

Cut-Copy-Paste Constructionism: Teaching Encoding and Text Mining with Erasure Poetics
Cut-Copy-Paste Constructionism: Teaching Encoding and Text Mining with Erasure Poetics
  • Show the following:

    Annotations
    Resources
  • Adjust appearance:

    Font
    Font style
    Color Scheme
    Light
    Dark
    Annotation contrast
    Low
    High
    Margins
  • Search within:
    • Notifications
    • Privacy
  • Issue HomeJournal of Interactive Technology and Pedagogy, no. 27
  • Journals
  • Learn more about Manifold

Notes

table of contents
  1. Cut-Copy-Paste Constructionism: Teaching Encoding and Text Mining with Erasure Poetics
    1. Abstract
    2. Introduction
    3. Course Context and Design
    4. A Constructionist Approach to Literary Text Mining Pedagogy
    5. Erasure Class Session
    6. Variable opacity
    7. Preservation of context and location data
      1. Process-based, layered annotation
      2. Logic without coding languages
    8. Conclusion: Hybrid Computing
    9. Notes
    10. References
    11. About the Authors

Cut-Copy-Paste Constructionism: Teaching Encoding and Text Mining with Erasure Poetics

Luca Messarra, Stanford University

Nichole Nomura, University of Wyoming

Abstract

This article reports on a classroom experiment in digital humanities (DH) erasure-poetry pedagogy. Using art supplies, students “erased” words as a means of experimenting with and visualizing encoding and text-mining processes, making high-conceptual decisions around encoding and text-mining with low technical barriers to entry. We propose that using physical media (including variation in marker and paint opacity, and the ability to cut, fold, and move strips of paper) affords a hybrid, constructionist approach to encoding and text-mining that encourages experimental play and creative approaches to critical data studies. Encoding and mining are not only essential concepts in computational literary study—they are also essential to creative humanist inquiry writ large, notably utilized by erasure poets such as M. NourbeSe Philip, Solmaz Sharif, and Hugo García Manríquez. Building on the work of Christopher Ohge and Charlotte Tupman, we argue that situating computational practices only within the context of the “digital” humanities forecloses on their wider application. Among other things, embodying and enacting encoding and text-mining processes facilitates what Daniel Scott Snelson calls a “poetics of the search,” a playful, performance-based approach to digital humanities inquiries, providing students with contingent pathways to imaginatively reconstruct the digital, the humanities, and their variable sites of intersection on/off the page/screen.

Keywords: erasure; constructionism; text mining; text encoding; poetics.

Introduction

It is easy for students starting out in computational literary studies to see text mining and text encoding as separate processes. The act of computation abets in that, as it hides much of the actual encoding—not just the encoding of a TEI (Text Encoding Initiative) edition, but also the character encoding of UTF-8 or the in-between steps of NLP (natural language processing)—encoding is frequently only visible when something breaks. We taught a literary text mining class to undergraduates in which seeing encoding and mining processes as interconnected was an essential learning outcome. Drawing on Seymour Papert’s work in teaching Logo, a coding language, and following recent calls to “visceralize” data and its creation, we turned to constructionist pedagogy to scaffold student understanding of the relationship between encoding and text mining. Constructionist pedagogy, like minimal computing, asks us to consider whether and how we should teach coding, and how important it is that the “digital” in “digital humanities” be coextensive with electronic computing. In what follows, we report on a constructionist class session that used the creation of erasure poetry as a minimal computing method to achieve our learning outcomes.

Constructionist pedagogy and minimal computing have important theoretical and practical overlaps. The specific term of art, “minimal computing,” established and maintained by the work of the eponymous “Minimal Computing Working Group,” has moved from “a bag of tools” to a “methodology” (Risam and Bessette 2024, 750) a shift accompanied by increasing applications of minimal computing to pedagogy (e.g. Bessette 2023; McGrail 2017) in digital humanities contexts. Minimal computing has at its core a set of commitments that recursively shape the practices associated with the method, commitments that find new form in a set of design questions posed by Risam and Gil (2022): “1) ‘what do we need?’; 2) ‘what do we have’; 3) ‘what must we prioritize?’; and 4) ‘what are we willing to give up?’” In Risam and Gil, we hear echoes of Understanding by Design—the foundational text for what is today, in education, called “backwards design”—and in the discourse of minimal computing more broadly, a history of teaching computation grounded in constructionism (Wiggins and McTighe 2008). Minimal computing is therefore valuable in the classroom both because of the values that underwrite our commitment to minimal (reducing environmental impact and cost, increasing access, etc.), and because it has specific, proven pedagogical utility in the computational digital humanities via constructionism.

Our article draws out some of constructionism’s and minimal computing’s alignments by discussing a guest lecture by Luca on erasure poetry in Nichole’s “Introduction to Literary Text Mining” course. On that class day, students studied and produced erasure poems in order to physicalize the computational logics of text mining and encoding. As we will discuss, erasure provides productive scaffolds for understanding the interconnected operations often hidden by these computational processes. In particular, it offers variable opacity, the preservation of context and location data, process-based, layered annotation, and an opportunity to teach logic without coding languages. Erasure, a method which poet M. NourbeSe Philip describes as “intrinsic to colonial and imperial projects,” also provides a unique opportunity to visceralize data violence.1 The use of erasure in a digital humanities (DH) context bridges the destructive, transformative, and liberating valences of the technique with computational processes more broadly. Humanists can then use erasure as a gateway into critical data studies, fostering conversations about exclusion and visibility in our work.

The methods we offer here—grounded in a constructionist approach to teaching the relationship between encoding and mining—are cheap, durable, and nonproprietary. They will work every year, in traditional and non-traditional settings, under many constraints. We also believe that constructionist pedagogy for teaching literary text mining is a sound pedagogical approach in general, not just a band-aid approach for spaces that are low-tech by necessity. It is, as we discovered based on student feedback, especially useful in introductory literary-critical and text-mining settings, where our students often expressed hesitation about the computational elements of the course in the first weeks of instruction.

Before moving to the specifics of our class session, we want to concretize some of these claims using an erasure poem generated by one student, Rachelle Weiss, during the class session. Weiss’s work is shared with consent.

An image of Rachelle Weiss's erasure poem, constructed on top of a page of The Little Prince. Most lines are inked out in black sharpie, several are underlined in pencil, and a crown-shaped network in light blue connects a constellation of "king"s across the page. "subject," at the top of the page, has had quotes added around it, highlighted in orange. At the bottom of the page, these words are left unerased: "Hum! Hum" / judge / old rat. condemn / him to death. his life will depend on your justice. pardon. "death" and "life" are underlined in orange, and a question mark has been added after "pardon."
Figure 1. An erasure by Rachelle Weiss, page from Antoine de Saint-Exupery’s The Little Prince

Weiss’s erasure reveals much about the rhetoric of page forty-six of Antoine de Saint-Exupery’s The Little Prince. Certainly, digital methods alone could quickly tell us that the king speaks the most here. But Weiss’s work shows us what might be lost in that methodological ease and what material, humanist approaches to text encoding and mining can reveal about these processes. From a pedagogical perspective, Weiss’s erasure demonstrates how erasure can be a useful tool for teaching the interconnectivity between encoding and text mining as every stroke of the sharpie simultaneously marks up and extracts meaning from the text. From a critical data studies perspective, Weiss’s erasure transforms what might otherwise be a simple data visualization about the dominance of juridical discourse on page forty-six into a visceralization of that data.

The translucency of the black sharpie is key for these perspectives and our ability to move between them, as the relative opacity allows students and readers to see and comment upon the decision-making process. The creative method also encourages us to read the data connotatively, as one would read a poem. Here, what we can read through the sharpie reveals that the “subject” of the page/poem is not a rotely operationalized grammatical subject, since the translucency of the erasure reveals that several other nouns throughout the page were omitted, and plenty of additional words remain at the end. Rather, the poem’s “subject,” marked both by erasure and the addition of emphatic orange quotation marks, is thematic: necropolitics. The poem visualizes state-power’s control over “death” and “life.” That is affirmed by the overbearing, crownlike network established around the keyword “king,” a visualization which, like despotic monarchs, overreaches on our interpretative attention and the possibilities of any (social) text. Flipping Le Petit Prince’s ironic take on decrepit monarchs on its head, the erasure poem posits—and spatially demonstrates by preserving context and location data—how monarchy circumscribes justice.

Once more, digital methods might reveal that these are the themes of the book, but Weiss’s work goes beyond visualization by encoding choice into how we read these themes. Life, death, and subjecthood are contingent upon our reading processes, as suggested by the poet’s hand-drawn, orange question mark next to “pardon.” Weiss’s procedural visualization, crafted within the constraints of the material page, builds space for a humanist response to monarchical overreach when we carefully account for what exists in and beyond the crown. The terms visible within the king’s crown are “old” and “judge,” rendering the king as an aging judicial body who, if we judge it fit, can be “condemn[ed]...to death,” from the outside. The king is that subject whose “life will depend on your / justice.” The query then posed to the reader: to pardon or not pardon those old judges over life and death.

Of course, this is just one of many possible ways to read Weiss’s work. Our point is that these sorts of playful and political humanities approaches to encoding/mining are more possible when we abandon an emphasis on code and embrace an emphasis on process—both the procedural logics of computation and the process of asking questions, interpreting, and sharing our decision-making with others.

Course Context and Design

We draw our data for this article from a course taught in 2023, in which Nichole Nomura was the primary instructor. The course, “Introduction to Literary Text Mining,” is a seminar (16 students) taught in the department of English at Stanford University that attracts undergraduates with a variety of training, needs, and interests. It fills “Methods” requirements for several majors outside of English, meets an Applied Quantitative Reasoning General Education requirement for all majors, and serves students minoring in Digital Humanities. There are no technical or literary prerequisites for enrollment, which means that students enter with a wide range of comfort with both computing and literary meaning-making. It is important here to note that even at well-resourced schools such as Stanford, student access to computational resources is uneven. Requiring students to have particular kinds of computing power and operating systems available to them in the classroom was not an option. A pre-course survey to identify the computing devices available to students and ongoing adjustments throughout the quarter mitigated, but did not solve, some of the challenges related to access. Instead of the initial instructional plan, which had students work by hand and eventually produce their own code, the course shifted to instructor-supplied computing, optional coding tutorials on the side for students who wanted to learn how to run such analyses themselves, and significantly more emphasis on computing and calculating by hand or in tabular formats in cloud-supported software like Google Sheets.

A Constructionist Approach to Literary Text Mining Pedagogy

There is no text-mining without encoding. The question “how does the computer know what to pull out?” requires, repeatedly, a conversation about what first we must put in, how we store it, and how we manipulate it. In the literary-text-mining stack,2 conversations about encoding have nearly infinite scale—a conversation about file formats is a conversation about encoding. So is a conversation about UTF-8, about keyboard mapping, about multilingual DH, about natural-language processing and named-entity recognition, or about the Text Encoding Initiative (TEI) and digital editions. At all moments in the vertical procedure, we find encoding decisions, interactions, and consequences.

In the pedagogical stack, we emphasized “encoding” as a scalable, persistent refrain in our essential understandings. Our shorthand for “text-mining” throughout the quarter was “taking stuff out,” and so our shorthand for encoding was “putting stuff in ____” (where “in” was frequently followed by something like “a structure,” but could also be “a text” or “a sentence” or “a word”). Such a shorthand—putting stuff in—brought these practices into alignment with annotation. Encoding became aligned not just with the annotation of scene segments or annotations intended to train LLMs, but also with the annotations we make for processes that train our thinking like schools, tests, and papers.3 Encoding therefore became something we already did all the time: not a new concept, but a familiar one that we were working to expand in the context of the course.

To help our students see the relationship between encoding and mining within the literary-text-mining stack, we required pedagogical practices that visceralize and physicalize the invisible or difficult-to-see—in this case, pedagogy grounded in Papert’s constructionism. Constructivism (with a “v”), the epistemic theoretical framework most commonly associated with Jean Piaget, argues that learners actively construct their own knowledge in the context of their learning environment and in relation to their existing schemas and knowledge. Constructionism (with an “n”), put forward by Piaget’s student Papert, develops constructivism by emphasizing that learners construct that knowledge when they build things in the real world. Constructionism undergirds much of the research on “learning by making” and, famously, comes out of Papert’s work developing Logo, the coding language that initially supported Lego Mindstorms.4 Common in STEM and data science education, constructionism’s emphasis on building out loud, on building real things, and scaffolding novice entry into the field at any level align it with many of DH’s pedagogical values.

For instance, Catherine D’Ignazio and Rahul Bhargava take an explicitly constructionist approach in their work to theorize and test pedagogy for creative data literacy. In particular, they highlight the role of constructionism in inviting novices into the field:

[T]his paper argues that the best way forward is through engaging learners where they are with hands-on creative activities that build their capacity. Without such invitations any efforts to work with novices will fall into a techno-centric focus on software skills acquisition, which has little chance of connecting learners to the opportunity of data to help them achieve their goals. (D’Ignazio and Bhargava 2019)

They identify three main principles for a constructionist approach to data visualization that we argue are instructive for the digital humanities more broadly and for literary text mining specifically: project-based learning (tied to the learner interests), hands-on learning (embodied), and peer learning. Project-based learning and peer learning are well-established cornerstones of DH pedagogy. Hands-on learning might seem to be quite obviously a part of DH pedagogy as well—but D’Ignazio and Bhargava, along with the constructivists and constructionists they draw on (Dewey, Montessori, Papert, Ackerman, and others), remind us that there is an important literality to “hands-on.” Highlighting in particular the challenge of creating hands-on learning experiences in fields that mostly work with screens, they stress that there are opportunities to “visceralize” and “physicalize” data at all stages of the pedagogical process, even if having physical or literal objects as the final product is impractical.

Zooming out from D’Ignazio and Bhargava’s work on data visualization to literary text-mining and the digital humanities more broadly, constructionist approaches ask us to consider how “real” coding is. Does working on a computer screen count as hands-on? Where are the opportunities to physicalize and visceralize as process, in addition to product, in our pedagogy? How important is it that “digital” becomes synonymous with (electronic) “computing,” and what do we lose when we drop the ways that digital and computing have historically been done by hand? Literary text mining courses are a productive site for exploring these questions.

This class therefore took a deliberately, and explicitly, constructionist approach to digital humanities pedagogy. The framework was explained and cited on the first day of term, and the hows and whys of constructionist lesson planning were frequently detailed for students. This explicit approach matters in the classroom because the activities that students did in literary text mining might look like “fun crafts”—and sometimes they were also fun crafts—but all constructionist activities were presented as essential, rigorous, and quite serious play that supported the learning outcomes of the course. Our erasure lesson was taught in the context of this framework, and was likewise not an “extra” or “fun” activity. Instead, it was positioned as building on and stretching our previous encoding, qualitative-coding, and text-mining lessons.

In this class, students worked with a single book (a children’s chapter book of their choosing), over the course of the entire quarter, to create a new physical edition, and had reference to the corpus of their peers’ books for text-mining concepts related to corpora. We began by destructively scanning the books, running and hand-correcting a small sample of OCR, and creating metadata structures (and the metadata) for our texts. Every text-mining exercise was first completed by hand, in the loose pages of the book and on accompanying paper, before it was completed using an electronic computer. Students did natural-language processing including co-reference resolution and parsing by hand—first, using their expert judgment, and second, attempting to articulate logical “rules” for a peer to follow with similar-enough results, and then we compared our tables to those built by BookNLP and UDPipe. We calculated sentiment analysis and a variety of readability metrics by hand, annotating the text with scores from different dictionaries, including one we built ourselves, before exploring scores calculated across the entire book. Frequency tables built with and without lemmatization were done first on the page, with highlighter schemes to document what changed between the two rounds of counting. Work always followed the flow of: 1) by hand, in your own book, on a single page; 2) compare with a classmate; 3) compare the results of a single page, produced by hand, to the results of the electronic computer, at the scale of the book. Then, if appropriate, 4) compare the results across the corpus of classroom texts. At each of the stages, students were asked to make literary meaning, in addition to checking if their code “worked.” Brandon Walsh’s “three-speed problem” for DH pedagogy, in which students cannot shift between different settings/speeds when they need to, and his proposed solution—low-tech concept work by hand before introducing the code/software/tool (Walsh 2023)—is the solution to a highly differentiated classroom likewise implemented in this course. Further, summative and formative course assessments could always be completed “by hand” or “by computer,” and there was a blanket one-free “my computer ate my homework” policy for extensions.

We are not the first to experiment with low-stakes, by-hand, pencil-and-paper DH in research and in classrooms (e.g., Josephine Miles’s hand-tallies or Kalani Craig’s students’ hand-produced word clouds [Craig 2017]). What we want to stress here is the particular effect of the repeated use of the same text over a ten week period, an effect that had implications for the success of our erasure lesson. There’s operationalizing of the kind digital humanists do on large corpora for the purposes of research—the kind that is aggressively theorized because we simply cannot check all of our work—and then there’s operationalizing of the kind we practiced in this classroom: tentatively theorized, and aggressively stress-tested over and over again on the same book, on the same chapter, on the same page, on the same sentence. While text mining only one book might look like minimalism, it gave rise to an expansive, proliferating, almost exhausting maximalism—counting the same and same-ish things again and again in different ways. Layers of encoding built up on pages over time. Failed experiments were revisited and reworked because a new method was a better fit for capturing what the student tried to capture the week before. Students got very familiar with their chosen chapter book. In some ways, therefore, our lesson on erasure broke the routines of the class, as students had to start on a new page they hadn’t done any work on previously. It was presented as a challenge, an exercise in synthesizing their understanding of key concepts—an opportunity to test their understanding of their own stack and the ways in which they might choose to rearrange the component parts.

Erasure Class Session

Luca Messarra was the guest lecturer for the class session, and Nichole co-designed the lesson plan on erasure with Luca. The Essential Understandings for the unit were:

  1. Text-mining and text-encoding are not mutually exclusive and are in fact deeply interdependent.
  2. Argument, visualization, text-mining, and text-encoding are all meaning-making processes that require choices.
  3. These choices are grounded in theory, politics, practicality, and feasibility, and interact with our desired outcomes.
  4. Our visualizations can reveal or conceal process and decision-making. These processes are made more explicit by the erasure process.

The lesson began with a brief lecture challenging students to theorize erasure poetry as a type of data visualization. Fusing Rosa Menkman’s theorization of glitch (Menkman 2010) with the Situationist practice of détournement (Debord and Wolman 1956), students were further prompted to consider the ways in which an encoding/text mining process enacted by physical erasure could expose material-political networks latent within a given text. That prompt was then followed by an open book exploration period, where students were given the opportunity to read through erasure works such as M. NourbeSe Philip’s 2008 Zong!, Hugo García Manríquez’s 2014 Anti-Humboldt, Tom Philips’s 2016 A Humument, Luca and Sean Messarra’s 2019 edition of Niccolò Machiavelli’s The Prince, and several others.5 They were then tasked to group up with an erasure of their choice and respond to a series of scaffolded questions about the “data” being visualized in these works. After some collaborative discussion, students were prompted to create their own erasure poem of a page of their choice, paying special attention to the relationship between their method of erasure and the final product.

Some students had a strict rules-based approach to their erasures, and others opted to mine a relevant poem out of the text by simply removing all non-related words. Their interpretations of their poems, in exit tickets and discussion with peers, varied. Some students made reference to frequency, and some to newly-noticed connotations and relationships. Several of the connotation-based arguments were based on the ability of students to read, thanks to the variable opacity of their erasure, the contexts from which the words were drawn: pulling location data off the page, or co-references, or grammatical parses. Encoding practices using physical materials helped them experiment with both procedural logics and more traditionally humanistic logics, blending them and making visible what would have been lost if they had relied on only one or the other.

At the conclusion of the lesson, students were asked to “perform a reading” of their erasure by reading it aloud in a “poetic voice.” Some took that call earnestly, bringing airs of Gwendolyn Brooks and Allen Ginsberg to the classroom, while others presented in a more colloquial fashion. Regardless, clapping, laughter, and astonishment filled the room as unforeseen critical insights were discovered by verbalizing their work. In reading a “data visualization” aloud as a poem, the students called attention to the role of the poetics of a given medium (the opacity of a sharpie, the intonation of a human voice, the particularities of a computer’s hardware and software) in structuring the meaning-making process and, by extension, how the (in)visibility of decision making protocols makes (im)possible various political responses and interpretations.6

In our specific example of constructionist pedagogy, the use of erasure for teaching encoding/text mining provides at least four affordances for digital humanities pedagogy: variable opacity, preservation of context and location data, process-based and layered annotation, and instruction in computational logic without coding.

Variable opacity

Markers, highlighters, Wite-Out, and strips of paper all present different options in terms of opacity. Students had done many annotation exercises before the erasure poetry unit, including a guided qualitative-coding exercise dependent on color-coded annotations, but the erasure exercise was the first time they were asked to make decisions about opacity. Variable opacity is the material quality undergirding the preservation of context and location data, as well as supporting process-based, layered annotation. It is also the predicate upon which students are able to see how encoding and text mining are co-constitutive, as every act of erasure simultaneously encodes and mines the text.

Preservation of context and location data

The act of erasure is the act of negotiating how much context to preserve. Erasure provides an opportunity to visualize the critical decisions about encoding that must be made alongside, before, and within any mining work. The ghosts of encoding, too often rendered invisible in spreadsheets and the products of text mining, can remain visible in an erasure via the preservation of context and location data. The more opaque the erasure, the less information about context available to a reader, but location data—the physical location of a word on a page—remains. Weiss’s sharpied erasure implies an extraction of the text-mining kind without actually removing the text from context. Another student’s cut-out windows, created by layering one sheet of paper over the book page, completely obscured context while emphasizing location.

Process-based, layered annotation

Layered annotations—literally layered on the same page—were how we did the work of the quarter. Sets of articulated rules, inter-annotator agreement, and sequential passes counting different things on the same page of text were the bread and butter of the class. By the time we got to the erasure assignment, students were used to working over their previous annotations, to explaining their process to a peer, to changing the rules halfway through, to returning to the top of the page, and to re-doing their annotations. Erasure offers an opportunity for students to see the process and product as more deeply interconnected: the act of erasure results in an erasure, an object with high humanistic value, rather than the “scratch work” of procedural annotation we had done up until this point.

Logic without coding languages

What are we willing to give up?: Machine-readable code.

Risam and Gil frame minimal computing with a “heuristic comprising four questions to determine what is, in fact, necessary and sufficient when developing a digital humanities project under constraint: 1) ‘what do we need?’; 2) ‘what do we have’; 3) ‘what must we prioritize?’; and 4) ‘what are we willing to give up?’” (Risam and Gil 2022). This class was willing to give up machine-readable code, and it gained much by doing so. The logistics of teaching and performing actual coding are arguably one reason why we treat encoding and text mining as distinct, and materially engaging with texts via erasure offers an opportunity to see how these processes are interdependent.

Digital humanities educators might be tired of debating how important it is for students to learn to code—we’re tired of debating whether we need to learn to code. But practically, we all have to make and enact a decision about how, when, and why to teach (or not to teach) coding. We have no universal objections to teaching coding; the question is when is it pedagogically productive to give up code in exchange for other learning outcomes? We’re not convinced an intro to literary text mining class for undergraduates is the ideal space to teach coding—not on the quarter system, not at Stanford, not with the mix of students brought to this particular classroom, and not with the requirements this class needs to fill.7

Constructionism invites us to rethink the relationship between minimal computing and “digital humanities project[s] under constraint.” Rather than thinking of minimal computing solely as a defensive response or even proactive activism for digital humanities under (unwilled) constraint, we can also think of constraint as an intentional choice that benefits both our students and ourselves as scholars and teachers. Unwilled “constraint” on what we need, have, prioritize, and give up in a digital humanities context is bad. Constraints-from-above such as budget cuts and curricular censorship are bad, and we should collaboratively work to resist these as a field. But what about willing constraints? The long history of constraint-based writing, from transhistorical experiments with the sonnet form to the difficult shenanigans of OuLiPo, has taught us that willing constraint can be immensely generative for artistic production and intellectual inquiry. Constraint can lead to good poetry. Constraint can lead to clever solutions in programming. Constraint is a necessity for scaffolding in the classroom.

In thinking about what we are willing to constrain in our digital humanities pedagogy, we move into an approach of minimal computing as giving up nothing—and possibly—gaining quite a bit. It takes a particular definition of “minimalist” to look at the amount and variety of art supplies and mess moving in and out of literary text mining biweekly and assert that this is, in fact, more minimalist than the thing that could be happening with “just” laptops. We had to talk our way into rooms with better tables (the desks in our assigned room had definitely been designed with only laptops in mind), to use resources most people simply do not have on hand (bandsaws), and to “destroy” a lot of books. But the gains: students left the course being able to think like computers and also like humanists. They could explain the principles of many different kinds of text-mining operations with concrete examples, perform calculations on the fly (the math in the course was heavily influenced by Tukey’s Exploratory Data Analysis, a different kind of constraint that merits long form discussion elsewhere), read (really read, in the sense that they could make meaning, some of it literary, out of) equations that used words, tokens, sentences, and other literary features as variables, and signify off of the things they noticed, whether they noticed it via traditional reading or computational reading. We gave up learning to write code in Python or R, but it felt like giving up nothing at all. We adamantly did not give up thinking like a humanist, nor did we give up thinking like a computer.

A maximalist view of this kind of computational abundance—a stack in which nothing is given up in return—is only available when students know how to think like computers, or when they are themselves, to lose the analogy and draw on the older sense of the word, computers, and when they may choose where to insert computation and calculation into their own mental stack. Turning ourselves into computers, into beings capable of procedural thinking, happened not in spite of our humanistic training, intuitions, sense of values, and powers of judgment—it happened because of them, and it only took a few art supplies to help us navigate between them.

Conclusion: Hybrid Computing

Our experiment with erasure poetry shows how “giving up” coding in a digital humanities classroom does not necessarily create a skill vacuum. Rather, teaching computational logics without teaching coding languages allows instructors to focus on teaching the procedural logics and essential understandings of text mining in a language-agnostic way. Dropping code grants students the ability to focus less on getting the code “right” and more on developing an understanding of what different computational processes can do, when to use them, and their limitations. It does not mean trading Risam and Gil’s symbolic computational literacy for metaphorical computer literacy (Risam and Gil 2022), but rather, means being quite comfortable with the idea that symbolic computational literacy can happen in a wide range of symbolic registers—including Python, R, math, English, and flowcharts explaining rules for annotations—and makes space for the idea that we have to translate between those symbolic registers all the time anyways, as many of them are embedded in interdependent semantic networks.

Utilizing physical erasure as a method for teaching encoding and text mining is not only an effective and engaging instantiation of minimal computing—it also brings unique affordances to the instruction of these computational processes by centering humanist inquiry and playful constructionist methods. Materially enacting and embodying encoding and text-mining processes within a constructionist context makes possible a form of contingent reading, as described by Daniel Scott Snelson, that “makes do with the situation at hand: scanning, gathering, deciphering, contextualizing, and, in the end, carrying out the work of fetching whatever it might find, given the situation, for an ongoing improvisatory performance of interpretation” (Snelson 2022). Just as search engines are simultaneously “tool[s] for use” and “active agent[s] in writing the world and scripting … users,” erasures enfold both writing and being written, or in the literary-text mining context, encoding and mining. The affordances in a pedagogical context in which we produce rather than consume erasure allow us to, in Snelson’s words, “process the poetics of the search”: to query, as we create, more reflectively and more speculatively. Contingent reading, scaffolded by minimalist, constructionist pedagogies, allows digital humanities students to see that computational skills and thinking are transferable: they are not limited to the context of screens, and they are related to the humanist practices they already know and will continue to develop—a maximalist ability to scan, gather, decipher, annotate, contextualize, and fetch patterns in their everyday lives.

These contingencies challenged us as pedagogues, not with their interdisciplinarity, but rather their hybridity. Instead of switching between computational and literary thinking, we found our students building something new that did not quite fit the conventions of either methodological camp. That novelty might seem unsurprising, especially given Jo Guldi’s remarks in The Dangerous Art of Text Mining that “what is produced in the process [of text mining] is more than an adjustment to old conversations. Something genuinely new is being produced where old fields meet” (Guldi 2023, 13). Yes, their erasures were “new” objects, each being a unique poem and material interaction with “old fields.” But more importantly for their learning, their individual stacks—the layered interactions and processes between humanistic thought and computational logics—were new. Unlike previous lessons in text mining, where we more clearly moved between literary and computational logics as separate stages, the erasure exercise, because of its contingencies, had students developing their own stack. This forced them to choose from the tools of the quarter, in their own sequence, with their own dependencies.

Students had built a hybrid digital humanities stack—here, a hybrid literary text mining stack—in response to their constructionist experiments. Each stack was a unique assemblage of various material and mental tools. If disciplinarity suggests epistemic commitments in the form of full, generalizable and transferrable stacks (e.g. the scientific method or the assemblages of close-reading), and interdisciplinarity suggests a fluency in switching between them, the hybrid stack suggests something more contingent, modular, and granular. Our previous pedagogical efforts in this class had resembled something more like so-called “human-in-the-loop” infrastructure: a sequential and strategic deployment of humanistic reasoning to respond to, correct, or contest the computational. In the erasure lesson, and in reading our students’ work, we found we needed a different model to account for the relationship between computational and humanistic thinking.

Papert proposed such a model when defending Logo from the critics. Papert’s defense of “thinking like a computer” articulates a vision of computational thinking as one element in a toolkit:

People often fear that using computer models for people will lead to mechanical or linear thinking: They worry about people losing respect for their intuitions, sense of values, powers of judgment… The advice “think like a computer” could be taken to mean always think about everything like a computer. This would be restrictive and narrowing. But the advice could be taken in a much different sense, not precluding anything, but making a powerful addition to a person’s stock of mental tools. Nothing is given up in return. (Papert 2020)

Papert’s claim that the computational does not have to preclude anything—that it can exist as one tool among many—helps students make decisions about their own interpretive stacks in alignment with minimal computing principles. When our students have access to a wide stock of mental tools, including the ability to think like computers without using an electronic computer, their ability to assess what they have and what they need expands beyond disciplinary constraints.

Using physical erasure to teach core digital humanities skills builds on the work of Christopher Ohge and Charlotte Tupman, who contend that situating encoding and other computational processes only within the context of the “digital” humanities forecloses on their wider application (Ohge and Tupman 2020). We see affinities in our hybrid approach to computing with Sheila Liming’s vision of the digital humanities as “an evolving continuum of humanities methods and tools” which works to:

unyoke the idea of advancement from that of intellectual abandonment, encouraging students to see the full gamut of skills gleaned within the humanities classroom as mutually reinforcing and related. And it would furnish a platform for the criticism of newness for newness’s sake, from which students might turn their critical attentions not to the devaluation of their own skills and labor but toward those who would see them devalued, defunded, and, yes, destroyed. (Liming 2023)

What we need (to create that platform, to better engage our students, to sponsor broad humanist inquiry, to resist proprietary overreach—to name only a few of our ambitions) is a digital humanities pedagogy attendant to constructionist tenets. This does not have to be “learn by making” in the “you need to produce an object” sense. Rather, it asks us to recognize the many opportunities for minimal computing in all layers of the digital humanities stack.

Notes

  1. In lieu of a substantial literature review on the politics of erasure, and mindful of the politics of absence in such footnotes, we suggest the reader begin with King (2012), a compendium of six erasure poets (including Philip) being interviewed about their process. For an explicit political discussion of erasure, see Sharif (2013). ↑

  2. By stack we mean both the metaphorical shape of the assemblage of skills and processes we use to conduct literary text-mining (in the same sense as Benjamin Bratton’s 2016 The Stack: On Software and Sovereignty) and the more literal, technical sense of the term. The distinction between the metaphorical assemblage and the literal one is relatively small in literary text mining, and really depends on one’s willingness to adjudicate “computer.” ↑

  3. See Remi Kalir and Antero Garcia, “Annotation Aids Learning,” in their 2021 Annotation, for an excellent introduction to the relationship between annotation and learning. ↑

  4. For a birds-eye view of constructionism and constructivism aimed at DH, we suggest D’Ignazio and Bhargava 2019. For an introduction to constructionism, see Papert’s Mindstorms: Children, Computers, and Powerful Ideas.↑

  5. “Erasure” was also considered capaciously beyond the act of linguistic removal; students were prompted to consider anticolonial images in Felipe Guamán Poma de Ayala’s El primer nueva corónica y buen gobierno (~1615) as erasures. ↑

  6. More specifically, we are referring to a widely recognized need in the field of digital humanities to make visible the materialities and politics underlying computation: from the human labor involved in microchip construction to state-sponsored data surveillance initiatives. Erasure makes evident, if only through analogy, how much is rendered invisible from step-to-step. Navigating between the published, finished erasures of others and their own procedural erasures invites students to return to other moments of erasure in the process/stack of the course, including in their own digitization work, in their decisions about what to include/exclude in their metadata assignments (guided by readings from Klein and D’Ignazio’s Data Feminism), and in the choices they make in visualization. ↑

  7. Let us stress that this is context-specific. Graduate classrooms are a different matter; as are any other class in any other context, taught by anyone else to any other group of students. If the learning outcomes of the course were code, then we would teach code. ↑

References

Bessette, Lee Skallerup. 2023. "Digital Redlining, Minimal Computing, and Equity." In Critical Digital Pedagogy in Higher Education. Issues in Distance Education Series. AU Press.

Bratton, Benjamin H. 2016. The Stack: On Software and Sovereignty. The MIT Press.

Craig, Kalani. 2017. "Analog Tools in Digital History Classrooms: An Activity-Theory Case Study of Learning Opportunities in Digital Humanities." International Journal for the Scholarship of Teaching and Learning 11 (1). https://doi.org/10.20429/ijsotl.2017.110107.

Debord, Guy, and Gil Wolman. 1956. "A User's Guide to Détournement." Translated by Ken Knabb. Bureau of Public Secrets. https://www.bopsecrets.org/SI/detourn.htm.

D'Ignazio, Catherine, and Rahul Bhargava. 2019. "Creative Data Literacy: A Constructionist Approach to Teaching Information Visualization." Digital Humanities Quarterly 012 (4).

García Manríquez, Hugo. 2014. Anti-Humboldt: A Reading of the North American Free Trade Agreement. Litmus Press.

Guldi, Jo. 2023. The Dangerous Art of Text Mining: A Methodology for Digital History. 1st ed. Cambridge University Press. https://doi.org/10.1017/9781009263016.

Kalir, Remi, and Antero Garcia. 2021. Annotation. The MIT Press Essential Knowledge Series. The MIT Press.

King, Andrew David. 2012. "The Weight of What's Left [Out]: Six Contemporary Erasurists on Their Craft." The Kenyon Review, November 6. https://kenyonreview.org/2012/11/erasure-collaborative-interview/.

Liming, Sheila. 2023. "How to Teach DH without Separating New from Old." In What We Teach When We Teach DH: Digital Humanities in The Classroom, edited by Brian Croxall and Diane K. Jakacki. Debates in the Digital Humanities 10. University of Minnesota Press. https://dhdebates.gc.cuny.edu/read/what-we-teach-when-we-teach-dh/section/11162808-26fc-4d83-8995-2e1240963aea#ch17.

Machiavelli, Niccolò. 2019. The Prince. Edited by Luca Messarra and Sean Messarra. Undocumented Press. https://archive.org/details/the-prince_UP.

McGrail, Anne B. 2017. "Open Source in Open Access Environments: Choices and Necessities." Paper presented at Minimal Computing Working Group. February 17. https://go-dh.github.io/mincomp/thoughts/2017/02/17/mcgrail-choices/.

Menkman, Rosa. 2010. "Glitch Studies Manifesto." https://amodern.net/wp-content/uploads/2016/05/2010_Original_Rosa-Menkman-Glitch-Studies-Manifesto.pdf.

Ohge, Christopher, and Charlotte Tupman. 2020. "Encoding and Analysis, and Encoding as Analysis, in Textual Editing." In Routledge International Handbook of Research Methods in Digital Humanities, 1st ed. Routledge.

Papert, Seymour. 2020. Mindstorms: Children, Computers, and Powerful Ideas. Basic Books.

Philip, Marlene Nourbese. 2008. Zong! Wesleyan University Press.

Phillips, Tom. 2016. A Humument: A Treated Victorian Novel. Final edition. Thames & Hudson.

Risam, Roopika, and Lee Skallerup Bessette. 2024. "Introduction: Minimal Computing and EdTech." Learning, Media and Technology 49 (5): 747–54. https://doi.org/10.1080/17439884.2024.2435200.

Risam, Roopika, and Alex Gil. 2022. "Introduction: The Questions of Minimal Computing." Digital Humanities Quarterly 16 (2). https://www.digitalhumanities.org/dhq/vol/16/2/000646/000646.html.

Sharif, Solmaz. 2013. "The Near Transitive Properties of the Political and Poetical: Erasure." Evening Will Come: A Monthly Journal of Poetics, no. 28 (April). https://web.archive.org/web/20180819040736/https://thevolta.org/ewc28-ssharif-p1.html.

Snelson, Daniel Scott. 2022. "Contingent Reading: A Poetics of the Search." ASAP/Journal 7 (2): 385–407. https://doi.org/10.1353/asa.2022.0025.

Walsh, Brandon. 2023. "The Three-Speed Problem in Digital Humanities Pedagogy." In What We Teach When We Teach DH: Digital Humanities in the Classroom, edited by Brian Croxall and Diane K. Jakacki. Debates in the Digital Humanities 10. University of Minnesota Press. https://dhdebates.gc.cuny.edu/read/what-we-teach-when-we-teach-dh/section/fdb7ea0a-5c9e-404a-ab9d-3a24fe6de61c#ch18.

Wiggins, Grant P., and Jay McTighe. 2008. Understanding by Design. Expanded 2nd ed, [Nachdr.]. Association for Supervision and Curriculum Development.

About the Authors

Luca Messarra is a media scholar, poet, and book artist working as a PhD candidate in English at Stanford University. He researches historical and contemporary publishing communities and technologies of literary production. His dissertation, “Literature, on Demand!: Book Production, Cultures, and Fictions after 1960” builds a literary history and criticism of print-on-demand publishing. He is currently a visiting scholar in the Black Studies department at the City College of New York, where he teaches courses on poetry, video games, and critical theory.

Nichole Nomura is an Assistant Professor of Public Humanities in the Department of English at the University of Wyoming. She researches digital humanities pedagogy and how literature teaches/is taught, using methods from the digital humanities, literary criticism, and the educational social sciences.

Attribution-NonCommercial-ShareAlike 4.0 International

This entry is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.

Annotate

Articles
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org