All posts by Tim Gorichanaz

Original texts: Thinking about the Bible and beyond

I recently read the book Misquoting Jesus: The Story Behind Who Changed the Bible and Why, by Bart D. Ehrman, on my brother’s recommendation. The book traces the transformations—intentional and otherwise—that gave us the Bible that we have today. Ehrman discusses how copyists over the ages altered the text to, for instance, obfuscate the women and harmonize the accounts of Jesus in the New Testament. The book problematizes the situation for those that favor a literal interpretation of the Bible: If we believe the Bible is God’s word, or at least divinely inspired, then it’s a major challenge that we don’t have any of the original texts.

(As an aside, Ehrman doesn’t seem to appreciate a narrative, pathic reading of the Bible as a tome of mythological force. He himself seems to fall prey to the literal reading of the Bible that he denounces. For a literate, rather than literal, view of the Bible, see Rob Bell’s recent book What is the Bible?)

“The original texts”—what does that mean? While this maybe ought to be a straightforward question, it is anything but. Ehrlman describes how, for example, Paul’s letter to the Galatians (part of the New Testament canon) was most likely originally dictated and immediately existed in multiple manuscript copies that were sent out (to the “Galatians,” which demonym is itself a bit ambiguous) and then copied further. Our earliest version of Galatians is from over a hundred years after these “originals.” Since then, the letter was copied and transformed by any number of hands and cultures, generating a family tree of differences.

The printing press, when it came along in the 15th century, lent some stability to textual reproduction. But even with print, it’s no easier to say what the “original” is. Shakespeare, for instance—there’s the perduring question of the original scripts (and also of pronunciation!). In any printed text, which is the original: is it the first edition, of which there may be many and multiple printings? The printer’s proof? How about the second edition which includes corrections of the printer’s errors? How about the author’s final manuscript (if such a thing exists)?

And today, when many documents are “digital native,” our situation is in many ways more like a scriptorium than a printing house. Getting to the “original” is as devilish a task as ever. Think about quotes we come across that appear with some variation and with attribution to any number of people that we can’t pin down where they actually came from.

All over, we’re reaching for originals. What we don’t seem to ask, is why. Why do we care about the original? It’s sure to be a case-by-case question. In some disputes, discerning what the original document actually said is of central importance. But in cases such as the Bible, I am tempted to conclude that it’s irrelevant.

Writing, memory and freedom

Is this memory?

One of the reasons we write things down is to help us remember. That seems clear; I don’t want to forget your phone number or what I was supposed to get from the grocery store, so I write it down.

Some ancient philosophers worried that writing would wipe away our capacity for organically remembering things. Even today, writers such as Nicholas Carr, author of The Shallows: What the Internet is Doing to Our Brains, make analogous arguments.

But those worries seem secondary to the problem that writing (and information systems generally) creates the sense that all memories have been recorded and are retrievable.

First of all, this is an illusion. It is impossible to record everything, and gaps are inevitable. Frighteningly, we may not notice when things are missing, and we fill in the gaps through inference. We know this in everyday life as “jumping to conclusions.” The idea that we think we can record everything seems to come from the 20th-century enchantment with computing, whose metaphors have permeated many aspects of life.

Psychologist Arthur Glenberg argues against this position in his 1997 paper What Memory is For. The idea of total recall—that our brains are recording everything we ever experience but that it’s locked away and can be coaxed out through psychotherapy—is a myth, Glenberg argues. Rather, Glenberg advances a view of memory as a facilitator of action in our environments, and something that is not totally accurate (and shouldn’t be!). More recently, Julia Shaw makes similar points in her book The Memory Illusion, in which she goes over many ways that our memories can betray us. The pernicious thing is not when we forget—it’s when we misremember things and think we’re correct. You can watch an animated abstract for The Memory Illusion here:

Second, remembering may not always be the best thing. Also in 1997, Geof Bowker published a paper on the importance of organizational forgetting. Organizations need to manage a lot of information and knowledge. Normally we think of this only in terms of remembering, but organizations also need to think about forgetting—especially that which no longer serves.

In the everyday lives of individuals, too, memories can sometimes be straightjackets. Sure, it’s easy to argue that forgetting an unpleasant memory could be problematic, but the ability to forget aspects of one’s past is also important for moving forward.

Modern technology has made this so much more complicated. Consider, for example, that you once had an abusive lover. You’ve since broken up, but remembering that person feels more painful than helpful. In the days of yore, you could simply burn all the photographs of the two of you and that would be that. But today, traces of your past relationship may be strewn about Facebook forever. For another sort of example, the work of Oliver Haimson on the intersection of gender transition and social media also presents a fascinating case.

The point is, the (im)possibility of forgetting becomes a crucial ethical issue today. Philosopher Luciano Floridi writes in The Ethics of Information:

Recorded memories tend to freeze the nature of their subject. The more memories we accumulate and externalize, the more narrative constraints we provide for the construction and development of personal identities. Increasing our memories also means decreasing the degree of freedom we might enjoy in defining ourselves. Forgetting is also a self-poietic [creative] art… Capturing, editing, saving, conserving, and managing one’s own memories for personal and public consumption will become increasingly important not just in terms of protection of informational privacy… but also in terms of a morally healthy construction of one’s personal identity.

We should remember that memory slips are not all bad—both mental forgetting and missing written information. And we should also be more humble about what we do “remember,” because it may very well be wrong.

Thinking the relationship between writing and speech

An example of quipu, a knot-based information system used by the Incas. This could be considered a form of proto-writing.

What is writing, exactly? That question began to fascinate me as I worked on my master’s degree (in Spanish linguistics). There are a few posts on this blog on the topic (for example), but it’s a question that I haven’t yet grown tired of.

On the surface, we think of writing as a simple representation of speech. But that may not be quite right. That view would imply that the whole history of writing, from pictographs and other early information systems (flags, knots, etc.) was teleological from the start, just searching for the “correct” way to represent a spoken language that was already totally understood.

A 1996 paper by David R. Olson, “Towards a Psychology of Literacy,” challenges the assumption of writing-as-representation and introduces instead a view of writing as a model for language.

Olson traces the evolution of writing in this light, discussing the movement from representing tokens to representing words (towards the abstract). This movement occurred as syntax (what you might loosely call “grammar”) was introduced and began to complexify. With syntax, writing came to shed light on the structures of speech. The next movement was from thinking of whole words to thinking of units of sound, and from there the development of writing continued to facilitate the analysis of language in many ways—on the level of sentence meaning, for instance.

The cognitive changes that go along with the development of writing and literacy (discussed, for instance, by Walter Ong), are due to this mode of analysis. Some of the changes Olson discusses are a reduction in the felt magic of symbols, and the transformation of words and sentences into objects of contemplation and philosophy.

In summation, Olson writes:

In this way, writing systems, rather than transcribing a known, provide concepts and categories for thinking about the structure of spoken language. The development of a functional way of communicating with visible marks was, simultaneously, a discovery of the representable structures of speech.

On this view, learning to read is not simply a matter of learning the symbols that correspond to already-understood units of language, as is often assumed in pedagogy. Rather, it is a metalinguistic activity—it involves the discovery of those very units. Learning to read is essentially learning to tune in to your language and analyze it.

Thus writing is, by its very nature, a reflective activity. And so we have further grounds for why the practice of writing is conducive to self-reflection (what can be called the hermeneutic capacity of writing). As historian Lynn Hunt says, writing leads to thinking.