Category Archives: Technology

Writing, memory and freedom

Is this memory?

One of the reasons we write things down is to help us remember. That seems clear; I don’t want to forget your phone number or what I was supposed to get from the grocery store, so I write it down.

Some ancient philosophers worried that writing would wipe away our capacity for organically remembering things. Even today, writers such as Nicholas Carr, author of The Shallows: What the Internet is Doing to Our Brains, make analogous arguments.

But those worries seem secondary to the problem that writing (and information systems generally) creates the sense that all memories have been recorded and are retrievable.

First of all, this is an illusion. It is impossible to record everything, and gaps are inevitable. Frighteningly, we may not notice when things are missing, and we fill in the gaps through inference. We know this in everyday life as “jumping to conclusions.” The idea that we think we can record everything seems to come from the 20th-century enchantment with computing, whose metaphors have permeated many aspects of life.

Psychologist Arthur Glenberg argues against this position in his 1997 paper What Memory is For. The idea of total recall—that our brains are recording everything we ever experience but that it’s locked away and can be coaxed out through psychotherapy—is a myth, Glenberg argues. Rather, Glenberg advances a view of memory as a facilitator of action in our environments, and something that is not totally accurate (and shouldn’t be!). More recently, Julia Shaw makes similar points in her book The Memory Illusion, in which she goes over many ways that our memories can betray us. The pernicious thing is not when we forget—it’s when we misremember things and think we’re correct. You can watch an animated abstract for The Memory Illusion here:

Second, remembering may not always be the best thing. Also in 1997, Geof Bowker published a paper on the importance of organizational forgetting. Organizations need to manage a lot of information and knowledge. Normally we think of this only in terms of remembering, but organizations also need to think about forgetting—especially that which no longer serves.

In the everyday lives of individuals, too, memories can sometimes be straightjackets. Sure, it’s easy to argue that forgetting an unpleasant memory could be problematic, but the ability to forget aspects of one’s past is also important for moving forward.

Modern technology has made this so much more complicated. Consider, for example, that you once had an abusive lover. You’ve since broken up, but remembering that person feels more painful than helpful. In the days of yore, you could simply burn all the photographs of the two of you and that would be that. But today, traces of your past relationship may be strewn about Facebook forever. For another sort of example, the work of Oliver Haimson on the intersection of gender transition and social media also presents a fascinating case.

The point is, the (im)possibility of forgetting becomes a crucial ethical issue today. Philosopher Luciano Floridi writes in The Ethics of Information:

Recorded memories tend to freeze the nature of their subject. The more memories we accumulate and externalize, the more narrative constraints we provide for the construction and development of personal identities. Increasing our memories also means decreasing the degree of freedom we might enjoy in defining ourselves. Forgetting is also a self-poietic [creative] art… Capturing, editing, saving, conserving, and managing one’s own memories for personal and public consumption will become increasingly important not just in terms of protection of informational privacy… but also in terms of a morally healthy construction of one’s personal identity.

We should remember that memory slips are not all bad—both mental forgetting and missing written information. And we should also be more humble about what we do “remember,” because it may very well be wrong.

Writing in hyperhistory

There’s a well-known distinction between prehistory and history. History, we say, is everything after the invention of writing. This first happened around 7,000 years ago.

But we shouldn’t think of history as referring to a period in the earth’s development, but rather as a descriptor of how people live. That is, even after writing was first invented, many societies still lived prehistorically—without writing. Indeed, writing was invented many times, independently, in different parts of the world (China, Sumer, Egypt, Mesoamerica…). Even today, there are some remote tribes that live prehistorically.

But is the prehistory/history distinction enough? The philosopher Luciano Floridi posits that many of us today have moved into a different way of life, which he calls hyperhistory. In hyperhistory, “societies or environments where ICTs [information and communication technologies] and their data processing capabilities are the necessary condition for the maintenance and any further development of societal welfare, personal well-being, as well as intellectual flourishing.”

We can summarize the distinction in this way:

  • Prehistory – No ICTs
  • History – Society is enriched by ICTs that store and transmit information
  • Hyperhistory – ICTs have overtaken other technologies and now society depends on ICTs to function

 

Floridi very briefly discusses the concept of hyperhistory and what it means for warfare in the Computerphile video below.

Across Floridi’s many works on the topics, he discusses how hyperhistory plays out in economics, politics, warfare, information quality and other areas. The bottom line is that hyperhistory is new, and we are only beginning to grapple with its immense challenges. Floridi writes:

Processing power will increase, while becoming cheaper. The amount of data will reach unthinkable quantities. And the value of our network will grow almost vertically. However, our storage capacity (space) and the speed of our communications (time) are lagging behind. Hyperhistory is a new era in human development, but it does not transcend the spatio-temporal constraints that have always regulated our life on this planet.

And elsewhere:

It may take a long while before we shall come to understand in full such transformations, but it is time to start working on it.

When it comes to writing, what does hyperhistory mean? That’s a big question, but we can sketch some initial thoughts:

  • When there’s more information, there’s less time to devote to any given information. Not only is there more of it to read, but we have to spend more time organizing it and searching for it. Commensurately we’re seeing writing take different forms: shorter sentences, more lists, etc.
  • Non-text forms of information are proliferating. Big data visualizations, videos, etc.
  • Things change fast, and so writing on many topics quickly obsolesces.
  • Text is being processed more by other technologies than by humans. Machines cannot understand in the same way humans can—their “reasoning faculties” are quite different from ours. What does it mean to think of writing for the audience of both humans and machines?

Ah, things to think about…

Documenting the self

I’ve been hard at work on my dissertation proposal—I’m studying the processes of artistic self-portraiture—and I’ve been thinking about self-documentation. In modern society we seem to be compelled to write about ourselves. We make resumes and CVs, and we write bios for our social media profiles, which are becoming central for everything from everyday communication to dating and business. There are, of course, also many non-verbal ways in which we document ourselves, which is a focus of my dissertation.

The later work of Michel Foucault suggests that self-documentation is not new. On the contrary, many in Ancient Greece and Rome apparently kept hupomnēmata, or notebooks “to collect what one has managed to hear or read, and for a purpose that is nothing less than the shaping of the self.” These were fragmentary notebooks, but their result was not merely a collection of disjointed scraps; rather, they contributed to a new whole, along with the writer themselves. According to Foucault, the purpose of the hupomnēmata was to care for the self, which was an ancient directive. (Foucault laments that today we only recall know thyself, having forgotten about care for thyself.) As Foucault writes, “writing transforms the things seen or heard ‘into tissue and blood.’” People regularly returned to their hupomnēmata for nourishment.

The function of the hupomnēmata is quite different from the modern genre of autobiography, whose purpose is not to care for the self but to care for others. Autobiographies and many other self-documents are packaged for sale (in various senses), but the hupomnēmata were intensely private. They were more about the process than the product.

Today, some of us keep hupomnēmata. Mine, if you could call it that, is in Evernote. But I think this is a rare practice. On the other hand, many people cultivate something like hupomnēmata in their social media feeds. A Twitter feed, for instance, presents a seemingly disjointed collection of thoughts and snippets from the world, and it seems to be both like and unlike hupomnēmata. On Twitter (and other social media, or even ICT-made self-documents in general), are posts revisited as a means of self-care? Is the primary audience the self or another?