At the end of 2016, Gerald Lynch, my editor at Tech Radar, asked me to write a feature about how technology is changing what we remember. This was one of my favourite pieces I've researched - I love getting to pester futurists. Read it here or below.
How technology is changing what we remember
I’m supposed to be making sure I can use the term ‘down with the fourth wall’ in the context I want to. But within a few seconds I’ve found an article titled ‘25 Classic Moments When Movies Broke the Fourth Wall’ and within a few more I’m learning that The Big Short enlisted Anthony Bourdain and a metaphor around a seafood stew to explain what a collateralized debt obligation is.
That’s great. Within a few more seconds I’ve saved all of those things, and I can continue back on my original path, finding out more about the fourth wall usage.
If I were reading a book, it would be the same as using a contents page to search for the relevant chapter, finding a more compelling chapter title, skim-reading that for a few pages, and then returning to the task at hand.
This way of learning hasn’t changed all that much; yet there is a growing concern in contemporary neurological studies around what is changing. With vast plains of external memory now available, can the human brain afford to be purposefully forgetful?
Gone are the days when we needed to remember phone numbers, house numbers, birthdays or appointments. The details we need access to require a lot less context, and virtually no mapping out of information. Your brother’s house number, the other names of your girlfriend’s family in case they pick up the home phone, even your Nan’s birthday – forgetting any of these will no longer leave you stranded in a social quagmire (although you should probably try and remember that last one).
The reliance that we place on our brains to accurately recall information is in decline, while the bond we form with technology that can outsource this information strengthens.
Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips , a landmark 2011 report printed in the magazine Science by Betsy Sparrow, Jenny Liu and Daniel Wegner, found that “[The Web] is an external memory storage space, and we make it responsible for remembering things”.
In one of Sparrow’s studies, two groups of undergraduates were given trivia statements. One group were told they could retrieve the information later on their computer, while the other were told they wouldn’t be able to check back. The first group had worse recall than the second, suggesting that our brains are learning to disregard information found online.
This effect, as with the plasticity of brains, becomes stronger each time we experience it. The more we turn to Google for our answers, the less likely we are to retain what we find there. Instead of remembering a fact, Sparrow’s findings suggest that internet users have learned to remember how to find a fact.
The good news is that our brains have never been that adept at remembering. Instead, we’ve historically used a technique called transactive memory, a term Sparrow’s peer Wegner proposed to signify the group mind (or hive mind, if you’re insufferable).
A transactive memory system is a mechanism through which groups collectively encode, store and retrieve knowledge. According to Wegner, a transactive memory system consists of the knowledge stored in each individual's memory, but mixed in with metamemory containing information regarding a teammate's domains of expertise. Effectively, you know what you know, and you know what kind of thing the others know.
A transactive memory system used to mean relying on your local community, whether that was your family or your pub quiz team. Now search engines, Evernote and smartphones are replacing personal contacts as our trusted external memory sources.
How many times have you been at a pub quiz aching to reach for your phone for a quick Google check? That’s your brain deciding that a search engine is a more reliable 'phone a friend' than your best mate.
It’s a subject that Nicholas Carr spends a lot of time on when deliberating over the future of the adult brain in his book The Shallows, published in 2010.
He argues that the internet is exposing the human brain to mind-altering technology, the levels of which haven’t been experienced since the printing press.
Our brains, according to Carr, are being rewired so they can only accommodate superficial understanding. Carr saw the ease of online searching and the distractions of browsing through the web as possibly limiting our capacity to concentrate.
However, with a few years' breathing space from The Shallows it’s actually beginning to look like our brains, plasticity and all, deserve a bit more credit.
After all, they were never specifically ‘programmed’ to read, to make it into town for bang-on 11am, to write, or to use a printing press – we’ve just been able to adapt. In spite of Carr’s claims that we’re drifting into the shallows of comprehension, it’s likely that, inside our heads, something much more complex is happening.
Contrary to the assertions Carr made in his 2008 article Is Google Making Us Stupid?, futurist Jamais Cascio argued in a 2009 issue of Atlantic Monthly that technology is actually making us smarter. Cascio made the case that the array of problems facing humanity will turn the fight for survival into an intelligence race:
“Most people don’t realize that this process is already under way,” he wrote. “In fact, it’s happening all around us, across the full spectrum of how we understand intelligence. It’s visible in the hive mind of the internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity.”
It’s now been five years since The Google Effect first made headlines. The idea that the internet can change how we think is no longer new or revolutionary. But the next generation of apps that aid these inventions with claims of total recall are still evolving – and you can bet our brains are too.
Apps like Evernote and team collaboration tools such as Slack and Trello free up the ridiculous amount of time it can take to track down pesky snippets of information by making it intuitive to find them. Since its inception in 2008, note-taking service Evernote has been busy building a base of more than 100 million users who are probably best described as converts, such is their commitment to the Evernote way of never forgetting.
Slack, founded in 2013, and Trello, in 2011, are often mentioned in the same breath, Slack standing for Searchable Log of All Conversation and Knowledge, and Trello being a web-based project management system. These apps can be seen as the direct descendants of Google searches – taking those initial instincts to bookmark or even (lest we forget) physically write down entire URLs in a separate notebook to return to later. Although each app serves different purposes, their end goal is the same: increase productivity and reduce time spent remembering.
The numbers are persuasive. Slack users claim that using the app results in 25 percent fewer meetings, a 48 percent reduction in internal emails and a 32 percent increase in productivity.
James Sherrett, Slack’s senior manager of accounts, said the changes Slack makes to its users’ memories are very specific to different individuals, but he added: “One of the things Slack users consistently tell us is that having Slack for all their team communication is like having a searchable, external super brain. They can trust that anything they need to know is in Slack – it's all pulled together into one place and searchable.”
I asked Sherrett if he thinks there's a risk of losing other skills when we don’t need to remember as often. He told me: “Rote memorization used to be a skill taught in schools and valued in adults. Being able to recite a speech from memory or cite a fact was considered a marker of intelligence.
“But that's changed. We no longer teach rote memorization in schools. We teach ways to recall and search for information. This change has coincided with the increasing use of computers to store and manage information. And computers are very good at storing and managing information, and our brains are very flexible in how they adapt to get the information they need.”
There it is again – that emphasis on storing the memory away, instead of having it just to hand. I turned to Cascio to find out what he thinks of the rise in technology of Evernote, Slack, Trello et al, and whether they are making subtle or seismic changes to our memory.
“The impact of these apps is an interesting dilemma, and one without an obvious ‘right’ answer,” Cascio said.
“Precision of memory is probably the greatest advantage these apps have over human ‘meat’ memory – our brains are notoriously bad at remembering the fiddly little details of things, and it’s commonly understood that when we remember something, our brains are functionally remembering the last time we remembered it, allowing errors to creep in. With an app, the accurate details are preserved.
“Our evolved biological brains still excel over digital memories: inference-based recall. It’s the ‘that reminds me of something…’ moment – the new recollection may not be directly related to what you were thinking about, but contains some non-obvious triggers for a link.
“This is all moving towards a world where recollection from a memory locker app wouldn’t require an active search. Imagine an app that pays attention to what you’re writing or saying and offers (in a non-obtrusive way) relevant items from your stored memories. Something like this would really serve as a brain co-processor, and not just a data dumpster.
"If it works, it would be like an extension of your own mind; if it doesn’t, it would be like Clippy with a power rating of over 9,000. ‘You appear to be engaged in a romantic encounter. Here’s a list of all of the times you’ve said the wrong thing during a make-out session’.”
Following the release of her study in 2011, Sparrow emphasised in interviews that “memory is so much more than memorisation”. She argued that this shift away from memorising may ultimately help people improve their comprehension and become better learners. The Google effect may allow us to free up more space on our internal hard drives, and focus on processing as opposed to memorising.
When we're no longer expected to remember birthdays, postcodes, kings and queens, or even the century a country was founded in, are we saying goodbye to depth and fluidity? Are the days of serendipitous thoughts, malapropisms and parallel thinking over? Or does it leave our fervently human brains with more room for these very freedoms?
Certainly, all these promises to Keep Track Of Everything and Never Forget (claims made by Trello and Evernote respectively) can feel tempting. But for anyone who's grown up straddling two centuries, there’s something inherently human in being forgetful. While memories of a time without smartphones or Wi-Fi grow fainter, there’s something wonderfully comforting in not remembering every detail. Because, after all, we’re only supposed to recall what was truly memorable.