Thoughtfully engaging modernity

Jonathan Franzen’s diatribe article in The Guardiana preview of his forthcoming book The Kraus Project, is provocatively entitled What’s wrong with the modern world? Trotting out many standard objections to techno-utopianism, he particularly bemoans overuse of Twitter, Apple products generally, and even calls out Jeff Bezos as one of the four horsemen of the Apocalypse. But it is not the Apocalypse of the Bible to which he refers but rather the more personal apocalyptic crises that we all experience. He was introduced to this idea by the early 20th century Viennese cultural critic Karl Kraus, also known as The Great Hater, an individual with whom Franzen has been obsessed for a couple of decades. He explains that,

Kraus’s signal complaint – that the nexus of technology and media has made people relentlessly focused on the present and forgetful of the past – can’t help ringing true to me. Kraus was the first great instance of a writer fully experiencing how modernity, whose essence is the accelerating rate of change, in itself creates the conditions for personal apocalypse. Naturally, because he was the first, the changes felt particular and unique to him, but in fact he was registering something that has become a fixture of modernity. The experience of each succeeding generation is so different from that of the previous one that there will always be people to whom it seems that any connection of the key values of the past have been lost. As long as modernity lasts, all days will feel to someone like the last days of humanity.

Alexander Nazaryan is a bit dyspeptic himself in response to Franzen’s take on modernity, arguing that Franzen’s cri de coeur offers critique sans cure (Notably, Nazaryan offers no remedy either). Michael Jarvis is a bit more sympathetic to the neo-Luddism of Thomas Pynchon in his review of Bleeding Edge, the new novel by the famously reclusive author. Unlike Franzen who tells us that “Not only am I not a Luddite, I’m not even sure the original Luddites were Luddites.”, Pynchon is unabashed about his views on modernity. Like Franzen’s glorification of Kraus as The Great Hater, Pynchon is on the record as exalting Ned Lud - the original Luddite - as Badass. Pynchon argues that the 1779 movement known as Luddism was not a response to technology per se but rather class war, a reaction to the disenfranchisement of poor workers by modern machines. While there is a kernel of truth to his assertion, the mastery over technology that was wrought by the Industrial Revolution represents a cultural shift that is more than just concern for jobs. Tellingly, it was only a few decades later that Mary Shelley sat by the fireside at Villa Diodati, weaving the story that was to become the mainstay of every subsequent backlash to technology, Frankenstein, or the Modern Prometheus.

Frankenstein still touches a chord, but in today’s world it is decidedly unwise to rail against modernity. Not only will you be pilloried in the press, but even if people buy the argument they will remain in thrall to modern technology – it is just too seductive to ignore. Moreover, the rants miss the point. Personally, I want to engage with modernity and live a rich, juicy life that is authentic and true. In short, I want to flourish as a modern. The better question asks how we might embrace modern technology and do it thoughtfully. And here the best lesson comes from most surprising of sources: the Amish. This past January, I had a chance to have an extended conversation with Jamie Wetmore in his office at the Consortium for Science Policy & Outcomes at Arizona State University where he enlightened me about Amish attitudes towards technology. According to Jamey’s studies, the Amish are not anti-technology. Rather, they think often and deeply about how technology affects their values. For the Amish, these are closely interwoven with their religion, and so they choose to decline the adoption of technologies that conflict with their religious values. But the choice is active – they gather together and consider the pluses and minuses, and then collectively decide on a course of action. Those with a more secular take on the world (me!) may harbour a different set of values, but values we have, and it seems to me that is worth following the example of the Amish and ask how does modernity impact my value system? Posed in this way, the question naturally leads to answers that lack the crankiness of Franzen and Pynchon’s tirades, while providing a way to engage that is more thoughtful than the techno-utopian musings of their interlocutors: weigh your engagement with technology with your own personal values.

Ah, but you might say that knowing something is a problem and doing something about it are two different things. Small steps are often the most effective ways to modify behaviour, and here is one that might help. A common complaint about modern life is that in the middle of a conversation, someone glances at their computer or smartphone (are they even different anymore?), checking for what can best be described as I-don’t-know-what-but-something-might-be-new. The person who looks away is distracted; the one who was ignored is, well, ignored. Everyone acknowledges the problem. And everyone does it from time to time. So for the next three days, just do this: notice. Don’t chuck your technology out the window, and definitely don’t beat yourself up about it when you sneak a peek at some digital screen in the middle of a conversation. You might try practicing what the Buddhists suggest to do with any behaviour you want to forestall – get curious about it. What was being said when you looked away? How do you think the other person felt? How did you feel about the whole thing? Most of all, ask yourself whether your actions align with your values. If you want to have the exercise really bear fruit, make a note each time it happens – it could be on paper, or in some electronic file, but the simple act of jotting down what was going on when your mind drifted from present to virtual will help change your brain’s ingrained pattern of behaviour. At first it will be hard, awkward, and maybe even a bit uncomfortable. You will start out catching yourself checking your whatever in the middle of a conversation as frequently as before, but by the third day it will become a rarity. And you will be better for it.

Use it or lose it

As the technology of memorializing dialogue (in stone, no less) came into vogue, Socrates famously admonished Phaedrus his protegé Plato on its dangers: if people are able to write everything down, their ability to remember what was said will diminish. Plato, being an early version of an early adopter, memorialized the debate, and that is why the apocryphal story is with us today. But even without a grounding in modern neurobiology, Socrates had a valid point: the plasticity of our brains are such that the less we use them for a given function, the more our ability to carry out that function is impaired.

This becomes a tricky issue when thinking about the world in which we live today. In a thoughtful essay over at The Atlantic, Evan Selinger reviews a number of arguments for and against the use of ‘apps’ to make us, as he puts it in his title, a better person. What Evan is particularly concerned with are digital willpower enhancements: the suite of technologies that have been developed to help us do everything from not being distracted by a tweet to refrain from eating more than we would like. Continue reading

Music on (or projecting off?) your Mind

The Multimodal Brain Orchestra, led by the SPECS research group, recently performed their inaugural performance.  Oh, and there were no instruments.

“…four performers were fitted with caps littered with electrodes that take a real-time electroencephalograph [EEG] – an image of the brain’s electrical activity._45706524_brain_227_170

“There is a first violin, a second violin and so on, except that instead of violins they are brains,” says Dr Mura.

The graphs of those brain waves are projected onto one of two large screens above the orchestra. The performers launch sounds or affect their frequencies and modulations based on two well-characterised effects seen in EEGs: the steady-state visually evoked potential (SSVEP), and the so-called P300 signal.

When expectation is fulfilled, 300 thousandths of a second later, a signal known as the P300  appears in the EEG.

In the Multimodal Brain Orchestra, the P300 signal is registered – with a dot demarcating it on the EEG trace projected to the audience, so that they can see the effect of the performer’s thought – in turn launching a sound or recorded instrument.” (links added).

While research and exciting activities of this kind re-ignite important and fascinating dialogue around consciousness, I was particularly intrigued by the quote of Dr. Anna Mura, a biologist who is also the producer of the project:

“What we want to show here is the use of your brain without your body. Embodiment – we should get rid of it sometimes.”

So, I have to ask somewhat rhetorically: when do you ever use your body without your brain? Sure, there is autonomic activity of certain organs but without brain function they would cease to work. Now, I understand where Dr. Mura is going with this statement, which made me think of the notion of an extended mind, a topic in neurophilosophy and the philosophy of mind, which, according to Neil Levy, has “far reaching” implications for neuroethics. Broadly, the extended mind is that mental states extend beyond the skulls of the brains in which produce them. So for instance, conveying emotion in a music performance through an instrument is a claim of the extended mind. So, according to Levy, the extended mind thesis “alters the focus of neuroethics, away from the question of whether we ought to allow interventions into the mind, and toward the question of  which interventions we ought to allow and under what conditions.”

Further,  Grant Gillett would likely disagree that “we should get rid of” embodiment – actually, it is foreseeable that he would consider the idea impossible, and probably argue that dis-embodiment may only occur in a case where the brain is no longer able to embody the person, as in a condition such as locked-in-syndrome. Gillett would state that human subjectivity, in the brain is inscribed by a history of neurological, social, psychological, environmental, and other processes which embody meaning of being a human being-in-the-word-with-others.

Link to the BBC article here.

Hat tip to Ryan Nadel for “drawing my attention” to the piece.

- daniel buchman