Is boredom good for us?

time

Sandi Mann’s The Upside of Downtime and Felt Time: The psychology of how we perceive time by Marc Wittmann reviewed for New Scientist, 13 April 2016.

 

VISITORS to New York’s Museum of Modern Art in 2010 got to meet time, face-to-face. For her show The Artist is Present, Marina Abramovic sat, motionless, for 7.5 hours at a stretch while visitors wandered past her.

Unlike all the other art on show, she hadn’t “dropped out” of time: this was no cold, unbreathing sculpture. Neither was she time’s plaything, as she surely would have been had some task engaged her. Instead, Marc Wittmann, a psychologist based in Freiburg, Germany, reckons that Abramovic became time.

Wittmann’s book Felt Time explains how we experience time, posit it and remember it, all in the same moment. We access the future and the past through the 3-second chink that constitutes our experience of the present. Beyond this interval, metronome beats lose their rhythm and words fall apart in the ear.

As unhurried and efficient as an ophthalmologist arriving at a prescription by placing different lenses before the eye, Wittmann reveals, chapter by chapter, how our view through that 3-second chink is shaped by anxiety, age, boredom, appetite and feeling.

Unfortunately, his approach smacks of the textbook, and his attempt at a “new solution to the mind-body problem” is a mess. However, his literary allusions – from Thomas Mann’s study of habituation in The Magic Mountain to Sten Nadolny’s evocation of the present moment in The Discovery of Slowness – offer real insight. Indeed, they are an education in themselves for anyone with an Amazon “buy” button to hand.

As we read Felt Time, do we gain most by mulling Wittmann’s words, even if some allusions are unfamiliar? Or are we better off chasing down his references on the internet? Which is the more interesting option? Or rather: which is “less boring”?

Sandi Mann’s The Upside of Downtime is also about time, inasmuch as it is about boredom.

Once we delighted in devices that put all knowledge and culture into our pockets. But our means of obtaining stimulation have become so routine that they have themselves become a source of boredom. By removing the tedium of waiting, says psychologist Mann, we have turned ourselves into sensation junkies. It’s hard for us to pay attention to a task when more exciting stimuli are on offer, and being exposed to even subtle distractions can make us feel more bored.

Sadly, Mann’s book demonstrates the point all too well. It is a design horror: a mess of boxed-out paragraphs and bullet-pointed lists. Each is entertaining in itself, yet together they render Mann’s central argument less and less engaging, for exactly the reasons she has identified. Reading her is like watching a magician take a bullet to the head while “performing” Russian roulette.

In the end Mann can’t decide whether boredom is a good or bad thing, while Wittmann’s more organised approach gives him the confidence he needs to walk off a cliff as he tries to use the brain alone to account for consciousness. But despite the flaws, Wittmann is insightful and Mann is engaging, and, praise be, there’s always next time.

 

Eugenic America: how to exclude almost everyone

imbeciles

Imbeciles: The Supreme Court, American eugenics, and the sterilization of Carrie Buck by Adam Cohen (Penguin Press)

Defectives in the Land: Disability and immigration in the age of eugenics by Douglas C. Baynton (University of Chicago Press)

for New Scientist, 22 March 2016

ONE of 19th-century England’s last independent “gentleman scientists”, Francis Galton was the proud inventor of underwater reading glasses, an egg-timer-based speedometer for cyclists, and a self-tipping top hat. He was also an early advocate of eugenics, and his Hereditary Genius was published two years after the first part of Karl Marx’s Das Kapital.

Both books are about the betterment of the human race: Marx supposed environment was everything; Galton assumed the same for heredity. “If a twentieth part of the cost and pains were spent in measures for the improvement of the human race that is spent on the improvement of the breed of horses and cattle,” he wrote, “what a galaxy of genius might we not create! We might introduce prophets and high priests of civilisation into the world, as surely as we… propagate idiots by mating cretins.”

What would such a human breeding programme look like? Would it use education to promote couplings that produced genetically healthy offspring? Or would it discourage or prevent pairings that would otherwise spread disease or dysfunction? And would it work by persuasion or by compulsion?

The study of what was then called degeneracy fell to a New York social reformer, Richard Louis Dugdale. During an 1874 inspection of a jail in New York State, Dugdale learned that six of the prisoners there were related. He traced the Jukes family tree back six generations, and found that some 350 people related to this family by blood or marriage were criminals, prostitutes or destitute.

Dugdale concluded that, like genius, “degeneracy” runs in families, but his response was measured. “The licentious parent makes an example which greatly aids in fixing habits of debauchery in the child. The correction,” he wrote, “is change of the environment… Where the environment changes in youth, the characteristics of heredity may be measurably altered.”

Other reformers were not so circumspect. An Indiana reformatory promptly launched a eugenic sterilisation effort, and in 1907 Indiana enacted the world’s first compulsory sterilisation statute. California followed suit in 1909. Between 1927 and 1979, Virginia forcibly sterilised at least 7450 “unfit” people. One of them was Carrie Buck, a woman labelled feeble-minded and kept ignorant of the details of her own case right up to the point in October 1927 when her fallopian tubes were tied and cauterised using carbolic acid and alcohol.

In Imbeciles, Adam Cohen follows Carrie Buck through the US court system, past the desks of one legal celebrity after the other, and not one of them, not Howard Taft, not Louis Brandeis, not Oliver Wendell Holmes Jr, gave a damn about her.

Cohen anatomises in pitiless detail how inept civil society can be at assimilating scientific ideas. He also does a good job explaining why attempts to manipulate the genetic make-up of whole populations can only fail to improve the genetic health of our species. Eugenics fails because it looks for genetic solutions to what are essentially cultural problems. The anarchist biologist Peter Kropotkin made this point as far back as 1912. Who were unfit, he asked the first international eugenics congress in London: workers or monied idlers? Those who produced degenerates in slums or those who produced degenerates in palaces? Culture casts a huge influence over the way we live our lives, hopelessly complicating our measures of strength, fitness and success.

Readers of Cohen’s book would also do well to watch out for Douglas Baynton’s Defectives in the Land, to be published in June. Focusing on immigrant experiences in New York, Baynton explains how ideas about genetics, disability, race, family life and employment worked together to exclude an extraordinarily diverse range of men and women from the shores of the US.

“Doesn’t this squashy sentimentality of a big minority of our people about human life make you puke?” Holmes once exclaimed. Holmes was a miserable bigot, but he wasn’t wrong to thirst for more rigour in our public discourse. History is not kind to bad ideas.

How the forces inside cells actually behave

animal electricity

Animal Electricity: How we learned that the body and brain are electric machines by Robert B. Campenot (Harvard University Press) for New Scientist, 9 March 2016.

IF YOU stood at arm’s length from someone and each of you had 1 per cent more electrons than protons, the force pushing the two of you apart would be enough to lift a “weight” equal to that of the entire Earth.

This startling observation, from Richard Feynman’s Lectures on Physics, so impressed cell biologist Robert Campenot he based quite a peculiar career around it. Not content with the mechanical metaphors of molecular biology, Campenot has studied living tissue as a delicate and complex mechanism that thrives by tweaking tiny imbalances in electrical charge.

If only the book were better prepared. Campenot’s enthusiasm for Feynman has him repeat the anecdote about lifting the world almost word for word, in the preface and introduction. Duplicating material is a surprisingly easy gaffe for a writer, and it is why we have editors. Where were they?

Campenot’s generous account ranges from Galvani’s discovery of animal electricity to the development of thought-controlled prosthetic limbs. He has high regard for popular science. But his is the rather fussy appreciation of the academic outsider who, uncertain of the form’s aesthetic potential, praises it for its utility. “The value of popularising science should never be underestimated because it occasionally attracts the attention of people who go on to make major contributions.” The pantaloonish impression he makes here is not wholly unrepresentative of the book.

Again, one might wish Campenot’s relationship with his editor had been more creative. Popular science writing rarely handles electricity well, let alone ion channels and membrane potentials. So, when it comes to developing suitable metaphors, Campenot is thrown on his own resources. His metaphors are as effective as one could wish for, but they suffer from repetition. One imagines the author wondering if he has done enough to nail his point, but with no one to reassure him.

Faults aside, this is a good book. Its mix of schoolroom electricity and sophisticated cell biology is highly eccentric but this, I think, speaks much in Campenot’s favour. The way organic tissue manipulates electricity, sending signals in broad electrical waves that can extend up to a third of a metre, is a dimension of biology we have taken on trust, domesticating it behind high-order metaphors drawn from computer science. Consequently, we have been unable to visualise how the forces in our cells actually behave. This was bound to turn out an odd endeavour. So be it. The odder, the better, in fact.

Putting the wheel in its place

wheel

The Wheel: Inventions and reinventions by Richard W. Bulliet (Columbia University Press), for New Scientist, 20 January 2016

IN 1870, a year after the first rickshaws appeared in Japan, three inventors separately applied for exclusive rights. Already, there were too many workshops serving the burgeoning market.

We will never know which of them, if any, invented this internationally popular, stackable, hand-drawn passenger cart. Just three years after its invention, the rickshaw had totally displaced the palanquin (a covered litter carried on the shoulders of two bearers) as the preferred mode of passenger transport in Japan.

What made the rickshaw so different from a wagon or an ox-cart and, in the eyes of many Westerners, so cruel, was the idea of it being pulled by a man instead of a farm animal. Pushing wheelchairs and baby carriages posed no problem, but pulling turned a man into a beast. “This quirk of perception,” Bulliet says, “reflects a history of human-animal relations that the Japanese – who ate little red meat, had few large herds of cattle and horses, and seldom used animals to pull vehicles – did not share with Westerners.”

In answer to some questions that seem far more difficult, Bulliet provides extraordinarily precise answers. He proposes an exact birth for the wheel: the wheel-set design, whereby wheels are fixed to rotating axles, was invented for use on mine cars in copper mines in the Carpathian mountains, perhaps as early as 4000 BC.

Other questions remain intractable. Why did wheeled vehicles not catch on in pre-Columbian America? The peoples of North and South America did not use wheels for transportation before Christopher Columbus arrived. They made wheeled toys, though. Cattle-herding societies from Senegal to Kenya were not taken in by wheels either, though they were happy enough to feature the chariots of visitors in their rock paintings.

Bulliet has a lot of fun teasing generations of anthropologists, archaeologists and historians for whom the wheel has been a symbol of self-evident utility: how could those foreign types not get it? His answer is radical: the wheel is actually not that great an idea. It only really came into its own once John McAdam, a Scot born in 1756, introduced a superior way to build roads. It’s worth remembering that McAdam insisted the best way to manufacture the small, sharp-edged stones he needed was to have workers, including women and children, sit beside the road and break up larger rocks. So much for progress.

The wheel revolution is, to Bulliet’s mind, a recent and largely human-powered one. Bicycles, shopping carts, baby strollers, dollies, gurneys and roll-aboard luggage: none of these was conceived before 1800. At the dawn of Europe’s Renaissance, in the 14th century, four-wheeled vehicles were not in common use anywhere in the world.

Bulliet ends his history with the oddly conventional observation that “invention is seldom a simple matter of who thought of something first”. He could have challenged the modern shibboleth (born in Samuel Butler’s Erewhon and given mature expression in George Dyson’s Darwin Among the Machines) that technology evolves. Add energy to an unbounded system, and complexity is pretty much inevitable. There is nothing inevitable about technology, though; human agency cannot be ignored. Even a technology as ubiquitous as the wheel turns out to be a scrappy hostage to historical contingency.

I may be misrepresenting the author’s argument here. It is hard to tell, because Bulliet approaches the philosophy of technology quite gingerly. He can afford to release the soft pedal. This is a fascinating book, but we need more, Professor Bulliet!

 

 

 

The disaster of the cloud itself

cloud

Tung-Hui Hu’s A Prehistory of the Cloud reviewed for New Scientist

LAST week, to protect my photographs of a much-missed girlfriend, I told all my back-up services to talk to each other. My snaps have since been multiplying like the runaway brooms in Disney’s Fantasia, and I have spent days trying to delete them.

Apart from being an idiot, I got into this fix because my data has been placed at one invisible but crucial remove in the cloud, zipping between energy-hungry servers scattered across the globe at the behest of algorithms I do not understand.

By duplicating our digital media to different servers, we insure against loss. The more complex and interwoven these back-up systems become, though, the more insidious our losses. Sync errors swallow documents whole. In the hands of most of us, JPEGs degrade a tiny bit each time they are saved. And all formats fall out of fashion eventually.

“Thus disaster recovery in the cloud often protects us against the disaster of the cloud itself,” says Tung-Hui Hu, a former network engineer whose A Prehistory of the Cloud poses some hard questions of our digital desires. Why are our commercial data centres equipped with iris and palm recognition systems? Why is Stockholm’s most highly publicised data centre housed in a bunker originally built to defend against nuclear attack?

Hu identifies two impulses: “First, a paranoid desire to pre-empt the enemy by maintaining vigilance in the face of constant threat, and second, a melancholic fantasy of surviving the eventual disaster by entombing data inside highly secured data vaults.”

The realm of the cloud does not countenance loss, but when we touch it, we corrupt it. The word for such a system – a memory that preserves, encrypts and mystifies a lost love-object – is melancholy. Hu’s is a deeply melancholy book and for that reason, a valuable one.

How we see now

 

mg22630260.800-1_800

For New Scientist, a review of Nicholas Mirzoeff’s book How to See the World

NICHOLAS MIRZOEFF, a media, culture and communication professor at New York University, wants to justify the study of visual culture by describing, accessibly, how strange our visual world has become.

This has been done before. In 1972 artist and writer John Berger made Ways of Seeing, a UK TV series and a book. This was also the year that astronaut Harrison Schmitt took the Blue Marble picture of Earth from Apollo 17, arguably the most reproduced photograph ever.

By contrast, in How to See the World, Mirzoeff’s mascot shot is the selfie taken by astronaut Akihiko Hoshide during his 2012 spacewalk. This time, Earth is reflected in Hoshide’s visor: the planet is physically different and changing fast. Transformations that would have been invisible to humans because they took place so slowly now occur in a single life. “We have to learn to see the Anthropocene,” writes Mirzoeff.

Images are ubiquitous, and we have learned to read them as frames in a giant, self-assembling graphic novel. Visual meaning is found in the connections we make between those images. We used to flock to the cinema for that sort of peculiar dream logic, but now we struggle to awaken. Mirzoeff cites artist Clement Valla writing that “we are already in the Matrix”.

Simple iconography is in retreat. During the 1962 Cuban missile crisis, Soviet missile trailers were visible in photos shown to the media. By 2003, the photos that US general Colin Powell showed of supposed weapons-making kit were lathered in yellow labelling, claiming to show what we could not in fact see.

Tracing the political, social and environmental implications of our visual culture, in words and black and white images, is a job of work. Mirzoeff succeeds: this is a dizzying and delightful book.

More than human

 

mg22630211.000-1_500

For New Scientist: a review of Ian Tattersall’s The Strange Case of the Rickety Cossack, and other cautionary tales from human evolution

THE odd leg bones and prominent brow ridges of a fossil hominid found in Belgium in 1830 clearly belong to an ancient relative of Homo sapiens. But palaeontologist August Mayer wasn’t having that: what he saw were the remains of a man who had spent his life on horseback despite a severe case of rickets, furrowing his brow in agony as a consequence, who hid himself away to die under 2 metres of fossil-laden sediment.

The “Cossack” in Ian Tattersall’s new book, The Strange Case of the Rickety Cossack, exemplifies the risk of relying too much on the opinion of authorities and not enough on systematic analysis. Before they were bureaucratised and (where possible) automated, several sciences fell down that particular well.

Palaeoanthropology made repeated descents, creating a lot of entertaining clatter in the process. For example, Richard Leakey’s televised live spat with Donald Johanson over human origins in 1981 would be unimaginable today. I think Tattersall, emeritus curator at the American Museum of Natural History, secretly misses this heroic age of simmering feuds and monstrous egos.

The human fossil record ends with us. There are many kinds of lemur but, as he writes, only one kind of human, “intolerant of competition and uniquely able to eliminate it”. As a result, there is an immense temptation to see humans as the acme of an epic evolutionary project, and to downplay the diversity our genus once displayed.

Matters of theory rarely disturbed the 20th-century palaeontologists; they assigned species names to practically every fossil they found until biologist Ernst Mayr, wielding insights from genetics, stunned them into embarrassed silence. Today, however, our severely pruned evolutionary tree grows bushier with every molecular, genetic and epigenetic discovery.

Some claim the group of five quite distinct fossil individuals discovered in 1991 in Dmanisi, east of the Black Sea, belong to one species. Use your eyes, says Tattersall; around 2 million years ago, four different kinds of hominid shared that region.

Tattersall explains how epigenetic effects on key genes cascade to produce radical morphological changes in an eye blink, and why our unusual thinking style, far from being the perfected product of long-term selective pressures, was bootstrapped out of existing abilities barely 100,000 years ago.

He performs a difficult balancing act with aplomb, telling the story of human evolution through an accurate and unsparing narrative of what scientists actually thought and did. His humility and generosity are exemplary.

The past is like Baltimore: there is no there there

mg22630152.400-1_1200

Longing for the Bomb: Oak Ridge and atomic nostalgia, Lindsey A. Freeman (University of North Carolina Press)
Seeing Green: The use and abuse of American environmental images, Finis Dunaway (Chicago University Press)
for New Scientist  (4 April 2015),

THE past can’t be re-experienced. It leaves only traces and artefacts, which we constantly shuffle, sort, discard and recover, in an obsessive effort to recall where we have come from. This is as true of societies as it is of individuals.

Lindsey Freeman, an assistant professor of sociology at the State University of New York, Buffalo, is the grandchild of first-generation residents of Oak Ridge, Tennessee. Once a “secret city”, where uranium was enriched for the US’s Manhattan Project, Oak Ridge opened its gates to the world in 1949 as America’s first “Atomic City”: a post-war utopia of universal healthcare, zero unemployment and state-owned housing.

In Longing for the Bomb, Freeman describes how residents of Oak Ridge dreamed up an identity for themselves as a new breed of American pioneer. He visits Oak Ridge’s Y-12 National Security Complex (an “American Uranium Center of Excellence”) during its Secret City Festival, boards its Scenic Excursion Train and cannot decide if converting a uranium processing site into a wildlife reserve is good or bad.

It would have been easy to turn the Oak Ridge story into something sinister, but Freeman is too generous a writer for that. Oak Ridge owes its existence to the geopolitical business of mass destruction, but its people have created stories that keep them a proud and happy community. Local trumps global, every time.

This is good for the founders of communities, but a problem for those who want to wake up those communities to the need for change. As historian Finis Dunaway puts it in Seeing Green, his history of environmental imagery, “even as media images have made the environmental crisis visible to a mass public, they often have masked systemic causes and ignored structural inequalities”.

Reading this, I was reminded of a talk by author Andrew Blackwell, where he told us just how hard it is to take authentic pictures of some of the world’s most polluted places. Systemic problems do not photograph well. Some manipulation is unavoidable.

Dunaway knows this. Three months after the nuclear accident at Three Mile Island in 1979the worst radioactive spill in US history occurred near Church Rock, New Mexico, on lands held by the Navajo nation. It took a week for the event to be reported, once, on a single news channel.

The remoteness of the site and a lack of national interest in Native American affairs might explain the silence but, as Dunaway points out, the absence of an iconic and photogenic cooling tower can’t have helped.

The iconic environmental images Dunaway discusses are essentially advertisements, and adverts address individuals. They assume that radical social change will catch on like any other consumer good. For example, the film An Inconvenient Truth, chock full of eye-catching images, is the acme of the sincere advertiser’s art, and its maker, former US vice-president and environmental campaigner Al Gore, is a vocal proponent of carbon offset and other market initiatives.

Dunaway, though, argues that you cannot market radical social action. For him, the moral seems to be that sometimes, you just have to give the order – as Franklin Roosevelt did when he made Oak Ridge a city.

A feast of bad ideas

This Idea Must Die: Scientific theories that are blocking progress, edited by John Brockman (Harper Perennial)

for New Scientist, 10 March 2015

THE physicist Max Planck had a bleak view of scientific progress. “A new scientific truth does not triumph by convincing its opponents…” he wrote, “but rather because its opponents eventually die.”

This is the assumption behind This Idea Must Die, the latest collection of replies to the annual question posed by impresario John Brockman on his stimulating and by now venerable online forum, Edge. The question is: which bits of science do we want to bury? Which ideas hold us back, trip us up or send us off in a futile direction?

Some ideas cited in the book are so annoying that we would be better off without them, even though they are true. Take “brain plasticity”. This was a real thing once upon a time, but the phrase spread promiscuously into so many corners of neuroscience that no one really knows what it means any more.

More than any amount of pontification (and readers wouldn’t believe how many new books agonise over what “science” was, is, or could be), Brockman’s posse capture the essence of modern enquiry. They show where it falls away into confusion (the use of cause-and-effect thinking in evolution), into religiosity (virtually everything to do with consciousness) and cant (for example, measuring nuclear risks with arbitrary yardsticks).

This is a book to argue with – even to throw against the wall at times. Several answers, cogent in themselves, still hit nerves. When Kurt Gray and Richard Dawkins, for instance, stick their knives into categorisation, I was left wondering whether scholastic hand-waving would really be an improvement. And Malthusian ideas about resources inevitably generate more heat than light when harnessed to the very different agendas of Matt Ridley and Andrian Kreye.

On the other hand, there is pleasure in seeing thinkers forced to express themselves in just a few hundred words. I carry no flag for futurist Douglas Rushkoff or psychologist Susan Blackmore, but how good to be wrong-footed. Their contributions are among the strongest, with Rushkoff discussing godlessness and Blackmore on the relationship between brain and consciousness.

Every reader will have a favourite. Mine is palaeontologist Julia Clarke’s plea that people stop asking her where feathered dinosaurs leave off and birds begin. Clarke offers lucid glimpses of the complexities and ambiguities inherent in deciphering the behaviour of long-vanished animals from thin fossil data. The next person to ask about the first bird will probably get a cake fork in their eye.

This Idea Must Die is garrulous and argumentative. I expected no less: Brockman’s formula is tried and tested. Better still, it shows no sign of getting old.

 

The toughest job

First ice cream of the year

Parentology: Everything you wanted to know about the science of raising children but were too exhausted to ask by Dalton Conley
and
It’s Complicated: The social lives of networked teens by Danah Boyd
reviewed for New Scientist

 

As early as page 14 of Parentology, a neonatologist explains to Conley that his sure-to-be-premature daughter should stay in her mother’s womb as long as possible, since “each week is ten more points of IQ”. Conley was furious. “A spark of rage landed on my sleeve. An urge to grab the doctor’s head and bash it against the sharp corner of the sonogram machine seized hold… I wanted to smash his head one time for every IQ point,” he recalls.

For all its insightful, funny, fully researched, conscientiously cited, Freakonomics approach to science and statistics, what really powers Parentology is a species of loving rage. The numbers teach us a great deal about what parents cannot do, cannot change and cannot help. However, we learn something quite different and very valuable from Conley. Love, care, interest and empathy won’t change a child’s chances, but they render most of the measures discussed in this book profoundly unimportant.

By all means keep score – it’s a tough world out there, and your kids need all the help they can get. But if you measure your worth as a parent by the numbers, you’ve missed the point of the enterprise.

If parenting is about learning how little influence we have over people and events, then pity also the youths interviewed by Boyd for It’s Complicated. Patronised, legally marginalised and even subject to curfew, US teenagers – to hear Boyd tell it – have but one means to engage with the outside world: via the imperfect medium of the computer screen. “Obsessed” with social media, they are simply trying to recreate, for themselves and each other, a social space denied to them by anxious parents, hostile civic authorities and a mass media bent on exaggerating every conceivable outdoor danger.

Of course, a life online is not simply a life lived behind glass. There are serious problems with social media: chiefly, the obstacle they present to self-reinvention, and the ease with which bullies can weaponise them.

But Boyd has little time for technological determinism. Her fieldwork with worried-well parents and their kids reveals the fault is not in our computers but in ourselves, that we scare our kids into their bedrooms, then spy on them constantly once they’re there. And she marshals a huge body of sociological evidence, anecdotal and statistical, to support this.

Parents, you’ve had your chance. Of course you blew it. Now leave the kids alone.