Thoughts from a Camden bench

Reading The Lonely Century by Noreena Hertz for the Telegraph, 26 September 2020

Economist Noreena Hertz’s new book, about our increasing loneliness in a society geared to mutual exploitation, is explosive stuff. I would guess it was commissioned a while ago, then retrofitted for the Covid-19 pandemic — though I defy you to find any unsightly welding marks. Hertz is too good a writer for that, and her idea too timely, too urgent, and too closely argued to be upstaged by a mere crisis.

Loneliness is bad for our physical and mental health. Lack of social interaction makes us both labile and aggressive, a sucker for any Dionysian movement that offers us even a shred of belonging. These lines of argument precede Covid-19 by years, and have been used to explain everything from changing dating patterns among the young to Donald Trump’s win in 2016. But, goodness, what a highly salted dish they make today, now that we’re consigned to our homes and free association is curbed by law.

Under lockdown, we’re now even more reliant on the internet to maintain our working and personal identities. Here again, our immediate experiences sharpen Hertz’s carefully thought-out arguments. Social media is making us unhappy. It’s eroding our civility. It’s driving up rates of suicide. And so on — the arguments here are well-rehearsed, though Hertz’s synthesis is certainly compelling. “Connecting the world may be their goal,” she writes, thinking of the likes of Instagram and Tik Tok, “but it seems that if in the process connections become shallower, crueller or increasingly distorted, so be it.”

Now that we spend so much time indoors, we’re becoming ever more aware of how our outdoor civic space has been dismantled. And here is Hertz once again, waiting for us outside the shuttered wreck of an abandoned library. Actually libraries are the least of it: these days we don’t even get to exchange a friendly word with the supermarket staff who used to check out our shopping. And heaven help us if, on our way back to house arrest, we try to take the weight off our feet on one of those notorious municipal “Camden benches” — tiered and slanted blocks of concrete designed, not to allow public rest of any sort, but to keep “loiterers” moving.

Having led us, Virgil-like, through the circles of virtual and IRL Hell, Hertz ushers us into the purgatory she calls The Loneliness Economy (the capital letters are hers), and describes some of the ways the private sector has tried to address our cultural atomisation. Consider, for example, the way WeWork’s designers built the corridors and stairways in their co-working spaces deliberately narrow, so that users would have to make eye contact with each other. Such strategies are a bit “nudgy” for my taste, but sooner them than that bloody Camden bench.

The trouble with commercialising solutions for loneliness should be obvious: the lonelier we are, the more money this sector will make. So the underlying social drivers of loneliness will be ignored, and only the symptoms will be addressed. “They want to sell the benefit of working in close proximity with others, but with none of the social buy-in, the *hard work* that community requires.” Hertz writes, and wonders whether the people joining many of new, shiny, commercialised communities “have the time or lifestyles that community-building demands.”

Bringing such material to life necessarily means cherry-picking your examples. An impatient or hostile reader might take exception to the graduate student who spent so long curating her “Jen Goes Ziplining” Instagram post that she never actually went ziplining; also to the well-paid executive who lives out of his car and spends all his money on platonic cuddling services. The world is never short of foolish and broken people, and two swallows do not make a summer.

Still, I can’t see how Hertz’s account is harmed by such vividly rendered first-person research. And readers keen to see Hertz’s working have 130 pages of notes to work from.

More seriously, The Lonely Century suffers from the same limitation as Rutger Bregman’s recent (and excellent) Humankind, about how people are basically good. Let’s give each other a break, says Bregman. Let’s give each other the time of day, says Hertz. But neither author (and it’s not for want of trying) can muster much evidence to suggest the world is turning in the direction they would favour. Bregman bangs on about a school that’s had its funding cut; Hertz about a community cafe that’s closed after running out of money. There are other stories, lots of them, but they are (as Hertz herself concedes at one point) a bit “happy-clappy”.

One of Hertz’s many pre-publication champions calls her book “inspiring”. All it inspired in me, I’m afraid, was terror. The world is full of socially deprogrammed zombies. The Technological Singularity is here, the robots are us, and our open plan office-bound Slack messages constitute, at a global level, some great vegetative Overmind.

Hertz doesn’t go this far. But I do. Subtitled “coming together in a world that’s pulling apart”, The Lonely Century confirmed me in my internal exile, and had me bolting to door to the world even more firmly. I don’t even care that I am the problem Hertz is working so desperately hard to fix. I’m not coming out until it’s all over.

Flame brightly and flame out

Reading Kindred by Rebecca Wragg Sykes for New Scientist, 19 August 2020

How we began to unpick our species’ ancient past in the late 19th century is an astounding story, but not always a pretty one. As well as attaining tremendous insights into the age of Earth and how life evolved, scholars also entertained astonishingly bad ideas about superiority.

Some of these continue today. Why do we assume that Neanderthals, who flourished for 400,000 years, were somehow inferior to Homo sapiens or less fit to survive?

In Kindred, a history of our understanding of Neanderthals, Rebecca Wragg Sykes separates perfectly valid and reasonable questions – for example, “why aren’t Neanderthals around any more?” – from the thinking that casts our ancient relatives as “dullard losers on a withered branch of the family tree”.

An expert in palaeolithic archaeology, with a special interest in the cognitive aspects of stone tool technologies, Sykes provides an fascinating and detailed picture of a field transformed almost out of recognition over the past thirty years. New technologies involve everything from gene sequencing, isotopes and lasers to powerful volumetric algorithms. Well-preserved sites are now not merely dug and brushed: they are scanned and sniffed. High-powered optical microscopes pick out slice and chop marks, electron beams trace the cross-sections of scratches at the nano-scale, and rapid collagen identification techniques determine the type of animal from even tiny bone fragments.

The risk with any new forensic tool is that, in our excitement, we over-interpret the results it throws up. As Sykes wisely says, “A balance must be struck between caution and not ignoring singular artefacts simply because they’re rare or wondrous.”

Many gushing media stories about our ancient ancestor don’t last beyond the first news cycle: though Neanderthals may have performed some funerary activity, they didn’t throw flowers on their loved ones’ graves, or decorate their remains in any way. Other stories, though, continue to accumulate a weight of circumstantial evidence. We’ve known for some years that some Neanderthals actually tanned leather; now it seems they may also have spun thread.

An exciting aspect of this book is the way it refreshes our ideas about our own place in hominin evolution. Rather than congratulating other species when they behave “like us”, Sykes shows that it is much more fruitful to see how human talents may have evolved from behaviours exhibited by other species. Take the example of art. We may ask whether the circular stone assemblies, discovered in a cave near Bruniquel in southern France in 2016, were meant by their Neaderthal creators as monuments? We may wonder, what is the significance of the Neanderthal handprints and ladder designs painted on the walls of three caves in Spain? In both cases, we’d be asking the wrong questions, Sykes says: While undoubtedly striking, Neanderthal art “might not be a massive cognitive leap for hominins who probably already understood the idea of representation.” Animal footprints are effectively symbols already, and even simple tracking “requires an ‘idealised’ form to be kept in mind.”
Captive chimpanzees, given painting materials, enjoy colouring and marking surfaces, though they’re not in the least bit invested in the end result of their labours. So the significance and symbolism of Neanderthal art may simply be that Neanderthals had fun making it.

The Neanderthals of Kindred are not cadet versions of ourselves. They don’t perform “primitive burials”, and they don’t make “proto-art”. They had their own needs, urges, enjoyments, and strategies for survival.

They were not alone and, best of all, they have not quite vanished. Neanderthal nuclear DNA contains glimmers of very ancient encounters between them and other hominin species. Recent research suggests that interbreeding between Neaderthals and Denisovans, and Neanderthals and and Homo sapiens, was effectively the norm. “Modern zoology’s concept of allotaxa may be more appropriate for what Neanderthals were to us,” Sykes explains. Like modern cattle and yaks, we were closely related species that varied in bodies and behaviours, yet could also reproduce.

Neanderthals were never very many. “”At any point in time,” Sykes says, “there may have been fewer Neanderthals walking about than commuters passing each day through Clapham Common, London’s busiest train station.” With dietary demands that took a monstrous toll on their environment, they were destined to flame brightly and flame out. That doesn’t make them less than thus. It simply makes them earlier.

They were part of our family, and though we carry some part of them inside us, we will never see their like again. This, for my money, is Sykes’s finest achievement. Seeing Neanderthals through her eyes, we cannot but mourn their passing.

Know when you’re being played

Calling Bullshit by Jevin D West and Carl T Bergstrom, and Science Fictions by Stuart Ritchie, reviewed for The Telegraph, 8 August 2020

Last week I received a press release headlined “1 in 4 Brits say ‘No’ to Covid vaccine”. This is was such staggeringly bad news, I decided it couldn’t possibly be true. And sure enough, it wasn’t.

Armed with the techniques taught me by biologist Carl Bergstrom and data scientist Jevin West, I “called bullshit” on this unwelcome news, which after all bore all the hallmarks of clickbait.

For a start, the question on which the poll was based was badly phrased. On closer reading it turns out that 25 per cent would decline if the government “made a Covid-19 vaccine available tomorrow”. Frankly, if it was offered *tomorrow* I’d be a refusenik myself. All things being equal, I prefer my medicines tested first.

But what of the real meat of the claim — that daunting figure of “25 per cent”?  It turns out that a sample of 2000 was selected from a sample of 17,000 drawn from the self-selecting community of subscribers to a lottery website. But hush my cynicism: I am assured that the sample of 2000 was “within +/-2% of ONS quotas for Age, Gender, Region, SEG, and 2019 vote, using machine learning”. In other words, some effort has been made to make the sample of 2000 representative of the UK population (but only on five criteria, which is not very impressive. And that whole “+/-2%” business means that up to 40 of the sample weren’t representative of anything).

For this, “machine learning” had to be employed (and, later, “a proprietary machine learning system”)? Well, of course not.  Mention of the miracle that is artificial intelligence is almost always a bit of prestidigitation to veil the poor quality of the original data. And anyway, no amount of “machine learning” can massage away the fact that the sample was too thin to serve the sweeping conclusions drawn from it (“Only 1 in 5 Conservative voters (19.77%) would say No” — it says, to two decimal places, yet!) and is anyway drawn from a non-random population.

Exhausted yet? Then you may well find Calling Bullshit essential reading. Even if you feel you can trudge through verbal bullshit easily enough, this book will give you the tools to swim through numerical snake-oil. And this is important, because numbers easily slip  past the defences we put up against mere words. Bergstrom and West teach a course at the University of Washington from which this book is largely drawn, and hammer this point home in their first lecture: “Words are human constructs,” they say; “Numbers seem to come directly from nature.”

Shake off your naive belief in the truth or naturalness of the numbers quoted in new stories and advertisements, and you’re half way towards knowing when you’re being played.

Say you diligently applied the lessons in Calling Bullshit, and really came to grips with percentages, causality, selection bias and all the rest. You may well discover that you’re now ignoring everything — every bit of health advice, every over-wrought NASA announcement about life on Mars, every economic forecast, every exit poll. Internet pioneer Jaron Lanier reached this point last year when he came up with Ten Arguments for Deleting Your Social Media Accounts Right Now. More recently the best-selling Swiss pundit Rolf Dobelli has ordered us to Stop Reading the News. Both deplore our current economy of attention, which values online engagement over the provision of actual information (as when, for instance, a  review like this one gets headlined “These Two Books About Bad Data Will Break Your Heart”; instead of being told what the piece is about, you’re being sold on the promise of an emotional experience).

Bergstrom and West believe that public education can save us from this torrent of micro-manipulative blither. Their book is a handsome contribution to that effort. We’ve lost Lanier and Dobelli, but maybe the leak can be stopped up. This, essentially, is what the the authors are about; they’re shoring up the Enlightenment ideal of a civic society governed by reason.

Underpinning this ideal is science, and the conviction that the world is assembled on a bedrock of truth fundamentally unassailable truths.

Philosophical nit-picking apart, science undeniably works. But in Science Fictions Stuart Ritchie, a psychologist based at King’s College, shows just how contingent and gimcrack and even shoddy the whole business can get. He has come to praise science, not to bury it; nevertheless, his analyses of science’s current ethical ills — fraud, hype, negligence and so on — are devastating.

The sheer number of problems besetting the scientific endeavour becomes somewhat more manageable once we work out which ills are institutional, which have to do with how scientists communicate, and which are existential problems that are never going away whatever we do.

Our evolved need to express meaning through stories is an existential problem. Without stories, we can do no thinking worth the name, and this means that we are always going to prioritise positive findings over negative ones, and find novelties more charming than rehearsed truths.

Such quirks of the human intellect can be and have been corrected by healthy institutions at least some of the time over the last 400-odd years. But our unruly mental habits run wildly out of control once they are harnessed to a media machine driven by attention.  And the blame for this is not always easily apportioned: “The scenario where an innocent researcher is minding their own business when the media suddenly seizes on one of their findings and blows it out of proportion is not at all the norm,” writes Ritchie.

It’s easy enough to mount a defence of science against the tin-foil-hat brigade, but Ritchie is attempting something much more discomforting: he’s defending science against scientists. Fraudulent and negligent individuals fall under the spotlight occasionally, but institutional flaws are Ritchie’s chief target.

Reading Science Fictions, we see field after field fail to replicate results, correct mistakes, identify the best lines of research, or even begin to recognise talent. In Ritchie’s proffered bag of solutions are desperately needed reforms to the way scientific work is published and cited, and some more controversial ideas about how international mega-collaborations may enable science to catch up on itself and check its own findings effectively (or indeed at all, in the dismal case of economic science).

At best, these books together offer a path back to a civic life based on truth and reason. At worst, they point towards one that’s at least a little bit defended against its own bullshit. Time will tell whether such efforts can genuinely turning the ship around, or are simply here to entertain us with a spot of deckchair juggling. But there’s honest toil here, and a lot of smart thinking with it. Reading both, I was given a fleeting, dizzying reminder of what it once felt like to be a free agent in a factual world.

An intellectual variant of whack-a-mole

Reading Joseph Mazur’s The Clock Mirage for The Spectator, 27 June 2020 

Some books elucidate their subject, mapping and sharpening its boundaries. The Clock Mirage, by the mathematician Joseph Mazur, is not one of them. Mazur is out to muddy time’s waters, dismantling the easy opposition between clock time and mental time, between physics and philosophy, between science and feeling.

That split made little sense even in 1922, when the philosopher Henri Bergson and the young physicist Albert Einstein (much against his better judgment) went head-to-head at the Société française de philosophie in Paris to discuss the meaning of relativity. (Or that was the idea. Actually they talked at complete cross-purposes.)

Einstein won. At the time, there was more novel insight to be got from physics than from psychological introspection. But time passes, knowledge accrues and fashions change. The inference (not Einstein’s, though people associate it with him) that time is a fourth dimension, commensurable with the three dimensions of space, is looking decidedly frayed. Meanwhile Bergson’s psychology of time has been pruned by neurologists and put out new shoots.

Our lives and perceptions are governed, to some extent, by circadian rhythms, but there is no internal clock by which we measure time in the abstract. Instead we construct events, and organise their relations, in space. Drivers, thinking they can make up time with speed, acquire tickets faster than they save seconds. Such errors are mathematically obvious, but spring from the irresistible association we make (poor vulnerable animals that we are) between speed and survival.

The more we understand about non-human minds, the more eccentric and sui generis our own time sense seems to be. Mazur ignores the welter of recent work on other animals’ sense of time — indeed, he winds the clock back several decades in his careless talk of animal ‘instincts’ (no one in animal behaviour uses the ‘I’ word any more). For this, though, I think he can be forgiven. He has put enough on his plate.

Mazur begins by rehearsing how the Earth turns, how clocks were developed, and how the idea of universal clock time came hot on the heels of the railway (mistimed passenger trains kept running into each other). His mind is engaged well enough throughout this long introduction, but around page 47 his heart beats noticeably faster. Mazur’s first love is theory, and he handles it well, using Zeno’s paradoxes to unpack the close relationship between psychology and mathematics.

In Zeno’s famous foot race, by the time fleet-footed Achilles catches up to the place where the plodding tortoise was, the tortoise has moved a little bit ahead. That keeps happening ad infinitum, or at least until Newton (or Leibniz, depending on who you think got to it first) pulls calculus out of his hat. Calculus is an algebraic way of handling (well, fudging) the continuity of the number line. It handles vectors and curves and smooth changes — the sorts of phenomena you can measure only if you’re prepared to stop counting.

But what if reality is granular after all, and time is quantised, arriving in discrete packets like the frames of a celluloid film stuttering through the gate of a projector? In this model of time, calculus is redundant and continuity is merely an illusion. Does it solve Zeno’s paradox? Perhaps it makes it 100 times more intractable. Just as motion needs time, time needs motion, and ‘we might wonder what happens to the existence of the world between those falling bits of time sand’.

This is all beautifully done, and Mazur, having hit his stride, maintains form throughout the rest of the book, though I suspect he has bitten off more than any reader could reasonably want to swallow. Rather than containing and spotlighting his subject, Mazur’s questions about time turn out (time and again, I’m tempted to say) to be about something completely different, as though we were playing an intellectual variant of whack-a-mole.

But this, I suppose, is the point. Mazur quotes Henri Poincaré:

Not only have we not direct intuition of the equality of two periods, but we have not even direct intuition of the simultaneity of two events occurring in two different places.

Our perception of time is so fractured, so much an ad hoc amalgam of the chatter of numerous, separately evolved systems (for the perception of motion; for the perception of daylight; for the perception of risk, and on and on — it’s a very long list), it may in the end be easier to abandon talk of time altogether, and for the same reason that psychologists, talking shop among themselves, eschew vague terms suchas ‘love’.

So much of what we mean by time, as we perceive it day to day, is really rhythm. So much of what physicists mean by time is really space. Time exists, as love exists, as a myth: real because contingent, real because constructed, a catch-all term for phenomena bigger, more numerous and far stranger than we can yet comprehend.

In praise of tiny backpacks

Reading The Bird Way by Jennifer Ackerman for New Scientist, 17 June 2020

Visit the Australian National Botanic Gardens in Canberra, and you may stumble upon an odd sight: a human figure, festooned with futuristic-looking monitoring gear. Is it a statue? No: when children poke it (this happens a lot) the statue blinks.

Meet Jessica McLachlan, a researcher at Australian National University, at work studying the fine, never-before-detected details of bird behaviour. The gear she wears is what it takes to observe the world as birds themselves see it. Not to put too fine a point on it, birds are a lot faster than we are.

Bird brains are miracles of miniaturisation. Their neurons are smaller, more numerous, and more densely packed. They differ architecturally too, cramming more and faster processing power into a smaller space.

This means that in the bird world, things happen fast — sometimes too fast for us to see. (Unless you film blue-capped cordon-bleus at 300 frames-per-second video, how could you possibly know that they tap-dancing in time with their singing?)

Modern recording equipment enables us to study an otherwise invisible world. For example, by strapping tiny backpacks to seabirds, and correlating their flight with known ocean activities, we’ve discovered that birds have a quite extraordinary sense of smell, following krill across apparently featureless horizons that are, for them, “elaborate landscapes of eddying odor plumes.”

Science and nature writer Jennifer Ackerman’s fresh, re-invigorated account of the world of birds is arranged on traditional lines (sections cover her subjects’ singing, work play, love and parenting), but it is informed, and transformed, by accounts of cybernetically enhanced field studies, forensic analyses and ambitious global collaboration. She has travelled far and talked to many, she is generous to a fault with her academic sources, and her descriptions of her visits, field trips and adventures are engaging, but never obtrusive.

Her account centers on the bird life of Australia. This is reasonable: bird song began here. Songbirds, parrots and pigeons evolved here. In Australia, birds fill more ecological niches, are smarter, and longer-lived.

Like every ornithological writer before her, Ackerman is besotted by the sheer variety of her subjects. One of the most endearing passages in her book compares parrots and corvids. Both are intelligent, highly social species, but, after 92 million years of evolutionary separation, the similarities stop there. Ravens are frightened of novelty. Keas lap it up. Keas, confronted by a new researcher, will abandon a task to go play with the stranger. Ravens will simply wig out. Ravens have a strict feeding hierarchy. Curious male Kea will good-naturedly stuff food down an unfamiliar youngster’s gullet to the point where it’s fending them off.

Ackerman’s account is often jaw-dropping, and never more shocking than when she assembles the evidence for the cultural sophistication of bird song. Birds decode far more from sounds than we do and until recently we’ve been deaf to their acoustic complexity. Japanese tits use 11 different notes in their songs, and it’s the combination of notes that encodes information. Swap two notes around, and you elicit different responses. If this isn’t quite syntax, it’s something very like it. The drongo has absolute conscious control over its song, using up to 45 mimicked alarm calls to frighten other species, such as meerkats, into dropping their lunch — and it will target specific warning calls at individuals so they don’t twig what’s going on.

Meanwhile, many different species of bird worldwide, from Australia to Africa to the Himalayas, appear to have developed a universal, and universally comprehensible signal to warn of the approach of brood parasites (cuckoos and the like).

If the twentieth century was the golden age of laboratory study, the 21st is shaping up to become a renaissance for the sorts of field studies Charles Darwin would recognise. Now that cybernetically enhanced researchers like Jessica McLachlan can follow individuals, we have a chance to gauge and understand the intelligence exhibited by the animals around us. Intelligence en masse is, after all, really hard to spot. It doesn’t suit tabulation, or statistical analysis. It doesn’t make itself known from a distance. Intelligent behaviour is unusual. It’s novel. It takes a long time to spot.

If birds are as intelligent as so many of the stories in Ackerman’s eye-opening book suggest, then this, of course, may only be the start of our problems. In Sweden, corvid researchers Mathias and Helena Osvath have befriended a raven who turns up for their experiments, aces them, then flaps off again. “In the ethics section of grant applications,” says Matthias, “it’s difficult to explain…”

Humanity unleashed

Reading Rutger Bregman’s Humankind: A hopeful history for New Scientist, 10 June 2020

In 1651 English philosopher Thomas Hobbes startled the world with Leviathan, an account of the good and evil lurking in human nature. Hobbes argued that people, left to their own devices, were naturally viscious. (Writing in the aftermath of Europe’s cataclysmic Thirty Years War, the evidence was all around.) Ultimately, though, Hobbes’s vision was positive. Humans are also naturally gregarious. Gathering in groups, eventually we become more together than we were apart. We become villages, societies, whole civilisations.

Hobbes’ argument can be taken in two ways. We can glory in what we have built over the course of generations. Or we can live in terror of that future moment when the thin veneer of our civilisation cracks and lets all the devils out.

Even as I was writing this review, I came across the following story. In April, at the height of the surge in coronavirus cases, Americans purchased more guns than at any other point since the FBI began collecting data over 20 years ago. I would contend that these are not stupid people. They are, I suspect, people who have embraced a negatively Hobbesian view of the world, and are expecting the apocalypse.

Belief in innate human badness is self-fulfilling. (Bregman borrows from medical jargon and calls it a nocebo: a negative expectation that make one’s circumstances progressively worse.) And we do seem to try everything in our power to think the worst of ourselves. For instance, we give our schoolchildren William Golding’s 1954 novel Lord of the Flies to read (and fair enough — it’s a very good book) but nobody thinks to mention that the one time boys really were trapped on a desert island for over a year — on a rocky Polynesian atoll without fresh water, in 1965 — they survived, stayed fit and healthy, successfully set a lad’s broken leg, and formed friendships that have lasted a lifetime. Before Bregman came along you couldn’t even find this story on the internet.

From this anecdotal foundation, Bregman assembles his ferocious argument, demolishing one Hobbesian shibboleth after another. Once the settlers of Easter Island had chopped down all the trees on their island (the subject of historian Jared Diamond’s bestelling 2011 book Collapse), their civilisation did not fall apart. It thrived — until European voyagers arrived, bringing diseases and the slave trade. When Catherine Susan Genovese was murdered in New York City on 13 March 1964 (the notorious incident which added the expression “bystander effect” to the psychological lexicon), her neighbours did not watch from out of their windows and do nothing. They called the police. Her neighbour rushed out into the street and held her while she was dying.

Historians and reporters can’t be trusted; neither, alas, can scientists. Philip Zimbardo’s Stanford prison experiment, conducted in August 1971, was supposed to have spun out of control in less than two days, as students playing prison guards set about abusing and torturing fellow students cast in the role of prisoners. Almost everything that has been written about the experiment is not merely exaggerated, it’s wrong. When in 2001 the BBC restaged the experiment, the guards and the prisoners spent the whole time sitting around drinking tea together.

Bregman also discusses the classic “memory” experiment by Stanley Milgram, in which a volunteer is persuaded to electrocute a person nearly to death (an actor, in fact, and in on the wheeze). The problem here is less the experimental design and more the way the experiment was intepreted.

Early accounts took the experiment to mean that people are robots, obeying orders unthinkingly. Subsequent close study of the transcripts shows something rather different: that people are desperate to do the right thing, and their anxiety makes them frighteningly easy to manipulate.

If we’re all desperate to be good people, then we need a new realism when it comes to human nature. We can’t any longer assume that because we are good, those who oppose us must be bad. We must learn to give people a chance. We must learn to stop manipulating people the whole time. From schools to prisons, from police forces to political systems, Bregman visits projects around the world that, by behaving in ways that can seem surreally naive, have resolved conflicts, reformed felons, encouraged excellence and righted whole economies.

This isn’t an argument between left and right, between socialist and conservative; it’s but about what we know about human nature and how we can accommodate a better model of it into our lives. With Humankind Bregman moves from politics, his usual playground, into psychological, even spiritual territory. I am fascinated to know where his journey will lead.

 

Joy in the detail

Reading Charles Darwin’s Barnacle and David Bowie’s Spider by Stephen Heard for the Spectator, 16 May 2020

Heteropoda davidbowie is a species of huntsman spider. Though rare, it has been found in parts of Malaysia, Singapore, Indonesia and possibly Thailand. (The uncertainty arises because it’s often mistaken for a similar-looking species, the Heteropoda javana.) In 2008 a German collector sent photos of his unusual-looking “pet” to Peter Jäger, an arachnologist at the Senckenberg Research Institute in Frankfurt. Consequently, and in common with most other living finds, David Bowie’s spider was discovered twice: once in the field, and once in the collection.

Bowie’s spider is famous, but not exceptional. Jäger has discovered more than 200 species of spider in the last decade, and names them after politicians, comedians and rock stars to highlight our ecological plight. Other researchers find more pointed ways to further the same cause. In the first month of Donald Trump’s administration, Iranian-Canadian entomologist Vazrick Nazari discovered a moth with a head crowned with large, blond comb-over scales. There’s more to Neopalpa donaldtrumpi than a striking physical resemblance: it lives in a federally protected area around where the border wall with Mexico is supposed to go. Cue headlines.

Species are becoming extinct 100 times faster than they did before modern humans arrived. This makes reading a book about the naming of species a curiously queasy affair. Nor is there much comfort to be had in evolutionary ecologist Stephen Heard’s observation that, having described 1.5 million species, we’ve (at very best) only recorded half of what’s out there. There is, you may recall, that devastating passage in Cormac McCarthy’s western novel Blood Meridian in which Judge Holden meticulously records a Native American artifact in his sketchbook — then destroys it. Given that to discover a species you must, by definition, invade its environment, Holden’s sketch-and-burn habit appears to be a painfully accurate metonym for what the human species is doing to the planet. Since the 1970s (when there used to be twice as many wild animals than there are now) we’ve been discovering and endangering new species in almost the same breath.

Richard Spruce, one of the Victorian era’s great botanical explorers, who spent 15 years exploring the Amazon from the Andes to its mouth, is a star of this short, charming book about how we have named and ordered the living world. No detail of his bravery, resilience and grace under pressure come close to the eloquence of this passing quotation, however: “Whenever rains, swollen streams, and grumbling Indians combined to overwhelm me with chagrin,” he wrote in his account of his travels, “I found reason to thank heaven which had enabled me to forget for the moment all my troubles in the contemplation of a simple moss.”

Stephen Heard, an evolutionary ecologist based in Canada, explains how extraordinary amounts of curiosity have been codified to create a map of the living world. The legalistic-sounding codes by which species are named are, it turns out, admirably egalitarian, ensuring that the names amateurs give species are just as valid as those of professional scientists.

Formal names are necessary because of the difficulty we have in distinguishing between similiar species. Common names run into this difficulty all the time. There too many of them, so the same species gets different names in different languages. At the same time, there aren’t enough of them, so that, as Heard points out, “Darwin’s finches aren’t finches, African violets aren’t violets, and electric eels aren’t eels;” Robins, blackbirds and badgers are entirely different animals in Europe and North America; and virtually every flower has at one time or another been called a daisy.

Also names tend, reasonably enough, to be descriptive. This is fine when you’re distinguishing between, say, five different types of fish When there are 500 different fish to sort through, however, absurdity beckons. Heard lovingly transcribes the pre-Linnaean species name of the English whiting, formulated around 1738: “Gadus, dorso tripterygio, ore cirrato, longitudine ad latitudinem tripla, pinna ani prima officulorum trigiata“. So there.

It takes nothing away from the genius of Swedish physician Carl Linnaeus, who formulated the naming system we still use today, to say that he came along at the right time. By Linnaeus’s day, it was possible to look things up. Advances in printing and distribution had made reference works possible. Linnaeus’s innovation was to decouple names from descriptions. And this, as Heard reveals in anecdote after anecdote, is where the fun now slips in: the mythopoeic cool of the baboon Papio anubis, the mischevious smarts of the beetle Agra vation, the nerd celebrity of lemur Avahi cleesi.

Hearst’s taxonomy of taxonomies makes for somewhat thin reading; this is less of a book, more like a dozen interesting magazine articles flying in close formation. But its close focus, bringing to life minutiae of both the living world and the practice of science, is welcome.

I once met Michael Land, the neurobiologist who figured out how the lobster’s eye works. He told me that the trouble with big ideas is that they get in the way of the small ones. Heard’s lesson, delivered with such a light touch, is the same. The joy, and much of the accompanying wisdom, lies in the detail.

Goodbye to all that

Reading Technologies of the Human Corpse by John Troyer for the Spectator, 11 April 2020

John Troyer, the director of the Centre for Death and Society at the University of Bath, has moves. You can find his interpretative dances punctuating a number of his lectures, which go by such arresting titles as ‘150 Years of the Human Corpse in American History in Under 15 Minutes with Jaunty Background Music’ and ‘Abusing the Corpse Even More: Understanding Necrophilia Laws in the USA — Now with more Necro! And more Philia!’ (Wisconsin and Ohio are, according to Troyer’s eccentric looking and always fascinating website, ‘two states that just keep giving and giving when it comes to American necrophilia cases’.)

Troyer’s budding stand-up career has taken a couple of recent knocks. First was the ever more pressing need for him to crack on with his PhD (his dilatoriness was becoming a family joke). Technologies of the Human Corpse is yanked, not without injury, from that career-establishing academic work. Even as he assembled the present volume, however, there came another, far more personal, blow.

Late in July 2017 Troyer’s younger sister Julie was diagnosed with an aggressive brain cancer. Her condition deteriorated far more quickly than anyone expected, and on 29 July 2018 she died. This left Troyer — the engaging young American death scholar sprung from a family of funeral directors — having to square his erudite and cerebral thoughts on death and dead bodies with the fact he’d just kissed his sister goodbye. He interleaves poetical journal entries composed during Julie’s dying and her death, her funeral and her commemoration, between chapters written by a younger, jollier and of course shallower self.

To be brutal, the poems aren’t up to much, and on their own they wouldn’t add a great deal by way of nuance or tragedy. Happily for us, however, and to Troyer’s credit, he has transformed them into a deeply moving 30-page memoir that now serves as the book’s preface. This, then, is Troyer’s monster: a powerful essay about dying and bereavement; a set of poems written off the cuff and under great stress; and seven rather disconnected chapters about what’s befallen the human corpse in the past century or so.

Even as the book was going to print, Troyer explains in a hurried postscript, his father, a retired undertaker, lost consciousness following a cardiac arrest and was very obviously dying:

“And seeing my father suddenly fall into a comatose state so soon after watching my sister die is impossible to fully describe: I understand what is happening, yet I do not want to understand what is happening.”

This deceptively simple statement from Troyer the writer is streets ahead of anything Troyer the postgrad can pull off.

But to the meat of the book. The American civil war saw several thousand corpses embalmed and transported on new-fangled railway routes across the continent. The ability to preserve bodies, and even lend them a lifelike appearance months after death, created a new industry that, in various configurations and under several names, now goes by the daunting neologism of ‘deathcare provision’. In the future, this industry will be seen ‘transforming the mostly funeralisation side of the business into a much broader, human body parts and tissue distribution system’, as technical advances make increasing use of cadavers and processed cadaver parts.

So how much is a dead body worth? Between $30,000 and $50,000, says Troyer — five times as much for donors processed into medical implants, dermal implants and demineralised bone matrices. Funds and materials are exchanged through a network of body brokers who serve as middlemen between biomedical corporations such as Johnson & Johnson and the usual sources of human cadavers — medical schools, funeral homes and mortuaries. It is by no stretch an illegal trade, nor is it morally problematic in most instances; but it is rife with scandal. As one involved party remarks: ‘If you’re cremated, no one is ever going to know if you’re missing your shoulders or knees or your head.’

Troyer is out to show how various industries serve to turn our dead bodies into ‘an unfettered source of capital’. The ‘fluid men’ of Civil War America — who toured the battlefields showing keen students how to embalm a corpse (and almost always badly) — had no idea what a strange story they had started. Today, as the anatomist Gunther von Hagens poses human cadavers in sexual positions to pique and titillate worldwide audiences, we begin to get a measure of how far we have come. Hagens’s posthumous pornography reveals, says Troyer, ‘the ultimate taxonomic power over nature: we humans, or at least our bodies, can live forever because we pull ourselves from nature’.

Technologies of the Human Corpse is a bit of a mess, but I have a lot of time for Troyer. His insights are sound, and his recent travails may yet (and at high human cost — but it was ever thus) make him a writer of some force.

 

“Fat with smell, dissonant and dirty”

The revolt against scentlessness has been gathering for a while. Muchembled namechecks avant garde perfumes with names like Bat and Rhinoceros. A dear friend of mine favours Musc Kublai Khan for its faecal notes. Another spends a small fortune to smell like cat’s piss. Right now I’m wearing Andy Tauer’s Orange Star — don’t approach unless you like Quality Street orange cremes macerated in petrol…

Reading Robert Muchembled’s Smells: A Cultural History of Odours in Early Modern Times and Isabel Bannerman’s Scent Magic: Notes from a Gardener for the Telegraph, 18 April 2020